Sample records for extensive computational study

  1. Evaluation of a computational model to predict elbow range of motion

    PubMed Central

    Nishiwaki, Masao; Johnson, James A.; King, Graham J. W.; Athwal, George S.

    2014-01-01

    Computer models capable of predicting elbow flexion and extension range of motion (ROM) limits would be useful for assisting surgeons in improving the outcomes of surgical treatment of patients with elbow contractures. A simple and robust computer-based model was developed that predicts elbow joint ROM using bone geometries calculated from computed tomography image data. The model assumes a hinge-like flexion-extension axis, and that elbow passive ROM limits can be based on terminal bony impingement. The model was validated against experimental results with a cadaveric specimen, and was able to predict the flexion and extension limits of the intact joint to 0° and 3°, respectively. The model was also able to predict the flexion and extension limits to 1° and 2°, respectively, when simulated osteophytes were inserted into the joint. Future studies based on this approach will be used for the prediction of elbow flexion-extension ROM in patients with primary osteoarthritis to help identify motion-limiting hypertrophic osteophytes, and will eventually permit real-time computer-assisted navigated excisions. PMID:24841799

  2. Quantification of effect of sequential posteromedial release on flexion and extension gaps: a computer-assisted study in cadaveric knees.

    PubMed

    Mullaji, Arun; Sharma, Amit; Marawar, Satyajit; Kanna, Raj

    2009-08-01

    A novel sequence of posteromedial release consistent with surgical technique of total knee arthroplasty was performed in 15 cadaveric knees. Medial and lateral flexion and extension gaps were measured after each step of the release using a computed tomography-free computer navigation system. A spring-loaded distractor and a manual distractor were used to distract the joint. Posterior cruciate ligament release increased flexion more than extension gap; deep medial collateral ligament release had a negligible effect; semimembranosus release increased the flexion gap medially; reduction osteotomy increased medial flexion and extension gaps; superficial medial collateral ligament release increased medial joint gap more in flexion and caused severe instability. This sequence of release led to incremental and differential effects on flexion-extension gaps and has implications in correcting varus deformity.

  3. Computer-Assisted Learning in UK Engineering Degree Programmes: Lessons Learned from an Extensive Case Study Programme

    ERIC Educational Resources Information Center

    Rothberg, S. J.; Lamb, F. M.; Willis, L.

    2006-01-01

    This paper gives a synopsis of an extensive programme of case studies on real uses of computer-assisted learning (CAL) materials within UK engineering degree programmes. The programme was conducted between 2000 and 2003 and followed a questionnaire-based survey looking at CAL use in the UK and in Australia. The synopsis reveals a number of key…

  4. Neck postures in air traffic controllers with and without neck/shoulder disorders.

    PubMed

    Arvidsson, Inger; Hansson, Gert-Ake; Mathiassen, Svend Erik; Skerfving, Staffan

    2008-03-01

    Prolonged computer work with an extended neck is commonly believed to be associated with an increased risk of neck-shoulder disorders. The aim of this study was to compare neck postures during computer work between female cases with neck-shoulder disorders, and healthy referents. Based on physical examinations, 13 cases and 11 referents were selected among 70 female air traffic controllers with the same computer-based work tasks and identical workstations. Postures and movements were measured by inclinometers, placed on the forehead and upper back (C7/Th1) during authentic air traffic control. A recently developed method was applied to assess flexion/extension in the neck, calculated as the difference between head and upper back flexion/extension. cases and referents did not differ significantly in neck posture (median neck flexion/extension: -10 degrees vs. -9 degrees ; p=0.9). Hence, the belief that neck extension posture is associated with neck-shoulder disorders in computer work is not supported by the present data.

  5. 10 CFR 76.74 - Computation and extension of time.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Computation and extension of time. 76.74 Section 76.74 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS Certification § 76.74 Computation and extension of time. (a) In computing any period of time, the day of the act...

  6. 10 CFR 76.74 - Computation and extension of time.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Computation and extension of time. 76.74 Section 76.74 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS Certification § 76.74 Computation and extension of time. (a) In computing any period of time, the day of the act...

  7. 10 CFR 76.74 - Computation and extension of time.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 2 2013-01-01 2013-01-01 false Computation and extension of time. 76.74 Section 76.74 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS Certification § 76.74 Computation and extension of time. (a) In computing any period of time, the day of the act...

  8. 10 CFR 76.74 - Computation and extension of time.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Computation and extension of time. 76.74 Section 76.74 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS Certification § 76.74 Computation and extension of time. (a) In computing any period of time, the day of the act...

  9. 10 CFR 76.74 - Computation and extension of time.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Computation and extension of time. 76.74 Section 76.74 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS Certification § 76.74 Computation and extension of time. (a) In computing any period of time, the day of the act...

  10. Additional extensions to the NASCAP computer code, volume 1

    NASA Technical Reports Server (NTRS)

    Mandell, M. J.; Katz, I.; Stannard, P. R.

    1981-01-01

    Extensions and revisions to a computer code that comprehensively analyzes problems of spacecraft charging (NASCAP) are documented. Using a fully three dimensional approach, it can accurately predict spacecraft potentials under a variety of conditions. Among the extensions are a multiple electron/ion gun test tank capability, and the ability to model anisotropic and time dependent space environments. Also documented are a greatly extended MATCHG program and the preliminary version of NASCAP/LEO. The interactive MATCHG code was developed into an extremely powerful tool for the study of material-environment interactions. The NASCAP/LEO, a three dimensional code to study current collection under conditions of high voltages and short Debye lengths, was distributed for preliminary testing.

  11. Effectiveness of Computer-Assisted Mathematics Education (CAME) over Academic Achievement: A Meta-Analysis Study

    ERIC Educational Resources Information Center

    Demir, Seda; Basol, Gülsah

    2014-01-01

    The aim of the current study is to determine the overall effects of Computer-Assisted Mathematics Education (CAME) on academic achievement. After an extensive review of the literature, studies using Turkish samples and observing the effects of Computer-Assisted Education (CAE) on mathematics achievement were examined. As a result of this…

  12. Technical Development and Application of Soft Computing in Agricultural and Biological Engineering

    USDA-ARS?s Scientific Manuscript database

    Soft computing is a set of “inexact” computing techniques, which are able to model and analyze very complex problems. For these complex problems, more conventional methods have not been able to produce cost-effective, analytical, or complete solutions. Soft computing has been extensively studied and...

  13. Development of Soft Computing and Applications in Agricultural and Biological Engineering

    USDA-ARS?s Scientific Manuscript database

    Soft computing is a set of “inexact” computing techniques, which are able to model and analyze very complex problems. For these complex problems, more conventional methods have not been able to produce cost-effective, analytical, or complete solutions. Soft computing has been extensively studied and...

  14. Notebook computer use on a desk, lap and lap support: effects on posture, performance and comfort.

    PubMed

    Asundi, Krishna; Odell, Dan; Luce, Adam; Dennerlein, Jack T

    2010-01-01

    This study quantified postures of users working on a notebook computer situated in their lap and tested the effect of using a device designed to increase the height of the notebook when placed on the lap. A motion analysis system measured head, neck and upper extremity postures of 15 adults as they worked on a notebook computer placed on a desk (DESK), the lap (LAP) and a commercially available lapdesk (LAPDESK). Compared with the DESK, the LAP increased downwards head tilt 6 degrees and wrist extension 8 degrees . Shoulder flexion and ulnar deviation decreased 13 degrees and 9 degrees , respectively. Compared with the LAP, the LAPDESK decreased downwards head tilt 4 degrees , neck flexion 2 degrees , and wrist extension 9 degrees. Users reported less discomfort and difficulty in the DESK configuration. Use of the lapdesk improved postures compared with the lap; however, all configurations resulted in high values of wrist extension, wrist deviation and downwards head tilt. STATEMENT OF RELEVANCE: This study quantifies postures of users working with a notebook computer in typical portable configurations. A better understanding of the postures assumed during notebook computer use can improve usage guidelines to reduce the risk of musculoskeletal injuries.

  15. Transient Side Load Analysis of Out-of-Round Film-Cooled Nozzle Extensions

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Lin, Jeff; Ruf, Joe; Guidos, Mike

    2012-01-01

    There was interest in understanding the impact of out-of-round nozzle extension on the nozzle side load during transient startup operations. The out-of-round nozzle extension could be the result of asymmetric internal stresses, deformation induced by previous tests, and asymmetric loads induced by hardware attached to the nozzle. The objective of this study was therefore to computationally investigate the effect of out-of-round nozzle extension on the nozzle side loads during an engine startup transient. The rocket engine studied encompasses a regeneratively cooled chamber and nozzle, along with a film cooled nozzle extension. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics formulation, and transient inlet boundary flow properties derived from an engine system simulation. Six three-dimensional cases were performed with the out-of-roundness achieved by three different degrees of ovalization, elongated on lateral y and z axes: one slightly out-of-round, one more out-of-round, and one significantly out-of-round. The results show that the separation line jump was the primary source of the peak side loads. Comparing to the peak side load of the perfectly round nozzle, the peak side loads increased for the slightly and more ovalized nozzle extensions, and either increased or decreased for the two significantly ovalized nozzle extensions. A theory based on the counteraction of the flow destabilizing effect of an exacerbated asymmetrical flow caused by a lower degree of ovalization, and the flow stabilizing effect of a more symmetrical flow, created also by ovalization, is presented to explain the observations obtained in this effort.

  16. Sessional, Weekly and Diurnal Patterns of Computer Lab Usage by Students Attending a Regional University in Australia

    ERIC Educational Resources Information Center

    Spennemann, Dirk H. R.; Atkinson, John; Cornforth, David

    2007-01-01

    Most universities have invested in extensive infrastructure in the form of computer laboratories and computer kiosks. However, is this investment justified when it is suggested that students work predominantly from home using their own computers? This paper provides an empirical study investigating how students at a regional multi-campus…

  17. Improving Communicative Competence through Synchronous Communication in Computer-Supported Collaborative Learning Environments: A Systematic Review

    ERIC Educational Resources Information Center

    Huang, Xi

    2018-01-01

    Computer-supported collaborative learning facilitates the extension of second language acquisition into social practice. Studies on its achievement effects speak directly to the pedagogical notion of treating communicative practice in synchronous computer-mediated communication (SCMC): real-time communication that takes place between human beings…

  18. Minimizing Dispersion in FDTD Methods with CFL Limit Extension

    NASA Astrophysics Data System (ADS)

    Sun, Chen

    The CFL extension in FDTD methods is receiving considerable attention in order to reduce the computational effort and save the simulation time. One of the major issues in the CFL extension methods is the increased dispersion. We formulate a decomposition of FDTD equations to study the behaviour of the dispersion. A compensation scheme to reduce the dispersion in CFL extension is constructed and proposed. We further study the CFL extension in a FDTD subgridding case, where we improve the accuracy by acting only on the FDTD equations of the fine grid. Numerical results confirm the efficiency of the proposed method for minimising dispersion.

  19. The Effect of Home Computer Use on Children's Cognitive and Non-Cognitive Skills

    ERIC Educational Resources Information Center

    Fiorini, M.

    2010-01-01

    In this paper we investigate the effect of using a home computer on children's development. In most OECD countries 70% or more of the households have a computer at home and children use computers quite extensively, even at very young ages. We use data from the Longitudinal Study of Australian Children (LSAC), which follows an Australian cohort…

  20. Vectorization with SIMD extensions speeds up reconstruction in electron tomography.

    PubMed

    Agulleiro, J I; Garzón, E M; García, I; Fernández, J J

    2010-06-01

    Electron tomography allows structural studies of cellular structures at molecular detail. Large 3D reconstructions are needed to meet the resolution requirements. The processing time to compute these large volumes may be considerable and so, high performance computing techniques have been used traditionally. This work presents a vector approach to tomographic reconstruction that relies on the exploitation of the SIMD extensions available in modern processors in combination to other single processor optimization techniques. This approach succeeds in producing full resolution tomograms with an important reduction in processing time, as evaluated with the most common reconstruction algorithms, namely WBP and SIRT. The main advantage stems from the fact that this approach is to be run on standard computers without the need of specialized hardware, which facilitates the development, use and management of programs. Future trends in processor design open excellent opportunities for vector processing with processor's SIMD extensions in the field of 3D electron microscopy.

  1. An Analysis of the Use of Cloud Computing among University Lecturers: A Case Study in Zimbabwe

    ERIC Educational Resources Information Center

    Musungwini, Samuel; Mugoniwa, Beauty; Furusa, Samuel Simbarashe; Rebanowako, Taurai George

    2016-01-01

    Cloud computing is a novel model of computing that may bring extensive benefits to users, institutions, businesses and academics, while at the same time also giving rise to new risks and challenges. This study looked at the benefits of using Google docs by researchers and academics and analysing the factors affecting the adoption and use of the…

  2. 48 CFR 6302.6 - Computation and extension of time limits (Rule 6).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... of time limits (Rule 6). 6302.6 Section 6302.6 Federal Acquisition Regulations System DEPARTMENT OF TRANSPORTATION BOARD OF CONTRACT APPEALS RULES OF PROCEDURE 6302.6 Computation and extension of time limits (Rule 6). (a) Computation. Except as otherwise provided by law, in computing any period of time prescribed...

  3. An Extensive Reading Strategy to Promote Online Writing for Elementary Students in the 1:1 Digital Classroom

    ERIC Educational Resources Information Center

    Sun, Zhong; Yang, Xian Min; He, Ke Kang

    2016-01-01

    The rapid development of the digital classroom has made it possible to combine extensive reading with online writing, yet research and development in this area are lacking. This study explores the impact of online writing after extensive reading in a classroom setting in China where there was one computer for each student (a 1:1 digital…

  4. Multi-scale Computational Electromagnetics for Phenomenology and Saliency Characterization in Remote Sensing

    DTIC Science & Technology

    2016-07-15

    AFRL-AFOSR-JP-TR-2016-0068 Multi-scale Computational Electromagnetics for Phenomenology and Saliency Characterization in Remote Sensing Hean-Teik...SUBTITLE Multi-scale Computational Electromagnetics for Phenomenology and Saliency Characterization in Remote Sensing 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER... electromagnetics to the application in microwave remote sensing as well as extension of modelling capability with computational flexibility to study

  5. Multi-scale Computational Electromagnetics for Phenomenology and Saliency Characterization in Remote Sensing

    DTIC Science & Technology

    2016-07-15

    AFRL-AFOSR-JP-TR-2016-0068 Multi-scale Computational Electromagnetics for Phenomenology and Saliency Characterization in Remote Sensing Hean-Teik...SUBTITLE Multi-scale Computational Electromagnetics for Phenomenology and Saliency Characterization in Remote Sensing 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER...electromagnetics to the application in microwave remote sensing as well as extension of modelling capability with computational flexibility to study

  6. Social correlates of leisure-time sedentary behaviours in Canadian adults.

    PubMed

    Huffman, S; Szafron, M

    2017-03-01

    Research on the correlates of sedentary behaviour among adults is needed to design health interventions to modify this behaviour. This study explored the associations of social correlates with leisure-time sedentary behaviour of Canadian adults, and whether these associations differ between different types of sedentary behaviour. A sample of 12,021 Canadian adults was drawn from the 2012 Canadian Community Health Survey, and analyzed using binary logistic regression to model the relationships that marital status, the presence of children in the household, and social support have with overall time spent sitting, using a computer, playing video games, watching television, and reading during leisure time. Covariates included gender, age, education, income, employment status, perceived health, physical activity level, body mass index (BMI), and province or territory of residence. Extensive computer time was primarily negatively related to being in a common law relationship, and primarily positively related to being single/never married. Being single/never married was positively associated with extensive sitting time in men only. Having children under 12 in the household was protective against extensive video game and reading times. Increasing social support was negatively associated with extensive computer time in men and women, while among men increasing social support was positively associated with extensive sitting time. Computer, video game, television, and reading time have unique correlates among Canadian adults. Marital status, the presence of children in the household, and social support should be considered in future analyses of sedentary activities in adults.

  7. Enhancing Assignment Perceptions in Students with Mathematics Learning Disabilities by Including More Work: An Extension of Interspersal Research

    ERIC Educational Resources Information Center

    Wildmon, Mark E.; Skinner, Christopher H.; Watson, T. Steuart; Garrett, L. Shan

    2004-01-01

    Active student responding is often required to remedy computation skill deficits in students with learning disabilities. However, these students may find computation assignments unrewarding and frustrating, and be less likely to choose to engage in assigned computation tasks. In the current study, middle school students with learning disabilities…

  8. Barriers and Incentives to Computer Usage in Teaching

    DTIC Science & Technology

    1988-09-29

    classes with one or two computers. Research Methods The two major methods of data-gathering employed in this study were intensive and extensive classroom ... observation and repeated extended interviews with students and teachers. Administrators were also interviewed when appropriate. Classroom observers used

  9. Assessing Affordances of Selected Cloud Computing Tools for Language Teacher Education in Nigeria

    ERIC Educational Resources Information Center

    Ofemile, Abdulmalik Yusuf

    2015-01-01

    This paper reports part of a study that hoped to understand Teacher Educators' (TE) assessment of the affordances of selected cloud computing tools ranked among the top 100 for the year 2010. Research has shown that ICT and by extension cloud computing has positive impacts on daily life and this informed the Nigerian government's policy to…

  10. NASA Computational Case Study SAR Data Processing: Ground-Range Projection

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess; Rincon, Rafael

    2013-01-01

    Radar technology is used extensively by NASA for remote sensing of the Earth and other Planetary bodies. In this case study, we learn about different computational concepts for processing radar data. In particular, we learn how to correct a slanted radar image by projecting it on the surface that was sensed by a radar instrument.

  11. Can a tablet device alter undergraduate science students' study behavior and use of technology?

    PubMed

    Morris, Neil P; Ramsay, Luke; Chauhan, Vikesh

    2012-06-01

    This article reports findings from a study investigating undergraduate biological sciences students' use of technology and computer devices for learning and the effect of providing students with a tablet device. A controlled study was conducted to collect quantitative and qualitative data on the impact of a tablet device on students' use of devices and technology for learning. Overall, we found that students made extensive use of the tablet device for learning, using it in preference to laptop computers to retrieve information, record lectures, and access learning resources. In line with other studies, we found that undergraduate students only use familiar Web 2.0 technologies and that the tablet device did not alter this behavior for the majority of tools. We conclude that undergraduate science students can make extensive use of a tablet device to enhance their learning opportunities without institutions changing their teaching methods or computer systems, but that institutional intervention may be needed to drive changes in student behavior toward the use of novel Web 2.0 technologies.

  12. Determinants of Computer Utilization by Extension Personnel: A Structural Equations Approach

    ERIC Educational Resources Information Center

    Sivakumar, Paramasivan Sethuraman; Parasar, Bibudha; Das, Raghu Nath; Anantharaman, Mathevanpillai

    2014-01-01

    Purpose: Information technology (IT) has tremendous potential for fostering grassroots development and the Indian government has created various capital-intensive computer networks to promote agricultural development. However, research studies have shown that information technology investments are not always translated into productivity gains due…

  13. A Fast Algorithm for the Convolution of Functions with Compact Support Using Fourier Extensions

    DOE PAGES

    Xu, Kuan; Austin, Anthony P.; Wei, Ke

    2017-12-21

    In this paper, we present a new algorithm for computing the convolution of two compactly supported functions. The algorithm approximates the functions to be convolved using Fourier extensions and then uses the fast Fourier transform to efficiently compute Fourier extension approximations to the pieces of the result. Finally, the complexity of the algorithm is O(N(log N) 2), where N is the number of degrees of freedom used in each of the Fourier extensions.

  14. A Fast Algorithm for the Convolution of Functions with Compact Support Using Fourier Extensions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Kuan; Austin, Anthony P.; Wei, Ke

    In this paper, we present a new algorithm for computing the convolution of two compactly supported functions. The algorithm approximates the functions to be convolved using Fourier extensions and then uses the fast Fourier transform to efficiently compute Fourier extension approximations to the pieces of the result. Finally, the complexity of the algorithm is O(N(log N) 2), where N is the number of degrees of freedom used in each of the Fourier extensions.

  15. OpenFlow Extensions for Programmable Quantum Networks

    DTIC Science & Technology

    2017-06-19

    Extensions for Programmable Quantum Networks by Venkat Dasari, Nikolai Snow, and Billy Geerhart Computational and Information Sciences Directorate...distribution is unlimited. 1 1. Introduction Quantum networks and quantum computing have been receiving a surge of interest recently.1–3 However, there has...communicate using entangled particles and perform calculations using quantum logic gates. Additionally, quantum computing uses a quantum bit (qubit

  16. Computers for real time flight simulation: A market survey

    NASA Technical Reports Server (NTRS)

    Bekey, G. A.; Karplus, W. J.

    1977-01-01

    An extensive computer market survey was made to determine those available systems suitable for current and future flight simulation studies at Ames Research Center. The primary requirement is for the computation of relatively high frequency content (5 Hz) math models representing powered lift flight vehicles. The Rotor Systems Research Aircraft (RSRA) was used as a benchmark vehicle for computation comparison studies. The general nature of helicopter simulations and a description of the benchmark model are presented, and some of the sources of simulation difficulties are examined. A description of various applicable computer architectures is presented, along with detailed discussions of leading candidate systems and comparisons between them.

  17. 3-D modeling of ductile tearing using finite elements: Computational aspects and techniques

    NASA Astrophysics Data System (ADS)

    Gullerud, Arne Stewart

    This research focuses on the development and application of computational tools to perform large-scale, 3-D modeling of ductile tearing in engineering components under quasi-static to mild loading rates. Two standard models for ductile tearing---the computational cell methodology and crack growth controlled by the crack tip opening angle (CTOA)---are described and their 3-D implementations are explored. For the computational cell methodology, quantification of the effects of several numerical issues---computational load step size, procedures for force release after cell deletion, and the porosity for cell deletion---enables construction of computational algorithms to remove the dependence of predicted crack growth on these issues. This work also describes two extensions of the CTOA approach into 3-D: a general 3-D method and a constant front technique. Analyses compare the characteristics of the extensions, and a validation study explores the ability of the constant front extension to predict crack growth in thin aluminum test specimens over a range of specimen geometries, absolutes sizes, and levels of out-of-plane constraint. To provide a computational framework suitable for the solution of these problems, this work also describes the parallel implementation of a nonlinear, implicit finite element code. The implementation employs an explicit message-passing approach using the MPI standard to maintain portability, a domain decomposition of element data to provide parallel execution, and a master-worker organization of the computational processes to enhance future extensibility. A linear preconditioned conjugate gradient (LPCG) solver serves as the core of the solution process. The parallel LPCG solver utilizes an element-by-element (EBE) structure of the computations to permit a dual-level decomposition of the element data: domain decomposition of the mesh provides efficient coarse-grain parallel execution, while decomposition of the domains into blocks of similar elements (same type, constitutive model, etc.) provides fine-grain parallel computation on each processor. A major focus of the LPCG solver is a new implementation of the Hughes-Winget element-by-element (HW) preconditioner. The implementation employs a weighted dependency graph combined with a new coloring algorithm to provide load-balanced scheduling for the preconditioner and overlapped communication/computation. This approach enables efficient parallel application of the HW preconditioner for arbitrary unstructured meshes.

  18. Cooperative combinatorial optimization: evolutionary computation case study.

    PubMed

    Burgin, Mark; Eberbach, Eugene

    2008-01-01

    This paper presents a formalization of the notion of cooperation and competition of multiple systems that work toward a common optimization goal of the population using evolutionary computation techniques. It is proved that evolutionary algorithms are more expressive than conventional recursive algorithms, such as Turing machines. Three classes of evolutionary computations are introduced and studied: bounded finite, unbounded finite, and infinite computations. Universal evolutionary algorithms are constructed. Such properties of evolutionary algorithms as completeness, optimality, and search decidability are examined. A natural extension of evolutionary Turing machine (ETM) model is proposed to properly reflect phenomena of cooperation and competition in the whole population.

  19. Computational Studies Of Chemical Reactions: The Hnc-Hcn And Ch[subscript3]Nc-Ch[subscript3]Cn Isomerizations

    ERIC Educational Resources Information Center

    Halpern, Arthur M.

    2006-01-01

    The application of computational methods to the isomerization of hydrogen isocyanide to hydrogen cyanide, HNC-HCN is described. The logical extension to the exercise is presented to the isomerization of the methyl-substituted compounds, methylisocyanide and methylcyanide, Ch[subscript 3]NC-CH[subscript3]CN.

  20. Evaluation of the effect of different stretching patterns on force decay and tensile properties of elastomeric ligatures

    PubMed Central

    Aminian, Amin; Nakhaei, Samaneh; Agahi, Raha Habib; Rezaeizade, Masoud; Aliabadi, Hamed Mirzazadeh; Heidarpour, Majid

    2015-01-01

    Background: There have been numerous researches on elastomeric ligatures, but clinical conditions in different stages of treatment are not exactly similar to laboratory conditions. The aim of this in vitro study was to simulate clinical conditions and evaluate the effect of three stretching patterns on the amount of force, tensile strength (TS) and extension to TS of the elastomers during 8 weeks. Materials and Methods: Forces, TS and extension to TS of two different brands of elastomers were measured at initial, 24 h and 2, 4, and 8-week intervals using a testing machine. During the study period, the elastomers were stored in three different types of jig (uniform stretching, 1 and 3 mm point stretching) designed by the computer-aided design and computer-aided manufacturing technique in order to simulate the different stages of orthodontic treatment. Results: The elastomeric ligatures under study exhibited a similar force decay pattern. The maximum force decay occurred during the first 24 h (49.9% ± 15%) and the amount of force decay was 75.7% ± 8% after 8 weeks. In general, the TS decreased during the study period, and the amount of extension to TS increased. Conclusion: Although the elastic behavior of all ligatures under study was similar, the amount of residual force, TS and extension to TS increased in elastomers under point stretching pattern. PMID:26759597

  1. Evaluation of the effect of different stretching patterns on force decay and tensile properties of elastomeric ligatures.

    PubMed

    Aminian, Amin; Nakhaei, Samaneh; Agahi, Raha Habib; Rezaeizade, Masoud; Aliabadi, Hamed Mirzazadeh; Heidarpour, Majid

    2015-01-01

    There have been numerous researches on elastomeric ligatures, but clinical conditions in different stages of treatment are not exactly similar to laboratory conditions. The aim of this in vitro study was to simulate clinical conditions and evaluate the effect of three stretching patterns on the amount of force, tensile strength (TS) and extension to TS of the elastomers during 8 weeks. Forces, TS and extension to TS of two different brands of elastomers were measured at initial, 24 h and 2, 4, and 8-week intervals using a testing machine. During the study period, the elastomers were stored in three different types of jig (uniform stretching, 1 and 3 mm point stretching) designed by the computer-aided design and computer-aided manufacturing technique in order to simulate the different stages of orthodontic treatment. The elastomeric ligatures under study exhibited a similar force decay pattern. The maximum force decay occurred during the first 24 h (49.9% ± 15%) and the amount of force decay was 75.7% ± 8% after 8 weeks. In general, the TS decreased during the study period, and the amount of extension to TS increased. Although the elastic behavior of all ligatures under study was similar, the amount of residual force, TS and extension to TS increased in elastomers under point stretching pattern.

  2. Online System for Faster Multipoint Linkage Analysis via Parallel Execution on Thousands of Personal Computers

    PubMed Central

    Silberstein, M.; Tzemach, A.; Dovgolevsky, N.; Fishelson, M.; Schuster, A.; Geiger, D.

    2006-01-01

    Computation of LOD scores is a valuable tool for mapping disease-susceptibility genes in the study of Mendelian and complex diseases. However, computation of exact multipoint likelihoods of large inbred pedigrees with extensive missing data is often beyond the capabilities of a single computer. We present a distributed system called “SUPERLINK-ONLINE,” for the computation of multipoint LOD scores of large inbred pedigrees. It achieves high performance via the efficient parallelization of the algorithms in SUPERLINK, a state-of-the-art serial program for these tasks, and through the use of the idle cycles of thousands of personal computers. The main algorithmic challenge has been to efficiently split a large task for distributed execution in a highly dynamic, nondedicated running environment. Notably, the system is available online, which allows computationally intensive analyses to be performed with no need for either the installation of software or the maintenance of a complicated distributed environment. As the system was being developed, it was extensively tested by collaborating medical centers worldwide on a variety of real data sets, some of which are presented in this article. PMID:16685644

  3. 40 CFR 305.6 - Computation and extension of time.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 29 2012-07-01 2012-07-01 false Computation and extension of time. 305.6 Section 305.6 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SUPERFUND..., AND LIABILITY ACT (CERCLA) ADMINISTRATIVE HEARING PROCEDURES FOR CLAIMS AGAINST THE SUPERFUND General...

  4. 40 CFR 305.6 - Computation and extension of time.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 29 2013-07-01 2013-07-01 false Computation and extension of time. 305.6 Section 305.6 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SUPERFUND..., AND LIABILITY ACT (CERCLA) ADMINISTRATIVE HEARING PROCEDURES FOR CLAIMS AGAINST THE SUPERFUND General...

  5. 40 CFR 305.6 - Computation and extension of time.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 28 2011-07-01 2011-07-01 false Computation and extension of time. 305.6 Section 305.6 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SUPERFUND..., AND LIABILITY ACT (CERCLA) ADMINISTRATIVE HEARING PROCEDURES FOR CLAIMS AGAINST THE SUPERFUND General...

  6. Tomo3D 2.0--exploitation of advanced vector extensions (AVX) for 3D reconstruction.

    PubMed

    Agulleiro, Jose-Ignacio; Fernandez, Jose-Jesus

    2015-02-01

    Tomo3D is a program for fast tomographic reconstruction on multicore computers. Its high speed stems from code optimization, vectorization with Streaming SIMD Extensions (SSE), multithreading and optimization of disk access. Recently, Advanced Vector eXtensions (AVX) have been introduced in the x86 processor architecture. Compared to SSE, AVX double the number of simultaneous operations, thus pointing to a potential twofold gain in speed. However, in practice, achieving this potential is extremely difficult. Here, we provide a technical description and an assessment of the optimizations included in Tomo3D to take advantage of AVX instructions. Tomo3D 2.0 allows huge reconstructions to be calculated in standard computers in a matter of minutes. Thus, it will be a valuable tool for electron tomography studies with increasing resolution needs. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. PRINCESS: Privacy-protecting Rare disease International Network Collaboration via Encryption through Software guard extensionS

    PubMed Central

    Chen, Feng; Wang, Shuang; Jiang, Xiaoqian; Ding, Sijie; Lu, Yao; Kim, Jihoon; Sahinalp, S. Cenk; Shimizu, Chisato; Burns, Jane C.; Wright, Victoria J.; Png, Eileen; Hibberd, Martin L.; Lloyd, David D.; Yang, Hai; Telenti, Amalio; Bloss, Cinnamon S.; Fox, Dov; Lauter, Kristin; Ohno-Machado, Lucila

    2017-01-01

    Abstract Motivation: We introduce PRINCESS, a privacy-preserving international collaboration framework for analyzing rare disease genetic data that are distributed across different continents. PRINCESS leverages Software Guard Extensions (SGX) and hardware for trustworthy computation. Unlike a traditional international collaboration model, where individual-level patient DNA are physically centralized at a single site, PRINCESS performs a secure and distributed computation over encrypted data, fulfilling institutional policies and regulations for protected health information. Results: To demonstrate PRINCESS’ performance and feasibility, we conducted a family-based allelic association study for Kawasaki Disease, with data hosted in three different continents. The experimental results show that PRINCESS provides secure and accurate analyses much faster than alternative solutions, such as homomorphic encryption and garbled circuits (over 40 000× faster). Availability and Implementation: https://github.com/achenfengb/PRINCESS_opensource Contact: shw070@ucsd.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28065902

  8. Near-term hybrid vehicle program, phase 1. Appendix B: Design trade-off studies report. Volume 3: Computer program listings

    NASA Technical Reports Server (NTRS)

    1979-01-01

    A description and listing is presented of two computer programs: Hybrid Vehicle Design Program (HYVELD) and Hybrid Vehicle Simulation Program (HYVEC). Both of the programs are modifications and extensions of similar programs developed as part of the Electric and Hybrid Vehicle System Research and Development Project.

  9. Digital receiver study and implementation

    NASA Technical Reports Server (NTRS)

    Fogle, D. A.; Lee, G. M.; Massey, J. C.

    1972-01-01

    Computer software was developed which makes it possible to use any general purpose computer with A/D conversion capability as a PSK receiver for low data rate telemetry processing. Carrier tracking, bit synchronization, and matched filter detection are all performed digitally. To aid in the implementation of optimum computer processors, a study of general digital processing techniques was performed which emphasized various techniques for digitizing general analog systems. In particular, the phase-locked loop was extensively analyzed as a typical non-linear communication element. Bayesian estimation techniques for PSK demodulation were studied. A hardware implementation of the digital Costas loop was developed.

  10. DAKOTA Design Analysis Kit for Optimization and Terascale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Dalbey, Keith R.; Eldred, Michael S.

    2010-02-24

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes (computational models) and iterative analysis methods. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and analysis of computational models on high performance computers.A user provides a set of DAKOTA commands in an input file and launches DAKOTA. DAKOTA invokes instances of the computational models, collects their results, and performs systems analyses. DAKOTA contains algorithms for optimization with gradient and nongradient-basedmore » methods; uncertainty quantification with sampling, reliability, polynomial chaos, stochastic collocation, and epistemic methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as hybrid optimization, surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. Services for parallel computing, simulation interfacing, approximation modeling, fault tolerance, restart, and graphics are also included.« less

  11. Research Area 3: Mathematical Sciences: 3.4, Discrete Mathematics and Computer Science

    DTIC Science & Technology

    2015-06-10

    013-0043-1 Charles Chui, Hrushikesh Mhaskar. MRA contextual-recovery extension of smooth functions on manifolds, Applied and Computational Harmonic...753507. International Society for Optics and Photonics, 2010. [5] C. K. Chui and H. N. Mhaskar. MRA contextual-recovery extension of smooth functions on

  12. 19 CFR 201.14 - Computation of time, additional hearings, postponements, continuances, and extensions of time.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 19 Customs Duties 3 2014-04-01 2014-04-01 false Computation of time, additional hearings, postponements, continuances, and extensions of time. 201.14 Section 201.14 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION GENERAL RULES OF GENERAL APPLICATION Initiation and Conduct of Investigations...

  13. 19 CFR 201.14 - Computation of time, additional hearings, postponements, continuances, and extensions of time.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 19 Customs Duties 3 2013-04-01 2013-04-01 false Computation of time, additional hearings, postponements, continuances, and extensions of time. 201.14 Section 201.14 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION GENERAL RULES OF GENERAL APPLICATION Initiation and Conduct of Investigations...

  14. 19 CFR 210.6 - Computation of time, additional hearings, postponements, continuances, and extensions of time.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Computation of time, additional hearings, postponements, continuances, and extensions of time. 210.6 Section 210.6 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION INVESTIGATIONS OF UNFAIR PRACTICES IN IMPORT TRADE ADJUDICATION AND ENFORCEMENT...

  15. 45 CFR 150.429 - Computation of time and extensions of time.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Computation of time and extensions of time. 150.429 Section 150.429 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS Administrative Hearings...

  16. 45 CFR 150.429 - Computation of time and extensions of time.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Computation of time and extensions of time. 150.429 Section 150.429 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS Administrative Hearings...

  17. 76 FR 27274 - Defense Federal Acquisition Regulation Supplement; Rules of the Armed Services Board of Contract...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-11

    ... 23 Post-Hearing Briefs Rule 24 Transcript of Proceedings Rule 25 Withdrawal of Exhibits... from Court TIME, COMPUTATION, AND EXTENSIONS Rule 33 Time, Computation and Extensions EX PARTE COMMUNICATIONS Rule 34 Ex parte Communications SANCTIONS Rule 35 Sanctions EFFECTIVE DATE AND APPLICABILITY Rule...

  18. Inertial subsystem functional and design requirements for the orbiter (Phase B extension baseline)

    NASA Technical Reports Server (NTRS)

    Flanders, J. H.; Green, J. P., Jr.

    1972-01-01

    The design requirements use the Phase B extension baseline system definition. This means that a GNC computer is specified for all command control functions instead of a central computer communicating with the ISS through a databus. Forced air cooling is used instead of cold plate cooling.

  19. Outcomes of Orbital Floor Reconstruction After Extensive Maxillectomy Using the Computer-Assisted Fabricated Individual Titanium Mesh Technique.

    PubMed

    Zhang, Wen-Bo; Mao, Chi; Liu, Xiao-Jing; Guo, Chuan-Bin; Yu, Guang-Yan; Peng, Xin

    2015-10-01

    Orbital floor defects after extensive maxillectomy can cause severe esthetic and functional deformities. Orbital floor reconstruction using the computer-assisted fabricated individual titanium mesh technique is a promising method. This study evaluated the application and clinical outcomes of this technique. This retrospective study included 10 patients with orbital floor defects after maxillectomy performed from 2012 through 2014. A 3-dimensional individual stereo model based on mirror images of the unaffected orbit was obtained to fabricate an anatomically adapted titanium mesh using computer-assisted design and manufacturing. The titanium mesh was inserted into the defect using computer navigation. The postoperative globe projection and orbital volume were measured and the incidence of postoperative complications was evaluated. The average postoperative globe projection was 15.91 ± 1.80 mm on the affected side and 16.24 ± 2.24 mm on the unaffected side (P = .505), and the average postoperative orbital volume was 26.01 ± 1.28 and 25.57 ± 1.89 mL, respectively (P = .312). The mean mesh depth was 25.11 ± 2.13 mm. The mean follow-up period was 23.4 ± 7.7 months (12 to 34 months). Of the 10 patients, 9 did not develop diplopia or a decrease in visual acuity and ocular motility. Titanium mesh exposure was not observed in any patient. All patients were satisfied with their postoperative facial symmetry. Orbital floor reconstruction after extensive maxillectomy with an individual titanium mesh fabricated using computer-assisted techniques can preserve globe projection and orbital volume, resulting in successful clinical outcomes. Copyright © 2015 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  20. Guidelines for Computing Longitudinal Dynamic Stability Characteristics of a Subsonic Transport

    NASA Technical Reports Server (NTRS)

    Thompson, Joseph R.; Frank, Neal T.; Murphy, Patrick C.

    2010-01-01

    A systematic study is presented to guide the selection of a numerical solution strategy for URANS computation of a subsonic transport configuration undergoing simulated forced oscillation about its pitch axis. Forced oscillation is central to the prevalent wind tunnel methodology for quantifying aircraft dynamic stability derivatives from force and moment coefficients, which is the ultimate goal for the computational simulations. Extensive computations are performed that lead in key insights of the critical numerical parameters affecting solution convergence. A preliminary linear harmonic analysis is included to demonstrate the potential of extracting dynamic stability derivatives from computational solutions.

  1. Research in the Aloha system

    NASA Technical Reports Server (NTRS)

    Abramson, N.

    1974-01-01

    The Aloha system was studied and developed and extended to advanced forms of computer communications networks. Theoretical and simulation studies of Aloha type radio channels for use in packet switched communications networks were performed. Improved versions of the Aloha communications techniques and their extensions were tested experimentally. A packet radio repeater suitable for use with the Aloha system operational network was developed. General studies of the organization of multiprocessor systems centered on the development of the BCC 500 computer were concluded.

  2. Learning, epigenetics, and computation: An extension on Fitch's proposal. Comment on “Toward a computational framework for cognitive biology: Unifying approaches from cognitive neuroscience and comparative cognition” by W. Tecumseh Fitch

    NASA Astrophysics Data System (ADS)

    Okanoya, Kazuo

    2014-09-01

    The comparative computational approach of Fitch [1] attempts to renew the classical David Marr paradigm of computation, algorithm, and implementation, by introducing evolutionary view of the relationship between neural architecture and cognition. This comparative evolutionary view provides constraints useful in narrowing down the problem space for both cognition and neural mechanisms. I will provide two examples from our own studies that reinforce and extend Fitch's proposal.

  3. Study to document low thrust trajectory optimization programs HILTOP and ASTOP

    NASA Technical Reports Server (NTRS)

    Horsewood, J. L.; Mann, F. I.; Pines, S.

    1974-01-01

    Detailed documentation of the HILTOP and ASTOP computer programs is presented along with results of the analyses of the possible extension of the HILTOP program and results of an extra-ecliptic mission study performed with HILTOP.

  4. Transient Three-Dimensional Side Load Analysis of a Film Cooled Nozzle

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Guidos, Mike

    2008-01-01

    Transient three-dimensional numerical investigations on the side load physics for an engine encompassing a film cooled nozzle extension and a regeneratively cooled thrust chamber, were performed. The objectives of this study are to identify the three-dimensional side load physics and to compute the associated aerodynamic side load using an anchored computational methodology. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics formulation, and a transient inlet history based on an engine system simulation. Ultimately, the computational results will be provided to the nozzle designers for estimating of effect of the peak side load on the nozzle structure. Computations simulating engine startup at ambient pressures corresponding to sea level and three high altitudes were performed. In addition, computations for both engine startup and shutdown transients were also performed for a stub nozzle, operating at sea level. For engine with the full nozzle extension, computational result shows starting up at sea level, the peak side load occurs when the lambda shock steps into the turbine exhaust flow, while the side load caused by the transition from free-shock separation to restricted-shock separation comes at second; and the side loads decreasing rapidly and progressively as the ambient pressure decreases. For the stub nozzle operating at sea level, the computed side loads during both startup and shutdown becomes very small due to the much reduced flow area.

  5. A Bitslice Implementation of Anderson's Attack on A5/1

    NASA Astrophysics Data System (ADS)

    Bulavintsev, Vadim; Semenov, Alexander; Zaikin, Oleg; Kochemazov, Stepan

    2018-03-01

    The A5/1 keystream generator is a part of Global System for Mobile Communications (GSM) protocol, employed in cellular networks all over the world. Its cryptographic resistance was extensively analyzed in dozens of papers. However, almost all corresponding methods either employ a specific hardware or require an extensive preprocessing stage and significant amounts of memory. In the present study, a bitslice variant of Anderson's Attack on A5/1 is implemented. It requires very little computer memory and no preprocessing. Moreover, the attack can be made even more efficient by harnessing the computing power of modern Graphics Processing Units (GPUs). As a result, using commonly available GPUs this method can quite efficiently recover the secret key using only 64 bits of keystream. To test the performance of the implementation, a volunteer computing project was launched. 10 instances of A5/1 cryptanalysis have been successfully solved in this project in a single week.

  6. A Computational Procedure for Identifying Bilinear Representations of Nonlinear Systems Using Volterra Kernels

    NASA Technical Reports Server (NTRS)

    Kvaternik, Raymond G.; Silva, Walter A.

    2008-01-01

    A computational procedure for identifying the state-space matrices corresponding to discrete bilinear representations of nonlinear systems is presented. A key feature of the method is the use of first- and second-order Volterra kernels (first- and second-order pulse responses) to characterize the system. The present method is based on an extension of a continuous-time bilinear system identification procedure given in a 1971 paper by Bruni, di Pillo, and Koch. The analytical and computational considerations that underlie the original procedure and its extension to the title problem are presented and described, pertinent numerical considerations associated with the process are discussed, and results obtained from the application of the method to a variety of nonlinear problems from the literature are presented. The results of these exploratory numerical studies are decidedly promising and provide sufficient credibility for further examination of the applicability of the method.

  7. Progress Monitoring with Computer Adaptive Assessments: The Impact of Data Collection Schedule on Growth Estimates

    ERIC Educational Resources Information Center

    Nelson, Peter M.; Van Norman, Ethan R.; Klingbeil, Dave A.; Parker, David C.

    2017-01-01

    Although extensive research exists on the use of curriculum-based measures for progress monitoring, little is known about using computer adaptive tests (CATs) for progress-monitoring purposes. The purpose of this study was to evaluate the impact of the frequency of data collection on individual and group growth estimates using a CAT. Data were…

  8. Derivation of improved load transformation matrices for launchers-spacecraft coupled analysis, and direct computation of margins of safety

    NASA Technical Reports Server (NTRS)

    Klein, M.; Reynolds, J.; Ricks, E.

    1989-01-01

    Load and stress recovery from transient dynamic studies are improved upon using an extended acceleration vector in the modal acceleration technique applied to structural analysis. Extension of the normal LTM (load transformation matrices) stress recovery to automatically compute margins of safety is presented with an application to the Hubble space telescope.

  9. Computational neurobiology is a useful tool in translational neurology: the example of ataxia

    PubMed Central

    Brown, Sherry-Ann; McCullough, Louise D.; Loew, Leslie M.

    2014-01-01

    Hereditary ataxia, or motor incoordination, affects approximately 150,000 Americans and hundreds of thousands of individuals worldwide with onset from as early as mid-childhood. Affected individuals exhibit dysarthria, dysmetria, action tremor, and diadochokinesia. In this review, we consider an array of computational studies derived from experimental observations relevant to human neuropathology. A survey of related studies illustrates the impact of integrating clinical evidence with data from mouse models and computational simulations. Results from these studies may help explain findings in mice, and after extensive laboratory study, may ultimately be translated to ataxic individuals. This inquiry lays a foundation for using computation to understand neurobiochemical and electrophysiological pathophysiology of spinocerebellar ataxias and may contribute to development of therapeutics. The interdisciplinary analysis suggests that computational neurobiology can be an important tool for translational neurology. PMID:25653585

  10. A glacier runoff extension to the Precipitation Runoff Modeling System

    USGS Publications Warehouse

    Van Beusekom, Ashley E.; Viger, Roland

    2016-01-01

    A module to simulate glacier runoff, PRMSglacier, was added to PRMS (Precipitation Runoff Modeling System), a distributed-parameter, physical-process hydrological simulation code. The extension does not require extensive on-glacier measurements or computational expense but still relies on physical principles over empirical relations as much as is feasible while maintaining model usability. PRMSglacier is validated on two basins in Alaska, Wolverine, and Gulkana Glacier basin, which have been studied since 1966 and have a substantial amount of data with which to test model performance over a long period of time covering a wide range of climatic and hydrologic conditions. When error in field measurements is considered, the Nash-Sutcliffe efficiencies of streamflow are 0.87 and 0.86, the absolute bias fractions of the winter mass balance simulations are 0.10 and 0.08, and the absolute bias fractions of the summer mass balances are 0.01 and 0.03, all computed over 42 years for the Wolverine and Gulkana Glacier basins, respectively. Without taking into account measurement error, the values are still within the range achieved by the more computationally expensive codes tested over shorter time periods.

  11. Computer use changes generalization of movement learning.

    PubMed

    Wei, Kunlin; Yan, Xiang; Kong, Gaiqing; Yin, Cong; Zhang, Fan; Wang, Qining; Kording, Konrad Paul

    2014-01-06

    Over the past few decades, one of the most salient lifestyle changes for us has been the use of computers. For many of us, manual interaction with a computer occupies a large portion of our working time. Through neural plasticity, this extensive movement training should change our representation of movements (e.g., [1-3]), just like search engines affect memory [4]. However, how computer use affects motor learning is largely understudied. Additionally, as virtually all participants in studies of perception and actions are computer users, a legitimate question is whether insights from these studies bear the signature of computer-use experience. We compared non-computer users with age- and education-matched computer users in standard motor learning experiments. We found that people learned equally fast but that non-computer users generalized significantly less across space, a difference negated by two weeks of intensive computer training. Our findings suggest that computer-use experience shaped our basic sensorimotor behaviors, and this influence should be considered whenever computer users are recruited as study participants. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Changes in posture through the use of simple inclines with notebook computers placed on a standard desk.

    PubMed

    Asundi, Krishna; Odell, Dan; Luce, Adam; Dennerlein, Jack T

    2012-03-01

    This study evaluated the use of simple inclines as a portable peripheral for improving head and neck postures during notebook computer use on tables in portable environments such as hotel rooms, cafés, and airport lounges. A 3D motion analysis system measured head, neck and right upper extremity postures of 15 participants as they completed a 10 min computer task in six different configurations, all on a fixed height desk: no-incline, 12° incline, 25° incline, no-incline with external mouse, 25° incline with an external mouse, and a commercially available riser with external mouse and keyboard. After completion of the task, subjects rated the configuration for comfort and ease of use and indicated perceived discomfort in several body segments. Compared to the no-incline configuration, use of the 12° incline reduced forward head tilt and neck flexion while increasing wrist extension. The 25° incline further reduced head tilt and neck flexion while further increasing wrist extension. The 25° incline received the lowest comfort and ease of use ratings and the highest perceived discomfort score. For portable, temporary computing environments where internal input devices are used, users may find improved head and neck postures with acceptable wrist extension postures with the utilization of a 12° incline. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  13. Valgus extension overload syndrome and stress injury of the olecranon.

    PubMed

    Ahmad, Christopher S; ElAttrache, Neal S

    2004-10-01

    Basic science studies have improved our understanding of the pathomechanics for valgus extension overload and olecranon stress fractures. These disorders result from repetitive abutment of the olecranon into the olecranon fossa combined with valgus torques, resulting in impaction and shear along the posteromedial olecranon. The patient history and physical examination are similar for each disorder. Imaging studies including plain radiographs, computed tomography, MRI or bone scan may be necessary for accurate diagnosis. Clinical and basic science support mandatory and careful assessment of the medial collateral ligament when valgus extension overload is identified and limited debridement of the olecranon when surgery is indicated. For stress fractures that fail nonoperative management, treatment with internal fixation provides good results.

  14. Practice in Computer-Based Testing and Performance on the National Certification Examination for Nurse Anesthetists

    ERIC Educational Resources Information Center

    Dosch, Michael P.

    2010-01-01

    The general aim of the present retrospective study was to examine the test mode effect, that is, the difference in performance when tests are taken on computer (CBT), or by paper and pencil (PnP). The specific purpose was to examine the degree to which extensive practice in CBT in graduate students in nurse anesthesia would raise scores on a…

  15. Computing Evans functions numerically via boundary-value problems

    NASA Astrophysics Data System (ADS)

    Barker, Blake; Nguyen, Rose; Sandstede, Björn; Ventura, Nathaniel; Wahl, Colin

    2018-03-01

    The Evans function has been used extensively to study spectral stability of travelling-wave solutions in spatially extended partial differential equations. To compute Evans functions numerically, several shooting methods have been developed. In this paper, an alternative scheme for the numerical computation of Evans functions is presented that relies on an appropriate boundary-value problem formulation. Convergence of the algorithm is proved, and several examples, including the computation of eigenvalues for a multi-dimensional problem, are given. The main advantage of the scheme proposed here compared with earlier methods is that the scheme is linear and scalable to large problems.

  16. Effect of Gender on Computer Use and Attitudes of College Seniors

    NASA Astrophysics Data System (ADS)

    McCoy, Leah P.; Heafner, Tina L.

    Male and female students have historically had different computer attitudes and levels of computer use. These equity issues are of interest to researchers and practitioners who seek to understand why a digital divide exists between men and women. In this study, these questions were examined in an intensive computing environment in which all students at one university were issued identical laptop computers and used them extensively for 4 years. Self-reported computer use was examined for effects of gender. Attitudes toward computers were also assessed and compared for male and female students. The results indicated that when the technological environment was institutionally equalized for male and female students, many traditional findings of gender differences were not evident.

  17. Emulating a million machines to investigate botnets.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rudish, Donald W.

    2010-06-01

    Researchers at Sandia National Laboratories in Livermore, California are creating what is in effect a vast digital petridish able to hold one million operating systems at once in an effort to study the behavior of rogue programs known as botnets. Botnets are used extensively by malicious computer hackers to steal computing power fron Internet-connected computers. The hackers harness the stolen resources into a scattered but powerful computer that can be used to send spam, execute phishing, scams or steal digital information. These remote-controlled 'distributed computers' are difficult to observe and track. Botnets may take over parts of tens of thousandsmore » or in some cases even millions of computers, making them among the world's most powerful computers for some applications.« less

  18. Coupled Aerodynamic and Structural Sensitivity Analysis of a High-Speed Civil Transport

    NASA Technical Reports Server (NTRS)

    Mason, B. H.; Walsh, J. L.

    2001-01-01

    An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity, finite-element structural analysis and computational fluid dynamics aerodynamic analysis. In a previous study, a multi-disciplinary analysis system for a high-speed civil transport was formulated to integrate a set of existing discipline analysis codes, some of them computationally intensive, This paper is an extension of the previous study, in which the sensitivity analysis for the coupled aerodynamic and structural analysis problem is formulated and implemented. Uncoupled stress sensitivities computed with a constant load vector in a commercial finite element analysis code are compared to coupled aeroelastic sensitivities computed by finite differences. The computational expense of these sensitivity calculation methods is discussed.

  19. Two-phase computed tomography study of warthin tumor of parotid gland: differentiation from other parotid gland tumors and its pathologic explanation.

    PubMed

    Woo, Seung Hoon; Choi, Dae-Seob; Kim, Jin-pyeong; Park, Jung Je; Joo, Yeon Hee; Chung, Phil-Sang; Kim, Bo-Young; Ko, Young-Hyeh; Jeong, Han-Sin; Kim, Hyung-Jin

    2013-01-01

    The objective of this study was to define the radiological characteristics of 2-phase computed tomography (CT) of parotid gland Warthin tumors (WTs) with a pathologic basis for these findings. We prospectively enrolled 116 patients with parotid gland tumor who underwent preoperative 2-phase CT scans(scanning delays of 30 and 120 seconds). The attenuation changes and enhancement patterns were analyzed according to pathology. We also evaluated size-matched samples of WTs and pleomorphic adenoma by staining CD31, vascular endothelial growth factor-receptor 2, collagen IV, and smooth muscle actin. Computed tomography numbers in WTs were significantly higher than those in other tumors in early-phase scans and lower in delayed scans. Pathologically, CD31(+) blood vessel area was significantly higher in WTs than in pleomorphic adenomas. In addition, WTs had an extensive capillary network and many leaky blood vessels. The enhancement pattern of early fill-in and early washout is the typical finding of WTs on 2-phase CT scans, which may be attributed pathologically to abundant blood vessel and extensive capillary network.

  20. PRINCESS: Privacy-protecting Rare disease International Network Collaboration via Encryption through Software guard extensionS.

    PubMed

    Chen, Feng; Wang, Shuang; Jiang, Xiaoqian; Ding, Sijie; Lu, Yao; Kim, Jihoon; Sahinalp, S Cenk; Shimizu, Chisato; Burns, Jane C; Wright, Victoria J; Png, Eileen; Hibberd, Martin L; Lloyd, David D; Yang, Hai; Telenti, Amalio; Bloss, Cinnamon S; Fox, Dov; Lauter, Kristin; Ohno-Machado, Lucila

    2017-03-15

    We introduce PRINCESS, a privacy-preserving international collaboration framework for analyzing rare disease genetic data that are distributed across different continents. PRINCESS leverages Software Guard Extensions (SGX) and hardware for trustworthy computation. Unlike a traditional international collaboration model, where individual-level patient DNA are physically centralized at a single site, PRINCESS performs a secure and distributed computation over encrypted data, fulfilling institutional policies and regulations for protected health information. To demonstrate PRINCESS' performance and feasibility, we conducted a family-based allelic association study for Kawasaki Disease, with data hosted in three different continents. The experimental results show that PRINCESS provides secure and accurate analyses much faster than alternative solutions, such as homomorphic encryption and garbled circuits (over 40 000× faster). https://github.com/achenfengb/PRINCESS_opensource. shw070@ucsd.edu. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  1. Modernization and optimization of a legacy open-source CFD code for high-performance computing architectures

    NASA Astrophysics Data System (ADS)

    Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; Kalinkin, Alexander A.

    2017-02-01

    Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, which is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,'bottom-up' and 'top-down', are illustrated. Preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.

  2. Precision of lumbar intervertebral measurements: does a computer-assisted technique improve reliability?

    PubMed

    Pearson, Adam M; Spratt, Kevin F; Genuario, James; McGough, William; Kosman, Katherine; Lurie, Jon; Sengupta, Dilip K

    2011-04-01

    Comparison of intra- and interobserver reliability of digitized manual and computer-assisted intervertebral motion measurements and classification of "instability." To determine if computer-assisted measurement of lumbar intervertebral motion on flexion-extension radiographs improves reliability compared with digitized manual measurements. Many studies have questioned the reliability of manual intervertebral measurements, although few have compared the reliability of computer-assisted and manual measurements on lumbar flexion-extension radiographs. Intervertebral rotation, anterior-posterior (AP) translation, and change in anterior and posterior disc height were measured with a digitized manual technique by three physicians and by three other observers using computer-assisted quantitative motion analysis (QMA) software. Each observer measured 30 sets of digital flexion-extension radiographs (L1-S1) twice. Shrout-Fleiss intraclass correlation coefficients for intra- and interobserver reliabilities were computed. The stability of each level was also classified (instability defined as >4 mm AP translation or 10° rotation), and the intra- and interobserver reliabilities of the two methods were compared using adjusted percent agreement (APA). Intraobserver reliability intraclass correlation coefficients were substantially higher for the QMA technique THAN the digitized manual technique across all measurements: rotation 0.997 versus 0.870, AP translation 0.959 versus 0.557, change in anterior disc height 0.962 versus 0.770, and change in posterior disc height 0.951 versus 0.283. The same pattern was observed for interobserver reliability (rotation 0.962 vs. 0.693, AP translation 0.862 vs. 0.151, change in anterior disc height 0.862 vs. 0.373, and change in posterior disc height 0.730 vs. 0.300). The QMA technique was also more reliable for the classification of "instability." Intraobserver APAs ranged from 87 to 97% for QMA versus 60% to 73% for digitized manual measurements, while interobserver APAs ranged from 91% to 96% for QMA versus 57% to 63% for digitized manual measurements. The use of QMA software substantially improved the reliability of lumbar intervertebral measurements and the classification of instability based on flexion-extension radiographs.

  3. Using Predictability for Lexical Segmentation

    ERIC Educational Resources Information Center

    Çöltekin, Çagri

    2017-01-01

    This study investigates a strategy based on predictability of consecutive sub-lexical units in learning to segment a continuous speech stream into lexical units using computational modeling and simulations. Lexical segmentation is one of the early challenges during language acquisition, and it has been studied extensively through psycholinguistic…

  4. An In-House Prototype for the Implementation of Computer-Based Extensive Reading in a Limited-Resource School

    ERIC Educational Resources Information Center

    Mayora, Carlos A.; Nieves, Idami; Ojeda, Victor

    2014-01-01

    A variety of computer-based models of Extensive Reading have emerged in the last decade. Different Information and Communication Technologies online usually support these models. However, such innovations are not feasible in contexts where the digital breach limits the access to Internet. The purpose of this paper is to report a project in which…

  5. Flexion-relaxation ratio in computer workers with and without chronic neck pain.

    PubMed

    Pinheiro, Carina Ferreira; dos Santos, Marina Foresti; Chaves, Thais Cristina

    2016-02-01

    This study evaluated the flexion-relaxation phenomenon (FRP) and flexion-relaxation ratios (FR-ratios) using surface electromyography (sEMG) of the cervical extensor muscles of computer workers with and without chronic neck pain, as well as of healthy subjects who were not computer users. This study comprised 60 subjects 20-45years of age, of which 20 were computer workers with chronic neck pain (CPG), 20 were computer workers without neck pain (NPG), and 20 were control individuals who do not use computers for work and use them less than 4h/day for other purposes (CG). FRP and FR-ratios were analyzed using sEMG of the cervical extensors. Analysis of FR-ratios showed smaller values in the semispinalis capitis muscles of the two groups of workers compared to the control group. The reference FR-ratio (flexion relaxation ratio [FRR], defined as the maximum activity in 1s of the re-extension/full flexion sEMG activity) was significantly higher in the computer workers with neck pain compared to the CG (CPG: 3.10, 95% confidence interval [CI95%] 2.50-3.70; NPG: 2.33, CI95% 1.93-2.74; CG: 1.99, CI95% 1.81-2.17; p<0.001). The FR-ratios and FRR of sEMG in this study suggested that computer use could increase recruitment of the semispinalis capitis during neck extension (concentric and eccentric phases), which could explain our results. These results also suggest that the FR-ratios of the semispinalis may be a potential functional predictive neuromuscular marker of asymptomatic neck musculoskeletal disorders since even asymptomatic computer workers showed altered values. On the other hand, the FRR values of the semispinalis capitis demonstrated a good discriminative ability to detect neck pain, and such results suggested that each FR-ratio could have a different application. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Impact of computer use on children's vision.

    PubMed

    Kozeis, N

    2009-10-01

    Today, millions of children use computers on a daily basis. Extensive viewing of the computer screen can lead to eye discomfort, fatigue, blurred vision and headaches, dry eyes and other symptoms of eyestrain. These symptoms may be caused by poor lighting, glare, an improper work station set-up, vision problems of which the person was not previously aware, or a combination of these factors. Children can experience many of the same symptoms related to computer use as adults. However, some unique aspects of how children use computers may make them more susceptible than adults to the development of these problems. In this study, the most common eye symptoms related to computer use in childhood, the possible causes and ways to avoid them are reviewed.

  7. Evolutionary and biological metaphors for engineering design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jakiela, M.

    1994-12-31

    Since computing became generally available, there has been strong interest in using computers to assist and automate engineering design processes. Specifically, for design optimization and automation, nonlinear programming and artificial intelligence techniques have been extensively studied. New computational techniques, based upon the natural processes of evolution, adaptation, and learing, are showing promise because of their generality and robustness. This presentation will describe the use of two such techniques, genetic algorithms and classifier systems, for a variety of engineering design problems. Structural topology optimization, meshing, and general engineering optimization are shown as example applications.

  8. Generating finite cyclic and dihedral groups using sequential insertion systems with interactions

    NASA Astrophysics Data System (ADS)

    Fong, Wan Heng; Sarmin, Nor Haniza; Turaev, Sherzod; Yosman, Ahmad Firdaus

    2017-04-01

    The operation of insertion has been studied extensively throughout the years for its impact in many areas of theoretical computer science such as DNA computing. First introduced as a generalization of the concatenation operation, many variants of insertion have been introduced, each with their own computational properties. In this paper, we introduce a new variant that enables the generation of some special types of groups called sequential insertion systems with interactions. We show that these new systems are able to generate all finite cyclic and dihedral groups.

  9. ANTLR Tree Grammar Generator and Extensions

    NASA Technical Reports Server (NTRS)

    Craymer, Loring

    2005-01-01

    A computer program implements two extensions of ANTLR (Another Tool for Language Recognition), which is a set of software tools for translating source codes between different computing languages. ANTLR supports predicated- LL(k) lexer and parser grammars, a notation for annotating parser grammars to direct tree construction, and predicated tree grammars. [ LL(k) signifies left-right, leftmost derivation with k tokens of look-ahead, referring to certain characteristics of a grammar.] One of the extensions is a syntax for tree transformations. The other extension is the generation of tree grammars from annotated parser or input tree grammars. These extensions can simplify the process of generating source-to-source language translators and they make possible an approach, called "polyphase parsing," to translation between computing languages. The typical approach to translator development is to identify high-level semantic constructs such as "expressions," "declarations," and "definitions" as fundamental building blocks in the grammar specification used for language recognition. The polyphase approach is to lump ambiguous syntactic constructs during parsing and then disambiguate the alternatives in subsequent tree transformation passes. Polyphase parsing is believed to be useful for generating efficient recognizers for C++ and other languages that, like C++, have significant ambiguities.

  10. Computational strategies for three-dimensional flow simulations on distributed computer systems

    NASA Technical Reports Server (NTRS)

    Sankar, Lakshmi N.; Weed, Richard A.

    1995-01-01

    This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.

  11. Computational strategies for three-dimensional flow simulations on distributed computer systems

    NASA Astrophysics Data System (ADS)

    Sankar, Lakshmi N.; Weed, Richard A.

    1995-08-01

    This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.

  12. Tuning the cache memory usage in tomographic reconstruction on standard computers with Advanced Vector eXtensions (AVX)

    PubMed Central

    Agulleiro, Jose-Ignacio; Fernandez, Jose-Jesus

    2015-01-01

    Cache blocking is a technique widely used in scientific computing to minimize the exchange of information with main memory by reusing the data kept in cache memory. In tomographic reconstruction on standard computers using vector instructions, cache blocking turns out to be central to optimize performance. To this end, sinograms of the tilt-series and slices of the volumes to be reconstructed have to be divided into small blocks that fit into the different levels of cache memory. The code is then reorganized so as to operate with a block as much as possible before proceeding with another one. This data article is related to the research article titled Tomo3D 2.0 – Exploitation of Advanced Vector eXtensions (AVX) for 3D reconstruction (Agulleiro and Fernandez, 2015) [1]. Here we present data of a thorough study of the performance of tomographic reconstruction by varying cache block sizes, which allows derivation of expressions for their automatic quasi-optimal tuning. PMID:26217710

  13. Tuning the cache memory usage in tomographic reconstruction on standard computers with Advanced Vector eXtensions (AVX).

    PubMed

    Agulleiro, Jose-Ignacio; Fernandez, Jose-Jesus

    2015-06-01

    Cache blocking is a technique widely used in scientific computing to minimize the exchange of information with main memory by reusing the data kept in cache memory. In tomographic reconstruction on standard computers using vector instructions, cache blocking turns out to be central to optimize performance. To this end, sinograms of the tilt-series and slices of the volumes to be reconstructed have to be divided into small blocks that fit into the different levels of cache memory. The code is then reorganized so as to operate with a block as much as possible before proceeding with another one. This data article is related to the research article titled Tomo3D 2.0 - Exploitation of Advanced Vector eXtensions (AVX) for 3D reconstruction (Agulleiro and Fernandez, 2015) [1]. Here we present data of a thorough study of the performance of tomographic reconstruction by varying cache block sizes, which allows derivation of expressions for their automatic quasi-optimal tuning.

  14. From Greeks to Today: Cipher Trees and Computer Cryptography.

    ERIC Educational Resources Information Center

    Grady, M. Tim; Brumbaugh, Doug

    1988-01-01

    Explores the use of computers for teaching mathematical models of transposition ciphers. Illustrates the ideas, includes activities and extensions, provides a mathematical model and includes computer programs to implement these topics. (MVL)

  15. Comparison of the Hamstring Muscle Activity and Flexion-Relaxation Ratio between Asymptomatic Persons and Computer Work-related Low Back Pain Sufferers.

    PubMed

    Kim, Min-Hee; Yoo, Won-Gyu

    2013-05-01

    [Purpose] The purpose of this study was to compare the hamstring muscle (HAM) activities and flexion-relaxation ratios of an asymptomatic group and a computer work-related low back pain (LBP) group. [Subjects] For this study, we recruited 10 asymptomatic computer workers and 10 computer workers with work-related LBP. [Methods] We measured the RMS activity of each phase (flexion, full-flexion, and re-extension phase) of trunk flexion and calculated the flexion-relaxation (FR) ratio of the muscle activities of the flexion and full-flexion phases. [Results] In the computer work-related LBP group, the HAM muscle activity increased during the full-flexion phase compared to the asymptomatic group, and the FR ration was also significantly higher. [Conclusion] We thought that prolonged sitting of computer workers might cause the change in their HAM muscle activity pattern.

  16. Cloud Computing. Technology Briefing. Number 1

    ERIC Educational Resources Information Center

    Alberta Education, 2013

    2013-01-01

    Cloud computing is Internet-based computing in which shared resources, software and information are delivered as a service that computers or mobile devices can access on demand. Cloud computing is already used extensively in education. Free or low-cost cloud-based services are used daily by learners and educators to support learning, social…

  17. Design of the Digital Sky Survey DA and online system: A case history in the use of computer aided tools for data acquisition system design

    NASA Astrophysics Data System (ADS)

    Petravick, D.; Berman, E.; Nicinski, T.; Rechenmacher, R.; Oleynik, G.; Pordes, R.; Stoughton, C.

    1991-06-01

    As part of its expanding Astrophysics program, Fermilab is participating in the Digital Sky Survey (DSS). Fermilab is part of a collaboration involving University of Chicago, Princeton University, and the Institute of Advanced Studies (at Princeton). The DSS main results will be a photometric imaging survey and a redshift survey of galaxies and color-selected quasars over pi steradians of the Northern Galactic Cap. This paper focuses on our use of Computer Aided Software Engineering (CASE) in specifying the data system for DSS. Extensions to standard methodologies were necessary to compensate for tool shortcomings and to improve communication amongst the collaboration members. One such important extension was the incorporation of CASE information into the specification document.

  18. Feasibility study for a numerical aerodynamic simulation facility. Volume 3: FMP language specification/user manual

    NASA Technical Reports Server (NTRS)

    Kenner, B. G.; Lincoln, N. R.

    1979-01-01

    The manual is intended to show the revisions and additions to the current STAR FORTRAN. The changes are made to incorporate an FMP (Flow Model Processor) for use in the Numerical Aerodynamic Simulation Facility (NASF) for the purpose of simulating fluid flow over three-dimensional bodies in wind tunnel environments and in free space. The FORTRAN programming language for the STAR-100 computer contains both CDC and unique STAR extensions to the standard FORTRAN. Several of the STAR FORTRAN extensions to standard FOR-TRAN allow the FORTRAN user to exploit the vector processing capabilities of the STAR computer. In STAR FORTRAN, vectors can be expressed with an explicit notation, functions are provided that return vector results, and special call statements enable access to any machine instruction.

  19. Extending fields in a level set method by solving a biharmonic equation

    NASA Astrophysics Data System (ADS)

    Moroney, Timothy J.; Lusmore, Dylan R.; McCue, Scott W.; McElwain, D. L. Sean

    2017-08-01

    We present an approach for computing extensions of velocities or other fields in level set methods by solving a biharmonic equation. The approach differs from other commonly used approaches to velocity extension because it deals with the interface fully implicitly through the level set function. No explicit properties of the interface, such as its location or the velocity on the interface, are required in computing the extension. These features lead to a particularly simple implementation using either a sparse direct solver or a matrix-free conjugate gradient solver. Furthermore, we propose a fast Poisson preconditioner that can be used to accelerate the convergence of the latter. We demonstrate the biharmonic extension on a number of test problems that serve to illustrate its effectiveness at producing smooth and accurate extensions near interfaces. A further feature of the method is the natural way in which it deals with symmetry and periodicity, ensuring through its construction that the extension field also respects these symmetries.

  20. Use of Technology in the Household: An Exploratory Study

    ERIC Educational Resources Information Center

    Jackson, Barcus C.

    2010-01-01

    Since the 1980s, personal computer ownership has become ubiquitous, and people are increasingly using household technologies for a wide variety of purposes. Extensive research has resulted in useful models to explain workplace technology acceptance and household technology adoption. Studies have also found that the determinants underlying…

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boegh, P.; Hopkirk, R.; Junod, A.

    From international nuclear industries fair; Basel, Switzerland (16 Oct 1972). The extensive environmental studies performed in Switzerland for the cooling towers of the Kaiseraugst and Leibstadt Nuclear Power Plants are presented. The computer program SAUNA for the calculation of the cooling tower plume behavior is briefly described. The main results of the environmental studies are summarized. (8 references) (auth)

  2. The CALL-SLA Interface: Insights from a Second-Order Synthesis

    ERIC Educational Resources Information Center

    Plonsky, Luke; Ziegler, Nicole

    2016-01-01

    The relationship between computer-assisted language learning (CALL) and second language acquisition (SLA) has been studied both extensively, covering numerous subdomains, and intensively, resulting in hundreds of primary studies. It is therefore no surprise that CALL researchers, as in other areas of applied linguistics, have turned in recent…

  3. Force user's manual, revised

    NASA Technical Reports Server (NTRS)

    Jordan, Harry F.; Benten, Muhammad S.; Arenstorf, Norbert S.; Ramanan, Aruna V.

    1987-01-01

    A methodology for writing parallel programs for shared memory multiprocessors has been formalized as an extension to the Fortran language and implemented as a macro preprocessor. The extended language is known as the Force, and this manual describes how to write Force programs and execute them on the Flexible Computer Corporation Flex/32, the Encore Multimax and the Sequent Balance computers. The parallel extension macros are described in detail, but knowledge of Fortran is assumed.

  4. permGPU: Using graphics processing units in RNA microarray association studies.

    PubMed

    Shterev, Ivo D; Jung, Sin-Ho; George, Stephen L; Owzar, Kouros

    2010-06-16

    Many analyses of microarray association studies involve permutation, bootstrap resampling and cross-validation, that are ideally formulated as embarrassingly parallel computing problems. Given that these analyses are computationally intensive, scalable approaches that can take advantage of multi-core processor systems need to be developed. We have developed a CUDA based implementation, permGPU, that employs graphics processing units in microarray association studies. We illustrate the performance and applicability of permGPU within the context of permutation resampling for a number of test statistics. An extensive simulation study demonstrates a dramatic increase in performance when using permGPU on an NVIDIA GTX 280 card compared to an optimized C/C++ solution running on a conventional Linux server. permGPU is available as an open-source stand-alone application and as an extension package for the R statistical environment. It provides a dramatic increase in performance for permutation resampling analysis in the context of microarray association studies. The current version offers six test statistics for carrying out permutation resampling analyses for binary, quantitative and censored time-to-event traits.

  5. Cloud Based Educational Systems and Its Challenges and Opportunities and Issues

    ERIC Educational Resources Information Center

    Paul, Prantosh Kr.; Lata Dangwal, Kiran

    2014-01-01

    Cloud Computing (CC) is actually is a set of hardware, software, networks, storage, services an interface combines to deliver aspects of computing as a service. Cloud Computing (CC) actually uses the central remote servers to maintain data and applications. Practically Cloud Computing (CC) is extension of Grid computing with independency and…

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonachea, D.; Dickens, P.; Thakur, R.

    There is a growing interest in using Java as the language for developing high-performance computing applications. To be successful in the high-performance computing domain, however, Java must not only be able to provide high computational performance, but also high-performance I/O. In this paper, we first examine several approaches that attempt to provide high-performance I/O in Java - many of which are not obvious at first glance - and evaluate their performance on two parallel machines, the IBM SP and the SGI Origin2000. We then propose extensions to the Java I/O library that address the deficiencies in the Java I/O APImore » and improve performance dramatically. The extensions add bulk (array) I/O operations to Java, thereby removing much of the overhead currently associated with array I/O in Java. We have implemented the extensions in two ways: in a standard JVM using the Java Native Interface (JNI) and in a high-performance parallel dialect of Java called Titanium. We describe the two implementations and present performance results that demonstrate the benefits of the proposed extensions.« less

  7. Computational open-channel hydraulics for movable-bed problems

    USGS Publications Warehouse

    Lai, Chintu; ,

    1990-01-01

    As a major branch of computational hydraulics, notable advances have been made in numerical modeling of unsteady open-channel flow since the beginning of the computer age. According to the broader definition and scope of 'computational hydraulics,' the basic concepts and technology of modeling unsteady open-channel flow have been systematically studied previously. As a natural extension, computational open-channel hydraulics for movable-bed problems are addressed in this paper. The introduction of the multimode method of characteristics (MMOC) has made the modeling of this class of unsteady flows both practical and effective. New modeling techniques are developed, thereby shedding light on several aspects of computational hydraulics. Some special features of movable-bed channel-flow simulation are discussed here in the same order as given by the author in the fixed-bed case.

  8. Modernization and optimization of a legacy open-source CFD code for high-performance computing architectures

    DOE PAGES

    Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; ...

    2017-03-20

    Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less

  9. Extension of the TDCR model to compute counting efficiencies for radionuclides with complex decay schemes.

    PubMed

    Kossert, K; Cassette, Ph; Carles, A Grau; Jörg, G; Gostomski, Christroph Lierse V; Nähle, O; Wolf, Ch

    2014-05-01

    The triple-to-double coincidence ratio (TDCR) method is frequently used to measure the activity of radionuclides decaying by pure β emission or electron capture (EC). Some radionuclides with more complex decays have also been studied, but accurate calculations of decay branches which are accompanied by many coincident γ transitions have not yet been investigated. This paper describes recent extensions of the model to make efficiency computations for more complex decay schemes possible. In particular, the MICELLE2 program that applies a stochastic approach of the free parameter model was extended. With an improved code, efficiencies for β(-), β(+) and EC branches with up to seven coincident γ transitions can be calculated. Moreover, a new parametrization for the computation of electron stopping powers has been implemented to compute the ionization quenching function of 10 commercial scintillation cocktails. In order to demonstrate the capabilities of the TDCR method, the following radionuclides are discussed: (166m)Ho (complex β(-)/γ), (59)Fe (complex β(-)/γ), (64)Cu (β(-), β(+), EC and EC/γ) and (229)Th in equilibrium with its progenies (decay chain with many α, β and complex β(-)/γ transitions). © 2013 Published by Elsevier Ltd.

  10. Modernization and optimization of a legacy open-source CFD code for high-performance computing architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha

    Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less

  11. A security mechanism based on evolutionary game in fog computing.

    PubMed

    Sun, Yan; Lin, Fuhong; Zhang, Nan

    2018-02-01

    Fog computing is a distributed computing paradigm at the edge of the network and requires cooperation of users and sharing of resources. When users in fog computing open their resources, their devices are easily intercepted and attacked because they are accessed through wireless network and present an extensive geographical distribution. In this study, a credible third party was introduced to supervise the behavior of users and protect the security of user cooperation. A fog computing security mechanism based on human nervous system is proposed, and the strategy for a stable system evolution is calculated. The MATLAB simulation results show that the proposed mechanism can reduce the number of attack behaviors effectively and stimulate users to cooperate in application tasks positively.

  12. Modular space station, phase B extension. Information management advanced development. Volume 4: Data processing assembly

    NASA Technical Reports Server (NTRS)

    Gerber, C. R.

    1972-01-01

    The computation and logical functions which are performed by the data processing assembly of the modular space station are defined. The subjects discussed are: (1) requirements analysis, (2) baseline data processing assembly configuration, (3) information flow study, (4) throughput simulation, (5) redundancy study, (6) memory studies, and (7) design requirements specification.

  13. Case Study of a Small Scale Polytechnic Entrepreneurship Capstone Course Sequence

    ERIC Educational Resources Information Center

    Webster, Rustin D.; Kopp, Richard

    2017-01-01

    A multidisciplinary entrepreneurial senior capstone has been created for engineering technology students at a research I land-grant university statewide extension. The two semester course sequence welcomes students from Mechanical Engineering Technology, Electrical Engineering Technology, Computer Graphics Technology, and Organizational…

  14. Advances in Artificial Neural Networks - Methodological Development and Application

    USDA-ARS?s Scientific Manuscript database

    Artificial neural networks as a major soft-computing technology have been extensively studied and applied during the last three decades. Research on backpropagation training algorithms for multilayer perceptron networks has spurred development of other neural network training algorithms for other ne...

  15. An Extensive X-ray Computed Tomography Evaluation of a Fully Penetrated Encapsulated SiC MMC Ballistic Panel

    DTIC Science & Technology

    2009-04-01

    An Extensive X-ray Computed Tomography Evaluation of a Fully Penetrated Encapsulated SiC MMC Ballistic Panel by William H. Green and Robert H...Panel William H. Green and Robert H. Carter Weapons and Materials Research Directorate, ARL...PROGRAM ELEMENT NUMBER 2182040 6. AUTHOR(S) William H. Green and Robert H. Carter 5d. PROJECT NUMBER AH80 5e. TASK NUMBER 5f. WORK UNIT

  16. Reanalysis, compatibility and correlation in analysis of modified antenna structures

    NASA Technical Reports Server (NTRS)

    Levy, R.

    1989-01-01

    A simple computational procedure is synthesized to process changes in the microwave-antenna pathlength-error measure when there are changes in the antenna structure model. The procedure employs structural modification reanalysis methods combined with new extensions of correlation analysis to provide the revised rms pathlength error. Mainframe finite-element-method processing of the structure model is required only for the initial unmodified structure, and elementary postprocessor computations develop and deal with the effects of the changes. Several illustrative computational examples are included. The procedure adapts readily to processing spectra of changes for parameter studies or sensitivity analyses.

  17. On the use and computation of the Jordan canonical form in system theory

    NASA Technical Reports Server (NTRS)

    Sridhar, B.; Jordan, D.

    1974-01-01

    This paper investigates various aspects of the application of the Jordan canonical form of a matrix in system theory and develops a computational approach to determining the Jordan form for a given matrix. Applications include pole placement, controllability and observability studies, serving as an intermediate step in yielding other canonical forms, and theorem proving. The computational method developed in this paper is both simple and efficient. The method is based on the definition of a generalized eigenvector and a natural extension of Gauss elimination techniques. Examples are included for demonstration purposes.

  18. Automatic system for computer program documentation

    NASA Technical Reports Server (NTRS)

    Simmons, D. B.; Elliott, R. W.; Arseven, S.; Colunga, D.

    1972-01-01

    Work done on a project to design an automatic system for computer program documentation aids was made to determine what existing programs could be used effectively to document computer programs. Results of the study are included in the form of an extensive bibliography and working papers on appropriate operating systems, text editors, program editors, data structures, standards, decision tables, flowchart systems, and proprietary documentation aids. The preliminary design for an automated documentation system is also included. An actual program has been documented in detail to demonstrate the types of output that can be produced by the proposed system.

  19. Integration of Cloud resources in the LHCb Distributed Computing

    NASA Astrophysics Data System (ADS)

    Úbeda García, Mario; Méndez Muñoz, Víctor; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel

    2014-06-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) - instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keeping this on mind, pros and cons of a cloud based infrasctructure have been studied in contrast with the current setup. As a result, this work addresses four different use cases which represent a major improvement on several levels of our infrastructure. We describe the solution implemented by LHCb for the contextualisation of the VMs based on the idea of Cloud Site. We report on operational experience of using in production several institutional Cloud resources that are thus becoming integral part of the LHCb Distributed Computing resources. Furthermore, we describe as well the gradual migration of our Service Infrastructure towards a fully distributed architecture following the Service as a Service (SaaS) model.

  20. DYNALIST II : A Computer Program for Stability and Dynamic Response Analysis of Rail Vehicle Systems : Volume 2. User's Manual.

    DOT National Transportation Integrated Search

    1975-02-01

    A methodology and a computer program, DYNALIST II, have been developed for computing the response of rail vehicle systems to sinusoidal or stationary random rail irregularities. The computer program represents an extension of the earlier DYNALIST pro...

  1. DYNALIST II : A Computer Program for Stability and Dynamic Response Analysis of Rail Vehicle Systems : Volume 1. Technical Report.

    DOT National Transportation Integrated Search

    1975-02-01

    A methodology and a computer program, DYNALIST II, have been developed for computing the response of rail vehicle systems to sinusoidal or stationary random rail irregularities. The computer program represents an extension of the earlier DYNALIST pro...

  2. 77 FR 64309 - Notice of Request for an Extension to a Currently Approved Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-19

    ..., fees, books, use of a laptop computer, and Leadership training. The program is conducted in accordance... graduating in food, agriculture, natural resources, and other related fields of study and to offer career... Program guidelines. The program is designed to integrate classroom study in a degreed university program...

  3. The Computational Infrastructure for Geodynamics as a Community of Practice

    NASA Astrophysics Data System (ADS)

    Hwang, L.; Kellogg, L. H.

    2016-12-01

    Computational Infrastructure for Geodynamics (CIG), geodynamics.org, originated in 2005 out of community recognition that the efforts of individual or small groups of researchers to develop scientifically-sound software is impossible to sustain, duplicates effort, and makes it difficult for scientists to adopt state-of-the art computational methods that promote new discovery. As a community of practice, participants in CIG share an interest in computational modeling in geodynamics and work together on open source software to build the capacity to support complex, extensible, scalable, interoperable, reliable, and reusable software in an effort to increase the return on investment in scientific software development and increase the quality of the resulting software. The group interacts regularly to learn from each other and better their practices formally through webinar series, workshops, and tutorials and informally through listservs and hackathons. Over the past decade, we have learned that successful scientific software development requires at a minimum: collaboration between domain-expert researchers, software developers and computational scientists; clearly identified and committed lead developer(s); well-defined scientific and computational goals that are regularly evaluated and updated; well-defined benchmarks and testing throughout development; attention throughout development to usability and extensibility; understanding and evaluation of the complexity of dependent libraries; and managed user expectations through education, training, and support. CIG's code donation standards provide the basis for recently formalized best practices in software development (geodynamics.org/cig/dev/best-practices/). Best practices include use of version control; widely used, open source software libraries; extensive test suites; portable configuration and build systems; extensive documentation internal and external to the code; and structured, human readable input formats.

  4. The relationship between psychosocial work factors, work stress and computer-related musculoskeletal discomforts among computer users in Malaysia.

    PubMed

    Zakerian, Seyed Abolfazl; Subramaniam, Indra Devi

    2009-01-01

    Increasing numbers of workers use computer for work. So, especially among office workers, there is a high risk of musculoskeletal discomforts. This study examined the associations among 3 factors, psychosocial work factors, work stress and musculoskeletal discomforts. These associations were examined via a questionnaire survey on 30 office workers (at a university in Malaysia), whose jobs required an extensive use of computers. The questionnaire was distributed and collected daily for 20 days. While the results indicated a significant relationship among psychosocial work factors, work stress and musculoskeletal discomfort, 3 psychosocial work factors were found to be more important than others in both work stress and musculoskeletal discomfort: job demands, negative social interaction and computer-related problems. To further develop study design, it is necessary to investigate industrial and other workers who have experienced musculoskeletal discomforts and work stress.

  5. A glacier runoff extension to the Precipitation Runoff Modeling System

    Treesearch

    A. E. Van Beusekom; R. J. Viger

    2016-01-01

    A module to simulate glacier runoff, PRMSglacier, was added to PRMS (Precipitation Runoff Modeling System), a distributed-parameter, physical-process hydrological simulation code. The extension does not require extensive on-glacier measurements or computational expense but still relies on physical principles over empirical relations as much as is feasible while...

  6. Errors in finite-difference computations on curvilinear coordinate systems

    NASA Technical Reports Server (NTRS)

    Mastin, C. W.; Thompson, J. F.

    1980-01-01

    Curvilinear coordinate systems were used extensively to solve partial differential equations on arbitrary regions. An analysis of truncation error in the computation of derivatives revealed why numerical results may be erroneous. A more accurate method of computing derivatives is presented.

  7. ELEVEN BROADCASTING EXPERIMENTS.

    ERIC Educational Resources Information Center

    PERRATON, HILARY D.

    A REVIEW IS MADE OF EXPERIMENTAL COURSES COMBINING THE USE OF RADIO, TELEVISION, AND CORRESPONDENCE STUDY AND GIVEN BY THE NATIONAL EXTENSION COLLEGE IN ENGLAND. COURSES INCLUDED ENGLISH, MATHEMATICS, SOCIAL WORK, PHYSICS, STATISTICS, AND COMPUTERS. TWO METHODS OF LINKING CORRESPONDENCE COURSES TO BROADCASTS WERE USED--IN MATHEMATICS AND SOCIAL…

  8. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S.

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.« less

  9. ARDS User Manual

    NASA Technical Reports Server (NTRS)

    Fleming, David P.

    2001-01-01

    Personal computers (PCs) are now used extensively for engineering analysis. their capability exceeds that of mainframe computers of only a few years ago. Programs originally written for mainframes have been ported to PCs to make their use easier. One of these programs is ARDS (Analysis of Rotor Dynamic Systems) which was developed at Arizona State University (ASU) by Nelson et al. to quickly and accurately analyze rotor steady state and transient response using the method of component mode synthesis. The original ARDS program was ported to the PC in 1995. Several extensions were made at ASU to increase the capability of mainframe ARDS. These extensions have also been incorporated into the PC version of ARDS. Each mainframe extension had its own user manual generally covering only that extension. Thus to exploit the full capability of ARDS required a large set of user manuals. Moreover, necessary changes and enhancements for PC ARDS were undocumented. The present document is intended to remedy those problems by combining all pertinent information needed for the use of PC ARDS into one volume.

  10. AIP1OGREN: Aerosol Observing Station Intensive Properties Value-Added Product

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koontz, Annette; Flynn, Connor

    The aip1ogren value-added product (VAP) computes several aerosol intensive properties. It requires as input calibrated, corrected, aerosol extensive properties (scattering and absorption coefficients, primarily) from the Aerosol Observing Station (AOS). Aerosol extensive properties depend on both the nature of the aerosol and the amount of the aerosol. We compute several properties as relationships between the various extensive properties. These intensive properties are independent of aerosol amount and instead relate to intrinsic properties of the aerosol itself. Along with the original extensive properties we report aerosol single-scattering albedo, hemispheric backscatter fraction, asymmetry parameter, and Ångström exponent for scattering and absorption withmore » one-minute averaging. An hourly averaged file is produced from the 1-minute files that includes all extensive and intensive properties as well as submicron scattering and submicron absorption fractions. Finally, in both the minutely and hourly files the aerosol radiative forcing efficiency is provided.« less

  11. Computers and the Primary Curriculum 3-13.

    ERIC Educational Resources Information Center

    Crompton, Rob, Ed.

    This book is a comprehensive and practical guide to the use of computers across a wide age range. Extensive use is made of photographs, illustrations, cartoons, and samples of children's work to demonstrate the versatility of computer use in schools. An introduction by Rob Crompton placing computer use within the educational context of the United…

  12. Extensions and improvements on XTRAN3S

    NASA Technical Reports Server (NTRS)

    Borland, C. J.

    1989-01-01

    Improvements to the XTRAN3S computer program are summarized. Work on this code, for steady and unsteady aerodynamic and aeroelastic analysis in the transonic flow regime has concentrated on the following areas: (1) Maintenance of the XTRAN3S code, including correction of errors, enhancement of operational capability, and installation on the Cray X-MP system; (2) Extension of the vectorization concepts in XTRAN3S to include additional areas of the code for improved execution speed; (3) Modification of the XTRAN3S algorithm for improved numerical stability for swept, tapered wing cases and improved computational efficiency; and (4) Extension of the wing-only version of XTRAN3S to include pylon and nacelle or external store capability.

  13. Recent Theoretical Studies On Excitation and Recombination

    NASA Technical Reports Server (NTRS)

    Pradhan, Anil K.

    2000-01-01

    New advances in the theoretical treatment of atomic processes in plasmas are described. These enable not only an integrated, unified, and self-consistent treatment of important radiative and collisional processes, but also large-scale computation of atomic data with high accuracy. An extension of the R-matrix work, from excitation and photoionization to electron-ion recombination, includes a unified method that subsumes both the radiative and the di-electronic recombination processes in an ab initio manner. The extensive collisional calculations for iron and iron-peak elements under the Iron Project are also discussed.

  14. 20 CFR 704.103 - Removal of certain minimums when computing or paying compensation.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Removal of certain minimums when computing or... PROVISIONS FOR LHWCA EXTENSIONS Defense Base Act § 704.103 Removal of certain minimums when computing or... benefits are to be computed under section 9 of the LHWCA, 33 U.S.C. 909, shall not apply in computing...

  15. The Effects of a Robot Game Environment on Computer Programming Education for Elementary School Students

    ERIC Educational Resources Information Center

    Shim, Jaekwoun; Kwon, Daiyoung; Lee, Wongyu

    2017-01-01

    In the past, computer programming was perceived as a task only carried out by computer scientists; in the 21st century, however, computer programming is viewed as a critical and necessary skill that everyone should learn. In order to improve teaching of problem-solving abilities in a computing environment, extensive research is being done on…

  16. Computational solution of atmospheric chemistry problems

    NASA Technical Reports Server (NTRS)

    Jafri, J.; Ake, R. L.

    1986-01-01

    Extensive studies were performed on problems of interest in atmospheric chemistry. In addition to several minor projects, four major projects were performed and described (theoretical studies of ground and low-lying excited states of ClO2; ground and excited state potential energy surfaces of the methyl peroxy radical; electronic states ot the FO radical; and theoretical studies S02 (H2O) (sub n)).

  17. Using spatial principles to optimize distributed computing for enabling the physical science discoveries

    PubMed Central

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-01-01

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779

  18. Using spatial principles to optimize distributed computing for enabling the physical science discoveries.

    PubMed

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-04-05

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.

  19. Surface electromyogram for the control of anthropomorphic teleoperator fingers.

    PubMed

    Gupta, V; Reddy, N P

    1996-01-01

    Growing importance of telesurgery has led to the need for the development of synergistic control of anthropomorphic teleoperators. Synergistic systems can be developed using direct biological control. The purpose of this study was to develop techniques for direct biocontrol of anthropomorphic teleoperators using surface electromyogram (EMG). A computer model of a two finger teleoperator was developed and controlled using surface EMG from the flexor digitorum superficialis during flexion-extension of the index finger. The results of the study revealed a linear relationship between the RMS EMG and the flexion-extension of the finger model. Therefore, surface EMG can be used as a direct biocontrol for teleoperators and in VR applications.

  20. An Extension of the Mean Value Theorem for Integrals

    ERIC Educational Resources Information Center

    Khalili, Parviz; Vasiliu, Daniel

    2010-01-01

    In this note we present an extension of the mean value theorem for integrals. The extension we consider is motivated by an older result (here referred as Corollary 2), which is quite classical for the literature of Mathematical Analysis or Calculus. We also show an interesting application for computing the sum of a harmonic series.

  1. Identifying Key Events in AOPs for Embryonic Disruption using Computational Toxicology (European Teratology Society - AOP symp.)

    EPA Science Inventory

    Addressing safety aspects of drugs and environmental chemicals relies extensively on animal testing; however, the quantity of chemicals needing assessment and challenges of species extrapolation require alternative approaches to traditional animal studies. Newer in vitro and in s...

  2. Development of Policy on the Telecommunications-Transportation Tradeoff, Final Report.

    ERIC Educational Resources Information Center

    Nilles, Jack M.; And Others

    To identify and evaluate the implications of potential communications and computer technology alternatives to urban transportation, an extensive research study was made of telecommuting--bringing workers toegether by communication instead of physically. An attempt was made to formulate practical statements on telecommuting network design, policies…

  3. NETL - Chemical Looping Reactor

    ScienceCinema

    None

    2018-02-14

    NETL's Chemical Looping Reactor unit is a high-temperature integrated CLC process with extensive instrumentation to improve computational simulations. A non-reacting test unit is also used to study solids flow at ambient temperature. The CLR unit circulates approximately 1,000 pounds per hour at temperatures around 1,800 degrees Fahrenheit.

  4. Campus Laptops: What Logistical and Technological Factors Are Perceived Critical?

    ERIC Educational Resources Information Center

    Cutshall, Robert; Changchit, Chuleeporn; Elwood, Susan

    2006-01-01

    This study examined university students' perceptions about a required laptop program. For higher education, providing experiences with computer tools tends to be one of the prerequisites to professional success because employers value extensive experience with information technology. Several universities are initiating laptop programs where all…

  5. Binding-Site Compatible Fragment Growing Applied to the Design of β2-Adrenergic Receptor Ligands.

    PubMed

    Chevillard, Florent; Rimmer, Helena; Betti, Cecilia; Pardon, Els; Ballet, Steven; van Hilten, Niek; Steyaert, Jan; Diederich, Wibke E; Kolb, Peter

    2018-02-08

    Fragment-based drug discovery is intimately linked to fragment extension approaches that can be accelerated using software for de novo design. Although computers allow for the facile generation of millions of suggestions, synthetic feasibility is however often neglected. In this study we computationally extended, chemically synthesized, and experimentally assayed new ligands for the β 2 -adrenergic receptor (β 2 AR) by growing fragment-sized ligands. In order to address the synthetic tractability issue, our in silico workflow aims at derivatized products based on robust organic reactions. The study started from the predicted binding modes of five fragments. We suggested a total of eight diverse extensions that were easily synthesized, and further assays showed that four products had an improved affinity (up to 40-fold) compared to their respective initial fragment. The described workflow, which we call "growing via merging" and for which the key tools are available online, can improve early fragment-based drug discovery projects, making it a useful creative tool for medicinal chemists during structure-activity relationship (SAR) studies.

  6. Technology's Effect on Achievement in Higher Education: A Stage I Meta-Analysis of Classroom Applications

    ERIC Educational Resources Information Center

    Schmid, Richard F.; Bernard, Robert M.; Borokhovski, Eugene; Tamim, Rana; Abrami, Philip C.; Wade, C. Anne; Surkes, Michael A.; Lowerison, Gretchen

    2009-01-01

    This paper reports the findings of a Stage I meta-analysis exploring the achievement effects of computer-based technology use in higher education classrooms (non-distance education). An extensive literature search revealed more than 6,000 potentially relevant primary empirical studies. Analysis of a representative sample of 231 studies (k = 310)…

  7. Anniversary Paper: History and status of CAD and quantitative image analysis: The role of Medical Physics and AAPM

    PubMed Central

    Giger, Maryellen L.; Chan, Heang-Ping; Boone, John

    2008-01-01

    The roles of physicists in medical imaging have expanded over the years, from the study of imaging systems (sources and detectors) and dose to the assessment of image quality and perception, the development of image processing techniques, and the development of image analysis methods to assist in detection and diagnosis. The latter is a natural extension of medical physicists’ goals in developing imaging techniques to help physicians acquire diagnostic information and improve clinical decisions. Studies indicate that radiologists do not detect all abnormalities on images that are visible on retrospective review, and they do not always correctly characterize abnormalities that are found. Since the 1950s, the potential use of computers had been considered for analysis of radiographic abnormalities. In the mid-1980s, however, medical physicists and radiologists began major research efforts for computer-aided detection or computer-aided diagnosis (CAD), that is, using the computer output as an aid to radiologists—as opposed to a completely automatic computer interpretation—focusing initially on methods for the detection of lesions on chest radiographs and mammograms. Since then, extensive investigations of computerized image analysis for detection or diagnosis of abnormalities in a variety of 2D and 3D medical images have been conducted. The growth of CAD over the past 20 years has been tremendous—from the early days of time-consuming film digitization and CPU-intensive computations on a limited number of cases to its current status in which developed CAD approaches are evaluated rigorously on large clinically relevant databases. CAD research by medical physicists includes many aspects—collecting relevant normal and pathological cases; developing computer algorithms appropriate for the medical interpretation task including those for segmentation, feature extraction, and classifier design; developing methodology for assessing CAD performance; validating the algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more—from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis. PMID:19175137

  8. Anniversary Paper: History and status of CAD and quantitative image analysis: The role of Medical Physics and AAPM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giger, Maryellen L.; Chan, Heang-Ping; Boone, John

    2008-12-15

    The roles of physicists in medical imaging have expanded over the years, from the study of imaging systems (sources and detectors) and dose to the assessment of image quality and perception, the development of image processing techniques, and the development of image analysis methods to assist in detection and diagnosis. The latter is a natural extension of medical physicists' goals in developing imaging techniques to help physicians acquire diagnostic information and improve clinical decisions. Studies indicate that radiologists do not detect all abnormalities on images that are visible on retrospective review, and they do not always correctly characterize abnormalities thatmore » are found. Since the 1950s, the potential use of computers had been considered for analysis of radiographic abnormalities. In the mid-1980s, however, medical physicists and radiologists began major research efforts for computer-aided detection or computer-aided diagnosis (CAD), that is, using the computer output as an aid to radiologists--as opposed to a completely automatic computer interpretation--focusing initially on methods for the detection of lesions on chest radiographs and mammograms. Since then, extensive investigations of computerized image analysis for detection or diagnosis of abnormalities in a variety of 2D and 3D medical images have been conducted. The growth of CAD over the past 20 years has been tremendous--from the early days of time-consuming film digitization and CPU-intensive computations on a limited number of cases to its current status in which developed CAD approaches are evaluated rigorously on large clinically relevant databases. CAD research by medical physicists includes many aspects--collecting relevant normal and pathological cases; developing computer algorithms appropriate for the medical interpretation task including those for segmentation, feature extraction, and classifier design; developing methodology for assessing CAD performance; validating the algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more--from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis.« less

  9. Concurrent extensions to the FORTRAN language for parallel programming of computational fluid dynamics algorithms

    NASA Technical Reports Server (NTRS)

    Weeks, Cindy Lou

    1986-01-01

    Experiments were conducted at NASA Ames Research Center to define multi-tasking software requirements for multiple-instruction, multiple-data stream (MIMD) computer architectures. The focus was on specifying solutions for algorithms in the field of computational fluid dynamics (CFD). The program objectives were to allow researchers to produce usable parallel application software as soon as possible after acquiring MIMD computer equipment, to provide researchers with an easy-to-learn and easy-to-use parallel software language which could be implemented on several different MIMD machines, and to enable researchers to list preferred design specifications for future MIMD computer architectures. Analysis of CFD algorithms indicated that extensions of an existing programming language, adaptable to new computer architectures, provided the best solution to meeting program objectives. The CoFORTRAN Language was written in response to these objectives and to provide researchers a means to experiment with parallel software solutions to CFD algorithms on machines with parallel architectures.

  10. Redesigning the specificity of protein-DNA interactions with Rosetta.

    PubMed

    Thyme, Summer; Baker, David

    2014-01-01

    Building protein tools that can selectively bind or cleave specific DNA sequences requires efficient technologies for modifying protein-DNA interactions. Computational design is one method for accomplishing this goal. In this chapter, we present the current state of protein-DNA interface design with the Rosetta macromolecular modeling program. The LAGLIDADG endonuclease family of DNA-cleaving enzymes, under study as potential gene therapy reagents, has been the main testing ground for these in silico protocols. At this time, the computational methods are most useful for designing endonuclease variants that can accommodate small numbers of target site substitutions. Attempts to engineer for more extensive interface changes will likely benefit from an approach that uses the computational design results in conjunction with a high-throughput directed evolution or screening procedure. The family of enzymes presents an engineering challenge because their interfaces are highly integrated and there is significant coordination between the binding and catalysis events. Future developments in the computational algorithms depend on experimental feedback to improve understanding and modeling of these complex enzymatic features. This chapter presents both the basic method of design that has been successfully used to modulate specificity and more advanced procedures that incorporate DNA flexibility and other properties that are likely necessary for reliable modeling of more extensive target site changes.

  11. Effect of Preparation Depth on the Marginal and Internal Adaptation of Computer-aided Design/Computer-assisted Manufacture Endocrowns.

    PubMed

    Gaintantzopoulou, M D; El-Damanhoury, H M

    The aim of the study was to evaluate the effect of preparation depth and intraradicular extension on the marginal and internal adaptation of computer-aided design/computer-assisted manufacture (CAD/CAM) endocrown restorations. Standardized preparations were made in resin endodontic tooth models (Nissin Dental), with an intracoronal preparation depth of 2 mm (group H2), with extra 1- (group H3) or 2-mm (group H4) intraradicular extensions in the root canals (n=12). Vita Enamic polymer-infiltrated ceramic-network material endocrowns were fabricated using the CEREC AC CAD/CAM system and were seated on the prepared teeth. Specimens were evaluated by microtomography. Horizontal and vertical tomographic sections were recorded and reconstructed by using the CTSkan software (TView v1.1, Skyscan).The surface/void volume (S/V) in the region of interest was calculated. Marginal gap (MG), absolute marginal discrepancy (MD), and internal marginal gap were measured at various measuring locations and calculated in microscale (μm). Marginal and internal discrepancy data (μm) were analyzed with nonparametric Kruskal-Wallis analysis of variance by ranks with Dunn's post hoc, whereas S/V data were analyzed by one-way analysis of variance and Bonferroni multiple comparisons (α=0.05). Significant differences were found in MG, MD, and internal gap width values between the groups, with H2 showing the lowest values from all groups. S/V calculations presented significant differences between H2 and the other two groups (H3 and H4) tested, with H2 again showing the lowest values. Increasing the intraradicular extension of endocrown restorations increased the marginal and internal gap of endocrown restorations.

  12. In Vivo Patellofemoral Contact Mechanics During Active Extension Using a Novel Dynamic MRI-based Methodology

    PubMed Central

    Borotikar, Bhushan S.; Sheehan, Frances T.

    2017-01-01

    Objectives To establish an in vivo, normative patellofemoral cartilage contact mechanics database acquired during voluntary muscle control using a novel dynamic magnetic resonance (MR) imaging-based computational methodology and validate the contact mechanics sensitivity to the known sub-millimeter methodological inaccuracies. Design Dynamic cine phase-contrast and multi-plane cine images were acquired while female subjects (n=20, sample of convenience) performed an open kinetic chain (knee flexion-extension) exercise inside a 3-Tesla MR scanner. Static cartilage models were created from high resolution three-dimensional static MR data and accurately placed in their dynamic pose at each time frame based on the cine-PC data. Cartilage contact parameters were calculated based on the surface overlap. Statistical analysis was performed using paired t-test and a one-sample repeated measures ANOVA. The sensitivity of the contact parameters to the known errors in the patellofemoral kinematics was determined. Results Peak mean patellofemoral contact area was 228.7±173.6mm2 at 40° knee angle. During extension, contact centroid and peak strain locations tracked medially on the femoral and patellar cartilage and were not significantly different from each other. At 30°, 35°, and 40° of knee extension, contact area was significantly different. Contact area and centroid locations were insensitive to rotational and translational perturbations. Conclusion This study is a first step towards unfolding the biomechanical pathways to anterior patellofemoral pain and OA using dynamic, in vivo, and accurate methodologies. The database provides crucial data for future studies and for validation of, or as an input to, computational models. PMID:24012620

  13. Experimental and Computational Study of Sonic and Supersonic Jet Plumes

    NASA Technical Reports Server (NTRS)

    Venkatapathy, E.; Naughton, J. W.; Fletcher, D. G.; Edwards, Thomas A. (Technical Monitor)

    1994-01-01

    Study of sonic and supersonic jet plumes are relevant to understanding such phenomenon as jet-noise, plume signatures, and rocket base-heating and radiation. Jet plumes are simple to simulate and yet, have complex flow structures such as Mach disks, triple points, shear-layers, barrel shocks, shock-shear-layer interaction, etc. Experimental and computational simulation of sonic and supersonic jet plumes have been performed for under- and over-expanded, axisymmetric plume conditions. The computational simulation compare very well with the experimental observations of schlieren pictures. Experimental data such as temperature measurements with hot-wire probes are yet to be measured and will be compared with computed values. Extensive analysis of the computational simulations presents a clear picture of how the complex flow structure develops and the conditions under which self-similar flow structures evolve. From the computations, the plume structure can be further classified into many sub-groups. In the proposed paper, detail results from the experimental and computational simulations for single, axisymmetric, under- and over-expanded, sonic and supersonic plumes will be compared and the fluid dynamic aspects of flow structures will be discussed.

  14. A Computer Interview for Multivariate Monitoring of Psychiatric Outcome.

    ERIC Educational Resources Information Center

    Stevenson, John F.; And Others

    Application of computer technology to psychiatric outcome measurement offers the promise of coping with increasing demands for extensive patient interviews repeated longitudinally. Described is the development of a cost-effective multi-dimensional tracking device to monitor psychiatric functioning, building on a previous local computer interview…

  15. SURFACE WATER FLOW IN LANDSCAPE MODELS: 1. EVERGLADES CASE STUDY. (R824766)

    EPA Science Inventory

    Many landscape models require extensive computational effort using a large array of grid cells that represent the landscape. The number of spatial cells may be in the thousands and millions, while the ecological component run in each of the cells to account for landscape dynamics...

  16. Tax Wealth in Fifty States.

    ERIC Educational Resources Information Center

    Halstead, D. Kent

    This study presents a scheme for yearly, comparative, computation of state and local government tax capacity and effort. Figures for all states for fiscal year 1975 are presented in extensive tables. The system used is a simplified version of the Representative Tax System, which identifies tax bases, determines national average tax rates for those…

  17. A Model for Teaching an Introductory Programming Course Using ADRI

    ERIC Educational Resources Information Center

    Malik, Sohail Iqbal; Coldwell-Neilson, Jo

    2017-01-01

    High failure and drop-out rates from introductory programming courses continue to be of significant concern to computer science disciplines despite extensive research attempting to address the issue. In this study, we include the three entities of the didactic triangle, instructors, students and curriculum, to explore the learning difficulties…

  18. The topology of the cosmic web in terms of persistent Betti numbers

    NASA Astrophysics Data System (ADS)

    Pranav, Pratyush; Edelsbrunner, Herbert; van de Weygaert, Rien; Vegter, Gert; Kerber, Michael; Jones, Bernard J. T.; Wintraecken, Mathijs

    2017-03-01

    We introduce a multiscale topological description of the Megaparsec web-like cosmic matter distribution. Betti numbers and topological persistence offer a powerful means of describing the rich connectivity structure of the cosmic web and of its multiscale arrangement of matter and galaxies. Emanating from algebraic topology and Morse theory, Betti numbers and persistence diagrams represent an extension and deepening of the cosmologically familiar topological genus measure and the related geometric Minkowski functionals. In addition to a description of the mathematical background, this study presents the computational procedure for computing Betti numbers and persistence diagrams for density field filtrations. The field may be computed starting from a discrete spatial distribution of galaxies or simulation particles. The main emphasis of this study concerns an extensive and systematic exploration of the imprint of different web-like morphologies and different levels of multiscale clustering in the corresponding computed Betti numbers and persistence diagrams. To this end, we use Voronoi clustering models as templates for a rich variety of web-like configurations and the fractal-like Soneira-Peebles models exemplify a range of multiscale configurations. We have identified the clear imprint of cluster nodes, filaments, walls, and voids in persistence diagrams, along with that of the nested hierarchy of structures in multiscale point distributions. We conclude by outlining the potential of persistent topology for understanding the connectivity structure of the cosmic web, in large simulations of cosmic structure formation and in the challenging context of the observed galaxy distribution in large galaxy surveys.

  19. Soft Computing Techniques for the Protein Folding Problem on High Performance Computing Architectures.

    PubMed

    Llanes, Antonio; Muñoz, Andrés; Bueno-Crespo, Andrés; García-Valverde, Teresa; Sánchez, Antonia; Arcas-Túnez, Francisco; Pérez-Sánchez, Horacio; Cecilia, José M

    2016-01-01

    The protein-folding problem has been extensively studied during the last fifty years. The understanding of the dynamics of global shape of a protein and the influence on its biological function can help us to discover new and more effective drugs to deal with diseases of pharmacological relevance. Different computational approaches have been developed by different researchers in order to foresee the threedimensional arrangement of atoms of proteins from their sequences. However, the computational complexity of this problem makes mandatory the search for new models, novel algorithmic strategies and hardware platforms that provide solutions in a reasonable time frame. We present in this revision work the past and last tendencies regarding protein folding simulations from both perspectives; hardware and software. Of particular interest to us are both the use of inexact solutions to this computationally hard problem as well as which hardware platforms have been used for running this kind of Soft Computing techniques.

  20. [Computed tomography semiotics of osteonecrosis and sequestration in chronic hematogenic osteomyelitis].

    PubMed

    D'iachkova, G V; Mitina, Iu L

    2007-01-01

    Based on the data of computed tomography, radiography and densitometry in 39 patients the authors describe in detail the signs of osteonecrosis and sequestration of different localization and extension.

  1. Using E-Learning and ICT Courses in Educational Environment: A Review

    ERIC Educational Resources Information Center

    Salehi, Hadi; Shojaee, Mohammad; Sattar, Susan

    2015-01-01

    With the quick emergence of computers and related technology, Electronic-learning (E-learning) and Information Communication and Technology (ICT) have been extensively utilized in the education and training field. Miscellaneous methods of integrating computer technology and the context in which computers are used have affected student learning in…

  2. Computer Power: Part 1: Distribution of Power (and Communications).

    ERIC Educational Resources Information Center

    Price, Bennett J.

    1988-01-01

    Discussion of the distribution of power to personal computers and computer terminals addresses options such as extension cords, perimeter raceways, and interior raceways. Sidebars explain: (1) the National Electrical Code; (2) volts, amps, and watts; (3) transformers, circuit breakers, and circuits; and (4) power vs. data wiring. (MES)

  3. Computational Participation: Understanding Coding as an Extension of Literacy Instruction

    ERIC Educational Resources Information Center

    Burke, Quinn; O'Byrne, W. Ian; Kafai, Yasmin B.

    2016-01-01

    Understanding the computational concepts on which countless digital applications run offers learners the opportunity to no longer simply read such media but also become more discerning end users and potentially innovative "writers" of new media themselves. To think computationally--to solve problems, to design systems, and to process and…

  4. Miscellaneous Topics in Computer-Aided Drug Design: Synthetic Accessibility and GPU Computing, and Other Topics.

    PubMed

    Fukunishi, Yoshifumi; Mashimo, Tadaaki; Misoo, Kiyotaka; Wakabayashi, Yoshinori; Miyaki, Toshiaki; Ohta, Seiji; Nakamura, Mayu; Ikeda, Kazuyoshi

    2016-01-01

    Computer-aided drug design is still a state-of-the-art process in medicinal chemistry, and the main topics in this field have been extensively studied and well reviewed. These topics include compound databases, ligand-binding pocket prediction, protein-compound docking, virtual screening, target/off-target prediction, physical property prediction, molecular simulation and pharmacokinetics/pharmacodynamics (PK/PD) prediction. Message and Conclusion: However, there are also a number of secondary or miscellaneous topics that have been less well covered. For example, methods for synthesizing and predicting the synthetic accessibility (SA) of designed compounds are important in practical drug development, and hardware/software resources for performing the computations in computer-aided drug design are crucial. Cloud computing and general purpose graphics processing unit (GPGPU) computing have been used in virtual screening and molecular dynamics simulations. Not surprisingly, there is a growing demand for computer systems which combine these resources. In the present review, we summarize and discuss these various topics of drug design.

  5. Miscellaneous Topics in Computer-Aided Drug Design: Synthetic Accessibility and GPU Computing, and Other Topics

    PubMed Central

    Fukunishi, Yoshifumi; Mashimo, Tadaaki; Misoo, Kiyotaka; Wakabayashi, Yoshinori; Miyaki, Toshiaki; Ohta, Seiji; Nakamura, Mayu; Ikeda, Kazuyoshi

    2016-01-01

    Abstract: Background Computer-aided drug design is still a state-of-the-art process in medicinal chemistry, and the main topics in this field have been extensively studied and well reviewed. These topics include compound databases, ligand-binding pocket prediction, protein-compound docking, virtual screening, target/off-target prediction, physical property prediction, molecular simulation and pharmacokinetics/pharmacodynamics (PK/PD) prediction. Message and Conclusion: However, there are also a number of secondary or miscellaneous topics that have been less well covered. For example, methods for synthesizing and predicting the synthetic accessibility (SA) of designed compounds are important in practical drug development, and hardware/software resources for performing the computations in computer-aided drug design are crucial. Cloud computing and general purpose graphics processing unit (GPGPU) computing have been used in virtual screening and molecular dynamics simulations. Not surprisingly, there is a growing demand for computer systems which combine these resources. In the present review, we summarize and discuss these various topics of drug design. PMID:27075578

  6. Quantum gates by periodic driving

    PubMed Central

    Shi, Z. C.; Wang, W.; Yi, X. X.

    2016-01-01

    Topological quantum computation has been extensively studied in the past decades due to its robustness against decoherence. One way to realize the topological quantum computation is by adiabatic evolutions—it requires relatively long time to complete a gate, so the speed of quantum computation slows down. In this work, we present a method to realize single qubit quantum gates by periodic driving. Compared to adiabatic evolution, the single qubit gates can be realized at a fixed time much shorter than that by adiabatic evolution. The driving fields can be sinusoidal or square-well field. With the sinusoidal driving field, we derive an expression for the total operation time in the high-frequency limit, and an exact analytical expression for the evolution operator without any approximations is given for the square well driving. This study suggests that the period driving could provide us with a new direction in regulations of the operation time in topological quantum computation. PMID:26911900

  7. Quantum gates by periodic driving.

    PubMed

    Shi, Z C; Wang, W; Yi, X X

    2016-02-25

    Topological quantum computation has been extensively studied in the past decades due to its robustness against decoherence. One way to realize the topological quantum computation is by adiabatic evolutions-it requires relatively long time to complete a gate, so the speed of quantum computation slows down. In this work, we present a method to realize single qubit quantum gates by periodic driving. Compared to adiabatic evolution, the single qubit gates can be realized at a fixed time much shorter than that by adiabatic evolution. The driving fields can be sinusoidal or square-well field. With the sinusoidal driving field, we derive an expression for the total operation time in the high-frequency limit, and an exact analytical expression for the evolution operator without any approximations is given for the square well driving. This study suggests that the period driving could provide us with a new direction in regulations of the operation time in topological quantum computation.

  8. Using the Microsoft Kinect™ to assess 3-D shoulder kinematics during computer use.

    PubMed

    Xu, Xu; Robertson, Michelle; Chen, Karen B; Lin, Jia-Hua; McGorry, Raymond W

    2017-11-01

    Shoulder joint kinematics has been used as a representative indicator to investigate musculoskeletal symptoms among computer users for office ergonomics studies. The traditional measurement of shoulder kinematics normally requires a laboratory-based motion tracking system which limits the field studies. In the current study, a portable, low cost, and marker-less Microsoft Kinect™ sensor was examined for its feasibility on shoulder kinematics measurement during computer tasks. Eleven healthy participants performed a standardized computer task, and their shoulder kinematics data were measured by a Kinect sensor and a motion tracking system concurrently. The results indicated that placing the Kinect sensor in front of the participants would yielded a more accurate shoulder kinematics measurements then placing the Kinect sensor 15° or 30° to one side. The results also showed that the Kinect sensor had a better estimate on shoulder flexion/extension, compared with shoulder adduction/abduction and shoulder axial rotation. The RMSE of front-placed Kinect sensor on shoulder flexion/extension was less than 10° for both the right and the left shoulder. The measurement error of the front-placed Kinect sensor on the shoulder adduction/abduction was approximately 10° to 15°, and the magnitude of error is proportional to the magnitude of that joint angle. After the calibration, the RMSE on shoulder adduction/abduction were less than 10° based on an independent dataset of 5 additional participants. For shoulder axial rotation, the RMSE of front-placed Kinect sensor ranged between approximately 15° to 30°. The results of the study suggest that the Kinect sensor can provide some insight on shoulder kinematics for improving office ergonomics. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. A Weibull distribution accrual failure detector for cloud computing.

    PubMed

    Liu, Jiaxi; Wu, Zhibo; Wu, Jin; Dong, Jian; Zhao, Yao; Wen, Dongxin

    2017-01-01

    Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing.

  10. Signal Analysis Techniques for Interpreting Electroencephalograms

    DTIC Science & Technology

    1980-12-01

    investigations by Lansing and Barlow (61). The relation between VER, adaptation attention fatigue, etc., has been studied quite extensively with invasive...in order to restore the highly abnormal EEG to near normal. Anatomical and Neurophysiological Considerations of VER Changes For studies of visual...Computer Analysis of Electroencephalograms, Digest of the 7th International Conf. on Medical and Biological Engineering, Stockholm, pp. 257-260, 1967. 4

  11. A Systematic Replication and Extension of Using Incremental Rehearsal to Improve Multiplication Skills: An Investigation of Generalization

    ERIC Educational Resources Information Center

    Codding, Robin S.; Archer, Jillian; Connell, James

    2010-01-01

    The purpose of this study was to replicate and extend a previous study by Burns ("Education and Treatment of Children" 28: 237-249, 2005) examining the effectiveness of incremental rehearsal on computation performance. A multiple-probe design across multiplication problem sets was employed for one participant to examine digits correct per minute…

  12. Mobile Applications for Extension

    ERIC Educational Resources Information Center

    Drill, Sabrina L.

    2012-01-01

    Mobile computing devices (smart phones, tablets, etc.) are rapidly becoming the dominant means of communication worldwide and are increasingly being used for scientific investigation. This technology can further our Extension mission by increasing our power for data collection, information dissemination, and informed decision-making. Mobile…

  13. The experience of agency in human-computer interactions: a review

    PubMed Central

    Limerick, Hannah; Coyle, David; Moore, James W.

    2014-01-01

    The sense of agency is the experience of controlling both one’s body and the external environment. Although the sense of agency has been studied extensively, there is a paucity of studies in applied “real-life” situations. One applied domain that seems highly relevant is human-computer-interaction (HCI), as an increasing number of our everyday agentive interactions involve technology. Indeed, HCI has long recognized the feeling of control as a key factor in how people experience interactions with technology. The aim of this review is to summarize and examine the possible links between sense of agency and understanding control in HCI. We explore the overlap between HCI and sense of agency for computer input modalities and system feedback, computer assistance, and joint actions between humans and computers. An overarching consideration is how agency research can inform HCI and vice versa. Finally, we discuss the potential ethical implications of personal responsibility in an ever-increasing society of technology users and intelligent machine interfaces. PMID:25191256

  14. A non-iterative extension of the multivariate random effects meta-analysis.

    PubMed

    Makambi, Kepher H; Seung, Hyunuk

    2015-01-01

    Multivariate methods in meta-analysis are becoming popular and more accepted in biomedical research despite computational issues in some of the techniques. A number of approaches, both iterative and non-iterative, have been proposed including the multivariate DerSimonian and Laird method by Jackson et al. (2010), which is non-iterative. In this study, we propose an extension of the method by Hartung and Makambi (2002) and Makambi (2001) to multivariate situations. A comparison of the bias and mean square error from a simulation study indicates that, in some circumstances, the proposed approach perform better than the multivariate DerSimonian-Laird approach. An example is presented to demonstrate the application of the proposed approach.

  15. Fluid-solid interaction: benchmarking of an external coupling of ANSYS with CFX for cardiovascular applications.

    PubMed

    Hose, D R; Lawford, P V; Narracott, A J; Penrose, J M T; Jones, I P

    2003-01-01

    Fluid-solid interaction is a primary feature of cardiovascular flows. There is increasing interest in the numerical solution of these systems as the extensive computational resource required for such studies becomes available. One form of coupling is an external weak coupling of separate solid and fluid mechanics codes. Information about the stress tensor and displacement vector at the wetted boundary is passed between the codes, and an iterative scheme is employed to move towards convergence of these parameters at each time step. This approach has the attraction that separate codes with the most extensive functionality for each of the separate phases can be selected, which might be important in the context of the complex rheology and contact mechanics that often feature in cardiovascular systems. Penrose and Staples describe a weak coupling of CFX for computational fluid mechanics to ANSYS for solid mechanics, based on a simple Jacobi iteration scheme. It is important to validate the coupled numerical solutions. An extensive analytical study of flow in elastic-walled tubes was carried out by Womersley in the late 1950s. This paper describes the performance of the coupling software for the straight elastic-walled tube, and compares the results with Womersley's analytical solutions. It also presents preliminary results demonstrating the application of the coupled software in the context of a stented vessel.

  16. Application of Computer-Assisted Learning Methods in the Teaching of Chemical Spectroscopy.

    ERIC Educational Resources Information Center

    Ayscough, P. B.; And Others

    1979-01-01

    Discusses the application of computer-assisted learning methods to the interpretation of infrared, nuclear magnetic resonance, and mass spectra; and outlines extensions into the area of integrated spectroscopy. (Author/CMV)

  17. Platform-independent method for computer aided schematic drawings

    DOEpatents

    Vell, Jeffrey L [Slingerlands, NY; Siganporia, Darius M [Clifton Park, NY; Levy, Arthur J [Fort Lauderdale, FL

    2012-02-14

    A CAD/CAM method is disclosed for a computer system to capture and interchange schematic drawing and associated design information. The schematic drawing and design information are stored in an extensible, platform-independent format.

  18. A New Soft Computing Method for K-Harmonic Means Clustering.

    PubMed

    Yeh, Wei-Chang; Jiang, Yunzhi; Chen, Yee-Fen; Chen, Zhe

    2016-01-01

    The K-harmonic means clustering algorithm (KHM) is a new clustering method used to group data such that the sum of the harmonic averages of the distances between each entity and all cluster centroids is minimized. Because it is less sensitive to initialization than K-means (KM), many researchers have recently been attracted to studying KHM. In this study, the proposed iSSO-KHM is based on an improved simplified swarm optimization (iSSO) and integrates a variable neighborhood search (VNS) for KHM clustering. As evidence of the utility of the proposed iSSO-KHM, we present extensive computational results on eight benchmark problems. From the computational results, the comparison appears to support the superiority of the proposed iSSO-KHM over previously developed algorithms for all experiments in the literature.

  19. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    PubMed

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols.

  20. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions

    PubMed Central

    Dinov, Ivo D.; Siegrist, Kyle; Pearl, Dennis K.; Kalinin, Alexandr; Christou, Nicolas

    2015-01-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome, which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols. PMID:27158191

  1. Preconditioned upwind methods to solve 3-D incompressible Navier-Stokes equations for viscous flows

    NASA Technical Reports Server (NTRS)

    Hsu, C.-H.; Chen, Y.-M.; Liu, C. H.

    1990-01-01

    A computational method for calculating low-speed viscous flowfields is developed. The method uses the implicit upwind-relaxation finite-difference algorithm with a nonsingular eigensystem to solve the preconditioned, three-dimensional, incompressible Navier-Stokes equations in curvilinear coordinates. The technique of local time stepping is incorporated to accelerate the rate of convergence to a steady-state solution. An extensive study of optimizing the preconditioned system is carried out for two viscous flow problems. Computed results are compared with analytical solutions and experimental data.

  2. Practical Algorithms for the Longest Common Extension Problem

    NASA Astrophysics Data System (ADS)

    Ilie, Lucian; Tinta, Liviu

    The Longest Common Extension problem considers a string s and computes, for each of a number of pairs (i,j), the longest substring of s that starts at both i and j. It appears as a subproblem in many fundamental string problems and can be solved by linear-time preprocessing of the string that allows (worst-case) constant-time computation for each pair. The two known approaches use powerful algorithms: either constant-time computation of the Lowest Common Ancestor in trees or constant-time computation of Range Minimum Queries (RMQ) in arrays. We show here that, from practical point of view, such complicated approaches are not needed. We give two very simple algorithms for this problem that require no preprocessing. The first needs only the string and is significantly faster than all previous algorithms on the average. The second combines the first with a direct RMQ computation on the Longest Common Prefix array. It takes advantage of the superior speed of the cache memory and is the fastest on virtually all inputs.

  3. Using Netnography to Explore the Culture of Online Language Teaching Communities

    ERIC Educational Resources Information Center

    Kulavuz-Onal, Derya

    2015-01-01

    Netnography (Kozinets, 2010) is an ethnographic approach to study communities that exist primarily online. Engaging in online participant observation, the netnographer connects to the online community through a computer screen, and the field is located inside the screen. Although it has been used in marketing research extensively, netnography is a…

  4. Generalization of Posture Training to Computer Workstations in an Applied Setting

    ERIC Educational Resources Information Center

    Sigurdsson, Sigurdur O.; Ring, Brandon M.; Needham, Mick; Boscoe, James H.; Silverman, Kenneth

    2011-01-01

    Improving employees' posture may decrease the risk of musculoskeletal disorders. The current paper is a systematic replication and extension of Sigurdsson and Austin (2008), who found that an intervention consisting of information, real-time feedback, and self-monitoring improved participant posture at mock workstations. In the current study,…

  5. The Effects of Beacons, Comments, and Tasks on Program Comprehension Process in Software Maintenance

    ERIC Educational Resources Information Center

    Fan, Quyin

    2010-01-01

    Program comprehension is the most important and frequent process in software maintenance. Extensive research has found that individual characteristics of programmers, differences of computer programs, and differences of task-driven motivations are the major factors that affect the program comprehension results. There is no study specifically…

  6. Knowledge Acquisition at Work. IEE Brief Number 2.

    ERIC Educational Resources Information Center

    Scribner, Sylvia; Sachs, Patricia

    An exploratory investigation attempted to determine how learning at work actually takes place and in what ways learning on the job differs from classroom learning. The study was based on extensive observations and interviews over a 5-year period at two manufacturing plants that implemented a computer-based system known as Manufacturing Resource…

  7. Telecommunication market research processing

    NASA Astrophysics Data System (ADS)

    Dupont, J. F.

    1983-06-01

    The data processing in two telecommunication market investigations is described. One of the studies concerns the office applications of communication and the other the experiences with a videotex terminal. Statistical factorial analysis was performed on a large mass of data. A comparison between utilization intentions and effective utilization is made. Extensive rewriting of statistical analysis computer programs was required.

  8. The prediction in computer color matching of dentistry based on GA+BP neural network.

    PubMed

    Li, Haisheng; Lai, Long; Chen, Li; Lu, Cheng; Cai, Qiang

    2015-01-01

    Although the use of computer color matching can reduce the influence of subjective factors by technicians, matching the color of a natural tooth with a ceramic restoration is still one of the most challenging topics in esthetic prosthodontics. Back propagation neural network (BPNN) has already been introduced into the computer color matching in dentistry, but it has disadvantages such as unstable and low accuracy. In our study, we adopt genetic algorithm (GA) to optimize the initial weights and threshold values in BPNN for improving the matching precision. To our knowledge, we firstly combine the BPNN with GA in computer color matching in dentistry. Extensive experiments demonstrate that the proposed method improves the precision and prediction robustness of the color matching in restorative dentistry.

  9. Optimistic barrier synchronization

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1992-01-01

    Barrier synchronization is fundamental operation in parallel computation. In many contexts, at the point a processor enters a barrier it knows that it has already processed all the work required of it prior to synchronization. The alternative case, when a processor cannot enter a barrier with the assurance that it has already performed all the necessary pre-synchronization computation, is treated. The problem arises when the number of pre-sychronization messages to be received by a processor is unkown, for example, in a parallel discrete simulation or any other computation that is largely driven by an unpredictable exchange of messages. We describe an optimistic O(log sup 2 P) barrier algorithm for such problems, study its performance on a large-scale parallel system, and consider extensions to general associative reductions as well as associative parallel prefix computations.

  10. Workload Characterization of CFD Applications Using Partial Differential Equation Solvers

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Workload characterization is used for modeling and evaluating of computing systems at different levels of detail. We present workload characterization for a class of Computational Fluid Dynamics (CFD) applications that solve Partial Differential Equations (PDEs). This workload characterization focuses on three high performance computing platforms: SGI Origin2000, EBM SP-2, a cluster of Intel Pentium Pro bases PCs. We execute extensive measurement-based experiments on these platforms to gather statistics of system resource usage, which results in workload characterization. Our workload characterization approach yields a coarse-grain resource utilization behavior that is being applied for performance modeling and evaluation of distributed high performance metacomputing systems. In addition, this study enhances our understanding of interactions between PDE solver workloads and high performance computing platforms and is useful for tuning these applications.

  11. Alternate concepts study extension. Volume 2: Part 4: Avionics

    NASA Technical Reports Server (NTRS)

    1971-01-01

    A recommended baseline system is presented along with alternate avionics systems, Mark 2 avionics, booster avionics, and a cost summary. Analyses and discussions are included on the Mark 1 orbiter avionics subsystems, electrical ground support equipment, and the computer programs. Results indicate a need to define all subsystems of the baseline system, an installation study to determine the impact on the crew station, and a study on access for maintenance.

  12. Improving the Acquisition of Basic Technical Surgical Skills with VR-Based Simulation Coupled with Computer-Based Video Instruction.

    PubMed

    Rojas, David; Kapralos, Bill; Dubrowski, Adam

    2016-01-01

    Next to practice, feedback is the most important variable in skill acquisition. Feedback can vary in content and the way that it is used for delivery. Health professions education research has extensively examined the different effects provided by the different feedback methodologies. In this paper we compared two different types of knowledge of performance (KP) feedback. The first type was video-based KP feedback while the second type consisted of computer generated KP feedback. Results of this study showed that computer generated performance feedback is more effective than video based performance feedback. The combination of the two feedback methodologies provides trainees with a better understanding.

  13. Biotelemetry and computer analysis of sleep processes on earth and in space.

    NASA Technical Reports Server (NTRS)

    Adey, W. R.

    1972-01-01

    Developments in biomedical engineering now permit study of states of sleep, wakefulness, and focused attention in man exposed to rigorous environments, including aerospace flight. These new sensing devices, data acquisition systems, and computational methods have also been extensively applied to clinical problems of disordered sleep. A 'library' of EEG data has been prepared for sleep in normal man, and characterized for its group features by computational analysis. Sleep in an astronaut in space flight has been examined for the first and second 'nights' of space flight. Normal 90-min cycles were detected during the second night. Sleep patterns in quadriplegic patients deprived of all sensory inputs below the neck have indicated major deviations.

  14. Development of a computer-assisted learning software package on dental traumatology.

    PubMed

    Tolidis, K; Crawford, P; Stephens, C; Papadogiannis, Y; Plakias, C

    1998-10-01

    The development of computer-assisted learning software packages is a relatively new field of computer application. The progress made in personal computer technology toward more user-friendly operating systems has stimulated the academic community to develop computer-assisted learning for pre- and postgraduate students. The ability of computers to combine audio and visual data in an interactive form provides a powerful educational tool. The purpose of this study was to develop and evaluate a computer-assisted learning package on dental traumatology. This program contains background information on the diagnosis, classification, and management of dental injuries in both the permanent and the deciduous dentitions. It is structured into chapters according to the nature of the injury and whether injury has occurred in the primary or permanent dentition. At the end of each chapter there is a self-assessment questionnaire as well as references to relevant literature. Extensive use of pictures and video provides a comprehensive overview of the subject.

  15. Covering Resilience: A Recent Development for Binomial Checkpointing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walther, Andrea; Narayanan, Sri Hari Krishna

    In terms of computing time, adjoint methods offer a very attractive alternative to compute gradient information, required, e.g., for optimization purposes. However, together with this very favorable temporal complexity result comes a memory requirement that is in essence proportional with the operation count of the underlying function, e.g., if algorithmic differentiation is used to provide the adjoints. For this reason, checkpointing approaches in many variants have become popular. This paper analyzes an extension of the so-called binomial approach to cover also possible failures of the computing systems. Such a measure of precaution is of special interest for massive parallel simulationsmore » and adjoint calculations where the mean time between failure of the large scale computing system is smaller than the time needed to complete the calculation of the adjoint information. We describe the extensions of standard checkpointing approaches required for such resilience, provide a corresponding implementation and discuss first numerical results.« less

  16. STARS: An integrated general-purpose finite element structural, aeroelastic, and aeroservoelastic analysis computer program

    NASA Technical Reports Server (NTRS)

    Gupta, Kajal K.

    1991-01-01

    The details of an integrated general-purpose finite element structural analysis computer program which is also capable of solving complex multidisciplinary problems is presented. Thus, the SOLIDS module of the program possesses an extensive finite element library suitable for modeling most practical problems and is capable of solving statics, vibration, buckling, and dynamic response problems of complex structures, including spinning ones. The aerodynamic module, AERO, enables computation of unsteady aerodynamic forces for both subsonic and supersonic flow for subsequent flutter and divergence analysis of the structure. The associated aeroservoelastic analysis module, ASE, effects aero-structural-control stability analysis yielding frequency responses as well as damping characteristics of the structure. The program is written in standard FORTRAN to run on a wide variety of computers. Extensive graphics, preprocessing, and postprocessing routines are also available pertaining to a number of terminals.

  17. Using a software-defined computer in teaching the basics of computer architecture and operation

    NASA Astrophysics Data System (ADS)

    Kosowska, Julia; Mazur, Grzegorz

    2017-08-01

    The paper describes the concept and implementation of SDC_One software-defined computer designed for experimental and didactic purposes. Equipped with extensive hardware monitoring mechanisms, the device enables the students to monitor the computer's operation on bus transfer cycle or instruction cycle basis, providing the practical illustration of basic aspects of computer's operation. In the paper, we describe the hardware monitoring capabilities of SDC_One and some scenarios of using it in teaching the basics of computer architecture and microprocessor operation.

  18. Estimating apparent maximum muscle stress of trunk extensor muscles in older adults using subject-specific musculoskeletal models.

    PubMed

    Burkhart, Katelyn A; Bruno, Alexander G; Bouxsein, Mary L; Bean, Jonathan F; Anderson, Dennis E

    2018-01-01

    Maximum muscle stress (MMS) is a critical parameter in musculoskeletal modeling, defining the maximum force that a muscle of given size can produce. However, a wide range of MMS values have been reported in literature, and few studies have estimated MMS in trunk muscles. Due to widespread use of musculoskeletal models in studies of the spine and trunk, there is a need to determine reasonable magnitude and range of trunk MMS. We measured trunk extension strength in 49 participants over 65 years of age, surveyed participants about low back pain, and acquired quantitative computed tomography (QCT) scans of their lumbar spines. Trunk muscle morphology was assessed from QCT scans and used to create a subject-specific musculoskeletal model for each participant. Model-predicted extension strength was computed using a trunk muscle MMS of 100 N/cm 2 . The MMS of each subject-specific model was then adjusted until the measured strength matched the model-predicted strength (±20 N). We found that measured trunk extension strength was significantly higher in men. With the initial constant MMS value, the musculoskeletal model generally over-predicted trunk extension strength. By adjusting MMS on a subject-specific basis, we found apparent MMS values ranging from 40 to 130 N/cm 2 , with an average of 75.5 N/cm 2 for both men and women. Subjects with low back pain had lower apparent MMS than subjects with no back pain. This work incorporates a unique approach to estimate subject-specific trunk MMS values via musculoskeletal modeling and provides a useful insight into MMS variation. © 2017 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res 36:498-505, 2018. © 2017 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  19. A Weibull distribution accrual failure detector for cloud computing

    PubMed Central

    Wu, Zhibo; Wu, Jin; Zhao, Yao; Wen, Dongxin

    2017-01-01

    Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing. PMID:28278229

  20. Two-Level Weld-Material Homogenization for Efficient Computational Analysis of Welded Structure Blast-Survivability

    NASA Astrophysics Data System (ADS)

    Grujicic, M.; Arakere, G.; Hariharan, A.; Pandurangan, B.

    2012-06-01

    The introduction of newer joining technologies like the so-called friction-stir welding (FSW) into automotive engineering entails the knowledge of the joint-material microstructure and properties. Since, the development of vehicles (including military vehicles capable of surviving blast and ballistic impacts) nowadays involves extensive use of the computational engineering analyses (CEA), robust high-fidelity material models are needed for the FSW joints. A two-level material-homogenization procedure is proposed and utilized in this study to help manage computational cost and computer storage requirements for such CEAs. The method utilizes experimental (microstructure, microhardness, tensile testing, and x-ray diffraction) data to construct: (a) the material model for each weld zone and (b) the material model for the entire weld. The procedure is validated by comparing its predictions with the predictions of more detailed but more costly computational analyses.

  1. Computational aeroacoustics and numerical simulation of supersonic jets

    NASA Technical Reports Server (NTRS)

    Morris, Philip J.; Long, Lyle N.

    1996-01-01

    The research project has been a computational study of computational aeroacoustics algorithms and numerical simulations of the flow and noise of supersonic jets. During this study a new method for the implementation of solid wall boundary conditions for complex geometries in three dimensions has been developed. In addition, a detailed study of the simulation of the flow in and noise from supersonic circular and rectangular jets has been conducted. Extensive comparisons have been made with experimental measurements. A summary of the results of the research program are attached as the main body of this report in the form of two publications. Also, the report lists the names of the students who were supported by this grant, their degrees, and the titles of their dissertations. In addition, a list of presentations and publications made by the Principal Investigators and the research students is also included.

  2. Robotics-Centered Outreach Activities: An Integrated Approach

    ERIC Educational Resources Information Center

    Ruiz-del-Solar, Javier

    2010-01-01

    Nowadays, universities are making extensive efforts to attract prospective students to the fields of electrical, electronic, and computer engineering. Thus, outreach is becoming increasingly important, and activities with schoolchildren are being extensively carried out as part of this effort. In this context, robotics is a very attractive and…

  3. Tools for Creating Mobile Applications for Extension

    ERIC Educational Resources Information Center

    Drill, Sabrina L.

    2012-01-01

    Considerations and tools for developing mobile applications for Extension include evaluating the topic, purpose, and audience. Different computing platforms may be used, and apps designed as modified Web pages or implicitly programmed for a particular platform. User privacy is another important consideration, especially for data collection apps.…

  4. Offshore survey provides answers to coastal stability and potential offshore extensions of landslides into Abalone Cove, Palos Verdes peninsula, Calif

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dill, R.F.; Slosson, J.E.

    1993-04-01

    The configuration and stability of the present coast line near Abalone Cove, on the south side of Palos Verdes Peninsula, California is related to the geology, oceanographic conditions, and recent and ancient landslide activity. This case study utilizes offshore high resolution seismic profiles, side-scan sonar, diving, and coring, to relate marine geology to the stability of a coastal region with known active landslides utilizing a desk top computer and off-the-shelf software. Electronic navigation provided precise positioning that when applied to computer generated charts permitted correlation of survey data needed to define the offshore geology and sea floor sediment patterns. Amore » mackintosh desk-top computer and commercially available off-the-shelf software provided the analytical tools for constructing a base chart and a means to superimpose template overlays of topography, isopachs or sediment thickness, bottom roughness and sediment distribution patterns. This composite map of offshore geology and oceanography was then related to an extensive engineering and geological land study of the coastal zone forming Abalone Cove, an area of active landslides. Vibrocoring provided ground sediment data for high resolution seismic traverses. This paper details the systems used, present findings relative to potential landslide movements, coastal erosion and discuss how conclusions were reached to determine whether or not onshore landslide failures extend offshore.« less

  5. The Role of the Computer in Education. Proceedings of the Annual Meeting (6th, Arlington Heights, Illinois, February 12-14, 1986).

    ERIC Educational Resources Information Center

    Micro-Ideas, Glenview, IL.

    Fifty-five papers focusing on the role of computer technology in education at all levels are included in the proceedings of this conference, which was designed to model effective and appropriate uses of the computer as an extension of the teacher-based instructional system. The use of the computer as a tool was emphasized, and the word processor…

  6. Projecting Grammatical Features in Nominals: Cognitive Processing Theory & Computational Implementation

    DTIC Science & Technology

    2010-03-01

    functionality and plausibility distinguishes this research from most research in computational linguistics and computational psycholinguistics . The... Psycholinguistic Theory There is extensive psycholinguistic evidence that human language processing is essentially incremental and interactive...challenges of psycholinguistic research is to explain how humans can process language effortlessly and accurately given the complexity and ambiguity that is

  7. Factors Affecting Career Choice: Comparison between Students from Computer and Other Disciplines

    ERIC Educational Resources Information Center

    Alexander, P. M.; Holmner, M.; Lotriet, H. H.; Matthee, M. C.; Pieterse, H. V.; Naidoo, S.; Twinomurinzi, H.; Jordaan, D.

    2011-01-01

    The number of student enrolments in computer-related courses remains a serious concern worldwide with far reaching consequences. This paper reports on an extensive survey about career choice and associated motivational factors amongst new students, only some of whom intend to major in computer-related courses, at two South African universities.…

  8. Modular space station, phase B extension. Information management advanced development. Volume 5: Software assembly

    NASA Technical Reports Server (NTRS)

    Gerber, C. R.

    1972-01-01

    The development of uniform computer program standards and conventions for the modular space station is discussed. The accomplishments analyzed are: (1) development of computer program specification hierarchy, (2) definition of computer program development plan, and (3) recommendations for utilization of all operating on-board space station related data processing facilities.

  9. Relationship between movement time and hip moment impulse in the sagittal plane during sit-to-stand movement: a combined experimental and computer simulation study.

    PubMed

    Inai, Takuma; Takabayashi, Tomoya; Edama, Mutsuaki; Kubo, Masayoshi

    2018-04-27

    The association between repetitive hip moment impulse and the progression of hip osteoarthritis is a recently recognized area of study. A sit-to-stand movement is essential for daily life and requires hip extension moment. Although a change in the sit-to-stand movement time may influence the hip moment impulse in the sagittal plane, this effect has not been examined. The purpose of this study was to clarify the relationship between sit-to-stand movement time and hip moment impulse in the sagittal plane. Twenty subjects performed the sit-to-stand movement at a self-selected natural speed. The hip, knee, and ankle joint angles obtained from experimental trials were used to perform two computer simulations. In the first simulation, the actual sit-to-stand movement time obtained from the experiment was entered. In the second simulation, sit-to-stand movement times ranging from 0.5 to 4.0 s at intervals of 0.25 s were entered. Hip joint moments and hip moment impulses in the sagittal plane during sit-to-stand movements were calculated for both computer simulations. The reliability of the simulation model was confirmed, as indicated by the similarities in the hip joint moment waveforms (r = 0.99) and the hip moment impulses in the sagittal plane between the first computer simulation and the experiment. In the second computer simulation, the hip moment impulse in the sagittal plane decreased with a decrease in the sit-to-stand movement time, although the peak hip extension moment increased with a decrease in the movement time. These findings clarify the association between the sit-to-stand movement time and hip moment impulse in the sagittal plane and may contribute to the prevention of the progression of hip osteoarthritis.

  10. Effect of Endocrown Pulp Chamber Extension Depth on Molar Fracture Resistance.

    PubMed

    Hayes, A; Duvall, N; Wajdowicz, M; Roberts, H

    The purpose of this study was to evaluate the effect of endocrown pulp chamber extension on mandibular molar fracture resistance. A total of 36 recently extracted mandibular third molars of approximate equal size were sectioned at the facial lingual height of contour followed by endodontic access into the pulp chamber. The specimens were then randomly divided into three groups (n=12) and pulpal and root canal contents removed. Pulp chamber floors were established at 2, 3, and 4 mm from the occlusal table using a three-step etch-and-rinse adhesive and a flowable resin composite. The prepared specimens were then embedded in auto-polymerizing denture base resin with surface area available for adhesive bonding determined using a digital recording microscope. Specimens were restored using a standardized template with a chairside computer-aided design/computer-aided manufacturing unit with the endocrown milled from a lithium disilicate glass-ceramic material. Restoration parameters of occlusal table anatomy and thickness were standardized with the only parameter difference being the pulp chamber extension depth. The endocrown restorations were luted with a self-adhesive resin luting agent and tested to failure after 24 hours on a universal testing machine, with force applied to the facial cusps at a 45° angle to the long axis of the tooth. The failure load was converted into stress for each specimen using the available surface area for bonding. Mean failure load and stress among the three groups was first subjected to the Shapiro-Wilk and Bartlett tests and then analyzed with an analysis of variance with the Tukey post hoc test at a 95% confidence level (p=0.05). The 2- and 4-mm chamber extension groups demonstrated the highest fracture resistance stress, with the 3-mm group similar to the 2-mm group. The 3- and 4-mm chamber extension group specimens demonstrated nearly universal catastrophic tooth fracture, whereas half the 2-mm chamber extension group displayed nonrestorable root fractures. Under the conditions of this study, mandibular molars restored with the endocrown technique with 2- and 4-mm pulp chamber extensions displayed greater tooth fracture resistance force as well as stress. All groups demonstrated a high number of catastrophic fractures, but these results may not be clinically significant because the fracture force results are higher than normal reported values of masticatory function.

  11. Integrated geophysical and geological study of the tectonic framework of the 38th Parallel Lineament in the vicinity of its intersection with the extension of the New Madrid Fault Zone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Braile, L.W.; Hinze, W.J.; Keller, G.R.

    1978-06-01

    Extensive gravity and aeromagnetic surveys have been conducted in critical areas of Kentucky, Illinois, and Indiana centering around the intersection of the 38th Parallel Lineament and the extension of the New Madrid Fault Zone. Available aeromagnetic maps have been digitized and these data have been processed by a suite of computer programs developed for this purpose. Seismic equipment has been prepared for crustal seismic studies and a 150 km long seismic refraction line has been observed along the Wabash River Valley Fault System. Preliminary basement rock and configuration maps have been prepared based on studies of the samples derived frommore » basement drill holes. Interpretation of these data are at a preliminary stage, but studies to this date indicate that the 38th Parallel Lineament features extend as far north as 39/sup 0/N and a subtle northeasterly striking magnetic and gravity anomaly cuts across Indiana from the southwest corner of the state, roughly on strike with the New Madrid Seismic Zone.« less

  12. ELAV Links Paused Pol II to Alternative Polyadenylation in the Drosophila Nervous System

    PubMed Central

    Oktaba, Katarzyna; Zhang, Wei; Lotz, Thea Sabrina; Jun, David Jayhyun; Lemke, Sandra Beatrice; Ng, Samuel Pak; Esposito, Emilia; Levine, Michael; Hilgers, Valérie

    2014-01-01

    SUMMARY Alternative polyadenylation (APA) has been implicated in a variety of developmental and disease processes. A particularly dramatic form of APA occurs in the developing nervous system of flies and mammals, whereby various developmental genes undergo coordinate 3′ UTR extension. In Drosophila, the RNA-binding protein ELAV inhibits RNA processing at proximal polyadenylation sites, thereby fostering the formation of exceptionally long 3′ UTRs. Here, we present evidence that paused Pol II promotes recruitment of ELAV to extended genes. Replacing promoters of extended genes with heterologous promoters blocks normal 3′ extension in the nervous system, while extension-associated promoters can induce 3′ extension in ectopic tissues expressing ELAV. Computational analyses suggest that promoter regions of extended genes tend to contain paused Pol II and associated cis-regulatory elements such as GAGA. ChIP-Seq assays identify ELAV in the promoter regions of extended genes. Our study provides evidence for a regulatory link between promoter-proximal pausing and APA. PMID:25544561

  13. Extension of nanoconfined DNA: Quantitative comparison between experiment and theory

    NASA Astrophysics Data System (ADS)

    Iarko, V.; Werner, E.; Nyberg, L. K.; Müller, V.; Fritzsche, J.; Ambjörnsson, T.; Beech, J. P.; Tegenfeldt, J. O.; Mehlig, K.; Westerlund, F.; Mehlig, B.

    2015-12-01

    The extension of DNA confined to nanochannels has been studied intensively and in detail. However, quantitative comparisons between experiments and model calculations are difficult because most theoretical predictions involve undetermined prefactors, and because the model parameters (contour length, Kuhn length, effective width) are difficult to compute reliably, leading to substantial uncertainties. Here we use a recent asymptotically exact theory for the DNA extension in the "extended de Gennes regime" that allows us to compare experimental results with theory. For this purpose, we performed experiments measuring the mean DNA extension and its standard deviation while varying the channel geometry, dye intercalation ratio, and ionic strength of the buffer. The experimental results agree very well with theory at high ionic strengths, indicating that the model parameters are reliable. At low ionic strengths, the agreement is less good. We discuss possible reasons. In principle, our approach allows us to measure the Kuhn length and the effective width of a single DNA molecule and more generally of semiflexible polymers in solution.

  14. Extension of research data repository system to support direct compute access to biomedical datasets: enhancing Dataverse to support large datasets.

    PubMed

    McKinney, Bill; Meyer, Peter A; Crosas, Mercè; Sliz, Piotr

    2017-01-01

    Access to experimental X-ray diffraction image data is important for validation and reproduction of macromolecular models and indispensable for the development of structural biology processing methods. In response to the evolving needs of the structural biology community, we recently established a diffraction data publication system, the Structural Biology Data Grid (SBDG, data.sbgrid.org), to preserve primary experimental datasets supporting scientific publications. All datasets published through the SBDG are freely available to the research community under a public domain dedication license, with metadata compliant with the DataCite Schema (schema.datacite.org). A proof-of-concept study demonstrated community interest and utility. Publication of large datasets is a challenge shared by several fields, and the SBDG has begun collaborating with the Institute for Quantitative Social Science at Harvard University to extend the Dataverse (dataverse.org) open-source data repository system to structural biology datasets. Several extensions are necessary to support the size and metadata requirements for structural biology datasets. In this paper, we describe one such extension-functionality supporting preservation of file system structure within Dataverse-which is essential for both in-place computation and supporting non-HTTP data transfers. © 2016 New York Academy of Sciences.

  15. Extension, validation and application of the NASCAP code

    NASA Technical Reports Server (NTRS)

    Katz, I.; Cassidy, J. J., III; Mandell, M. J.; Schnuelle, G. W.; Steen, P. G.; Parks, D. E.; Rotenberg, M.; Alexander, J. H.

    1979-01-01

    Numerous extensions were made in the NASCAP code. They fall into three categories: a greater range of definable objects, a more sophisticated computational model, and simplified code structure and usage. An important validation of NASCAP was performed using a new two dimensional computer code (TWOD). An interactive code (MATCHG) was written to compare material parameter inputs with charging results. The first major application of NASCAP was performed on the SCATHA satellite. Shadowing and charging calculation were completed. NASCAP was installed at the Air Force Geophysics Laboratory, where researchers plan to use it to interpret SCATHA data.

  16. Lightweight computational steering of very large scale molecular dynamics simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beazley, D.M.; Lomdahl, P.S.

    1996-09-01

    We present a computational steering approach for controlling, analyzing, and visualizing very large scale molecular dynamics simulations involving tens to hundreds of millions of atoms. Our approach relies on extensible scripting languages and an easy to use tool for building extensions and modules. The system is extremely easy to modify, works with existing C code, is memory efficient, and can be used from inexpensive workstations and networks. We demonstrate how we have used this system to manipulate data from production MD simulations involving as many as 104 million atoms running on the CM-5 and Cray T3D. We also show howmore » this approach can be used to build systems that integrate common scripting languages (including Tcl/Tk, Perl, and Python), simulation code, user extensions, and commercial data analysis packages.« less

  17. Putting Fun Back into Learning.

    ERIC Educational Resources Information Center

    Rao, Srikumar S.

    1995-01-01

    People will learn better if they like what they are learning. Computers offer an extensive library of cases, examples, and stories that are easy to access, fun to work through, and tell students what they want to know. One example is the ASK system, a 15-module, self-study, multimedia program that is fun for trainees to use, which should enhance…

  18. Providing Graduated Corrective Feedback in an Intelligent Computer-Assisted Language Learning Environment

    ERIC Educational Resources Information Center

    Ai, Haiyang

    2017-01-01

    Corrective feedback (CF), a response to linguistic errors made by second language (L2) learners, has received extensive scholarly attention in second language acquisition. While much of the previous research in the field has focused on whether CF facilitates or impedes L2 development, few studies have examined the efficacy of gradually modifying…

  19. Artificial Intelligence and Its Importance in Education.

    ERIC Educational Resources Information Center

    Tilmann, Martha J.

    Artificial intelligence, or the study of ideas that enable computers to be intelligent, is discussed in terms of what it is, what it has done, what it can do, and how it may affect the teaching of tomorrow. An extensive overview of artificial intelligence examines its goals and applications and types of artificial intelligence including (1) expert…

  20. 75 FR 28782 - Extension of Period for Nominations to the National Medal of Technology and Innovation Nomination...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-24

    ... innovation and/or be familiar with the education, training, employment and management of technological... Innovations/Bioengineering and Biomedical Technology; Technology Management/Computing/IT/Manufacturing...] Extension of Period for Nominations to the National Medal of Technology and Innovation Nomination Evaluation...

  1. Teaching XBRL to Graduate Business Students: A Hands-On Approach

    ERIC Educational Resources Information Center

    Pinsker, Robert

    2004-01-01

    EXtensible Business Reporting Language (XBRL) is a non-proprietary, computer language that has many uses. Known primarily as the Extensible Markup Language (XML) for business reporting, XBRL allows entities to report their business information (i.e., financial statements, announcements, etc.) on the Internet and communicate with other entities'…

  2. Confined Turbulent Swirling Recirculating Flow Predictions. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Abujelala, M. T.

    1984-01-01

    Turbulent swirling flow, the STARPIC computer code, turbulence modeling of turbulent flows, the k-xi turbulence model and extensions, turbulence parameters deduction from swirling confined flow measurements, extension of the k-xi to confined swirling recirculating flows, and general predictions for confined turbulent swirling flow are discussed.

  3. CBES--An Efficient Implementation of the Coursewriter Language.

    ERIC Educational Resources Information Center

    Franks, Edward W.

    An extensive computer based education system (CBES) built around the IBM Coursewriter III program product at Ohio State University is described. In this system, numerous extensions have been added to the Coursewriter III language to provide capabilities needed to implement sophisticated instructional strategies. CBES design goals include lower CPU…

  4. [Development of computer aided forming techniques in manufacturing scaffolds for bone tissue engineering].

    PubMed

    Wei, Xuelei; Dong, Fuhui

    2011-12-01

    To review recent advance in the research and application of computer aided forming techniques for constructing bone tissue engineering scaffolds. The literature concerning computer aided forming techniques for constructing bone tissue engineering scaffolds in recent years was reviewed extensively and summarized. Several studies over last decade have focused on computer aided forming techniques for bone scaffold construction using various scaffold materials, which is based on computer aided design (CAD) and bone scaffold rapid prototyping (RP). CAD include medical CAD, STL, and reverse design. Reverse design can fully simulate normal bone tissue and could be very useful for the CAD. RP techniques include fused deposition modeling, three dimensional printing, selected laser sintering, three dimensional bioplotting, and low-temperature deposition manufacturing. These techniques provide a new way to construct bone tissue engineering scaffolds with complex internal structures. With rapid development of molding and forming techniques, computer aided forming techniques are expected to provide ideal bone tissue engineering scaffolds.

  5. Computationally intensive econometrics using a distributed matrix-programming language.

    PubMed

    Doornik, Jurgen A; Hendry, David F; Shephard, Neil

    2002-06-15

    This paper reviews the need for powerful computing facilities in econometrics, focusing on concrete problems which arise in financial economics and in macroeconomics. We argue that the profession is being held back by the lack of easy-to-use generic software which is able to exploit the availability of cheap clusters of distributed computers. Our response is to extend, in a number of directions, the well-known matrix-programming interpreted language Ox developed by the first author. We note three possible levels of extensions: (i) Ox with parallelization explicit in the Ox code; (ii) Ox with a parallelized run-time library; and (iii) Ox with a parallelized interpreter. This paper studies and implements the first case, emphasizing the need for deterministic computing in science. We give examples in the context of financial economics and time-series modelling.

  6. Examining Trust, Forgiveness and Regret as Computational Concepts

    NASA Astrophysics Data System (ADS)

    Marsh, Stephen; Briggs, Pamela

    The study of trust has advanced tremendously in recent years, to the extent that the goal of a more unified formalisation of the concept is becoming feasible. To that end, we have begun to examine the closely related concepts of regret and forgiveness and their relationship to trust and its siblings. The resultant formalisation allows computational tractability in, for instance, artificial agents. Moreover, regret and forgiveness, when allied to trust, are very powerful tools in the Ambient Intelligence (AmI) security area, especially where Human Computer Interaction and concrete human understanding are key. This paper introduces the concepts of regret and forgiveness, exploring them from social psychological as well as a computational viewpoint, and presents an extension to Marsh's original trust formalisation that takes them into account. It discusses and explores work in the AmI environment, and further potential applications.

  7. Correlation between Preoperative High Resolution Computed Tomography (CT) Findings with Surgical Findings in Chronic Otitis Media (COM) Squamosal Type.

    PubMed

    Karki, S; Pokharel, M; Suwal, S; Poudel, R

    Background The exact role of High resolution computed tomography (HRCT) temporal bone in preoperative assessment of Chronic suppurative otitis media atticoantral disease still remains controversial. Objective To evaluate the role of high resolution computed tomography temporal bone in Chronic suppurative otitis media atticoantral disease and to compare preoperative computed tomographic findings with intra-operative findings. Method Prospective, analytical study conducted among 65 patients with chronic suppurative otitis media atticoantral disease in Department of Radiodiagnosis, Kathmandu University Dhulikhel Hospital between January 2015 to July 2016. The operative findings were compared with results of imaging. The parameters of comparison were erosion of ossicles, scutum, facial canal, lateral semicircular canal, sigmoid and tegmen plate along with extension of disease to sinus tympani and facial recess. Sensitivity, specificity, negative predictive value, positive predictive values were calculated. Result High resolution computed tomography temporal bone offered sensitivity (Se) and specificity (Sp) of 100% for visualization of sigmoid and tegmen plate erosion. The performance of HRCT in detecting malleus (Se=100%, Sp=95.23%), incus (Se=100%,Sp=80.48%) and stapes (Se=96.55%, Sp=71.42%) erosion was excellent. It offered precise information about facial canal erosion (Se=100%, Sp=75%), scutum erosion (Se=100%, Sp=96.87%) and extension of disease to facial recess and sinus tympani (Se=83.33%,Sp=100%). high resolution computed tomography showed specificity of 100% for lateral semicircular canal erosion (Sp=100%) but with low sensitivity (Se=53.84%). Conclusion The findings of high resolution computed tomography and intra-operative findings were well comparable except for lateral semicircular canal erosion. high resolution computed tomography temporal bone acts as a road map for surgeon to identify the extent of disease, plan for appropriate procedure that is required and prepare for potential complications that can be encountered during surgery.

  8. Computer Model Helps Communities Gauge Effects of New Industry.

    ERIC Educational Resources Information Center

    Long, Celeste; And Others

    1987-01-01

    Describes computer Industrial Impact Model used by Texas Agricultural Extension Service rural planners to assess potential benefits and costs of new firms on community private and public sectors. Presents selected data/results for two communities assessing impact of the same plant. (NEC)

  9. 78 FR 57839 - Request for Information on Computer Security Incident Coordination (CSIC)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-20

    ... Institute of Standards and Technology (NIST), United States Department of Commerce. ACTION: Notice, extension of comment period. SUMMARY: NIST is extending the deadline for submitting comments relating to Computer Security Incident Coordination. NIST experienced technical difficulties with receiving email...

  10. Ecological Footprint Analysis (EFA) for the Chicago Metropolitan Area: Initial Estimation - slides

    EPA Science Inventory

    Because of its computational simplicity, Ecological Footprint Analysis (EFA) has been extensively deployed for assessing the sustainability of various environmental systems. In general, EFA aims at capturing the impacts of human activity on the environment by computing the amount...

  11. COMPUTATIONAL TOXICOLOGY - OBJECTIVE 2: DEVELOPING APPROACHES FOR PRIORITIZING CHEMICALS FOR SUBSEQUENT SCREENING AND TESTING

    EPA Science Inventory

    One of the strategic objectives of the Computational Toxicology Program is to develop approaches for prioritizing chemicals for subsequent screening and testing. Approaches currently available for this process require extensive resources. Therefore, less costly and time-extensi...

  12. Predicting chaos for infinite dimensional dynamical systems: The Kuramoto-Sivashinsky equation, a case study

    NASA Technical Reports Server (NTRS)

    Smyrlis, Yiorgos S.; Papageorgiou, Demetrios T.

    1991-01-01

    The results of extensive computations are presented in order to accurately characterize transitions to chaos for the Kuramoto-Sivashinsky equation. In particular, the oscillatory dynamics in a window that supports a complete sequence of period doubling bifurcations preceding chaos is followed. As many as thirteen period doublings are followed and used to compute the Feigenbaum number for the cascade and so enable, for the first time, an accurate numerical evaluation of the theory of universal behavior of nonlinear systems, for an infinite dimensional dynamical system. Furthermore, the dynamics at the threshold of chaos exhibit a fractal behavior which is demonstrated and used to compute a universal scaling factor that enables the self-similar continuation of the solution into a chaotic regime.

  13. Payload/orbiter contamination control requirement study, volume 2, exhibit A

    NASA Technical Reports Server (NTRS)

    Bareiss, L. E.; Hooper, V. W.; Rantanen, R. O.; Ress, E. B.

    1974-01-01

    The computer printout data generated during the Payload/Orbiter Contamination Control Requirement Study are presented. The computer listings of the input surface data matrices, the viewfactor data matrices, and the geometric relationship data matrices for the three orbiter/spacelab configurations analyzed in this study are given. These configurations have been broken up into the geometrical surfaces and nodes necessary to define the principal critical surfaces whether they are contaminant sources, experimental surfaces, or operational surfaces. A numbering scheme was established based upon nodal numbers that relates the various spacelab surfaces to a specific surface material or function. This numbering system was developed for the spacelab configurations such that future extension to a surface mapping capability could be developed as required.

  14. Sorting by Cuts, Joins, and Whole Chromosome Duplications.

    PubMed

    Zeira, Ron; Shamir, Ron

    2017-02-01

    Genome rearrangement problems have been extensively studied due to their importance in biology. Most studied models assumed a single copy per gene. However, in reality, duplicated genes are common, most notably in cancer. In this study, we make a step toward handling duplicated genes by considering a model that allows the atomic operations of cut, join, and whole chromosome duplication. Given two linear genomes, [Formula: see text] with one copy per gene and [Formula: see text] with two copies per gene, we give a linear time algorithm for computing a shortest sequence of operations transforming [Formula: see text] into [Formula: see text] such that all intermediate genomes are linear. We also show that computing an optimal sequence with fewest duplications is NP-hard.

  15. A Monte Carlo study of Weibull reliability analysis for space shuttle main engine components

    NASA Technical Reports Server (NTRS)

    Abernethy, K.

    1986-01-01

    The incorporation of a number of additional capabilities into an existing Weibull analysis computer program and the results of Monte Carlo computer simulation study to evaluate the usefulness of the Weibull methods using samples with a very small number of failures and extensive censoring are discussed. Since the censoring mechanism inherent in the Space Shuttle Main Engine (SSME) data is hard to analyze, it was decided to use a random censoring model, generating censoring times from a uniform probability distribution. Some of the statistical techniques and computer programs that are used in the SSME Weibull analysis are described. The methods documented in were supplemented by adding computer calculations of approximate (using iteractive methods) confidence intervals for several parameters of interest. These calculations are based on a likelihood ratio statistic which is asymptotically a chisquared statistic with one degree of freedom. The assumptions built into the computer simulations are described. The simulation program and the techniques used in it are described there also. Simulation results are tabulated for various combinations of Weibull shape parameters and the numbers of failures in the samples.

  16. Transient Three-Dimensional Side Load Analysis of Out-of-Round Film Cooled Nozzles

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Lin, Jeff; Ruf, Joe; Guidos, Mike

    2010-01-01

    The objective of this study is to investigate the effect of nozzle out-of-roundness on the transient startup side loads. The out-of-roundness could be the result of asymmetric loads induced by hardware attached to the nozzle, asymmetric internal stresses induced by previous tests and/or deformation, such as creep, from previous tests. The rocket engine studied encompasses a regeneratively cooled thrust chamber and a film cooled nozzle extension with film coolant distributed from a turbine exhaust manifold. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics formulation, and a transient inlet history based on an engine system simulation. Transient startup computations were performed with the out-of-roundness achieved by four degrees of ovalization of the nozzle: one perfectly round, one slightly out-of-round, one more out-of-round, and one significantly out-of-round. The computed side load physics caused by the nozzle out-of-roundness and its effect on nozzle side load are reported and discussed.

  17. In Silico Augmentation of the Drug Development Pipeline: Examples from the study of Acute Inflammation.

    PubMed

    An, Gary; Bartels, John; Vodovotz, Yoram

    2011-03-01

    The clinical translation of promising basic biomedical findings, whether derived from reductionist studies in academic laboratories or as the product of extensive high-throughput and -content screens in the biotechnology and pharmaceutical industries, has reached a period of stagnation in which ever higher research and development costs are yielding ever fewer new drugs. Systems biology and computational modeling have been touted as potential avenues by which to break through this logjam. However, few mechanistic computational approaches are utilized in a manner that is fully cognizant of the inherent clinical realities in which the drugs developed through this ostensibly rational process will be ultimately used. In this article, we present a Translational Systems Biology approach to inflammation. This approach is based on the use of mechanistic computational modeling centered on inherent clinical applicability, namely that a unified suite of models can be applied to generate in silico clinical trials, individualized computational models as tools for personalized medicine, and rational drug and device design based on disease mechanism.

  18. Using Predictability for Lexical Segmentation.

    PubMed

    Çöltekin, Çağrı

    2017-09-01

    This study investigates a strategy based on predictability of consecutive sub-lexical units in learning to segment a continuous speech stream into lexical units using computational modeling and simulations. Lexical segmentation is one of the early challenges during language acquisition, and it has been studied extensively through psycholinguistic experiments as well as computational methods. However, despite strong empirical evidence, the explicit use of predictability of basic sub-lexical units in models of segmentation is underexplored. This paper presents an incremental computational model of lexical segmentation for exploring the usefulness of predictability for lexical segmentation. We show that the predictability cue is a strong cue for segmentation. Contrary to earlier reports in the literature, the strategy yields state-of-the-art segmentation performance with an incremental computational model that uses only this particular cue in a cognitively plausible setting. The paper also reports an in-depth analysis of the model, investigating the conditions affecting the usefulness of the strategy. Copyright © 2016 Cognitive Science Society, Inc.

  19. A study of real-time computer graphic display technology for aeronautical applications

    NASA Technical Reports Server (NTRS)

    Rajala, S. A.

    1981-01-01

    The development, simulation, and testing of an algorithm for anti-aliasing vector drawings is discussed. The pseudo anti-aliasing line drawing algorithm is an extension to Bresenham's algorithm for computer control of a digital plotter. The algorithm produces a series of overlapping line segments where the display intensity shifts from one segment to the other in this overlap (transition region). In this algorithm the length of the overlap and the intensity shift are essentially constants because the transition region is an aid to the eye in integrating the segments into a single smooth line.

  20. Optimized feature-detection for on-board vision-based surveillance

    NASA Astrophysics Data System (ADS)

    Gond, Laetitia; Monnin, David; Schneider, Armin

    2012-06-01

    The detection and matching of robust features in images is an important step in many computer vision applications. In this paper, the importance of the keypoint detection algorithms and their inherent parameters in the particular context of an image-based change detection system for IED detection is studied. Through extensive application-oriented experiments, we draw an evaluation and comparison of the most popular feature detectors proposed by the computer vision community. We analyze how to automatically adjust these algorithms to changing imaging conditions and suggest improvements in order to achieve more exibility and robustness in their practical implementation.

  1. A computer technique for detailed analysis of mission radius and maneuverability characteristics of fighter aircraft

    NASA Technical Reports Server (NTRS)

    Foss, W. E., Jr.

    1981-01-01

    A computer technique to determine the mission radius and maneuverability characteristics of combat aircraft was developed. The technique was used to determine critical operational requirements and the areas in which research programs would be expected to yield the most beneficial results. In turn, the results of research efforts were evaluated in terms of aircraft performance on selected mission segments and for complete mission profiles. Extensive use of the technique in evaluation studies indicates that the calculated performance is essentially the same as that obtained by the proprietary programs in use throughout the aircraft industry.

  2. REVEAL: An Extensible Reduced Order Model Builder for Simulation and Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agarwal, Khushbu; Sharma, Poorva; Ma, Jinliang

    2013-04-30

    Many science domains need to build computationally efficient and accurate representations of high fidelity, computationally expensive simulations. These computationally efficient versions are known as reduced-order models. This paper presents the design and implementation of a novel reduced-order model (ROM) builder, the REVEAL toolset. This toolset generates ROMs based on science- and engineering-domain specific simulations executed on high performance computing (HPC) platforms. The toolset encompasses a range of sampling and regression methods that can be used to generate a ROM, automatically quantifies the ROM accuracy, and provides support for an iterative approach to improve ROM accuracy. REVEAL is designed to bemore » extensible in order to utilize the core functionality with any simulator that has published input and output formats. It also defines programmatic interfaces to include new sampling and regression techniques so that users can ‘mix and match’ mathematical techniques to best suit the characteristics of their model. In this paper, we describe the architecture of REVEAL and demonstrate its usage with a computational fluid dynamics model used in carbon capture.« less

  3. Robust efficient video fingerprinting

    NASA Astrophysics Data System (ADS)

    Puri, Manika; Lubin, Jeffrey

    2009-02-01

    We have developed a video fingerprinting system with robustness and efficiency as the primary and secondary design criteria. In extensive testing, the system has shown robustness to cropping, letter-boxing, sub-titling, blur, drastic compression, frame rate changes, size changes and color changes, as well as to the geometric distortions often associated with camcorder capture in cinema settings. Efficiency is afforded by a novel two-stage detection process in which a fast matching process first computes a number of likely candidates, which are then passed to a second slower process that computes the overall best match with minimal false alarm probability. One key component of the algorithm is a maximally stable volume computation - a three-dimensional generalization of maximally stable extremal regions - that provides a content-centric coordinate system for subsequent hash function computation, independent of any affine transformation or extensive cropping. Other key features include an efficient bin-based polling strategy for initial candidate selection, and a final SIFT feature-based computation for final verification. We describe the algorithm and its performance, and then discuss additional modifications that can provide further improvement to efficiency and accuracy.

  4. METCAN: The metal matrix composite analyzer

    NASA Technical Reports Server (NTRS)

    Hopkins, Dale A.; Murthy, Pappu L. N.

    1988-01-01

    Metal matrix composites (MMC) are the subject of intensive study and are receiving serious consideration for critical structural applications in advanced aerospace systems. MMC structural analysis and design methodologies are studied. Predicting the mechanical and thermal behavior and the structural response of components fabricated from MMC requires the use of a variety of mathematical models. These models relate stresses to applied forces, stress intensities at the tips of cracks to nominal stresses, buckling resistance to applied force, or vibration response to excitation forces. The extensive research in computational mechanics methods for predicting the nonlinear behavior of MMC are described. This research has culminated in the development of the METCAN (METal Matrix Composite ANalyzer) computer code.

  5. AMPS data management concepts. [Atmospheric, Magnetospheric and Plasma in Space experiment

    NASA Technical Reports Server (NTRS)

    Metzelaar, P. N.

    1975-01-01

    Five typical AMPS experiments were formulated to allow simulation studies to verify data management concepts. Design studies were conducted to analyze these experiments in terms of the applicable procedures, data processing and displaying functions. Design concepts for AMPS data management system are presented which permit both automatic repetitive measurement sequences and experimenter-controlled step-by-step procedures. Extensive use is made of a cathode ray tube display, the experimenters' alphanumeric keyboard, and the computer. The types of computer software required by the system and the possible choices of control and display procedures available to the experimenter are described for several examples. An electromagnetic wave transmission experiment illustrates the methods used to analyze data processing requirements.

  6. Hypersonic Research Vehicle (HRV) real-time flight test support feasibility and requirements study. Part 1: Real-time flight experiment support

    NASA Technical Reports Server (NTRS)

    Rediess, Herman A.; Ramnath, Rudrapatna V.; Vrable, Daniel L.; Hirvo, David H.; Mcmillen, Lowell D.; Osofsky, Irving B.

    1991-01-01

    The results are presented of a study to identify potential real time remote computational applications to support monitoring HRV flight test experiments along with definitions of preliminary requirements. A major expansion of the support capability available at Ames-Dryden was considered. The focus is on the use of extensive computation and data bases together with real time flight data to generate and present high level information to those monitoring the flight. Six examples were considered: (1) boundary layer transition location; (2) shock wave position estimation; (3) performance estimation; (4) surface temperature estimation; (5) critical structural stress estimation; and (6) stability estimation.

  7. Shaded-Color Picture Generation of Computer-Defined Arbitrary Shapes

    NASA Technical Reports Server (NTRS)

    Cozzolongo, J. V.; Hermstad, D. L.; Mccoy, D. S.; Clark, J.

    1986-01-01

    SHADE computer program generates realistic color-shaded pictures from computer-defined arbitrary shapes. Objects defined for computer representation displayed as smooth, color-shaded surfaces, including varying degrees of transparency. Results also used for presentation of computational results. By performing color mapping, SHADE colors model surface to display analysis results as pressures, stresses, and temperatures. NASA has used SHADE extensively in sign and analysis of high-performance aircraft. Industry should find applications for SHADE in computer-aided design and computer-aided manufacturing. SHADE written in VAX FORTRAN and MACRO Assembler for either interactive or batch execution.

  8. Dynamics of Numerics & Spurious Behaviors in CFD Computations. Revised

    NASA Technical Reports Server (NTRS)

    Yee, Helen C.; Sweby, Peter K.

    1997-01-01

    The global nonlinear behavior of finite discretizations for constant time steps and fixed or adaptive grid spacings is studied using tools from dynamical systems theory. Detailed analysis of commonly used temporal and spatial discretizations for simple model problems is presented. The role of dynamics in the understanding of long time behavior of numerical integration and the nonlinear stability, convergence, and reliability of using time-marching approaches for obtaining steady-state numerical solutions in computational fluid dynamics (CFD) is explored. The study is complemented with examples of spurious behavior observed in steady and unsteady CFD computations. The CFD examples were chosen to illustrate non-apparent spurious behavior that was difficult to detect without extensive grid and temporal refinement studies and some knowledge from dynamical systems theory. Studies revealed the various possible dangers of misinterpreting numerical simulation of realistic complex flows that are constrained by available computing power. In large scale computations where the physics of the problem under study is not well understood and numerical simulations are the only viable means of solution, extreme care must be taken in both computation and interpretation of the numerical data. The goal of this paper is to explore the important role that dynamical systems theory can play in the understanding of the global nonlinear behavior of numerical algorithms and to aid the identification of the sources of numerical uncertainties in CFD.

  9. Boiler-turbine life extension

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Natzkov, S.; Nikolov, M.

    1995-12-01

    The design life of the main power equipment-boilers and turbines is about 105 working hours. The possibilities for life extension are after normatively regulated control tests. The diagnostics and methodology for Boilers and Turbines Elements Remaining Life Assessment using up to date computer programs, destructive and nondestructive control of metal of key elements of units equipment, metal creep and low cycle fatigue calculations. As well as data for most common damages and some technical decisions for elements life extension are presented.

  10. Railroads and the Environment : Estimation of Fuel Consumption in Rail Transportation : Volume 3. Comparison of Computer Simulations with Field Measurements

    DOT National Transportation Integrated Search

    1978-09-01

    This report documents comparisons between extensive rail freight service measurements (previously presented in Volume II) and simulations of the same operations using a sophisticated train performance calculator computer program. The comparisons cove...

  11. Assessment of (Computer-Supported) Collaborative Learning

    ERIC Educational Resources Information Center

    Strijbos, J. -W.

    2011-01-01

    Within the (Computer-Supported) Collaborative Learning (CS)CL research community, there has been an extensive dialogue on theories and perspectives on learning from collaboration, approaches to scaffold (script) the collaborative process, and most recently research methodology. In contrast, the issue of assessment of collaborative learning has…

  12. 48 CFR Appendix A to Chapter 2 - Armed Services Board of Contract Appeals

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... EXTENSIONS Rule 33Time, Computation and Extensions EX PARTE COMMUNICATIONS Rule 34Ex parte Communications..., taking into account such factors as the size and complexity of the claim, the contractor may file a... exhibits, post-hearing briefs, and documents which the Board has specifically designated to be made a part...

  13. 48 CFR Appendix A to Chapter 2 - Armed Services Board of Contract Appeals

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... EXTENSIONS Rule 33Time, Computation and Extensions EX PARTE COMMUNICATIONS Rule 34Ex parte Communications..., taking into account such factors as the size and complexity of the claim, the contractor may file a... exhibits, post-hearing briefs, and documents which the Board has specifically designated to be made a part...

  14. 48 CFR Appendix A to Chapter 2 - Armed Services Board of Contract Appeals

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... EXTENSIONS Rule 33Time, Computation and Extensions EX PARTE COMMUNICATIONS Rule 34Ex parte Communications..., taking into account such factors as the size and complexity of the claim, the contractor may file a... exhibits, post-hearing briefs, and documents which the Board has specifically designated to be made a part...

  15. Using R-Project for Free Statistical Analysis in Extension Research

    ERIC Educational Resources Information Center

    Mangiafico, Salvatore S.

    2013-01-01

    One option for Extension professionals wishing to use free statistical software is to use online calculators, which are useful for common, simple analyses. A second option is to use a free computing environment capable of performing statistical analyses, like R-project. R-project is free, cross-platform, powerful, and respected, but may be…

  16. Extending the Binomial Checkpointing Technique for Resilience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walther, Andrea; Narayanan, Sri Hari Krishna

    In terms of computing time, adjoint methods offer a very attractive alternative to compute gradient information, re- quired, e.g., for optimization purposes. However, together with this very favorable temporal complexity result comes a memory requirement that is in essence proportional with the operation count of the underlying function, e.g., if algo- rithmic differentiation is used to provide the adjoints. For this reason, checkpointing approaches in many variants have become popular. This paper analyzes an extension of the so-called binomial approach to cover also possible failures of the computing systems. Such a measure of precaution is of special interest for massivemore » parallel simulations and adjoint calculations where the mean time between failure of the large scale computing system is smaller than the time needed to complete the calculation of the adjoint information. We de- scribe the extensions of standard checkpointing approaches required for such resilience, provide a corresponding imple- mentation and discuss numerical results.« less

  17. Numerical computation of spherical harmonics of arbitrary degree and order by extending exponent of floating point numbers

    NASA Astrophysics Data System (ADS)

    Fukushima, Toshio

    2012-04-01

    By extending the exponent of floating point numbers with an additional integer as the power index of a large radix, we compute fully normalized associated Legendre functions (ALF) by recursion without underflow problem. The new method enables us to evaluate ALFs of extremely high degree as 232 = 4,294,967,296, which corresponds to around 1 cm resolution on the Earth's surface. By limiting the application of exponent extension to a few working variables in the recursion, choosing a suitable large power of 2 as the radix, and embedding the contents of the basic arithmetic procedure of floating point numbers with the exponent extension directly in the program computing the recurrence formulas, we achieve the evaluation of ALFs in the double-precision environment at the cost of around 10% increase in computational time per single ALF. This formulation realizes meaningful execution of the spherical harmonic synthesis and/or analysis of arbitrary degree and order.

  18. Fluid Dynamics of Competitive Swimming: A Computational Study

    NASA Astrophysics Data System (ADS)

    Mittal, Rajat; Loebbeck, Alfred; Singh, Hersh; Mark, Russell; Wei, Timothy

    2004-11-01

    The dolphin kick is an important component in competitive swimming and is used extensively by swimmers immediately following the starting dive as well as after turns. In this stroke, the swimmer swims about three feet under the water surface and the stroke is executed by performing an undulating wave-like motion of the body that is quite similar to the anguilliform propulsion mode in fish. Despite the relatively simple kinematics of this stoke, considerable variability in style and performance is observed even among Olympic level swimmers. Motivated by this, a joint experimental-numerical study has been initiated to examine the fluid-dynamics of this stroke. The current presentation will describe the computational portion of this study. The computations employ a sharp interface immersed boundary method (IBM) which allows us to simulate flows with complex moving boudnaries on stationary Cartesian grids. 3D body scans of male and female Olympic swimmers have been obtained and these are used in conjuction with high speed videos to recreate a realistic dolphin kick for the IBM solver. Preliminary results from these computations will be presented.

  19. Adaptive DIT-Based Fringe Tracking and Prediction at IOTA

    NASA Technical Reports Server (NTRS)

    Wilson, Edward; Pedretti, Ettore; Bregman, Jesse; Mah, Robert W.; Traub, Wesley A.

    2004-01-01

    An automatic fringe tracking system has been developed and implemented at the Infrared Optical Telescope Array (IOTA). In testing during May 2002, the system successfully minimized the optical path differences (OPDs) for all three baselines at IOTA. Based on sliding window discrete Fourier transform (DFT) calculations that were optimized for computational efficiency and robustness to atmospheric disturbances, the algorithm has also been tested extensively on off-line data. Implemented in ANSI C on the 266 MHZ PowerPC processor running the VxWorks real-time operating system, the algorithm runs in approximately 2.0 milliseconds per scan (including all three interferograms), using the science camera and piezo scanners to measure and correct the OPDs. Preliminary analysis on an extension of this algorithm indicates a potential for predictive tracking, although at present, real-time implementation of this extension would require significantly more computational capacity.

  20. Wedge sampling for computing clustering coefficients and triangle counts on large graphs

    DOE PAGES

    Seshadhri, C.; Pinar, Ali; Kolda, Tamara G.

    2014-05-08

    Graphs are used to model interactions in a variety of contexts, and there is a growing need to quickly assess the structure of such graphs. Some of the most useful graph metrics are based on triangles, such as those measuring social cohesion. Despite the importance of these triadic measures, algorithms to compute them can be extremely expensive. We discuss the method of wedge sampling. This versatile technique allows for the fast and accurate approximation of various types of clustering coefficients and triangle counts. Furthermore, these techniques are extensible to counting directed triangles in digraphs. Our methods come with provable andmore » practical time-approximation tradeoffs for all computations. We provide extensive results that show our methods are orders of magnitude faster than the state of the art, while providing nearly the accuracy of full enumeration.« less

  1. The Next Frontier in Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarrao, John

    2016-11-16

    Exascale computing refers to computing systems capable of at least one exaflop or a billion calculations per second (1018). That is 50 times faster than the most powerful supercomputers being used today and represents a thousand-fold increase over the first petascale computer that came into operation in 2008. How we use these large-scale simulation resources is the key to solving some of today’s most pressing problems, including clean energy production, nuclear reactor lifetime extension and nuclear stockpile aging.

  2. Early MIMD experience on the CRAY X-MP

    NASA Astrophysics Data System (ADS)

    Rhoades, Clifford E.; Stevens, K. G.

    1985-07-01

    This paper describes some early experience with converting four physics simulation programs to the CRAY X-MP, a current Multiple Instruction, Multiple Data (MIMD) computer consisting of two processors each with an architecture similar to that of the CRAY-1. As a multi-processor, the CRAY X-MP together with the high speed Solid-state Storage Device (SSD) in an ideal machine upon which to study MIMD algorithms for solving the equations of mathematical physics because it is fast enough to run real problems. The computer programs used in this study are all FORTRAN versions of original production codes. They range in sophistication from a one-dimensional numerical simulation of collisionless plasma to a two-dimensional hydrodynamics code with heat flow to a couple of three-dimensional fluid dynamics codes with varying degrees of viscous modeling. Early research with a dual processor configuration has shown speed-ups ranging from 1.55 to 1.98. It has been observed that a few simple extensions to FORTRAN allow a typical programmer to achieve a remarkable level of efficiency. These extensions involve the concept of memory local to a concurrent subprogram and memory common to all concurrent subprograms.

  3. Imaging a non-singular rotating black hole at the center of the Galaxy

    NASA Astrophysics Data System (ADS)

    Lamy, F.; Gourgoulhon, E.; Paumard, T.; Vincent, F. H.

    2018-06-01

    We show that the rotating generalization of Hayward’s non-singular black hole previously studied in the literature is geodesically incomplete, and that its straightforward extension leads to a singular spacetime. We present another extension, which is devoid of any curvature singularity. The obtained metric depends on three parameters and, depending on their values, yields an event horizon or not. These two regimes, named respectively regular rotating Hayward black hole and naked rotating wormhole, are studied both numerically and analytically. In preparation for the upcoming results of the Event Horizon Telescope, the images of an accretion torus around Sgr A*, the supermassive object at the center of the Galaxy, are computed. These images contain, even in the absence of a horizon, a central faint region which bears a resemblance to the shadow of Kerr black holes and emphasizes the difficulty of claiming the existence of an event horizon from the analysis of strong-field images. The frequencies of the co- and contra-rotating orbits at the innermost stable circular orbit (ISCO) in this geometry are also computed, in the hope that quasi-periodic oscillations may permit to compare this model with Kerr’s black hole on observational grounds.

  4. The CCC system in two teaching hospitals: a progress report.

    PubMed

    Slack, W V; Bleich, H L

    1999-06-01

    Computing systems developed by the Center for Clinical Computing (CCC) have been in operation in Beth Israel and Brigham and Women's hospitals for over 10 years. Designed to be of direct benefit to doctors, nurses, and other clinicians in the care of their patients, the CCC systems give the results of diagnostic studies immediately upon request; offer access to the medical literature: give advice, consultation, alerts, and reminders; assist in the day-to-day practice to medicine, and participate directly in the education of medical students and house officers. The CCC systems are extensively used, even by physicians who are under no obligation to use them. Studies have shown that the systems are well received and that they help clinicians improve the quality of patient care. In addition, the CCC systems have had a beneficial impact on the finances of the two hospitals, and they have cost less than what many hospitals spend for financial computing alone.

  5. Combining destination diversion decisions and critical in-flight event diagnosis in computer aided testing of pilots

    NASA Technical Reports Server (NTRS)

    Rockwell, T. H.; Giffin, W. C.; Romer, D. J.

    1984-01-01

    Rockwell and Giffin (1982) and Giffin and Rockwell (1983) have discussed the use of computer aided testing (CAT) in the study of pilot response to critical in-flight events. The present investigation represents an extension of these earlier studies. In testing pilot responses to critical in-flight events, use is made of a Plato-touch CRT system operating on a menu based format. In connection with the typical diagnostic problem, the pilot was presented with symptoms within a flight scenario. In one problem, the pilot has four minutes for obtaining the information which is needed to make a diagnosis of the problem. In the reported research, the attempt has been made to combine both diagnosis and diversion scenario into a single computer aided test. Tests with nine subjects were conducted. The obtained results and their significance are discussed.

  6. Screening for cognitive impairment in older individuals. Validation study of a computer-based test.

    PubMed

    Green, R C; Green, J; Harrison, J M; Kutner, M H

    1994-08-01

    This study examined the validity of a computer-based cognitive test that was recently designed to screen the elderly for cognitive impairment. Criterion-related validity was examined by comparing test scores of impaired patients and normal control subjects. Construct-related validity was computed through correlations between computer-based subtests and related conventional neuropsychological subtests. University center for memory disorders. Fifty-two patients with mild cognitive impairment by strict clinical criteria and 50 unimpaired, age- and education-matched control subjects. Control subjects were rigorously screened by neurological, neuropsychological, imaging, and electrophysiological criteria to identify and exclude individuals with occult abnormalities. Using a cut-off total score of 126, this computer-based instrument had a sensitivity of 0.83 and a specificity of 0.96. Using a prevalence estimate of 10%, predictive values, positive and negative, were 0.70 and 0.96, respectively. Computer-based subtests correlated significantly with conventional neuropsychological tests measuring similar cognitive domains. Thirteen (17.8%) of 73 volunteers with normal medical histories were excluded from the control group, with unsuspected abnormalities on standard neuropsychological tests, electroencephalograms, or magnetic resonance imaging scans. Computer-based testing is a valid screening methodology for the detection of mild cognitive impairment in the elderly, although this particular test has important limitations. Broader applications of computer-based testing will require extensive population-based validation. Future studies should recognize that normal control subjects without a history of disease who are typically used in validation studies may have a high incidence of unsuspected abnormalities on neurodiagnostic studies.

  7. An Investigation of High-Order Shock-Capturing Methods for Computational Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Casper, Jay; Baysal, Oktay

    1997-01-01

    Topics covered include: Low-dispersion scheme for nonlinear acoustic waves in nonuniform flow; Computation of acoustic scattering by a low-dispersion scheme; Algorithmic extension of low-dispersion scheme and modeling effects for acoustic wave simulation; The accuracy of shock capturing in two spatial dimensions; Using high-order methods on lower-order geometries; and Computational considerations for the simulation of discontinuous flows.

  8. Computer-Based Training at a Military Medical Center: Understanding Decreased Participation in Training among Staff and Ways to Improve Completion Rates

    ERIC Educational Resources Information Center

    Lavender, Julie

    2013-01-01

    Military health care facilities make extensive use of computer-based training (CBT) for both clinical and non-clinical staff. Despite evidence identifying various factors that may impact CBT, the problem is unclear as to what factors specifically influence employee participation in computer-based training. The purpose of this mixed method case…

  9. PETSc Users Manual Revision 3.7

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balay, Satish; Abhyankar, S.; Adams, M.

    This manual describes the use of PETSc for the numerical solution of partial differential equations and related problems on high-performance computers. The Portable, Extensible Toolkit for Scientific Computation (PETSc) is a suite of data structures and routines that provide the building blocks for the implementation of large-scale application codes on parallel (and serial) computers. PETSc uses the MPI standard for all message-passing communication.

  10. PETSc Users Manual Revision 3.8

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balay, S.; Abhyankar, S.; Adams, M.

    This manual describes the use of PETSc for the numerical solution of partial differential equations and related problems on high-performance computers. The Portable, Extensible Toolkit for Scientific Computation (PETSc) is a suite of data structures and routines that provide the building blocks for the implementation of large-scale application codes on parallel (and serial) computers. PETSc uses the MPI standard for all message-passing communication.

  11. MIADS2 ... an alphanumeric map information assembly and display system for a large computer

    Treesearch

    Elliot L. Amidon

    1966-01-01

    A major improvement and extension of the Map Information Assembly and Display System (MIADS) developed in 1964 is described. Basic principles remain unchanged, but the computer programs have been expanded and rewritten for a large computer, in Fortran IV and MAP languages. The code system is extended from 99 integers to about 2,200 alphanumeric 2-character codes. Hand-...

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alexeev, A. V.; Maltseva, D. V.; Ivanov, V. A., E-mail: ivanov@polly.phys.msu.ru

    We study force-extension curves of a single semiflexible chain consisting of several rigid rods connected by flexible spacers. The atomic force microscopy and laser optical or magnetic tweezers apparatus stretching these rod-coil macromolecules are discussed. In addition, the stretching by external isotropic force is analyzed. The main attention is focused on computer simulation and analytical results. We demonstrate that the force-extension curves for rod-coil chains composed of two or three rods of equal length differ not only quantitatively but also qualitatively in different probe methods. These curves have an anomalous shape for a chain of two rods. End-to-end distributions ofmore » rod-coil chains are calculated by Monte Carlo method and compared with analytical equations. The influence of the spacer’s length on the force-extension curves in different probe methods is analyzed. The results can be useful for interpreting experiments on the stretching of rod-coil block-copolymers.« less

  13. Barcode extension for analysis and reconstruction of structures

    NASA Astrophysics Data System (ADS)

    Myhrvold, Cameron; Baym, Michael; Hanikel, Nikita; Ong, Luvena L.; Gootenberg, Jonathan S.; Yin, Peng

    2017-03-01

    Collections of DNA sequences can be rationally designed to self-assemble into predictable three-dimensional structures. The geometric and functional diversity of DNA nanostructures created to date has been enhanced by improvements in DNA synthesis and computational design. However, existing methods for structure characterization typically image the final product or laboriously determine the presence of individual, labelled strands using gel electrophoresis. Here we introduce a new method of structure characterization that uses barcode extension and next-generation DNA sequencing to quantitatively measure the incorporation of every strand into a DNA nanostructure. By quantifying the relative abundances of distinct DNA species in product and monomer bands, we can study the influence of geometry and sequence on assembly. We have tested our method using 2D and 3D DNA brick and DNA origami structures. Our method is general and should be extensible to a wide variety of DNA nanostructures.

  14. Barcode extension for analysis and reconstruction of structures.

    PubMed

    Myhrvold, Cameron; Baym, Michael; Hanikel, Nikita; Ong, Luvena L; Gootenberg, Jonathan S; Yin, Peng

    2017-03-13

    Collections of DNA sequences can be rationally designed to self-assemble into predictable three-dimensional structures. The geometric and functional diversity of DNA nanostructures created to date has been enhanced by improvements in DNA synthesis and computational design. However, existing methods for structure characterization typically image the final product or laboriously determine the presence of individual, labelled strands using gel electrophoresis. Here we introduce a new method of structure characterization that uses barcode extension and next-generation DNA sequencing to quantitatively measure the incorporation of every strand into a DNA nanostructure. By quantifying the relative abundances of distinct DNA species in product and monomer bands, we can study the influence of geometry and sequence on assembly. We have tested our method using 2D and 3D DNA brick and DNA origami structures. Our method is general and should be extensible to a wide variety of DNA nanostructures.

  15. Barcode extension for analysis and reconstruction of structures

    PubMed Central

    Myhrvold, Cameron; Baym, Michael; Hanikel, Nikita; Ong, Luvena L; Gootenberg, Jonathan S; Yin, Peng

    2017-01-01

    Collections of DNA sequences can be rationally designed to self-assemble into predictable three-dimensional structures. The geometric and functional diversity of DNA nanostructures created to date has been enhanced by improvements in DNA synthesis and computational design. However, existing methods for structure characterization typically image the final product or laboriously determine the presence of individual, labelled strands using gel electrophoresis. Here we introduce a new method of structure characterization that uses barcode extension and next-generation DNA sequencing to quantitatively measure the incorporation of every strand into a DNA nanostructure. By quantifying the relative abundances of distinct DNA species in product and monomer bands, we can study the influence of geometry and sequence on assembly. We have tested our method using 2D and 3D DNA brick and DNA origami structures. Our method is general and should be extensible to a wide variety of DNA nanostructures. PMID:28287117

  16. NASTRAN computer system level 12.1

    NASA Technical Reports Server (NTRS)

    Butler, T. G.

    1971-01-01

    Program uses finite element displacement method for solving linear response of large, three-dimensional structures subject to static, dynamic, thermal, and random loadings. Program adapts to computers of different manufacture, permits up-dating and extention, allows interchange of output and input information between users, and is extensively documented.

  17. Little Package, Big Deal.

    ERIC Educational Resources Information Center

    Campbell, Joseph K.

    1979-01-01

    Describes New York State's extension experience in using the programable calculator, a portable pocket-size computer, to solve many of the problems that central computers now handle. Subscription services to programs written for the Texas Instruments TI-59 programable calculator are provided by both Cornell and Iowa State Universities. (MF)

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cowan, R. D.; Rajnak, K.; Renard, P.

    This is a set of three Fortran IV programs, RCN29, HFMOD7, and RCN229, based on the Herman--Skillman and Charlotte Froese Fischer programs, with extensive modifications and additions. The programs compute self-consistent-field radial wave functions and the various radial integrals involved in the computation of atomic energy levels and spectra.

  19. Introduction to SmartBooks. Report 23-93.

    ERIC Educational Resources Information Center

    Kopec, Danny; Wood, Carol

    Humankind has become accustomed to reading and learning from printed books. The computer offers us the possibility to exploit another medium whose key advantage is flexibility through extensive memory, computational speed, and versatile representational means. Specifically, we have the hypercard application, an integrated piece of software, with…

  20. Calendar Instruments in Retrospective Web Surveys

    ERIC Educational Resources Information Center

    Glasner, Tina; van der Vaart, Wander; Dijkstra, Wil

    2015-01-01

    Calendar instruments incorporate aided recall techniques such as temporal landmarks and visual time lines that aim to reduce response error in retrospective surveys. Those calendar instruments have been used extensively in off-line research (e.g., computer-aided telephone interviews, computer assisted personal interviewing, and paper and pen…

  1. Smisc - A collection of miscellaneous functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Landon Sego, PNNL

    2015-08-31

    A collection of functions for statistical computing and data manipulation. These include routines for rapidly aggregating heterogeneous matrices, manipulating file names, loading R objects, sourcing multiple R files, formatting datetimes, multi-core parallel computing, stream editing, specialized plotting, etc. Smisc-package A collection of miscellaneous functions allMissing Identifies missing rows or columns in a data frame or matrix as.numericSilent Silent wrapper for coercing a vector to numeric comboList Produces all possible combinations of a set of linear model predictors cumMax Computes the maximum of the vector up to the current index cumsumNA Computes the cummulative sum of a vector without propogating NAsmore » d2binom Probability functions for the sum of two independent binomials dataIn A flexible way to import data into R. dbb The Beta-Binomial Distribution df2list Row-wise conversion of a data frame to a list dfplapply Parallelized single row processing of a data frame dframeEquiv Examines the equivalence of two dataframes or matrices dkbinom Probability functions for the sum of k independent binomials factor2character Converts all factor variables in a dataframe to character variables findDepMat Identify linearly dependent rows or columns in a matrix formatDT Converts date or datetime strings into alternate formats getExtension Filename manipulations: remove the extension or path, extract the extension or path getPath Filename manipulations: remove the extension or path, extract the extension or path grabLast Filename manipulations: remove the extension or path, extract the extension or path ifelse1 Non-vectorized version of ifelse integ Simple numerical integration routine interactionPlot Two-way Interaction Plot with Error Bar linearMap Linear mapping of a numerical vector or scalar list2df Convert a list to a data frame loadObject Loads and returns the object(s) in an ".Rdata" file more Display the contents of a file to the R terminal movAvg2 Calculate the moving average using a 2-sided window openDevice Opens a graphics device based on the filename extension p2binom Probability functions for the sum of two independent binomials padZero Pad a vector of numbers with zeros parseJob Parses a collection of elements into (almost) equal sized groups pbb The Beta-Binomial Distribution pcbinom A continuous version of the binomial cdf pkbinom Probability functions for the sum of k independent binomials plapply Simple parallelization of lapply plotFun Plot one or more functions on a single plot PowerData An example of power data pvar Prints the name and value of one or more objects qbb The Beta-Binomial Distribution rbb And numerous others (space limits reporting).« less

  2. Model-Based Knowing: How Do Students Ground Their Understanding About Climate Systems in Agent-Based Computer Models?

    NASA Astrophysics Data System (ADS)

    Markauskaite, Lina; Kelly, Nick; Jacobson, Michael J.

    2017-12-01

    This paper gives a grounded cognition account of model-based learning of complex scientific knowledge related to socio-scientific issues, such as climate change. It draws on the results from a study of high school students learning about the carbon cycle through computational agent-based models and investigates two questions: First, how do students ground their understanding about the phenomenon when they learn and solve problems with computer models? Second, what are common sources of mistakes in students' reasoning with computer models? Results show that students ground their understanding in computer models in five ways: direct observation, straight abstraction, generalisation, conceptualisation, and extension. Students also incorporate into their reasoning their knowledge and experiences that extend beyond phenomena represented in the models, such as attitudes about unsustainable carbon emission rates, human agency, external events, and the nature of computational models. The most common difficulties of the students relate to seeing the modelled scientific phenomenon and connecting results from the observations with other experiences and understandings about the phenomenon in the outside world. An important contribution of this study is the constructed coding scheme for establishing different ways of grounding, which helps to understand some challenges that students encounter when they learn about complex phenomena with agent-based computer models.

  3. Extension of a Kinetic-Theory Approach for Computing Chemical-Reaction Rates to Reactions with Charged Particles

    NASA Technical Reports Server (NTRS)

    Liechty, Derek S.; Lewis, Mark J.

    2010-01-01

    Recently introduced molecular-level chemistry models that predict equilibrium and nonequilibrium reaction rates using only kinetic theory and fundamental molecular properties (i.e., no macroscopic reaction rate information) are extended to include reactions involving charged particles and electronic energy levels. The proposed extensions include ionization reactions, exothermic associative ionization reactions, endothermic and exothermic charge exchange reactions, and other exchange reactions involving ionized species. The extensions are shown to agree favorably with the measured Arrhenius rates for near-equilibrium conditions.

  4. Numerical simulation of turbulent jet noise, part 2

    NASA Technical Reports Server (NTRS)

    Metcalfe, R. W.; Orszag, S. A.

    1976-01-01

    Results on the numerical simulation of jet flow fields were used to study the radiated sound field, and in addition, to extend and test the capabilities of the turbulent jet simulation codes. The principal result of the investigation was the computation of the radiated sound field from a turbulent jet. In addition, the computer codes were extended to account for the effects of compressibility and eddy viscosity, and the treatment of the nonlinear terms of the Navier-Stokes equations was modified so that they can be computed in a semi-implicit way. A summary of the flow model and a description of the numerical methods used for its solution are presented. Calculations of the radiated sound field are reported. In addition, the extensions that were made to the fundamental dynamical codes are described. Finally, the current state-of-the-art for computer simulation of turbulent jet noise is summarized.

  5. A computer architecture for intelligent machines

    NASA Technical Reports Server (NTRS)

    Lefebvre, D. R.; Saridis, G. N.

    1991-01-01

    The Theory of Intelligent Machines proposes a hierarchical organization for the functions of an autonomous robot based on the Principle of Increasing Precision With Decreasing Intelligence. An analytic formulation of this theory using information-theoretic measures of uncertainty for each level of the intelligent machine has been developed in recent years. A computer architecture that implements the lower two levels of the intelligent machine is presented. The architecture supports an event-driven programming paradigm that is independent of the underlying computer architecture and operating system. Details of Execution Level controllers for motion and vision systems are addressed, as well as the Petri net transducer software used to implement Coordination Level functions. Extensions to UNIX and VxWorks operating systems which enable the development of a heterogeneous, distributed application are described. A case study illustrates how this computer architecture integrates real-time and higher-level control of manipulator and vision systems.

  6. SDA 7: A modular and parallel implementation of the simulation of diffusional association software

    PubMed Central

    Martinez, Michael; Romanowska, Julia; Kokh, Daria B.; Ozboyaci, Musa; Yu, Xiaofeng; Öztürk, Mehmet Ali; Richter, Stefan

    2015-01-01

    The simulation of diffusional association (SDA) Brownian dynamics software package has been widely used in the study of biomacromolecular association. Initially developed to calculate bimolecular protein–protein association rate constants, it has since been extended to study electron transfer rates, to predict the structures of biomacromolecular complexes, to investigate the adsorption of proteins to inorganic surfaces, and to simulate the dynamics of large systems containing many biomacromolecular solutes, allowing the study of concentration‐dependent effects. These extensions have led to a number of divergent versions of the software. In this article, we report the development of the latest version of the software (SDA 7). This release was developed to consolidate the existing codes into a single framework, while improving the parallelization of the code to better exploit modern multicore shared memory computer architectures. It is built using a modular object‐oriented programming scheme, to allow for easy maintenance and extension of the software, and includes new features, such as adding flexible solute representations. We discuss a number of application examples, which describe some of the methods available in the release, and provide benchmarking data to demonstrate the parallel performance. © 2015 The Authors. Journal of Computational Chemistry Published by Wiley Periodicals, Inc. PMID:26123630

  7. Nonlinear transient analysis of multi-mass flexible rotors - theory and applications

    NASA Technical Reports Server (NTRS)

    Kirk, R. G.; Gunter, E. J.

    1973-01-01

    The equations of motion necessary to compute the transient response of multi-mass flexible rotors are formulated to include unbalance, rotor acceleration, and flexible damped nonlinear bearing stations. A method of calculating the unbalance response of flexible rotors from a modified Myklestad-Prohl technique is discussed in connection with the method of solution for the transient response. Several special cases of simplified rotor-bearing systems are presented and analyzed for steady-state response, stability, and transient behavior. These simplified rotor models produce extensive design information necessary to insure stable performance to elastic mounted rotor-bearing systems under varying levels and forms of excitation. The nonlinear journal bearing force expressions derived from the short bearing approximation are utilized in the study of the stability and transient response of the floating bush squeeze damper support system. Both rigid and flexible rotor models are studied, and results indicate that the stability of flexible rotors supported by journal bearings can be greatly improved by the use of squeeze damper supports. Results from linearized stability studies of flexible rotors indicate that a tuned support system can greatly improve the performance of the units from the standpoint of unbalanced response and impact loading. Extensive stability and design charts may be readily produced for given rotor specifications by the computer codes presented in this analysis.

  8. Visual Environments for CFD Research

    NASA Technical Reports Server (NTRS)

    Watson, Val; George, Michael W. (Technical Monitor)

    1994-01-01

    This viewgraph presentation gives an overview of the visual environments for computational fluid dynamics (CFD) research. It includes details on critical needs from the future computer environment, features needed to attain this environment, prospects for changes in and the impact of the visualization revolution on the human-computer interface, human processing capabilities, limits of personal environment and the extension of that environment with computers. Information is given on the need for more 'visual' thinking (including instances of visual thinking), an evaluation of the alternate approaches for and levels of interactive computer graphics, a visual analysis of computational fluid dynamics, and an analysis of visualization software.

  9. Screen-based sedentary behavior and associations with functional strength in 6-15 year-old children in the United States.

    PubMed

    Edelson, Lisa R; Mathias, Kevin C; Fulgoni, Victor L; Karagounis, Leonidas G

    2016-02-04

    Physical strength is associated with improved health outcomes in children. Heavier children tend to have lower functional strength and mobility. Physical activity can increase children's strength, but it is unknown how different types of electronic media use impact physical strength. Data from the NHANES National Youth Fitness Survey (NNYFS) from children ages 6-15 were analyzed in this study. Regression models were conducted to determine if screen-based sedentary behaviors (television viewing time, computer/video game time) were associated with strength measures (grip, leg extensions, modified pull-ups, plank) while controlling for potential confounders including child age, sex, BMI z-score, and days per week with 60+ minutes of physical activity. Grip strength and leg extensions divided by body weight were analyzed to provide measures of relative strength together with pull-ups and plank, which require lifting the body. The results from the regression models showed the hypothesized inverse association between TV time and all strength measures. Computer time was only significantly inversely associated with the ability to do one or more pull-ups. This study shows that television viewing, but not computer/videogames, is inversely associated with measures of child strength while controlling for child characteristics and physical activity. These findings suggest that "screen time" may not be a unified construct with respect to strength outcomes and that further exploration of the potential benefits of reducing television time on children's strength and related mobility is needed.

  10. Leveraging Social Computing for Personalized Crisis Communication using Social Media.

    PubMed

    Leykin, Dmitry; Aharonson-Daniel, Limor; Lahad, Mooli

    2016-03-24

    The extensive use of social media in modern life redefines social interaction and communication. Communication plays an important role in mitigating, or exacerbating, the psychological and behavioral responses to critical incidents and disasters. As recent disasters demonstrated, people tend to converge to social media during and following emergencies. Authorities can then use this media and other computational methods to gain insights from the public, mainly to enhance situational awareness, but also to improve their communication with the public and public adherence to instructions. The current review presents a conceptual framework for studying psychological aspects of crisis and risk communication using the social media through social computing. Advanced analytical tools can be integrated in the processes and objectives of crisis communication. The availability of the computational techniques can improve communication with the public by a process of Hyper-Targeted Crisis Communication. The review suggests that using advanced computational tools for target-audience profiling and linguistic matching in social media, can facilitate more sensitive and personalized emergency communication.

  11. The numerical approach adopted in toba computer code for mass and heat transfer dynamic analysis of metal hydride hydrogen storage beds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    El Osery, I.A.

    1983-12-01

    Modelling studies of metal hydride hydrogen storage beds is a part of an extensive R and D program conducted in Egypt on hydrogen energy. In this context two computer programs; namely RET and RET1; have been developed. In RET computer program, a cylindrical conduction bed model is considered and an approximate analytical solution is used for solution of the associated mass and heat transfer problem. This problem is solved in RET1 computer program numerically allowing more flexibility in operating conditions but still limited to cylindrical configuration with only two alternatives for heat exchange; either fluid is passing through tubes imbeddedmore » in the solid alloy matrix or solid rods are surrounded by annular fluid tubes. The present computer code TOBA is more flexible and realistic. It performs the mass and heat transfer dynamic analysis of metal hydride storage beds using a variety of geometrical and operating alternatives.« less

  12. Computer animation for minimally invasive surgery: computer system requirements and preferred implementations

    NASA Astrophysics Data System (ADS)

    Pieper, Steven D.; McKenna, Michael; Chen, David; McDowall, Ian E.

    1994-04-01

    We are interested in the application of computer animation to surgery. Our current project, a navigation and visualization tool for knee arthroscopy, relies on real-time computer graphics and the human interface technologies associated with virtual reality. We believe that this new combination of techniques will lead to improved surgical outcomes and decreased health care costs. To meet these expectations in the medical field, the system must be safe, usable, and cost-effective. In this paper, we outline some of the most important hardware and software specifications in the areas of video input and output, spatial tracking, stereoscopic displays, computer graphics models and libraries, mass storage and network interfaces, and operating systems. Since this is a fairly new combination of technologies and a new application, our justification for our specifications are drawn from the current generation of surgical technology and by analogy to other fields where virtual reality technology has been more extensively applied and studied.

  13. Knowledge of computer among healthcare professionals of India: a key toward e-health.

    PubMed

    Gour, Neeraj; Srivastava, Dhiraj

    2010-11-01

    Information technology has radically changed the way that many people work and think. Over the years, technology has touched a new acme and now it is not confined to developed countries. Developing countries such as India have kept pace with the world in modern technology. Healthcare professionals can no longer ignore the application of information technology to healthcare because they are key to e-health. This study was conducted to enlighten the perspective and implications of computers among healthcare professionals, with the objective to assess the knowledge, use, and need of computers among healthcare professionals. A cross-sectional study of 240 healthcare professionals, including doctors, nurses, lab technicians, and pharmacists, was conducted. Each participant was interviewed using a pretested, semistructured format. Of 240 healthcare professionals, 57.91% were knowledgeable about computers. Of them, 22.08% had extensive knowledge and 35.83% had partial knowledge. Computer knowledge was greater among the age group 20-25 years (high knowledge-43.33% and partial knowledge-46.66%). Of 99 males, 21.21% were found to have good knowledge and 42.42% had partial knowledge. A majority of doctors and nurses used computer for study purposes. The remaining healthcare professionals used it basically for the sake of entertainment, Internet, and e-mail. A majority of all healthcare professionals (95.41%) requested computer training, which according to them would definitely help to make their future more bright and nurtured as well as to enhance their knowledge regarding computers.

  14. Visual management support system

    Treesearch

    Lee Anderson; Jerry Mosier; Geoffrey Chandler

    1979-01-01

    The Visual Management Support System (VMSS) is an extension of an existing computer program called VIEWIT, which has been extensively used by the U. S. Forest Service. The capabilities of this program lie in the rapid manipulation of large amounts of data, specifically opera-ting as a tool to overlay or merge one set of data with another. VMSS was conceived to...

  15. Getting off the Straight and Narrow: Exploiting Non-Linear, Interactive Narrative Structures in Digital Stories for Language Teaching

    ERIC Educational Resources Information Center

    Prosser, Andrew

    2014-01-01

    Digital storytelling is already used extensively in language education. Web documentaries, particularly in terms of design and narrative structure, provide an extension of the digital storytelling concept, specifically in terms of increased interactivity. Using a model of interactive, non-linear storytelling, originally derived from computer game…

  16. The Next Frontier in Computing

    ScienceCinema

    Sarrao, John

    2018-06-13

    Exascale computing refers to computing systems capable of at least one exaflop or a billion calculations per second (1018). That is 50 times faster than the most powerful supercomputers being used today and represents a thousand-fold increase over the first petascale computer that came into operation in 2008. How we use these large-scale simulation resources is the key to solving some of today’s most pressing problems, including clean energy production, nuclear reactor lifetime extension and nuclear stockpile aging.

  17. Answer Set Programming and Other Computing Paradigms

    ERIC Educational Resources Information Center

    Meng, Yunsong

    2013-01-01

    Answer Set Programming (ASP) is one of the most prominent and successful knowledge representation paradigms. The success of ASP is due to its expressive non-monotonic modeling language and its efficient computational methods originating from building propositional satisfiability solvers. The wide adoption of ASP has motivated several extensions to…

  18. Improved Adjoint-Operator Learning For A Neural Network

    NASA Technical Reports Server (NTRS)

    Toomarian, Nikzad; Barhen, Jacob

    1995-01-01

    Improved method of adjoint-operator learning reduces amount of computation and associated computational memory needed to make electronic neural network learn temporally varying pattern (e.g., to recognize moving object in image) in real time. Method extension of method described in "Adjoint-Operator Learning for a Neural Network" (NPO-18352).

  19. Elliptic Curve Cryptography with Java

    ERIC Educational Resources Information Center

    Klima, Richard E.; Sigmon, Neil P.

    2005-01-01

    The use of the computer, and specifically the mathematics software package Maple, has played a central role in the authors' abstract algebra course because it provides their students with a way to see realistic examples of the topics they discuss without having to struggle with extensive computations. However, Maple does not provide the computer…

  20. Initiating a Programmatic Assessment Report

    ERIC Educational Resources Information Center

    Berkaliev, Zaur; Devi, Shavila; Fasshauer, Gregory E.; Hickernell, Fred J.; Kartal, Ozgul; Li, Xiaofan; McCray, Patrick; Whitney, Stephanie; Zawojewski, Judith S.

    2014-01-01

    In the context of a department of applied mathematics, a program assessment was conducted to assess the departmental goal of enabling undergraduate students to recognize, appreciate, and apply the power of computational tools in solving mathematical problems that cannot be solved by hand, or would require extensive and tedious hand computation. A…

  1. Quantum Computer Games: Quantum Minesweeper

    ERIC Educational Resources Information Center

    Gordon, Michal; Gordon, Goren

    2010-01-01

    The computer game of quantum minesweeper is introduced as a quantum extension of the well-known classical minesweeper. Its main objective is to teach the unique concepts of quantum mechanics in a fun way. Quantum minesweeper demonstrates the effects of superposition, entanglement and their non-local characteristics. While in the classical…

  2. A Flexible, Extensible Online Testing System for Mathematics

    ERIC Educational Resources Information Center

    Passmore, Tim; Brookshaw, Leigh; Butler, Harry

    2011-01-01

    An online testing system developed for entry-skills testing of first-year university students in algebra and calculus is described. The system combines the open-source computer algebra system "Maxima" with computer scripts to parse student answers, which are entered using standard mathematical notation and conventions. The answers can…

  3. Educational Research and Theory Perspectives on Intelligent Computer-Assisted Instruction.

    ERIC Educational Resources Information Center

    Tennyson, Robert D.; Christensen, Dean L.

    This paper defines the next generation of intelligent computer-assisted instructional systems (ICAI) by depicting the elaborations and extensions offered by educational research and theory perspectives to enhance the ICAI environment. The first section describes conventional ICAI systems, which use expert systems methods and have three modules: a…

  4. Quantum Computer Games: Schrodinger Cat and Hounds

    ERIC Educational Resources Information Center

    Gordon, Michal; Gordon, Goren

    2012-01-01

    The quantum computer game "Schrodinger cat and hounds" is the quantum extension of the well-known classical game fox and hounds. Its main objective is to teach the unique concepts of quantum mechanics in a fun way. "Schrodinger cat and hounds" demonstrates the effects of superposition, destructive and constructive interference, measurements and…

  5. Primary School Pupils' Attitudes toward Learning Programming through Visual Interactive Environments

    ERIC Educational Resources Information Center

    Asad, Khaled; Tibi, Moanis; Raiyn, Jamal

    2016-01-01

    New generations are using and playing with mobile and computer applications extensively. These applications are the outcomes of programming work that involves skills, such as computational and algorithmic thinking. Learning programming is not easy for students children. In recent years, academic institutions like the Massachusetts Institute of…

  6. Empirical Validation and Application of the Computing Attitudes Survey

    ERIC Educational Resources Information Center

    Dorn, Brian; Elliott Tew, Allison

    2015-01-01

    Student attitudes play an important role in shaping learning experiences. However, few validated instruments exist for measuring student attitude development in a discipline-specific way. In this paper, we present the design, development, and validation of the computing attitudes survey (CAS). The CAS is an extension of the Colorado Learning…

  7. A Benders based rolling horizon algorithm for a dynamic facility location problem

    DOE PAGES

    Marufuzzaman,, Mohammad; Gedik, Ridvan; Roni, Mohammad S.

    2016-06-28

    This study presents a well-known capacitated dynamic facility location problem (DFLP) that satisfies the customer demand at a minimum cost by determining the time period for opening, closing, or retaining an existing facility in a given location. To solve this challenging NP-hard problem, this paper develops a unique hybrid solution algorithm that combines a rolling horizon algorithm with an accelerated Benders decomposition algorithm. Extensive computational experiments are performed on benchmark test instances to evaluate the hybrid algorithm’s efficiency and robustness in solving the DFLP problem. Computational results indicate that the hybrid Benders based rolling horizon algorithm consistently offers high qualitymore » feasible solutions in a much shorter computational time period than the standalone rolling horizon and accelerated Benders decomposition algorithms in the experimental range.« less

  8. Domain decomposition for aerodynamic and aeroacoustic analyses, and optimization

    NASA Technical Reports Server (NTRS)

    Baysal, Oktay

    1995-01-01

    The overarching theme was the domain decomposition, which intended to improve the numerical solution technique for the partial differential equations at hand; in the present study, those that governed either the fluid flow, or the aeroacoustic wave propagation, or the sensitivity analysis for a gradient-based optimization. The role of the domain decomposition extended beyond the original impetus of discretizing geometrical complex regions or writing modular software for distributed-hardware computers. It induced function-space decompositions and operator decompositions that offered the valuable property of near independence of operator evaluation tasks. The objectives have gravitated about the extensions and implementations of either the previously developed or concurrently being developed methodologies: (1) aerodynamic sensitivity analysis with domain decomposition (SADD); (2) computational aeroacoustics of cavities; and (3) dynamic, multibody computational fluid dynamics using unstructured meshes.

  9. Convergence of finite difference transient response computations for thin shells.

    NASA Technical Reports Server (NTRS)

    Sobel, L. H.; Geers, T. L.

    1973-01-01

    Numerical studies pertaining to the limits of applicability of the finite difference method in the solution of linear transient shell response problems are performed, and a computational procedure for the use of the method is recommended. It is found that the only inherent limitation of the finite difference method is its inability to reproduce accurately response discontinuities. This is not a serious limitation in view of natural constraints imposed by the extension of Saint Venant's principle to transient response problems. It is also found that the short wavelength limitations of thin shell (Bernoulli-Euler) theory create significant convergence difficulties in computed response to certain types of transverse excitations. These difficulties may be overcome, however, through proper selection of finite difference mesh dimensions and temporal smoothing of the excitation.

  10. Reusable Agena study. Volume 1: Executive summary. [space shuttle Agena upper stage tug concept

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The shuttle Agena upper stage interim tug concept is based on a building block approach. These building block concepts are extensions of existing ascent Agena configurations. Several current improvements, have been used in developing the shuttle/Agena upper stage concepts. High-density acid is used as the Agena upper stage oxidizer. The baffled injector is used in the main engine. The DF-224 is a fourth generation computer currently in development and will be flight proven in the near future. The Agena upper stage building block concept uses the current Agena as a baseline, adds an 8.5-inch (21.6 cm) extension to the fuel tank for optimum mixture ratio, uses monomethyl hydrazine as fuel, exchanges a 150:1 nozzle extension for the existing 45:1, exchanges an Autonetics DF-224 for the existing Honeywell computer, and adds a star sensor for guidance update. These modifications to the current Agena provide a 5-foot (1.52m) diameter shuttle/Agena upper stage that will fly all Vandenberg Air Force Base missions in the reusable mode without resorting to a kick motor. The delta V velocity of the Agena is increased by use of a strap-on propellant tank option. This option provides a shuttle/Agena upper stage with the capability to place almost 3900 pounds (1769 kg) into geosynchronous orbit (24 hour period) without the aid of kick motors.

  11. Characterization of a laboratory model of computer mouse use - applications for studying risk factors for musculoskeletal disorders.

    PubMed

    Flodgren, G; Heiden, M; Lyskov, E; Crenshaw, A G

    2007-03-01

    In the present study, we assessed the wrist kinetics (range of motion, mean position, velocity and mean power frequency in radial/ulnar deviation, flexion/extension, and pronation/supination) associated with performing a mouse-operated computerized task involving painting rectangles on a computer screen. Furthermore, we evaluated the effects of the painting task on subjective perception of fatigue and wrist position sense. The results showed that the painting task required constrained wrist movements, and repetitive movements of about the same magnitude as those performed in mouse-operated design tasks. In addition, the painting task induced a perception of muscle fatigue in the upper extremity (Borg CR-scale: 3.5, p<0.001) and caused a reduction in the position sense accuracy of the wrist (error before: 4.6 degrees , error after: 5.6 degrees , p<0.05). This standardized painting task appears suitable for studying relevant risk factors, and therefore it offers a potential for investigating the pathophysiological mechanisms behind musculoskeletal disorders related to computer mouse use.

  12. An integrated geophysical and geological study of the tectonic framework of the 38th Parallel Lineament in the vicinity of its intersection with the extension of the New Madrid Fault Zone. Geotechnical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Braile, L.W.; Hinze, J.H.; Keller, G.R.

    1978-09-01

    Extensive gravity and aeromagnetic surveys have been conducted in critical areas of Kentucky, Illinois, and Indiana centering around the intersection of the 38th Parallel Lineament and the extension of the New Madrid Fault Zone. Available aeromagnetic maps have been digitized and these data have been processed by a suite of computer programs developed for this purpose. Seismic equipment has been prepared for crustal seismic studies and a 150 km long seismic refraction line has been observed along the Wabash River Valley Fault System. Preliminary basement rock and configuration maps have been prepared based on studies of the samples derived frommore » basement drill holes. Interpretation of these data are only at a preliminary stage, but studies to this date indicate that the 38th Parallel Lineament features extend as far north as 39 degrees N and a subtle northeasterly-striking magnetic and gravity anomaly cuts across Indiana from the southwest corner of the state, roughly on strike with the New Madrid Seismic Zone.« less

  13. The Computer: Extension of the Human Mind III. Proceedings of the Annual Summer Computer Conference (3rd, Eugene, Oregon, August 1-3, 1984).

    ERIC Educational Resources Information Center

    Oregon Univ., Eugene. Center for Advanced Technology in Education.

    The 13 conference presentations in this proceedings are arranged by general and special interest sessions and listed within each session in the order in which they were presented. These papers are: (1) "Key Issues for the Near Future" (David Moursund); (2) "Educating with Computers: Insights from Cognitive Psychology (and Video Games)" (Morton Ann…

  14. PRESAGE: PRivacy-preserving gEnetic testing via SoftwAre Guard Extension.

    PubMed

    Chen, Feng; Wang, Chenghong; Dai, Wenrui; Jiang, Xiaoqian; Mohammed, Noman; Al Aziz, Md Momin; Sadat, Md Nazmus; Sahinalp, Cenk; Lauter, Kristin; Wang, Shuang

    2017-07-26

    Advances in DNA sequencing technologies have prompted a wide range of genomic applications to improve healthcare and facilitate biomedical research. However, privacy and security concerns have emerged as a challenge for utilizing cloud computing to handle sensitive genomic data. We present one of the first implementations of Software Guard Extension (SGX) based securely outsourced genetic testing framework, which leverages multiple cryptographic protocols and minimal perfect hash scheme to enable efficient and secure data storage and computation outsourcing. We compared the performance of the proposed PRESAGE framework with the state-of-the-art homomorphic encryption scheme, as well as the plaintext implementation. The experimental results demonstrated significant performance over the homomorphic encryption methods and a small computational overhead in comparison to plaintext implementation. The proposed PRESAGE provides an alternative solution for secure and efficient genomic data outsourcing in an untrusted cloud by using a hybrid framework that combines secure hardware and multiple crypto protocols.

  15. "Group IV Nanomembranes, Nanoribbons, and Quantum Dots: Processing, Characterization, and Novel Devices"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    liu, feng

    This theoretical project has been carried out in close interaction with the experimental project at UW-Madison under the same title led by PI Max Lagally and co-PI Mark Eriksson. Extensive computational studies have been performed to address a broad range of topics from atomic structure, stability, mechanical property, to electronic structure, optoelectronic and transport properties of various nanoarchitectures in the context of Si and other solid nanomembranes. These have been done by using combinations of different theoretical and computational approaches, ranging from first-principles calculations and molecular dynamics (MD) simulations to finite-element (FE) analyses and continuum modeling.

  16. Pineal region tumors: computed tomographic-pathologic spectrum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Futrell, N.N.; Osborn, A.G.; Cheson. B.D.

    While several computed tomographic (CT) studies of posterior third ventricular neoplasms have included descriptions of pineal tumors, few reports have concentrated on these uncommon lesions. Some authors have asserted that the CT appearance of many pineal tumors is virtually pathognomonic. A series of nine biopsy-proved pineal gland and eight other presumed tumors is presented that illustrates their remarkable heterogeneity in both histopathologic and CT appearance. These tumors included germinomas, teratocarcinomas, hamartomas, and other varieties. They had variable margination, attentuation, calcification, and suprasellar extension. Germinomas have the best response to radiation therapy. Biopsy of pineal region tumors is now feasible andmore » is recommended for treatment planning.« less

  17. Comparing the Performance of Two Dynamic Load Distribution Methods

    NASA Technical Reports Server (NTRS)

    Kale, L. V.

    1987-01-01

    Parallel processing of symbolic computations on a message-passing multi-processor presents one challenge: To effectively utilize the available processors, the load must be distributed uniformly to all the processors. However, the structure of these computations cannot be predicted in advance. go, static scheduling methods are not applicable. In this paper, we compare the performance of two dynamic, distributed load balancing methods with extensive simulation studies. The two schemes are: the Contracting Within a Neighborhood (CWN) scheme proposed by us, and the Gradient Model proposed by Lin and Keller. We conclude that although simpler, the CWN is significantly more effective at distributing the work than the Gradient model.

  18. Extended computational kernels in a massively parallel implementation of the Trotter-Suzuki approximation

    NASA Astrophysics Data System (ADS)

    Wittek, Peter; Calderaro, Luca

    2015-12-01

    We extended a parallel and distributed implementation of the Trotter-Suzuki algorithm for simulating quantum systems to study a wider range of physical problems and to make the library easier to use. The new release allows periodic boundary conditions, many-body simulations of non-interacting particles, arbitrary stationary potential functions, and imaginary time evolution to approximate the ground state energy. The new release is more resilient to the computational environment: a wider range of compiler chains and more platforms are supported. To ease development, we provide a more extensive command-line interface, an application programming interface, and wrappers from high-level languages.

  19. Referees Often Miss Obvious Errors in Computer and Electronic Publications

    NASA Astrophysics Data System (ADS)

    de Gloucester, Paul Colin

    2013-05-01

    Misconduct is extensive and damaging. So-called science is prevalent. Articles resulting from so-called science are often cited in other publications. This can have damaging consequences for society and for science. The present work includes a scientometric study of 350 articles (published by the Association for Computing Machinery; Elsevier; The Institute of Electrical and Electronics Engineers, Inc.; John Wiley; Springer; Taylor & Francis; and World Scientific Publishing Co.). A lower bound of 85.4% articles are found to be incongruous. Authors cite inherently self-contradictory articles more than valid articles. Incorrect informational cascades ruin the literature's signal-to-noise ratio even for uncomplicated cases.

  20. Referees often miss obvious errors in computer and electronic publications.

    PubMed

    de Gloucester, Paul Colin

    2013-01-01

    Misconduct is extensive and damaging. So-called science is prevalent. Articles resulting from so-called science are often cited in other publications. This can have damaging consequences for society and for science. The present work includes a scientometric study of 350 articles (published by the Association for Computing Machinery; Elsevier; The Institute of Electrical and Electronics Engineers, Inc.; John Wiley; Springer; Taylor & Francis; and World Scientific Publishing Co.). A lower bound of 85.4% articles are found to be incongruous. Authors cite inherently self-contradictory articles more than valid articles. Incorrect informational cascades ruin the literature's signal-to-noise ratio even for uncomplicated cases.

  1. Verification and Validation Strategy for LWRS Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carl M. Stoots; Richard R. Schultz; Hans D. Gougar

    2012-09-01

    One intension of the Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program is to create advanced computational tools for safety assessment that enable more accurate representation of a nuclear power plant safety margin. These tools are to be used to study the unique issues posed by lifetime extension and relicensing of the existing operating fleet of nuclear power plants well beyond their first license extension period. The extent to which new computational models / codes such as RELAP-7 can be used for reactor licensing / relicensing activities depends mainly upon the thoroughness with which they have been verifiedmore » and validated (V&V). This document outlines the LWRS program strategy by which RELAP-7 code V&V planning is to be accomplished. From the perspective of developing and applying thermal-hydraulic and reactivity-specific models to reactor systems, the US Nuclear Regulatory Commission (NRC) Regulatory Guide 1.203 gives key guidance to numeric model developers and those tasked with the validation of numeric models. By creating Regulatory Guide 1.203 the NRC defined a framework for development, assessment, and approval of transient and accident analysis methods. As a result, this methodology is very relevant and is recommended as the path forward for RELAP-7 V&V. However, the unique issues posed by lifetime extension will require considerations in addition to those addressed in Regulatory Guide 1.203. Some of these include prioritization of which plants / designs should be studied first, coupling modern supporting experiments to the stringent needs of new high fidelity models / codes, and scaling of aging effects.« less

  2. Quantum chemical calculations of interatomic potentials for computer simulation of solids

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A comprehensive mathematical model by which the collective behavior of a very large number of atoms within a metal or alloy can accurately be simulated was developed. Work was done in order to predict and modify the strength of materials to suit our technological needs. The method developed is useful in studying atomic interactions related to dislocation motion and crack extension.

  3. Model to Test Electric Field Comparisons in a Composite Fairing Cavity

    NASA Technical Reports Server (NTRS)

    Trout, Dawn; Burford, Janessa

    2012-01-01

    Evaluating the impact of radio frequency transmission in vehicle fairings is important to sensitive spacecraft. This study shows cumulative distribution function (CDF) comparisons of composite . a fairing electromagnetic field data obtained by computational electromagnetic 3D full wave modeling and laboratory testing. This work is an extension of the bare aluminum fairing perfect electric conductor (PEC) model. Test and model data correlation is shown.

  4. Model to Test Electric Field Comparisons in a Composite Fairing Cavity

    NASA Technical Reports Server (NTRS)

    Trout, Dawn H.; Burford, Janessa

    2013-01-01

    Evaluating the impact of radio frequency transmission in vehicle fairings is important to sensitive spacecraft. This study shows cumulative distribution function (CDF) comparisons of composite a fairing electromagnetic field data obtained by computational electromagnetic 3D full wave modeling and laboratory testing. This work is an extension of the bare aluminum fairing perfect electric conductor (PEC) model. Test and model data correlation is shown.

  5. Analysis of Satellite Communications Antenna Patterns

    NASA Technical Reports Server (NTRS)

    Rahmat-Samii, Y.

    1985-01-01

    Computer program accurately and efficiently predicts far-field patterns of offset, or symmetric, parabolic reflector antennas. Antenna designer uses program to study effects of varying geometrical and electrical (RF) parameters of parabolic reflector and its feed system. Accurate predictions of far-field patterns help designer predict overall performance of antenna. These reflectors used extensively in modern communications satellites and in multiple-beam and low side-lobe antenna systems.

  6. Studying Scientific Discovery by Computer Simulation.

    DTIC Science & Technology

    1983-03-30

    Mendel’s laws of inheritance, the law of Gay- Lussac for gaseous reactions, tile law of Dulong and Petit, the derivation of atomic weights by Avogadro...neceseary mid identify by block number) scientific discovery -ittri sic properties physical laws extensive terms data-driven heuristics intensive...terms theory-driven heuristics conservation laws 20. ABSTRACT (Continue on revere. side It necessary and identify by block number) Scientific discovery

  7. A Numerical Study on Microwave Coagulation Therapy

    DTIC Science & Technology

    2013-01-01

    hepatocellular carcinoma (small size liver tumor). Through extensive numerical simulations, we reveal the mathematical relationships between some critical parameters in the therapy, including input power, frequency, temperature, and regions of impact. It is shown that these relationships can be approximated using simple polynomial functions. Compared to solutions of partial differential equations, these functions are significantly easier to compute and simpler to analyze for engineering design and clinical

  8. Computed tomographic anatomy of the nasal cavity, paranasal sinuses and tympanic cavity of the koala.

    PubMed

    Hemsley, S; Palmer, H; Canfield, R B; Stewart, M E B; Krockenberger, M B; Malik, R

    2013-09-01

    To use cross-sectional imaging (helical computed tomography (CT)) combined with conventional anatomical dissection to define the normal anatomy of the nasal cavity and bony cavitations of the koala skull. Helical CT scans of the heads of nine adult animals were obtained using a multislice scanner acquiring thin slices reconstructed in the transverse, sagittal and dorsal planes. Subsequent anatomical dissection permitted confirmation of correct identification and further delineation of bony and air-filled structures visible in axial and multiplanar reformatted CT images. The nasal cavity was relatively simple, with little scrolling of nasal conchae, but bony cavitations were complex and extensive. A rostral maxillary recess and ventral conchal, caudal maxillary, frontal and sphenoidal paranasal sinuses were identified and characterised. Extensive temporal bone cavitation was shown to be related to a large epitympanic recess. The detailed anatomical data provided are applicable to future functional and comparative anatomical studies, as well as providing a preliminary atlas for clinical investigation of conditions such as cryptococcal rhinosinusitis, a condition more common in the koala than in many other species. © 2013 Australian Veterinary Association.

  9. Applications of artificial neural networks in medical science.

    PubMed

    Patel, Jigneshkumar L; Goyal, Ramesh K

    2007-09-01

    Computer technology has been advanced tremendously and the interest has been increased for the potential use of 'Artificial Intelligence (AI)' in medicine and biological research. One of the most interesting and extensively studied branches of AI is the 'Artificial Neural Networks (ANNs)'. Basically, ANNs are the mathematical algorithms, generated by computers. ANNs learn from standard data and capture the knowledge contained in the data. Trained ANNs approach the functionality of small biological neural cluster in a very fundamental manner. They are the digitized model of biological brain and can detect complex nonlinear relationships between dependent as well as independent variables in a data where human brain may fail to detect. Nowadays, ANNs are widely used for medical applications in various disciplines of medicine especially in cardiology. ANNs have been extensively applied in diagnosis, electronic signal analysis, medical image analysis and radiology. ANNs have been used by many authors for modeling in medicine and clinical research. Applications of ANNs are increasing in pharmacoepidemiology and medical data mining. In this paper, authors have summarized various applications of ANNs in medical science.

  10. Discrete Event-based Performance Prediction for Temperature Accelerated Dynamics

    NASA Astrophysics Data System (ADS)

    Junghans, Christoph; Mniszewski, Susan; Voter, Arthur; Perez, Danny; Eidenbenz, Stephan

    2014-03-01

    We present an example of a new class of tools that we call application simulators, parameterized fast-running proxies of large-scale scientific applications using parallel discrete event simulation (PDES). We demonstrate our approach with a TADSim application simulator that models the Temperature Accelerated Dynamics (TAD) method, which is an algorithmically complex member of the Accelerated Molecular Dynamics (AMD) family. The essence of the TAD application is captured without the computational expense and resource usage of the full code. We use TADSim to quickly characterize the runtime performance and algorithmic behavior for the otherwise long-running simulation code. We further extend TADSim to model algorithm extensions to standard TAD, such as speculative spawning of the compute-bound stages of the algorithm, and predict performance improvements without having to implement such a method. Focused parameter scans have allowed us to study algorithm parameter choices over far more scenarios than would be possible with the actual simulation. This has led to interesting performance-related insights into the TAD algorithm behavior and suggested extensions to the TAD method.

  11. Evaluation of Cache-based Superscalar and Cacheless Vector Architectures for Scientific Computations

    NASA Technical Reports Server (NTRS)

    Oliker, Leonid; Carter, Jonathan; Shalf, John; Skinner, David; Ethier, Stephane; Biswas, Rupak; Djomehri, Jahed; VanderWijngaart, Rob

    2003-01-01

    The growing gap between sustained and peak performance for scientific applications has become a well-known problem in high performance computing. The recent development of parallel vector systems offers the potential to bridge this gap for a significant number of computational science codes and deliver a substantial increase in computing capabilities. This paper examines the intranode performance of the NEC SX6 vector processor and the cache-based IBM Power3/4 superscalar architectures across a number of key scientific computing areas. First, we present the performance of a microbenchmark suite that examines a full spectrum of low-level machine characteristics. Next, we study the behavior of the NAS Parallel Benchmarks using some simple optimizations. Finally, we evaluate the perfor- mance of several numerical codes from key scientific computing domains. Overall results demonstrate that the SX6 achieves high performance on a large fraction of our application suite and in many cases significantly outperforms the RISC-based architectures. However, certain classes of applications are not easily amenable to vectorization and would likely require extensive reengineering of both algorithm and implementation to utilize the SX6 effectively.

  12. Sonic and Supersonic Jet Plumes

    NASA Technical Reports Server (NTRS)

    Venkatapathy, E.; Naughton, J. W.; Flethcher, D. G.; Edwards, Thomas A. (Technical Monitor)

    1994-01-01

    Study of sonic and supersonic jet plumes are relevant to understanding such phenomenon as jet-noise, plume signatures, and rocket base-heating and radiation. Jet plumes are simple to simulate and yet, have complex flow structures such as Mach disks, triple points, shear-layers, barrel shocks, shock- shear- layer interaction, etc. Experimental and computational simulation of sonic and supersonic jet plumes have been performed for under- and over-expanded, axisymmetric plume conditions. The computational simulation compare very well with the experimental observations of schlieren pictures. Experimental data such as temperature measurements with hot-wire probes are yet to be measured and will be compared with computed values. Extensive analysis of the computational simulations presents a clear picture of how the complex flow structure develops and the conditions under which self-similar flow structures evolve. From the computations, the plume structure can be further classified into many sub-groups. In the proposed paper, detail results from the experimental and computational simulations for single, axisymmetric, under- and over-expanded, sonic and supersonic plumes will be compared and the fluid dynamic aspects of flow structures will be discussed.

  13. [From fundamental research to clinical development: a review of orthodontics].

    PubMed

    Zhao, Zhi-he; Bai, Ding

    2011-11-01

    In recent years, new approaches to the diagnosis and treatment of malocclusion have emerged. The diagnostic and therapeutic techniques of orthodontics have evolved from two dimensions to five dimensions with the development of computer technology, auto-machining and imaging. Furthermore, interdisciplinary study has become the driving force for the advancement of fundamental research in orthodontics. The mechanisms of malocclusion and orthodontic tooth movement have been extensively studied to the details at the level of cells and molecules.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Somayaji, Anil B.; Amai, Wendy A.; Walther, Eleanor A.

    This reports describes the successful extension of artificial immune systems from the domain of computer security to the domain of real time control systems for robotic vehicles. A biologically-inspired computer immune system was added to the control system of two different mobile robots. As an additional layer in a multi-layered approach, the immune system is complementary to traditional error detection and error handling techniques. This can be thought of as biologically-inspired defense in depth. We demonstrated an immune system can be added with very little application developer effort, resulting in little to no performance impact. The methods described here aremore » extensible to any system that processes a sequence of data through a software interface.« less

  15. Post-analysis report on Chesapeake Bay data processing. [spectral analysis and recognition computer signature extension

    NASA Technical Reports Server (NTRS)

    Thomson, F.

    1972-01-01

    The additional processing performed on data collected over the Rhode River Test Site and Forestry Site in November 1970 is reported. The techniques and procedures used to obtain the processed results are described. Thermal data collected over three approximately parallel lines of the site were contoured, and the results color coded, for the purpose of delineating important scene constituents and to identify trees attacked by pine bark beetles. Contouring work and histogram preparation are reviewed and the important conclusions from the spectral analysis and recognition computer (SPARC) signature extension work are summarized. The SPARC setup and processing records are presented and recommendations are made for future data collection over the site.

  16. [Thin-section computed tomography of the bronchi; 2. Right upper lobe and left upper division].

    PubMed

    Matsuoka, Y; Ookubo, T; Ohtomo, K; Nishikawa, J; Kojima, K; Oyama, K; Yoshikawa, K; Iio, M

    1990-02-01

    Thin (2mm) section contiguous computed tomographic (CT) scans were obtained through the bronchi of the right upper lobe and the left upper division in 30 patients. All segmental bronchi were identified. The right subsegmental bronchi were identified in 100%, and the left subsegmental bronchi in 97%. The type of the orifice of the right bronchus was trifurcated (53%), the extension of B1 was apicoanterior (50%), and the size of B2b was equal to B3a (63%). The extension of the left B3 was subapicoanterior (38%), and the size of B1+2c was equal to B3a (62%).

  17. A numerical investigation of the effects of the spanwise length on the 3-D wake of a circular cylinder

    NASA Astrophysics Data System (ADS)

    Labbé, D. F. L.; Wilson, P. A.

    2007-11-01

    The numerical prediction of vortex-induced vibrations has been the focus of numerous investigations to date using tools such as computational fluid dynamics. In particular, the flow around a circular cylinder has raised much attention as it is present in critical engineering problems such as marine cables or risers. Limitations due to the computational cost imposed by the solution of a large number of equations have resulted in the study of mostly 2-D flows with only a few exceptions. The discrepancies found between experimental data and 2-D numerical simulations suggested that 3-D instabilities occurred in the wake of the cylinder that affect substantially the characteristics of the flow. The few 3-D numerical solutions available in the literature confirmed such a hypothesis. In the present investigation the effect of the spanwise extension of the solution domain on the 3-D wake of a circular cylinder is investigated for various Reynolds numbers between 40 and 1000. By assessing the minimum spanwise extension required to predict accurately the flow around a circular cylinder, the infinitely long cylinder is reduced to a finite length cylinder, thus making numerical solution an effective way of investigating flows around circular cylinders. Results are presented for three different spanwise extensions, namely πD/2, πD and 2πD. The analysis of the force coefficients obtained for the various Reynolds numbers together with a visualization of the three-dimensionalities in the wake of the cylinder allowed for a comparison between the effects of the three spanwise extensions. Furthermore, by showing the different modes of vortex shedding present in the wake and by analysing the streamwise components of the vorticity, it was possible to estimate the spanwise wavelengths at the various Reynolds numbers and to demonstrate that a finite spanwise extension is sufficient to accurately predict the flow past an infinitely long circular cylinder.

  18. Adapting Extension Food Safety Programming for Vegetable Growers to Accommodate Differences in Ethnicity, Farming Scale, and Other Individual Factors

    ERIC Educational Resources Information Center

    Kline, Terence R.; Kneen, Harold; Barrett, Eric; Kleinschmidt, Andy; Doohan, Doug

    2012-01-01

    Differences in vegetable production methods utilized by American growers create distinct challenges for Extension personnel providing food safety training to producer groups. A program employing computers and projectors will not be accepted by an Amish group that does not accept modern technology. We have developed an outreach program that covers…

  19. Appendices to the model description document for a computer program for the emulation/simulation of a space station environmental control and life support system

    NASA Technical Reports Server (NTRS)

    Yanosy, James L.

    1988-01-01

    A Model Description Document for the Emulation Simulation Computer Model was already published. The model consisted of a detailed model (emulation) of a SAWD CO2 removal subsystem which operated with much less detailed (simulation) models of a cabin, crew, and condensing and sensible heat exchangers. The purpose was to explore the utility of such an emulation simulation combination in the design, development, and test of a piece of ARS hardware, SAWD. Extensions to this original effort are presented. The first extension is an update of the model to reflect changes in the SAWD control logic which resulted from test. Also, slight changes were also made to the SAWD model to permit restarting and to improve the iteration technique. The second extension is the development of simulation models for more pieces of air and water processing equipment. Models are presented for: EDC, Molecular Sieve, Bosch, Sabatier, a new condensing heat exchanger, SPE, SFWES, Catalytic Oxidizer, and multifiltration. The third extension is to create two system simulations using these models. The first system presented consists of one air and one water processing system. The second consists of a potential air revitalization system.

  20. Appendices to the user's manual for a computer program for the emulation/simulation of a space station environmental control and life support system

    NASA Technical Reports Server (NTRS)

    Yanosy, James L.

    1988-01-01

    A user's Manual for the Emulation Simulation Computer Model was published previously. The model consisted of a detailed model (emulation) of a SAWD CO2 removal subsystem which operated with much less detailed (simulation) models of a cabin, crew, and condensing and sensible heat exchangers. The purpose was to explore the utility of such an emulation/simulation combination in the design, development, and test of a piece of ARS hardware - SAWD. Extensions to this original effort are presented. The first extension is an update of the model to reflect changes in the SAWD control logic which resulted from the test. In addition, slight changes were also made to the SAWD model to permit restarting and to improve the iteration technique. The second extension is the development of simulation models for more pieces of air and water processing equipment. Models are presented for: EDC, Molecular Sieve, Bosch, Sabatier, a new condensing heat exchanger, SPE, SFWES, Catalytic Oxidizer, and multifiltration. The third extension is to create two system simulations using these models. The first system presented consists of one air and one water processing system, the second a potential Space Station air revitalization system.

  1. Navier-Stokes Computations of Longitudinal Forces and Moments for a Blended Wing Body

    NASA Technical Reports Server (NTRS)

    Pao, S. Paul; Biedron, Robert T.; Park, Michael A.; Fremaux, C. Michael; Vicroy, Dan D.

    2005-01-01

    The object of this paper is to investigate the feasibility of applying CFD methods to aerodynamic analyses for aircraft stability and control. The integrated aerodynamic parameters used in stability and control, however, are not necessarily those extensively validated in the state of the art CFD technology. Hence, an exploratory study of such applications and the comparison of the solutions to available experimental data will help to assess the validity of the current computation methods. In addition, this study will also examine issues related to wind tunnel measurements such as measurement uncertainty and support interference effects. Several sets of experimental data from the NASA Langley 14x22-Foot Subsonic Tunnel and the National Transonic Facility are presented. Two Navier-Stokes flow solvers, one using structured meshes and the other unstructured meshes, were used to compute longitudinal static stability derivatives for an advanced Blended Wing Body configuration over a wide range of angles of attack. The computations were performed for two different Reynolds numbers and the resulting forces and moments are compared with the above mentioned wind tunnel data.

  2. Navier-Stokes Computations of Longitudinal Forces and Moments for a Blended Wing Body

    NASA Technical Reports Server (NTRS)

    Pao, S. Paul; Biedron, Robert T.; Park, Michael A.; Fremaux, C. Michael; Vicroy, Dan D.

    2004-01-01

    The object of this paper is to investigate the feasibility of applying CFD methods to aerodynamic analyses for aircraft stability and control. The integrated aerodynamic parameters used in stability and control, however, are not necessarily those extensively validated in the state of the art CFD technology. Hence, an exploratory study of such applications and the comparison of the solutions to available experimental data will help to assess the validity of the current computation methods. In addition, this study will also examine issues related to wind tunnel measurements such as measurement uncertainty and support interference effects. Several sets of experimental data from the NASA Langley 14x22-Foot Subsonic Tunnel and the National Transonic Facility are presented. Two Navier-Stokes flow solvers, one using structured meshes and the other unstructured meshes, were used to compute longitudinal static stability derivatives for an advanced Blended Wing Body configuration over a wide range of angles of attack. The computations were performed for two different Reynolds numbers and the resulting forces and moments are compared with the above mentioned wind tunnel data.

  3. In Silico Augmentation of the Drug Development Pipeline: Examples from the study of Acute Inflammation

    PubMed Central

    An, Gary; Bartels, John; Vodovotz, Yoram

    2011-01-01

    The clinical translation of promising basic biomedical findings, whether derived from reductionist studies in academic laboratories or as the product of extensive high-throughput and –content screens in the biotechnology and pharmaceutical industries, has reached a period of stagnation in which ever higher research and development costs are yielding ever fewer new drugs. Systems biology and computational modeling have been touted as potential avenues by which to break through this logjam. However, few mechanistic computational approaches are utilized in a manner that is fully cognizant of the inherent clinical realities in which the drugs developed through this ostensibly rational process will be ultimately used. In this article, we present a Translational Systems Biology approach to inflammation. This approach is based on the use of mechanistic computational modeling centered on inherent clinical applicability, namely that a unified suite of models can be applied to generate in silico clinical trials, individualized computational models as tools for personalized medicine, and rational drug and device design based on disease mechanism. PMID:21552346

  4. Interaction entropy for protein-protein binding

    NASA Astrophysics Data System (ADS)

    Sun, Zhaoxi; Yan, Yu N.; Yang, Maoyou; Zhang, John Z. H.

    2017-03-01

    Protein-protein interactions are at the heart of signal transduction and are central to the function of protein machine in biology. The highly specific protein-protein binding is quantitatively characterized by the binding free energy whose accurate calculation from the first principle is a grand challenge in computational biology. In this paper, we show how the interaction entropy approach, which was recently proposed for protein-ligand binding free energy calculation, can be applied to computing the entropic contribution to the protein-protein binding free energy. Explicit theoretical derivation of the interaction entropy approach for protein-protein interaction system is given in detail from the basic definition. Extensive computational studies for a dozen realistic protein-protein interaction systems are carried out using the present approach and comparisons of the results for these protein-protein systems with those from the standard normal mode method are presented. Analysis of the present method for application in protein-protein binding as well as the limitation of the method in numerical computation is discussed. Our study and analysis of the results provided useful information for extracting correct entropic contribution in protein-protein binding from molecular dynamics simulations.

  5. Interaction entropy for protein-protein binding.

    PubMed

    Sun, Zhaoxi; Yan, Yu N; Yang, Maoyou; Zhang, John Z H

    2017-03-28

    Protein-protein interactions are at the heart of signal transduction and are central to the function of protein machine in biology. The highly specific protein-protein binding is quantitatively characterized by the binding free energy whose accurate calculation from the first principle is a grand challenge in computational biology. In this paper, we show how the interactionentropy approach, which was recently proposed for protein-ligand binding free energy calculation, can be applied to computing the entropic contribution to the protein-protein binding free energy. Explicit theoretical derivation of the interactionentropy approach for protein-protein interaction system is given in detail from the basic definition. Extensive computational studies for a dozen realistic protein-protein interaction systems are carried out using the present approach and comparisons of the results for these protein-protein systems with those from the standard normal mode method are presented. Analysis of the present method for application in protein-protein binding as well as the limitation of the method in numerical computation is discussed. Our study and analysis of the results provided useful information for extracting correct entropic contribution in protein-protein binding from molecular dynamics simulations.

  6. High-Throughput Computing on High-Performance Platforms: A Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oleynik, D; Panitkin, S; Matteo, Turilli

    The computing systems used by LHC experiments has historically consisted of the federation of hundreds to thousands of distributed resources, ranging from small to mid-size resource. In spite of the impressive scale of the existing distributed computing solutions, the federation of small to mid-size resources will be insufficient to meet projected future demands. This paper is a case study of how the ATLAS experiment has embraced Titan -- a DOE leadership facility in conjunction with traditional distributed high- throughput computing to reach sustained production scales of approximately 52M core-hours a years. The three main contributions of this paper are: (i)more » a critical evaluation of design and operational considerations to support the sustained, scalable and production usage of Titan; (ii) a preliminary characterization of a next generation executor for PanDA to support new workloads and advanced execution modes; and (iii) early lessons for how current and future experimental and observational systems can be integrated with production supercomputers and other platforms in a general and extensible manner.« less

  7. Developmental Changes in Learning: Computational Mechanisms and Social Influences

    PubMed Central

    Bolenz, Florian; Reiter, Andrea M. F.; Eppinger, Ben

    2017-01-01

    Our ability to learn from the outcomes of our actions and to adapt our decisions accordingly changes over the course of the human lifespan. In recent years, there has been an increasing interest in using computational models to understand developmental changes in learning and decision-making. Moreover, extensions of these models are currently applied to study socio-emotional influences on learning in different age groups, a topic that is of great relevance for applications in education and health psychology. In this article, we aim to provide an introduction to basic ideas underlying computational models of reinforcement learning and focus on parameters and model variants that might be of interest to developmental scientists. We then highlight recent attempts to use reinforcement learning models to study the influence of social information on learning across development. The aim of this review is to illustrate how computational models can be applied in developmental science, what they can add to our understanding of developmental mechanisms and how they can be used to bridge the gap between psychological and neurobiological theories of development. PMID:29250006

  8. OpenARC: Extensible OpenACC Compiler Framework for Directive-Based Accelerator Programming Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Seyong; Vetter, Jeffrey S

    2014-01-01

    Directive-based, accelerator programming models such as OpenACC have arisen as an alternative solution to program emerging Scalable Heterogeneous Computing (SHC) platforms. However, the increased complexity in the SHC systems incurs several challenges in terms of portability and productivity. This paper presents an open-sourced OpenACC compiler, called OpenARC, which serves as an extensible research framework to address those issues in the directive-based accelerator programming. This paper explains important design strategies and key compiler transformation techniques needed to implement the reference OpenACC compiler. Moreover, this paper demonstrates the efficacy of OpenARC as a research framework for directive-based programming study, by proposing andmore » implementing OpenACC extensions in the OpenARC framework to 1) support hybrid programming of the unified memory and separate memory and 2) exploit architecture-specific features in an abstract manner. Porting thirteen standard OpenACC programs and three extended OpenACC programs to CUDA GPUs shows that OpenARC performs similarly to a commercial OpenACC compiler, while it serves as a high-level research framework.« less

  9. Cyber-workstation for computational neuroscience.

    PubMed

    Digiovanna, Jack; Rattanatamrong, Prapaporn; Zhao, Ming; Mahmoudi, Babak; Hermer, Linda; Figueiredo, Renato; Principe, Jose C; Fortes, Jose; Sanchez, Justin C

    2010-01-01

    A Cyber-Workstation (CW) to study in vivo, real-time interactions between computational models and large-scale brain subsystems during behavioral experiments has been designed and implemented. The design philosophy seeks to directly link the in vivo neurophysiology laboratory with scalable computing resources to enable more sophisticated computational neuroscience investigation. The architecture designed here allows scientists to develop new models and integrate them with existing models (e.g. recursive least-squares regressor) by specifying appropriate connections in a block-diagram. Then, adaptive middleware transparently implements these user specifications using the full power of remote grid-computing hardware. In effect, the middleware deploys an on-demand and flexible neuroscience research test-bed to provide the neurophysiology laboratory extensive computational power from an outside source. The CW consolidates distributed software and hardware resources to support time-critical and/or resource-demanding computing during data collection from behaving animals. This power and flexibility is important as experimental and theoretical neuroscience evolves based on insights gained from data-intensive experiments, new technologies and engineering methodologies. This paper describes briefly the computational infrastructure and its most relevant components. Each component is discussed within a systematic process of setting up an in vivo, neuroscience experiment. Furthermore, a co-adaptive brain machine interface is implemented on the CW to illustrate how this integrated computational and experimental platform can be used to study systems neurophysiology and learning in a behavior task. We believe this implementation is also the first remote execution and adaptation of a brain-machine interface.

  10. Cyber-Workstation for Computational Neuroscience

    PubMed Central

    DiGiovanna, Jack; Rattanatamrong, Prapaporn; Zhao, Ming; Mahmoudi, Babak; Hermer, Linda; Figueiredo, Renato; Principe, Jose C.; Fortes, Jose; Sanchez, Justin C.

    2009-01-01

    A Cyber-Workstation (CW) to study in vivo, real-time interactions between computational models and large-scale brain subsystems during behavioral experiments has been designed and implemented. The design philosophy seeks to directly link the in vivo neurophysiology laboratory with scalable computing resources to enable more sophisticated computational neuroscience investigation. The architecture designed here allows scientists to develop new models and integrate them with existing models (e.g. recursive least-squares regressor) by specifying appropriate connections in a block-diagram. Then, adaptive middleware transparently implements these user specifications using the full power of remote grid-computing hardware. In effect, the middleware deploys an on-demand and flexible neuroscience research test-bed to provide the neurophysiology laboratory extensive computational power from an outside source. The CW consolidates distributed software and hardware resources to support time-critical and/or resource-demanding computing during data collection from behaving animals. This power and flexibility is important as experimental and theoretical neuroscience evolves based on insights gained from data-intensive experiments, new technologies and engineering methodologies. This paper describes briefly the computational infrastructure and its most relevant components. Each component is discussed within a systematic process of setting up an in vivo, neuroscience experiment. Furthermore, a co-adaptive brain machine interface is implemented on the CW to illustrate how this integrated computational and experimental platform can be used to study systems neurophysiology and learning in a behavior task. We believe this implementation is also the first remote execution and adaptation of a brain-machine interface. PMID:20126436

  11. Can the Use of Web-Based Comic Strip Creation Tool Facilitate EFL Learners' Grammar and Sentence Writing?

    ERIC Educational Resources Information Center

    Kilickaya, Ferit; Krajka, Jaroslaw

    2012-01-01

    Both teacher- and learner-made computer visuals are quite extensively reported in Computer-Assisted Language Learning literature, for instance, filming interviews, soap operas or mini-documentaries, creating storyboard projects, authoring podcasts and vodcasts, designing digital stories. Such student-made digital assets are used to present to…

  12. Status of Computer Applications in the Southern Land-Grant Institutions Experiment Stations/Extension Services/Resident Instruction.

    ERIC Educational Resources Information Center

    Rendiero, Jane; Linder, William W.

    This report summarizes the results of a survey of 29 southern land-grant institutions which elicited information on microcomputer capabilities, programming efforts, and computer awareness education for formers, homemakers, community organizations, planning agencies, and other end users. Five topics were covered by the survey: (1) degree of…

  13. Teachers Must Push Technology's Tidal Wave: District Technology Initiatives Must Put the Teacher in Charge.

    ERIC Educational Resources Information Center

    Guhlin, Miguel

    2002-01-01

    For technology to impact student achievement, teachers must be empowered via extensive staff development. This paper presents building-level technology initiatives (e.g., peer training, super substitutes, and computer clubs) and district- level initiatives (e.g., establish a district technology committee, allow teachers to take computers home over…

  14. Using Computational Text Classification for Qualitative Research and Evaluation in Extension

    ERIC Educational Resources Information Center

    Smith, Justin G.; Tissing, Reid

    2018-01-01

    This article introduces a process for computational text classification that can be used in a variety of qualitative research and evaluation settings. The process leverages supervised machine learning based on an implementation of a multinomial Bayesian classifier. Applied to a community of inquiry framework, the algorithm was used to identify…

  15. Design & Delivery of Training for a State-Wide Data Communication Network.

    ERIC Educational Resources Information Center

    Zacher, Candace M.

    This report describes the process of development of training for agricultural research, teaching, and extension professionals in how to use the Fast Agricultural Communications Terminal (FACTS) computer network at Purdue University (Indiana), which is currently being upgraded in order to utilize the latest computer technology. The FACTS system is…

  16. The Effect of CRT Screen Design on Learning.

    ERIC Educational Resources Information Center

    Grabinger, R. Scott; Albers, Starleen

    Two computer assisted instruction programs tested the effects of plain and enhanced screen designs with or without information about those designs and task-type on time and learning. Subjects were 140 fourth grade students in Lincoln, Nebraska who had extensive prior experience with computers. The enhanced versions used headings, directive cues,…

  17. Probing End-User IT Security Practices--Through Homework

    ERIC Educational Resources Information Center

    Smith, Sean W.

    2004-01-01

    At Dartmouth College, the author teaches a course called "Security and Privacy." Its early position in the overall computer science curriculum means the course needs to be introductory, and the author can't assume the students possess an extensive computer science background. These constraints leave the author with a challenge: to construct…

  18. Computer-Assisted Bilingual/Bicultural Multiskills Project, 1987-1988. OREA Report.

    ERIC Educational Resources Information Center

    Berney, Tomi D.; Carey, Cecilia

    The Computer-Assisted Bilingual/Bicultural Multiskills Project completed its first year of an extension grant. The program used computerized and non-computerized instruction to help 109 native speakers of Haitian Creole/French and Spanish, most of whom were recent immigrants, develop English-as-a-Second-Language (ESL) native language, and content…

  19. Classroom Talk and Computational Thinking

    ERIC Educational Resources Information Center

    Jenkins, Craig W.

    2017-01-01

    This paper is part of a wider action research project taking place at a secondary school in South Wales, UK. The overarching aim of the project is to examine the potential for aspects of literacy and computational thinking to be developed using extensible 'build your own block' programming activities. This paper examines classroom talk at an…

  20. 45 CFR 1630.13 - Time.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Time. 1630.13 Section 1630.13 Public Welfare... § 1630.13 Time. (a) Computation. Time limits specified in this part shall be computed in accordance with... recipient's written request for good cause, grant an extension of time and shall so notify the recipient in...

  1. 45 CFR 1630.13 - Time.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 4 2012-10-01 2012-10-01 false Time. 1630.13 Section 1630.13 Public Welfare... § 1630.13 Time. (a) Computation. Time limits specified in this part shall be computed in accordance with... recipient's written request for good cause, grant an extension of time and shall so notify the recipient in...

  2. 45 CFR 1630.13 - Time.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 4 2011-10-01 2011-10-01 false Time. 1630.13 Section 1630.13 Public Welfare... § 1630.13 Time. (a) Computation. Time limits specified in this part shall be computed in accordance with... recipient's written request for good cause, grant an extension of time and shall so notify the recipient in...

  3. 45 CFR 1630.13 - Time.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 4 2013-10-01 2013-10-01 false Time. 1630.13 Section 1630.13 Public Welfare... § 1630.13 Time. (a) Computation. Time limits specified in this part shall be computed in accordance with... recipient's written request for good cause, grant an extension of time and shall so notify the recipient in...

  4. 45 CFR 1630.13 - Time.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 4 2014-10-01 2014-10-01 false Time. 1630.13 Section 1630.13 Public Welfare... § 1630.13 Time. (a) Computation. Time limits specified in this part shall be computed in accordance with... recipient's written request for good cause, grant an extension of time and shall so notify the recipient in...

  5. The Disclosure-Intimacy Link in Computer-Mediated Communication: An Attributional Extension of the Hyperpersonal Model

    ERIC Educational Resources Information Center

    Jiang, L. Crystal; Bazarova, Natalie N.; Hancock, Jeffrey T.

    2011-01-01

    The present research investigated whether the attribution process through which people explain self-disclosures differs in text-based computer-mediated interactions versus face to face, and whether differences in causal attributions account for the increased intimacy frequently observed in mediated communication. In the experiment participants…

  6. EXTENSION OF COMPUTER-AIDED PROCESS ENGINEERING APPLICATIONS TO ENVIRONMENTAL LIFE CYCLE ASSESSMENT AND SUPPLY CHAIN MANAGEMENT

    EPA Science Inventory

    The potential of computer-aided process engineering (CAPE) tools to enable process engineers to improve the environmental performance of both their processes and across the life cycle (from cradle-to-grave) has long been proffered. However, this use of CAPE has not been fully ach...

  7. A vectorized Lanczos eigensolver for high-performance computers

    NASA Technical Reports Server (NTRS)

    Bostic, Susan W.

    1990-01-01

    The computational strategies used to implement a Lanczos-based-method eigensolver on the latest generation of supercomputers are described. Several examples of structural vibration and buckling problems are presented that show the effects of using optimization techniques to increase the vectorization of the computational steps. The data storage and access schemes and the tools and strategies that best exploit the computer resources are presented. The method is implemented on the Convex C220, the Cray 2, and the Cray Y-MP computers. Results show that very good computation rates are achieved for the most computationally intensive steps of the Lanczos algorithm and that the Lanczos algorithm is many times faster than other methods extensively used in the past.

  8. Computer Graphics Research Laboratory Quarterly Progress Report Number 49, July-September 1993

    DTIC Science & Technology

    1993-11-22

    20 Texture Sampling and Strength Guided Motion: Jeffry S. Nimeroff 23 21 Radiosity : Min-Zhi Shao 24 22 Blended Shape Primitives: Douglas DeCarlo 25 23...placement. "* Extensions of radiosity rendering. "* A discussion of blended shape primitives and the applications in computer vision and computer...user. Radiosity : An improved version of the radiosity renderer is included. This version uses a fast over- relaxation progressive refinement algorithm

  9. A survey of computer search service costs in the academic health sciences library.

    PubMed Central

    Shirley, S

    1978-01-01

    The Norris Medical Library, University of Southern California, has recently completed an extensive survey of costs involved in the provision of computer search services beyond vendor charges for connect time and printing. In this survey costs for such items as terminal depreciation, repair contract, personnel time, and supplies are analyzed. Implications of this cost survey are discussed in relation to planning and price setting for computer search services. PMID:708953

  10. A study of computer graphics technology in application of communication resource management

    NASA Astrophysics Data System (ADS)

    Li, Jing; Zhou, Liang; Yang, Fei

    2017-08-01

    With the development of computer technology, computer graphics technology has been widely used. Especially, the success of object-oriented technology and multimedia technology promotes the development of graphics technology in the computer software system. Therefore, the computer graphics theory and application technology have become an important topic in the field of computer, while the computer graphics technology becomes more and more extensive in various fields of application. In recent years, with the development of social economy, especially the rapid development of information technology, the traditional way of communication resource management cannot effectively meet the needs of resource management. In this case, the current communication resource management is still using the original management tools and management methods, resource management equipment management and maintenance, which brought a lot of problems. It is very difficult for non-professionals to understand the equipment and the situation in communication resource management. Resource utilization is relatively low, and managers cannot quickly and accurately understand the resource conditions. Aimed at the above problems, this paper proposes to introduce computer graphics technology into the communication resource management. The introduction of computer graphics not only makes communication resource management more vivid, but also reduces the cost of resource management and improves work efficiency.

  11. Quantum Monte Carlo for atoms and molecules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnett, R.N.

    1989-11-01

    The diffusion quantum Monte Carlo with fixed nodes (QMC) approach has been employed in studying energy-eigenstates for 1--4 electron systems. Previous work employing the diffusion QMC technique yielded energies of high quality for H{sub 2}, LiH, Li{sub 2}, and H{sub 2}O. Here, the range of calculations with this new approach has been extended to include additional first-row atoms and molecules. In addition, improvements in the previously computed fixed-node energies of LiH, Li{sub 2}, and H{sub 2}O have been obtained using more accurate trial functions. All computations were performed within, but are not limited to, the Born-Oppenheimer approximation. In our computations,more » the effects of variation of Monte Carlo parameters on the QMC solution of the Schroedinger equation were studied extensively. These parameters include the time step, renormalization time and nodal structure. These studies have been very useful in determining which choices of such parameters will yield accurate QMC energies most efficiently. Generally, very accurate energies (90--100% of the correlation energy is obtained) have been computed with single-determinant trail functions multiplied by simple correlation functions. Improvements in accuracy should be readily obtained using more complex trial functions.« less

  12. Access to pedestrian roads, daily activities, and physical performance of adolescents.

    PubMed

    Sjolie, A N

    2000-08-01

    A cross-sectional study using a questionnaire and physical tests was performed. To study how access to pedestrian roads and daily activities are related to low back strength, low back mobility, and hip mobility in adolescents. Although many authorities express concern about the passive lifestyle of adolescents, little is known about associations between daily activities and physical performance. This study compared 38 youths in a community lacking access to pedestrian roads with 50 youths in nearby area providing excellent access to pedestrian roads. A standardized questionnaire was used to obtain data about pedestrian roads, school journeys, and activities from the local authorities and the pupils. Low back strength was tested as static endurance strength, low back mobility by modified Schober techniques, and hip mobility by goniometer. For statistical analyses, a P value of 0.05 or less determined significance. In the area using school buses, the pupils had less low back extension, less hamstring flexibility, and less hip abduction, flexion, and extension than pupils in the area with pedestrian roads. Multivariate analyses showed no associations between walking or bicycling to school and anatomic function, but regular walking or bicycling to leisure-time activities associated positively with low back strength, low back extension, hip flexion, and extension. Distance by school bus associated negatively with hip abduction, hip flexion, hip extension, and hamstring flexibility (P<0.001). Time spent on television or computer associated negatively but insignificantly with low back strength, hamstring flexibility, hip abduction, and flexion (P<0.1). The results indicate that access to pedestrian roads and other lifestyle factors are associated with physical performance.

  13. Smaller external notebook mice have different effects on posture and muscle activity.

    PubMed

    Oude Hengel, Karen M; Houwink, Annemieke; Odell, Dan; van Dieën, Jaap H; Dennerlein, Jack T

    2008-07-01

    Extensive computer mouse use is an identified risk factor for computer work-related musculoskeletal disorders; however, notebook computer mouse designs of varying sizes have not been formally evaluated but may affect biomechanical risk factors. Thirty adults performed a set of mouse tasks with five notebook mice, ranging in length from 75 to 105 mm and in width from 35 to 65 mm, and a reference desktop mouse. An electro-magnetic motion analysis system measured index finger (metacarpophalangeal joint), wrist and forearm postures, and surface electromyography measured muscle activity of three extensor muscles in the forearm and the first dorsal interosseus. The smallest notebook mice were found to promote less neutral postures (up to 3.2 degrees higher metacarpophalangeal joint adduction; 6.5 degrees higher metacarpophalangeal joint flexion, 2.3 degrees higher wrist extension) and higher muscle activity (up to 4.1% of maximum voluntary contraction higher wrist extensor muscle activity). Participants with smaller hands had overall more non-neutral postures than participants with larger hands (up to 5.6 degrees higher wrist extension and 5.9 degrees higher pronation); while participants with larger hands were more influenced by the smallest notebook mice (up to 3.6 degrees higher wrist extension and 5.5% of maximum voluntary contraction higher wrist extensor values). Self-reported ratings showed that while participants preferred smaller mice for portability; larger mice scored higher on comfort and usability. The smallest notebook mice increased the intensity of biomechanical exposures. Longer term mouse use could enhance these differences, having a potential impact on the prevention of work-related musculoskeletal disorders.

  14. Virtual transplantation in designing a facial prosthesis for extensive maxillofacial defects that cross the facial midline using computer-assisted technology.

    PubMed

    Feng, Zhi-hong; Dong, Yan; Bai, Shi-zhu; Wu, Guo-feng; Bi, Yun-peng; Wang, Bo; Zhao, Yi-min

    2010-01-01

    The aim of this article was to demonstrate a novel approach to designing facial prostheses using the transplantation concept and computer-assisted technology for extensive, large, maxillofacial defects that cross the facial midline. The three-dimensional (3D) facial surface images of a patient and his relative were reconstructed using data obtained through optical scanning. Based on these images, the corresponding portion of the relative's face was transplanted to the patient's where the defect was located, which could not be rehabilitated using mirror projection, to design the virtual facial prosthesis without the eye. A 3D model of an artificial eye that mimicked the patient's remaining one was developed, transplanted, and fit onto the virtual prosthesis. A personalized retention structure for the artificial eye was designed on the virtual facial prosthesis. The wax prosthesis was manufactured through rapid prototyping, and the definitive silicone prosthesis was completed. The size, shape, and cosmetic appearance of the prosthesis were satisfactory and matched the defect area well. The patient's facial appearance was recovered perfectly with the prosthesis, as determined through clinical evaluation. The optical 3D imaging and computer-aided design/computer-assisted manufacturing system used in this study can design and fabricate facial prostheses more precisely than conventional manual sculpturing techniques. The discomfort generally associated with such conventional methods was decreased greatly. The virtual transplantation used to design the facial prosthesis for the maxillofacial defect, which crossed the facial midline, and the development of the retention structure for the eye were both feasible.

  15. QPROT: Statistical method for testing differential expression using protein-level intensity data in label-free quantitative proteomics.

    PubMed

    Choi, Hyungwon; Kim, Sinae; Fermin, Damian; Tsou, Chih-Chiang; Nesvizhskii, Alexey I

    2015-11-03

    We introduce QPROT, a statistical framework and computational tool for differential protein expression analysis using protein intensity data. QPROT is an extension of the QSPEC suite, originally developed for spectral count data, adapted for the analysis using continuously measured protein-level intensity data. QPROT offers a new intensity normalization procedure and model-based differential expression analysis, both of which account for missing data. Determination of differential expression of each protein is based on the standardized Z-statistic based on the posterior distribution of the log fold change parameter, guided by the false discovery rate estimated by a well-known Empirical Bayes method. We evaluated the classification performance of QPROT using the quantification calibration data from the clinical proteomic technology assessment for cancer (CPTAC) study and a recently published Escherichia coli benchmark dataset, with evaluation of FDR accuracy in the latter. QPROT is a statistical framework with computational software tool for comparative quantitative proteomics analysis. It features various extensions of QSPEC method originally built for spectral count data analysis, including probabilistic treatment of missing values in protein intensity data. With the increasing popularity of label-free quantitative proteomics data, the proposed method and accompanying software suite will be immediately useful for many proteomics laboratories. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. An enhanced beam model for constrained layer damping and a parameter study of damping contribution

    NASA Astrophysics Data System (ADS)

    Xie, Zhengchao; Shepard, W. Steve, Jr.

    2009-01-01

    An enhanced analytical model is presented based on an extension of previous models for constrained layer damping (CLD) in beam-like structures. Most existing CLD models are based on the assumption that shear deformation in the core layer is the only source of damping in the structure. However, previous research has shown that other types of deformation in the core layer, such as deformations from longitudinal extension and transverse compression, can also be important. In the enhanced analytical model developed here, shear, extension, and compression deformations are all included. This model can be used to predict the natural frequencies and modal loss factors. The numerical study shows that compared to other models, this enhanced model is accurate in predicting the dynamic characteristics. As a result, the model can be accepted as a general computation model. With all three types of damping included and the formulation used here, it is possible to study the impact of the structure's geometry and boundary conditions on the relative contribution of each type of damping. To that end, the relative contributions in the frequency domain for a few sample cases are presented.

  17. The electromagnetic modeling of thin apertures using the finite-difference time-domain technique

    NASA Technical Reports Server (NTRS)

    Demarest, Kenneth R.

    1987-01-01

    A technique which computes transient electromagnetic responses of narrow apertures in complex conducting scatterers was implemented as an extension of previously developed Finite-Difference Time-Domain (FDTD) computer codes. Although these apertures are narrow with respect to the wavelengths contained within the power spectrum of excitation, this technique does not require significantly more computer resources to attain the increased resolution at the apertures. In the report, an analytical technique which utilizes Babinet's principle to model the apertures is developed, and an FDTD computer code which utilizes this technique is described.

  18. Computer use at work is associated with self-reported depressive and anxiety disorder.

    PubMed

    Kim, Taeshik; Kang, Mo-Yeol; Yoo, Min-Sang; Lee, Dongwook; Hong, Yun-Chul

    2016-01-01

    With the development of technology, extensive use of computers in the workplace is prevalent and increases efficiency. However, computer users are facing new harmful working conditions with high workloads and longer hours. This study aimed to investigate the association between computer use at work and self-reported depressive and anxiety disorder (DAD) in a nationally representative sample of South Korean workers. This cross-sectional study was based on the third Korean Working Conditions Survey (2011), and 48,850 workers were analyzed. Information about computer use and DAD was obtained from a self-administered questionnaire. We investigated the relation between computer use at work and DAD using logistic regression. The 12-month prevalence of DAD in computer-using workers was 1.46 %. After adjustment for socio-demographic factors, the odds ratio for DAD was higher in workers using computers more than 75 % of their workday (OR 1.69, 95 % CI 1.30-2.20) than in workers using computers less than 50 % of their shift. After stratifying by working hours, computer use for over 75 % of the work time was significantly associated with increased odds of DAD in 20-39, 41-50, 51-60, and over 60 working hours per week. After stratifying by occupation, education, and job status, computer use for more than 75 % of the work time was related with higher odds of DAD in sales and service workers, those with high school and college education, and those who were self-employed and employers. A high proportion of computer use at work may be associated with depressive and anxiety disorder. This finding suggests the necessity of a work guideline to help the workers suffering from high computer use at work.

  19. Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.

    PubMed

    Gopnik, Alison; Wellman, Henry M

    2012-11-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.

  20. Prosodic analysis by rule

    NASA Astrophysics Data System (ADS)

    Lindsay, D.

    1985-02-01

    Research on the automatic computer analysis of intonation using linguistic knowledge is described. The use of computer programs to analyze and classify fundamental frequency (FO) contours, and work on the psychophysics of British English intonation and on the phonetics of FO contours are described. Results suggest that FO can be conveniently tracked to represent intonation through time, which can be subsequently used by a computer program as the basis for analysis. Nuclear intonation, where the intonational nucleus is the region of auditory prominence, or information focus, found in all spoken sentences was studied. The main mechanism behind such prominence is the perception of an extensive FO movement on the nuclear syllable. A classification of the nuclear contour shape is a classification of the sentence type, often into categories that cannot be readily determined from only the segmental phonemes of the utterance.

  1. On finite element implementation and computational techniques for constitutive modeling of high temperature composites

    NASA Technical Reports Server (NTRS)

    Saleeb, A. F.; Chang, T. Y. P.; Wilt, T.; Iskovitz, I.

    1989-01-01

    The research work performed during the past year on finite element implementation and computational techniques pertaining to high temperature composites is outlined. In the present research, two main issues are addressed: efficient geometric modeling of composite structures and expedient numerical integration techniques dealing with constitutive rate equations. In the first issue, mixed finite elements for modeling laminated plates and shells were examined in terms of numerical accuracy, locking property and computational efficiency. Element applications include (currently available) linearly elastic analysis and future extension to material nonlinearity for damage predictions and large deformations. On the material level, various integration methods to integrate nonlinear constitutive rate equations for finite element implementation were studied. These include explicit, implicit and automatic subincrementing schemes. In all cases, examples are included to illustrate the numerical characteristics of various methods that were considered.

  2. Disseminated Multi-system Sarcoidosis Mimicking Metastases on 18F-FDG PET/CT.

    PubMed

    Makis, William; Palayew, Mark; Rush, Christopher; Probst, Stephan

    2018-06-07

    A 60-year-old female with no significant medical history presented with hematuria. A computed tomography (CT) scan revealed extensive lymphadenopathy with hypodensities in the liver and spleen, and she was referred for an 18 F-fluorodeoxyglucose ( 18 F-FDG) positron emission tomography/CT (PET/CT) study to assess for malignancy of unknown primary. PET/CT revealed extensive 18 F-FDG avid lymphadenopathy as well as innumerable intensely 18 F-FDG avid lung, liver and splenic nodules, highly concerning for malignancy. A PET-guided bone marrow biopsy of the posterior superior iliac spine revealed several non-necrotizing, well-formed granulomas, consistent with sarcoidosis. The patient was managed conservatively and remained clinically well over the subsequent 9 years of follow-up.

  3. Ab initio study of excited state electronic circular dichroism. Two prototype cases: methyl oxirane and R-(+)-1,1'-bi(2-naphthol).

    PubMed

    Rizzo, Antonio; Vahtras, Olav

    2011-06-28

    A computational approach to the calculation of excited state electronic circular dichroism (ESECD) spectra of chiral molecules is discussed. Frequency dependent quadratic response theory is employed to compute the rotatory strength for transitions between excited electronic states, by employing both a magnetic gauge dependent and a (velocity-based) magnetic gauge independent approach. Application is made to the lowest excited states of two prototypical chiral molecules, propylene oxide, also known as 1,2-epoxypropane or methyl oxirane, and R-(+)-1,1'-bi(2-naphthol), or BINOL. The dependence of the rotatory strength for transitions between the lowest three excited states of methyl oxirane upon the quality and extension of the basis set is analyzed, by employing a hierarchy of correlation consistent basis sets. Once established that basis sets of at least triple zeta quality, and at least doubly augmented, are sufficient to ensure sufficiently converged results, at least at the Hartree-Fock self-consistent field (HF-SCF) level, the rotatory strengths for all transitions between the lowest excited electronic states of methyl oxirane are computed and analyzed, employing HF-SCF, and density functional theory (DFT) electronic structure models. For DFT, both the popular B3LYP and its recently highly successful CAM-B3LYP extension are exploited. The strong dependence of the spectra upon electron correlation is highlighted. A HF-SCF and DFT study is carried out also for BINOL, a system where excited states show the typical pairing structure arising from the interaction of the two monomeric moieties, and whose conformational changes following photoexcitation were studied recently with via time-resolved CD.

  4. Impact of random discrete dopant in extension induced fluctuation in gate-source/drain underlap FinFET

    NASA Astrophysics Data System (ADS)

    Wang, Yijiao; Huang, Peng; Xin, Zheng; Zeng, Lang; Liu, Xiaoyan; Du, Gang; Kang, Jinfeng

    2014-01-01

    In this work, three dimensional technology computer-aided design (TCAD) simulations are performed to investigate the impact of random discrete dopant (RDD) including extension induced fluctuation in 14 nm silicon-on-insulator (SOI) gate-source/drain (G-S/D) underlap fin field effect transistor (FinFET). To fully understand the RDD impact in extension, RDD effect is evaluated in channel and extension separately and together. The statistical variability of FinFET performance parameters including threshold voltage (Vth), subthreshold slope (SS), drain induced barrier lowering (DIBL), drive current (Ion), and leakage current (Ioff) are analyzed. The results indicate that RDD in extension can lead to substantial variability, especially for SS, DIBL, and Ion and should be taken into account together with that in channel to get an accurate estimation on RDF. Meanwhile, higher doping concentration of extension region is suggested from the perspective of overall variability control.

  5. Analytical and experimental vibration studies of a 1/8-scale shuttle orbiter

    NASA Technical Reports Server (NTRS)

    Pinson, L. D.

    1975-01-01

    Natural frequencies and mode shapes for four symmetric vibration modes and four antisymmetric modes are compared with predictions based on NASTRAN finite-element analyses. Initial predictions gave poor agreement with test data; an extensive investigation revealed that the major factors influencing agreement were out-of-plane imperfections in fuselage panels and a soft fin-fuselage connection. Computations with a more refined analysis indicated satisfactory frequency predictions for all modes studied, within 11 percent of experimental values.

  6. Thermal Evolution of the North-Central Gulf Coast

    NASA Astrophysics Data System (ADS)

    Nunn, Jeffrey A.; Scardina, Allan D.; Pilger, Rex H., Jr.

    1984-12-01

    The subsidence history of the North Louisiana Salt Basin, determined from well data, indicates that the region underwent extension during rifting and has since passively subsided due to conductive cooling of the lithosphere. Timing of the rifting event is consistent with opening of the Gulf of Mexico during Late Triassic to Early Jurassic time. Crustal extension by a factor of 1.5 to 2 was computed from "tectonic" subsidence curves. However, data from the early subsidence history are insufficient to distinguish between uniform and nonuniform extension of the lithosphere. The magnitude of extension is in good agreement with total sediment and crustal thicknesses from seismic refraction data in the adjacent Central Mississippi Salt Basin. The temperature distribution within the sediments is calculated using a simple heat conduction model. Temperature and subsidence effects of thermal insulation by overlying sediments are included. The computed temperature distribution is in good agreement with bottom hole temperatures measured in deep wells. Temperature histories predicted for selected stratigraphic horizons within the North Louisiana Salt Basin suggest that thermal conditions have been favorable for hydrocarbon generation in the older stata. Results from a two-dimensional heat conduction model suggest that a probable cause for the early formation of the adjacent uplifts is lateral heat conduction from the basin. Rapid extension of the lithosphere underneath areas with horizontal dimensions of 50-100 km produces extremely rapid early subsidence due to lateral heat conduction. The moderate subsidence rate observed in the North Louisiana Salt Basin during the Jurassic and Early Cretaceous suggests slow extension over a long period of time.

  7. GDA (Geologic Data Assistant), an ArcPad extension for geologic mapping: code, prerequisites, and instructions

    USGS Publications Warehouse

    ,

    2006-01-01

    GDA (Geologic Data Assistant) is an extension to ArcPad, a mobile mapping software program by Environmental Systems Research Institute (ESRI) designed to run on personal digital assistant (PDA) computers. GDA and ArcPad allow a PDA to replace the paper notebook and field map traditionally used for geologic mapping. GDA allows easy collection of field data.

  8. Determining causal miRNAs and their signaling cascade in diseases using an influence diffusion model.

    PubMed

    Nalluri, Joseph J; Rana, Pratip; Barh, Debmalya; Azevedo, Vasco; Dinh, Thang N; Vladimirov, Vladimir; Ghosh, Preetam

    2017-08-15

    In recent studies, miRNAs have been found to be extremely influential in many of the essential biological processes. They exhibit a self-regulatory mechanism through which they act as positive/negative regulators of expression of genes and other miRNAs. This has direct implications in the regulation of various pathophysiological conditions, signaling pathways and different types of cancers. Studying miRNA-disease associations has been an extensive area of research; however deciphering miRNA-miRNA network regulatory patterns in several diseases remains a challenge. In this study, we use information diffusion theory to quantify the influence diffusion in a miRNA-miRNA regulation network across multiple disease categories. Our proposed methodology determines the critical disease specific miRNAs which play a causal role in their signaling cascade and hence may regulate disease progression. We extensively validate our framework using existing computational tools from the literature. Furthermore, we implement our framework on a comprehensive miRNA expression data set for alcohol dependence and identify the causal miRNAs for alcohol-dependency in patients which were validated by the phase-shift in their expression scores towards the early stages of the disease. Finally, our computational framework for identifying causal miRNAs implicated in diseases is available as a free online tool for the greater scientific community.

  9. Computer-Managed Instruction: Theory, Application, and Some Key Implementation Issues.

    DTIC Science & Technology

    1984-03-01

    who have endorsed computer technology but fail to adopt it . As one educational consultant claims: "Educators appear to have a deep-set skepticism toward...widespread use. i-1 II. BACKGROUND A. HISTORICAL PERSPECTIVE In the mid-1950’s, while still in its infancy, computer technology entered the world of education...to utilize the new technology , and to do it most.. extensively. Implementation of CMI in a standalone configuration using microcomputers has been

  10. Computation of repetitions and regularities of biologically weighted sequences.

    PubMed

    Christodoulakis, M; Iliopoulos, C; Mouchard, L; Perdikuri, K; Tsakalidis, A; Tsichlas, K

    2006-01-01

    Biological weighted sequences are used extensively in molecular biology as profiles for protein families, in the representation of binding sites and often for the representation of sequences produced by a shotgun sequencing strategy. In this paper, we address three fundamental problems in the area of biologically weighted sequences: (i) computation of repetitions, (ii) pattern matching, and (iii) computation of regularities. Our algorithms can be used as basic building blocks for more sophisticated algorithms applied on weighted sequences.

  11. Sensorimotor Assessment and Rehabilitative Apparatus

    DTIC Science & Technology

    2017-10-01

    vestibulo-ocular assessment without measuring eye movements per se. VON uses a head-mounted motion sensor, laptop computer with user...powered laptop computer with extensive processing algorithms. Frequent occlusion of the pupil by 2 eurosc t a o t t T t m I L f t o e n o a s h e t t s...The apparatus consists of a laptop computer , mirror galvanometer, back-projected laser target, data acquisition board, rate sensor, and motion-gain

  12. Flight instrument and telemetry response and its inversion

    NASA Technical Reports Server (NTRS)

    Weinberger, M. R.

    1971-01-01

    Mathematical models of rate gyros, servo accelerometers, pressure transducers, and telemetry systems were derived and their parameters were obtained from laboratory tests. Analog computer simulations were used extensively for verification of the validity for fast and large input signals. An optimal inversion method was derived to reconstruct input signals from noisy output signals and a computer program was prepared.

  13. Survivability Extensions for Dynamic Ultralog Environments

    DTIC Science & Technology

    2004-12-07

    8217 on line number 2018 doDo: Wrong number of tokens for ’Do’ on line number 2049 doDo: Wrong number of tokens for ’Do’ on line number 2069 doDo...discuss survivability as defined in the " bible of computational complexity", namely, the book "Computers and Intractability, a Guide to the Theory of

  14. Monte Carlo Simulation Using HyperCard and Lotus 1-2-3.

    ERIC Educational Resources Information Center

    Oulman, Charles S.; Lee, Motoko Y.

    Monte Carlo simulation is a computer modeling procedure for mimicking observations on a random variable. A random number generator is used in generating the outcome for the events that are being modeled. The simulation can be used to obtain results that otherwise require extensive testing or complicated computations. This paper describes how Monte…

  15. Social Cognitive Predictors of the Interests and Choices of Computing Majors: Applicability to Underrepresented Students

    ERIC Educational Resources Information Center

    Lent, Robert W.; Lopez, Frederick G.; Sheu, Hung-Bin; Lopez, Antonio M., Jr.

    2011-01-01

    In a replication and extension of earlier research, we examined the explanatory adequacy of the social cognitive choice model (Lent, Brown, & Hackett, 1994) in a sample of 1404 students majoring in a variety of computing disciplines at 23 historically Black and 27 predominantly White universities. Participants completed measures of self-efficacy,…

  16. pyro: Python-based tutorial for computational methods for hydrodynamics

    NASA Astrophysics Data System (ADS)

    Zingale, Michael

    2015-07-01

    pyro is a simple python-based tutorial on computational methods for hydrodynamics. It includes 2-d solvers for advection, compressible, incompressible, and low Mach number hydrodynamics, diffusion, and multigrid. It is written with ease of understanding in mind. An extensive set of notes that is part of the Open Astrophysics Bookshelf project provides details of the algorithms.

  17. Analysis of the knowledge and opinions of students and qualified dentists regarding the use of computers.

    PubMed

    Castelló Castañeda, Coral; Ríos Santos, Jose Vicente; Bullón, Pedro

    2008-01-01

    Dentists are currently required to make multiple diagnoses and treatment decisions every day and the information necessary to achieve this satisfactorily doubles in volume every five years. Knowledge therefore rapidly becomes out of date, so that it is often impossible to remember established information and assimilate new concepts. This may result in a significant lack of knowledge in the future, which would jeopardize the success of treatments. To remedy this situation and to prevent it, we nowadays have access to modern computing systems, with an extensive data base, which helps us to retain the information necessary for daily practice and access it instantaneously. The objectives of this study are therefore to determine how widespread the use of computing is in this environment and to determine the opinion of students and qualified dentists as regards its use in Dentistry. 90 people were chosen to take part in the study, divided into the following groups (students) (newly qualified dentists) (experts). It has been demonstrated that a high percentage (93.30%) use a computer, but that their level of computing knowledge is predominantly moderate. The place where a computer is used most is the home, which suggests that the majority own a computer. Analysis of the results obtained for evaluation of computers in teaching showed that the participants thought that it saved a great deal of time and had great potential for providing an image (in terms of marketing) and they considered it a very innovative and stimulating tool.

  18. Research on Extension of Sparql Ontology Query Language Considering the Computation of Indoor Spatial Relations

    NASA Astrophysics Data System (ADS)

    Li, C.; Zhu, X.; Guo, W.; Liu, Y.; Huang, H.

    2015-05-01

    A method suitable for indoor complex semantic query considering the computation of indoor spatial relations is provided According to the characteristics of indoor space. This paper designs ontology model describing the space related information of humans, events and Indoor space objects (e.g. Storey and Room) as well as their relations to meet the indoor semantic query. The ontology concepts are used in IndoorSPARQL query language which extends SPARQL syntax for representing and querying indoor space. And four types specific primitives for indoor query, "Adjacent", "Opposite", "Vertical" and "Contain", are defined as query functions in IndoorSPARQL used to support quantitative spatial computations. Also a method is proposed to analysis the query language. Finally this paper adopts this method to realize indoor semantic query on the study area through constructing the ontology model for the study building. The experimental results show that the method proposed in this paper can effectively support complex indoor space semantic query.

  19. Computational biomechanics of bone's responses to dental prostheses - osseointegration, remodeling and resorption

    NASA Astrophysics Data System (ADS)

    Li, Wei; Rungsiyakull, Chaiy; Field, Clarice; Lin, Daniel; Zhang, Leo; Li, Qing; Swain, Michael

    2010-06-01

    Clinical and experimental studies showed that human bone has the ability to remodel itself to better adapt to its biomechanical environment by changing both its material properties and geometry. As a consequence of the rapid development and extensive applications of major dental restorations such as implantation and fixed partial denture (FPD), the effect of bone remodeling on the success of a dental restorative surgery is becoming critical for prosthetic design and pre-surgical assessment. This paper aims to provide a computational biomechanics framework to address dental bone's responses as a result of dental restoration. It explored three important issues of resorption, apposition and osseointegration in terms of remodeling simulation. The published remodeling data in long bones were regulated to drive the computational remodeling prediction for the dental bones by correlating the results to clinical data. It is anticipated that the study will provide a more predictive model of dental bone response and help develop a new design methodology for patient-specific dental prosthetic restoration.

  20. Extensions to PIFCGT: Multirate output feedback and optimal disturbance suppression

    NASA Technical Reports Server (NTRS)

    Broussard, J. R.

    1986-01-01

    New control synthesis procedures for digital flight control systems were developed. The theoretical developments are the solution to the problem of optimal disturbance suppression in the presence of windshear. Control synthesis is accomplished using a linear quadratic cost function, the command generator tracker for trajectory following and the proportional-integral-filter control structure for practical implementation. Extensions are made to the optimal output feedback algorithm for computing feedback gains so that the multirate and optimal disturbance control designs are computed and compared for the advanced transport operating system (ATOPS). The performance of the designs is demonstrated by closed-loop poles, frequency domain multiinput sigma and eigenvalue plots and detailed nonlinear 6-DOF aircraft simulations in the terminal area in the presence of windshear.

  1. Towards developing robust algorithms for solving partial differential equations on MIMD machines

    NASA Technical Reports Server (NTRS)

    Saltz, Joel H.; Naik, Vijay K.

    1988-01-01

    Methods for efficient computation of numerical algorithms on a wide variety of MIMD machines are proposed. These techniques reorganize the data dependency patterns to improve the processor utilization. The model problem finds the time-accurate solution to a parabolic partial differential equation discretized in space and implicitly marched forward in time. The algorithms are extensions of Jacobi and SOR. The extensions consist of iterating over a window of several timesteps, allowing efficient overlap of computation with communication. The methods increase the degree to which work can be performed while data are communicated between processors. The effect of the window size and of domain partitioning on the system performance is examined both by implementing the algorithm on a simulated multiprocessor system.

  2. Towards developing robust algorithms for solving partial differential equations on MIMD machines

    NASA Technical Reports Server (NTRS)

    Saltz, J. H.; Naik, V. K.

    1985-01-01

    Methods for efficient computation of numerical algorithms on a wide variety of MIMD machines are proposed. These techniques reorganize the data dependency patterns to improve the processor utilization. The model problem finds the time-accurate solution to a parabolic partial differential equation discretized in space and implicitly marched forward in time. The algorithms are extensions of Jacobi and SOR. The extensions consist of iterating over a window of several timesteps, allowing efficient overlap of computation with communication. The methods increase the degree to which work can be performed while data are communicated between processors. The effect of the window size and of domain partitioning on the system performance is examined both by implementing the algorithm on a simulated multiprocessor system.

  3. Coordination characteristics of uranyl BBP complexes: Insights from an electronic structure analysis

    DOE PAGES

    Pemmaraju, Chaitanya Das; Copping, Roy; Smiles, Danil E.; ...

    2017-03-21

    Here, organic ligand complexes of lanthanide/actinide ions have been studied extensively for applications in nuclear fuel storage and recycling. Several complexes of 2,6-bis(2-benzimidazyl)pyridine (H2BBP) featuring the uranyl moiety have been reported recently, and the present study investigates the coordination characteristics of these complexes using density functional theory-based electronic structure analysis. In particular, with the aid of several computational models, the nonplanar equatorial coordination about uranyl, observed in some of the compounds, is studied and its origin traced to steric effects.

  4. Relativistic Radiative and Auger Rates for Fe XXIV

    NASA Technical Reports Server (NTRS)

    Bautista, M. A.; Mendoza, C.; Kallman, T. R.; Palmeri, P.; White, Nicholas E. (Technical Monitor)

    2002-01-01

    As part of a project to compute improved atomic data for the spectral modeling of iron K lines, we report extensive calculations and comparisons of radiative and Auger rates for transitions involving the K-vacancy states in Fe XXIV. By making use of several computational codes, a detailed study is carried out of orbital representation, configuration interaction, relativistic corrections, cancellation effects, and fine tuning. It is shown that a formal treatment of the Breit interaction is essential to render the important magnetic correlations that take part in the decay pathways of this ion. As a result, the accuracy of the present A-values is firmly ranked at better than 10% while that of the Auger rates at only 15%.

  5. Performance Evaluation of Remote Memory Access (RMA) Programming on Shared Memory Parallel Computers

    NASA Technical Reports Server (NTRS)

    Jin, Hao-Qiang; Jost, Gabriele; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    The purpose of this study is to evaluate the feasibility of remote memory access (RMA) programming on shared memory parallel computers. We discuss different RMA based implementations of selected CFD application benchmark kernels and compare them to corresponding message passing based codes. For the message-passing implementation we use MPI point-to-point and global communication routines. For the RMA based approach we consider two different libraries supporting this programming model. One is a shared memory parallelization library (SMPlib) developed at NASA Ames, the other is the MPI-2 extensions to the MPI Standard. We give timing comparisons for the different implementation strategies and discuss the performance.

  6. Microscopic description of fission properties for r-process nuclei

    NASA Astrophysics Data System (ADS)

    Giuliani, S. A.; Martínez-Pinedo, G.; Robledo, L. M.

    2018-01-01

    Fission properties of 886 even-even nuclei in the region 84 ≤ Z ≤ 120 and 118 ≤ Z ≤ 250 were computed using the Barcelona-Catania-Paris-Madrid energy density functional. An extensive study of both the potential energy surfaces and collectives inertias was performed. Spontaneous fission half-lives are computed using the semiclassical Wentzel-Kramers-Brillouin formalism. By comparing these three quantities we found that the stability of the nucleus against the fission process is driven by the interplay between both the potential energy and the collective inertias. In our calculations, nuclei with relative long half-lives were found in two regions around Z = 120, N = 182 and Z = 104, N = 222.

  7. Probing Higgs self-coupling of a classically scale invariant model in e+e- → Zhh: Evaluation at physical point

    NASA Astrophysics Data System (ADS)

    Fujitani, Y.; Sumino, Y.

    2018-04-01

    A classically scale invariant extension of the standard model predicts large anomalous Higgs self-interactions. We compute missing contributions in previous studies for probing the Higgs triple coupling of a minimal model using the process e+e- → Zhh. Employing a proper order counting, we compute the total and differential cross sections at the leading order, which incorporate the one-loop corrections between zero external momenta and their physical values. Discovery/exclusion potential of a future e+e- collider for this model is estimated. We also find a unique feature in the momentum dependence of the Higgs triple vertex for this class of models.

  8. Mathematical and Computational Modeling for Tumor Virotherapy with Mediated Immunity.

    PubMed

    Timalsina, Asim; Tian, Jianjun Paul; Wang, Jin

    2017-08-01

    We propose a new mathematical modeling framework based on partial differential equations to study tumor virotherapy with mediated immunity. The model incorporates both innate and adaptive immune responses and represents the complex interaction among tumor cells, oncolytic viruses, and immune systems on a domain with a moving boundary. Using carefully designed computational methods, we conduct extensive numerical simulation to the model. The results allow us to examine tumor development under a wide range of settings and provide insight into several important aspects of the virotherapy, including the dependence of the efficacy on a few key parameters and the delay in the adaptive immunity. Our findings also suggest possible ways to improve the virotherapy for tumor treatment.

  9. Rigorous Numerics for ill-posed PDEs: Periodic Orbits in the Boussinesq Equation

    NASA Astrophysics Data System (ADS)

    Castelli, Roberto; Gameiro, Marcio; Lessard, Jean-Philippe

    2018-04-01

    In this paper, we develop computer-assisted techniques for the analysis of periodic orbits of ill-posed partial differential equations. As a case study, our proposed method is applied to the Boussinesq equation, which has been investigated extensively because of its role in the theory of shallow water waves. The idea is to use the symmetry of the solutions and a Newton-Kantorovich type argument (the radii polynomial approach) to obtain rigorous proofs of existence of the periodic orbits in a weighted ℓ1 Banach space of space-time Fourier coefficients with exponential decay. We present several computer-assisted proofs of the existence of periodic orbits at different parameter values.

  10. Cumulative trauma disorder risk for children using computer products: results of a pilot investigation with a student convenience sample.

    PubMed

    Burke, Adam; Peper, Erik

    2002-01-01

    Cumulative trauma disorder is a major health problem for adults. Despite a growing understanding of adult cumulative trauma disorder, however, little is known about the risks for younger populations. This investigation examined issues related to child/adolescent computer product use and upper body physical discomfort. A convenience sample of 212 students, grades 1-12, was interviewed at their homes by a college-age sibling or relative. One of the child's parents was also interviewed. A 22-item questionnaire was used for data-gathering. Questionnaire items included frequency and duration of use, type of computer products/games and input devices used, presence of physical discomfort, and parental concerns related to the child's computer use. Many students experienced physical discomfort attributed to computer use, such as wrist pain (30%) and back pain (15%). Specific computer activities-such as using a joystick or playing noneducational games-were significantly predictive of physical discomfort using logistic multiple regression. Many parents reported difficulty getting their children off the computer (46%) and that their children spent less time outdoors (35%). Computer product use within this cohort was associated with self-reported physical discomfort. Results suggest a need for more extensive study, including multiyear longitudinal surveys.

  11. Repeated Kicking Actions in Karate: Effect on Technical Execution in Elite Practitioners.

    PubMed

    Quinzi, Federico; Camomilla, Valentina; Di Mario, Alberto; Felici, Francesco; Sbriccoli, Paola

    2016-04-01

    Training in martial arts is commonly performed by repeating a technical action continuously for a given number of times. This study aimed to investigate if the repetition of the task alters the proper technical execution, limiting the training efficacy for the technical evaluation during competition. This aim was pursued analyzing lower-limb kinematics and muscle activation during repeated roundhouse kicks. Six junior karate practitioners performed continuously 20 repetitions of the kick. Hip and knee kinematics and sEMG of vastus lateralis, biceps (BF), and rectus femoris were recorded. For each repetition, hip abduction-adduction and flexion-extension and knee flexion-extension peak angular displacements and velocities, agonist and antagonist muscle activation were computed. Moreover, to monitor for the presence of myoelectric fatigue, if any, the median frequency of the sEMG was computed. All variables were normalized with respect to their individual maximum observed during the sequence of kicks. Linear regressions were fitted to each normalized parameter to test its relationship with the repetition number. Linear-regression analysis showed that, during the sequence, the athletes modified their technique: Knee flexion, BF median frequency, hip abduction, knee-extension angular velocity, and BF antagonist activation significantly decreased. Conversely, hip flexion increased significantly. Since karate combat competitions require proper technical execution, training protocols combining severe fatigue and technical actions should be carefully proposed because of technique adaptations. Moreover, trainers and karate masters should consider including specific strength exercises for the BF and more generally for knee flexors.

  12. Computer Three-Dimensional Reconstruction of the Atrioventricular Node

    PubMed Central

    Li, Jue; Greener, Ian D.; Inada, Shin; Nikolski, Vladimir P.; Yamamoto, Mitsuru; Hancox, Jules C.; Zhang, Henggui; Billeter, Rudi; Efimov, Igor R.; Dobrzynski, Halina; Boyett, Mark R.

    2009-01-01

    Because of its complexity, the atrioventricular node (AVN), remains 1 of the least understood regions of the heart. The aim of the study was to construct a detailed anatomic model of the AVN and relate it to AVN function. The electric activity of a rabbit AVN preparation was imaged using voltage-dependent dye. The preparation was then fixed and sectioned. Sixty-five sections at 60- to 340-μm intervals were stained for histology and immunolabeled for neurofilament (marker of nodal tissue) and connexin43 (gap junction protein). This revealed multiple structures within and around the AVN, including transitional tissue, inferior nodal extension, penetrating bundle, His bundle, atrial and ventricular muscle, central fibrous body, tendon of Todaro, and valves. A 3D anatomically detailed mathematical model (≈13 million element array) of the AVN and surrounding atrium and ventricle, incorporating all cell types, was constructed. Comparison of the model with electric activity recorded in experiments suggests that the inferior nodal extension forms the slow pathway, whereas the transitional tissue forms the fast pathway into the AVN. In addition, it suggests the pacemaker activity of the atrioventricular junction originates in the inferior nodal extension. Computer simulation of the propagation of the action potential through the anatomic model shows how, because of the complex structure of the AVN, reentry (slow-fast and fast-slow) can occur. In summary, a mathematical model of the anatomy of the AVN has been generated that allows AVN conduction to be explored. PMID:18309098

  13. Synchrotron Imaging Computations on the Grid without the Computing Element

    NASA Astrophysics Data System (ADS)

    Curri, A.; Pugliese, R.; Borghes, R.; Kourousias, G.

    2011-12-01

    Besides the heavy use of the Grid in the Synchrotron Radiation Facility (SRF) Elettra, additional special requirements from the beamlines had to be satisfied through a novel solution that we present in this work. In the traditional Grid Computing paradigm the computations are performed on the Worker Nodes of the grid element known as the Computing Element. A Grid middleware extension that our team has been working on, is that of the Instrument Element. In general it is used to Grid-enable instrumentation; and it can be seen as a neighbouring concept to that of the traditional Control Systems. As a further extension we demonstrate the Instrument Element as the steering mechanism for a series of computations. In our deployment it interfaces a Control System that manages a series of computational demanding Scientific Imaging tasks in an online manner. The instrument control in Elettra is done through a suitable Distributed Control System, a common approach in the SRF community. The applications that we present are for a beamline working in medical imaging. The solution resulted to a substantial improvement of a Computed Tomography workflow. The near-real-time requirements could not have been easily satisfied from our Grid's middleware (gLite) due to the various latencies often occurred during the job submission and queuing phases. Moreover the required deployment of a set of TANGO devices could not have been done in a standard gLite WN. Besides the avoidance of certain core Grid components, the Grid Security infrastructure has been utilised in the final solution.

  14. Methane Adsorption in Zr-Based MOFs: Comparison and Critical Evaluation of Force Fields

    PubMed Central

    2017-01-01

    The search for nanoporous materials that are highly performing for gas storage and separation is one of the contemporary challenges in material design. The computational tools to aid these experimental efforts are widely available, and adsorption isotherms are routinely computed for huge sets of (hypothetical) frameworks. Clearly the computational results depend on the interactions between the adsorbed species and the adsorbent, which are commonly described using force fields. In this paper, an extensive comparison and in-depth investigation of several force fields from literature is reported for the case of methane adsorption in the Zr-based Metal–Organic Frameworks UiO-66, UiO-67, DUT-52, NU-1000, and MOF-808. Significant quantitative differences in the computed uptake are observed when comparing different force fields, but most qualitative features are common which suggests some predictive power of the simulations when it comes to these properties. More insight into the host–guest interactions is obtained by benchmarking the force fields with an extensive number of ab initio computed single molecule interaction energies. This analysis at the molecular level reveals that especially ab initio derived force fields perform well in reproducing the ab initio interaction energies. Finally, the high sensitivity of uptake predictions on the underlying potential energy surface is explored. PMID:29170687

  15. Porting marine ecosystem model spin-up using transport matrices to GPUs

    NASA Astrophysics Data System (ADS)

    Siewertsen, E.; Piwonski, J.; Slawig, T.

    2013-01-01

    We have ported an implementation of the spin-up for marine ecosystem models based on transport matrices to graphics processing units (GPUs). The original implementation was designed for distributed-memory architectures and uses the Portable, Extensible Toolkit for Scientific Computation (PETSc) library that is based on the Message Passing Interface (MPI) standard. The spin-up computes a steady seasonal cycle of ecosystem tracers with climatological ocean circulation data as forcing. Since the transport is linear with respect to the tracers, the resulting operator is represented by matrices. Each iteration of the spin-up involves two matrix-vector multiplications and the evaluation of the used biogeochemical model. The original code was written in C and Fortran. On the GPU, we use the Compute Unified Device Architecture (CUDA) standard, a customized version of PETSc and a commercial CUDA Fortran compiler. We describe the extensions to PETSc and the modifications of the original C and Fortran codes that had to be done. Here we make use of freely available libraries for the GPU. We analyze the computational effort of the main parts of the spin-up for two exemplar ecosystem models and compare the overall computational time to those necessary on different CPUs. The results show that a consumer GPU can compete with a significant number of cluster CPUs without further code optimization.

  16. PISCES: An environment for parallel scientific computation

    NASA Technical Reports Server (NTRS)

    Pratt, T. W.

    1985-01-01

    The parallel implementation of scientific computing environment (PISCES) is a project to provide high-level programming environments for parallel MIMD computers. Pisces 1, the first of these environments, is a FORTRAN 77 based environment which runs under the UNIX operating system. The Pisces 1 user programs in Pisces FORTRAN, an extension of FORTRAN 77 for parallel processing. The major emphasis in the Pisces 1 design is in providing a carefully specified virtual machine that defines the run-time environment within which Pisces FORTRAN programs are executed. Each implementation then provides the same virtual machine, regardless of differences in the underlying architecture. The design is intended to be portable to a variety of architectures. Currently Pisces 1 is implemented on a network of Apollo workstations and on a DEC VAX uniprocessor via simulation of the task level parallelism. An implementation for the Flexible Computing Corp. FLEX/32 is under construction. An introduction to the Pisces 1 virtual computer and the FORTRAN 77 extensions is presented. An example of an algorithm for the iterative solution of a system of equations is given. The most notable features of the design are the provision for several granularities of parallelism in programs and the provision of a window mechanism for distributed access to large arrays of data.

  17. Meshfree and efficient modeling of swimming cells

    NASA Astrophysics Data System (ADS)

    Gallagher, Meurig T.; Smith, David J.

    2018-05-01

    Locomotion in Stokes flow is an intensively studied problem because it describes important biological phenomena such as the motility of many species' sperm, bacteria, algae, and protozoa. Numerical computations can be challenging, particularly in three dimensions, due to the presence of moving boundaries and complex geometries; methods which combine ease of implementation and computational efficiency are therefore needed. A recently proposed method to discretize the regularized Stokeslet boundary integral equation without the need for a connected mesh is applied to the inertialess locomotion problem in Stokes flow. The mathematical formulation and key aspects of the computational implementation in matlab® or GNU Octave are described, followed by numerical experiments with biflagellate algae and multiple uniflagellate sperm swimming between no-slip surfaces, for which both swimming trajectories and flow fields are calculated. These computational experiments required minutes of time on modest hardware; an extensible implementation is provided in a GitHub repository. The nearest-neighbor discretization dramatically improves convergence and robustness, a key challenge in extending the regularized Stokeslet method to complicated three-dimensional biological fluid problems.

  18. Computation of peak discharge at culverts

    USGS Publications Warehouse

    Carter, Rolland William

    1957-01-01

    Methods for computing peak flood flow through culverts on the basis of a field survey of highwater marks and culvert geometry are presented. These methods are derived from investigations of culvert flow as reported in the literature and on extensive laboratory studies of culvert flow. For convenience in computation, culvert flow has been classified into six types, according to the location of the control section and the relative heights of the head-water and tail-water levels. The type of flow which occurred at any site can be determined from the field data and the criteria given in this report. A discharge equation has been developed for each flow type by combining the energy and continuity equations for the distance between an approach section upstream from the culvert and a terminal section within the culvert barrel. The discharge coefficient applicable to each flow type is listed for the more common entrance geometries. Procedures for computing peak discharge through culverts are outlined in detail for each of the six flow types.

  19. Performance Evaluation of Counter-Based Dynamic Load Balancing Schemes for Massive Contingency Analysis with Different Computing Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Huang, Zhenyu; Chavarría-Miranda, Daniel

    Contingency analysis is a key function in the Energy Management System (EMS) to assess the impact of various combinations of power system component failures based on state estimation. Contingency analysis is also extensively used in power market operation for feasibility test of market solutions. High performance computing holds the promise of faster analysis of more contingency cases for the purpose of safe and reliable operation of today’s power grids with less operating margin and more intermittent renewable energy sources. This paper evaluates the performance of counter-based dynamic load balancing schemes for massive contingency analysis under different computing environments. Insights frommore » the performance evaluation can be used as guidance for users to select suitable schemes in the application of massive contingency analysis. Case studies, as well as MATLAB simulations, of massive contingency cases using the Western Electricity Coordinating Council power grid model are presented to illustrate the application of high performance computing with counter-based dynamic load balancing schemes.« less

  20. Effects of Geometric Details on Slat Noise Generation and Propagation

    NASA Technical Reports Server (NTRS)

    Khorrami, Mehdi R.; Lockard, David P.

    2009-01-01

    The relevance of geometric details to the generation and propagation of noise from leading-edge slats is considered. Typically, such details are omitted in computational simulations and model-scale experiments thereby creating ambiguities in comparisons with acoustic results from flight tests. The current study uses two-dimensional, computational simulations in conjunction with a Ffowcs Williams-Hawkings (FW-H) solver to investigate the effects of previously neglected slat "bulb" and "blade" seals on the local flow field and the associated acoustic radiation. The computations show that the presence of the "blade" seal at the cusp in the simulated geometry significantly changes the slat cove flow dynamics, reduces the amplitudes of the radiated sound, and to a lesser extent, alters the directivity beneath the airfoil. Furthermore, the computations suggest that a modest extension of the baseline "blade" seal further enhances the suppression of slat noise. As a side issue, the utility and equivalence of FW-H methodology for calculating far-field noise as opposed to a more direct approach is examined and demonstrated.

  1. ASC FY17 Implementation Plan, Rev. 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamilton, P. G.

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computationalmore » resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resources, including technical staff, hardware, simulation software, and computer science solutions.« less

  2. Computer vision syndrome-A common cause of unexplained visual symptoms in the modern era.

    PubMed

    Munshi, Sunil; Varghese, Ashley; Dhar-Munshi, Sushma

    2017-07-01

    The aim of this study was to assess the evidence and available literature on the clinical, pathogenetic, prognostic and therapeutic aspects of Computer vision syndrome. Information was collected from Medline, Embase & National Library of Medicine over the last 30 years up to March 2016. The bibliographies of relevant articles were searched for additional references. Patients with Computer vision syndrome present to a variety of different specialists, including General Practitioners, Neurologists, Stroke physicians and Ophthalmologists. While the condition is common, there is a poor awareness in the public and among health professionals. Recognising this condition in the clinic or in emergency situations like the TIA clinic is crucial. The implications are potentially huge in view of the extensive and widespread use of computers and visual display units. Greater public awareness of Computer vision syndrome and education of health professionals is vital. Preventive strategies should form part of work place ergonomics routinely. Prompt and correct recognition is important to allow management and avoid unnecessary treatments. © 2017 John Wiley & Sons Ltd.

  3. Real-time WAMI streaming target tracking in fog

    NASA Astrophysics Data System (ADS)

    Chen, Yu; Blasch, Erik; Chen, Ning; Deng, Anna; Ling, Haibin; Chen, Genshe

    2016-05-01

    Real-time information fusion based on WAMI (Wide-Area Motion Imagery), FMV (Full Motion Video), and Text data is highly desired for many mission critical emergency or security applications. Cloud Computing has been considered promising to achieve big data integration from multi-modal sources. In many mission critical tasks, however, powerful Cloud technology cannot satisfy the tight latency tolerance as the servers are allocated far from the sensing platform, actually there is no guaranteed connection in the emergency situations. Therefore, data processing, information fusion, and decision making are required to be executed on-site (i.e., near the data collection). Fog Computing, a recently proposed extension and complement for Cloud Computing, enables computing on-site without outsourcing jobs to a remote Cloud. In this work, we have investigated the feasibility of processing streaming WAMI in the Fog for real-time, online, uninterrupted target tracking. Using a single target tracking algorithm, we studied the performance of a Fog Computing prototype. The experimental results are very encouraging that validated the effectiveness of our Fog approach to achieve real-time frame rates.

  4. Leveraging Social Computing for Personalized Crisis Communication using Social Media

    PubMed Central

    Leykin, Dmitry; Aharonson-Daniel, Limor; Lahad, Mooli

    2016-01-01

    Introduction: The extensive use of social media in modern life redefines social interaction and communication. Communication plays an important role in mitigating, or exacerbating, the psychological and behavioral responses to critical incidents and disasters. As recent disasters demonstrated, people tend to converge to social media during and following emergencies. Authorities can then use this media and other computational methods to gain insights from the public, mainly to enhance situational awareness, but also to improve their communication with the public and public adherence to instructions. Methods: The current review presents a conceptual framework for studying psychological aspects of crisis and risk communication using the social media through social computing. Results: Advanced analytical tools can be integrated in the processes and objectives of crisis communication. The availability of the computational techniques can improve communication with the public by a process of Hyper-Targeted Crisis Communication. Discussion: The review suggests that using advanced computational tools for target-audience profiling and linguistic matching in social media, can facilitate more sensitive and personalized emergency communication. PMID:27092290

  5. A distributed computing environment with support for constraint-based task scheduling and scientific experimentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahrens, J.P.; Shapiro, L.G.; Tanimoto, S.L.

    1997-04-01

    This paper describes a computing environment which supports computer-based scientific research work. Key features include support for automatic distributed scheduling and execution and computer-based scientific experimentation. A new flexible and extensible scheduling technique that is responsive to a user`s scheduling constraints, such as the ordering of program results and the specification of task assignments and processor utilization levels, is presented. An easy-to-use constraint language for specifying scheduling constraints, based on the relational database query language SQL, is described along with a search-based algorithm for fulfilling these constraints. A set of performance studies show that the environment can schedule and executemore » program graphs on a network of workstations as the user requests. A method for automatically generating computer-based scientific experiments is described. Experiments provide a concise method of specifying a large collection of parameterized program executions. The environment achieved significant speedups when executing experiments; for a large collection of scientific experiments an average speedup of 3.4 on an average of 5.5 scheduled processors was obtained.« less

  6. Integrative Analysis of Cancer Diagnosis Studies with Composite Penalization

    PubMed Central

    Liu, Jin; Huang, Jian; Ma, Shuangge

    2013-01-01

    Summary In cancer diagnosis studies, high-throughput gene profiling has been extensively conducted, searching for genes whose expressions may serve as markers. Data generated from such studies have the “large d, small n” feature, with the number of genes profiled much larger than the sample size. Penalization has been extensively adopted for simultaneous estimation and marker selection. Because of small sample sizes, markers identified from the analysis of single datasets can be unsatisfactory. A cost-effective remedy is to conduct integrative analysis of multiple heterogeneous datasets. In this article, we investigate composite penalization methods for estimation and marker selection in integrative analysis. The proposed methods use the minimax concave penalty (MCP) as the outer penalty. Under the homogeneity model, the ridge penalty is adopted as the inner penalty. Under the heterogeneity model, the Lasso penalty and MCP are adopted as the inner penalty. Effective computational algorithms based on coordinate descent are developed. Numerical studies, including simulation and analysis of practical cancer datasets, show satisfactory performance of the proposed methods. PMID:24578589

  7. Beyond computational difficulties: Survey of the two decades from the elaboration to the extensive application of the Hartree-Fock method

    NASA Astrophysics Data System (ADS)

    Martinez, Jean-Philippe

    2017-11-01

    The Hartree-Fock method, one of the first applications of the new quantum mechanics in the frame of the many-body problem, had been elaborated by Rayner Douglas Hartree in 1928 and Vladimir Fock in 1930. Promptly, the challenge of tedious computations was being discussed and it is well known that the application of the method benefited greatly from the development of computers from the mid-to-late 1950s. However, the years from 1930 to 1950 were by no means years of stagnation, as the method was the object of several considerations related to its mathematical formulation, possible extension, and conceptual understanding. Thus, with a focus on the respective attitudes of Hartree and Fock, in particular with respect to the concept of quantum exchange, the present work puts forward some mathematical and conceptual clarifications, which played an important role for a better understanding of the many-body problem in quantum mechanics.

  8. The Julia programming language: the future of scientific computing

    NASA Astrophysics Data System (ADS)

    Gibson, John

    2017-11-01

    Julia is an innovative new open-source programming language for high-level, high-performance numerical computing. Julia combines the general-purpose breadth and extensibility of Python, the ease-of-use and numeric focus of Matlab, the speed of C and Fortran, and the metaprogramming power of Lisp. Julia uses type inference and just-in-time compilation to compile high-level user code to machine code on the fly. A rich set of numeric types and extensive numerical libraries are built-in. As a result, Julia is competitive with Matlab for interactive graphical exploration and with C and Fortran for high-performance computing. This talk interactively demonstrates Julia's numerical features and benchmarks Julia against C, C++, Fortran, Matlab, and Python on a spectral time-stepping algorithm for a 1d nonlinear partial differential equation. The Julia code is nearly as compact as Matlab and nearly as fast as Fortran. This material is based upon work supported by the National Science Foundation under Grant No. 1554149.

  9. Factors Affecting Career Choice: Comparison Between Students from Computer and Other Disciplines

    NASA Astrophysics Data System (ADS)

    Alexander, P. M.; Holmner, M.; Lotriet, H. H.; Matthee, M. C.; Pieterse, H. V.; Naidoo, S.; Twinomurinzi, H.; Jordaan, D.

    2011-06-01

    The number of student enrolments in computer-related courses remains a serious concern worldwide with far reaching consequences. This paper reports on an extensive survey about career choice and associated motivational factors amongst new students, only some of whom intend to major in computer-related courses, at two South African universities. The data were analyzed using some components of Social Cognitive Career Theory, namely external influences, self-efficacy beliefs and outcome expectations. The research suggests the need for new strategies for marketing computer-related courses and the avenues through which they are marketed. This can to some extent be achieved by studying strategies used by other (non-computer) university courses, and their professional bodies. However, there are also distinct differences, related to self-efficacy and career outcomes, between the computer majors and the `other' group and these need to be explored further in order to find strategies that work well for this group. It is not entirely clear what the underlying reasons are for these differences but it is noteworthy that the perceived importance of "Interest in the career field" when choosing a career remains very high for both groups of students.

  10. Role of Open Source Tools and Resources in Virtual Screening for Drug Discovery.

    PubMed

    Karthikeyan, Muthukumarasamy; Vyas, Renu

    2015-01-01

    Advancement in chemoinformatics research in parallel with availability of high performance computing platform has made handling of large scale multi-dimensional scientific data for high throughput drug discovery easier. In this study we have explored publicly available molecular databases with the help of open-source based integrated in-house molecular informatics tools for virtual screening. The virtual screening literature for past decade has been extensively investigated and thoroughly analyzed to reveal interesting patterns with respect to the drug, target, scaffold and disease space. The review also focuses on the integrated chemoinformatics tools that are capable of harvesting chemical data from textual literature information and transform them into truly computable chemical structures, identification of unique fragments and scaffolds from a class of compounds, automatic generation of focused virtual libraries, computation of molecular descriptors for structure-activity relationship studies, application of conventional filters used in lead discovery along with in-house developed exhaustive PTC (Pharmacophore, Toxicophores and Chemophores) filters and machine learning tools for the design of potential disease specific inhibitors. A case study on kinase inhibitors is provided as an example.

  11. Automatic Multilevel Parallelization Using OpenMP

    NASA Technical Reports Server (NTRS)

    Jin, Hao-Qiang; Jost, Gabriele; Yan, Jerry; Ayguade, Eduard; Gonzalez, Marc; Martorell, Xavier; Biegel, Bryan (Technical Monitor)

    2002-01-01

    In this paper we describe the extension of the CAPO (CAPtools (Computer Aided Parallelization Toolkit) OpenMP) parallelization support tool to support multilevel parallelism based on OpenMP directives. CAPO generates OpenMP directives with extensions supported by the NanosCompiler to allow for directive nesting and definition of thread groups. We report some results for several benchmark codes and one full application that have been parallelized using our system.

  12. Bacterial computing with engineered populations.

    PubMed

    Amos, Martyn; Axmann, Ilka Maria; Blüthgen, Nils; de la Cruz, Fernando; Jaramillo, Alfonso; Rodriguez-Paton, Alfonso; Simmel, Friedrich

    2015-07-28

    We describe strategies for the construction of bacterial computing platforms by describing a number of results from the recently completed bacterial computing with engineered populations project. In general, the implementation of such systems requires a framework containing various components such as intracellular circuits, single cell input/output and cell-cell interfacing, as well as extensive analysis. In this overview paper, we describe our approach to each of these, and suggest possible areas for future research. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  13. A serial digital data communications device. [for real time flight simulation

    NASA Technical Reports Server (NTRS)

    Fetter, J. L.

    1977-01-01

    A general purpose computer peripheral device which is used to provide a full-duplex, serial, digital data transmission link between a Xerox Sigma computer and a wide variety of external equipment, including computers, terminals, and special purpose devices is reported. The interface has an extensive set of user defined options to assist the user in establishing the necessary data links. This report describes those options and other features of the serial communications interface and its performance by discussing its application to a particular problem.

  14. CCSI and the role of advanced computing in accelerating the commercial deployment of carbon capture systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, David; Agarwal, Deborah A.; Sun, Xin

    2011-09-01

    The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.

  15. CCSI and the role of advanced computing in accelerating the commercial deployment of carbon capture systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, D.; Agarwal, D.; Sun, X.

    2011-01-01

    The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.

  16. Scalable Optical-Fiber Communication Networks

    NASA Technical Reports Server (NTRS)

    Chow, Edward T.; Peterson, John C.

    1993-01-01

    Scalable arbitrary fiber extension network (SAFEnet) is conceptual fiber-optic communication network passing digital signals among variety of computers and input/output devices at rates from 200 Mb/s to more than 100 Gb/s. Intended for use with very-high-speed computers and other data-processing and communication systems in which message-passing delays must be kept short. Inherent flexibility makes it possible to match performance of network to computers by optimizing configuration of interconnections. In addition, interconnections made redundant to provide tolerance to faults.

  17. A novel iris localization algorithm using correlation filtering

    NASA Astrophysics Data System (ADS)

    Pohit, Mausumi; Sharma, Jitu

    2015-06-01

    Fast and efficient segmentation of iris from the eye images is a primary requirement for robust database independent iris recognition. In this paper we have presented a new algorithm for computing the inner and outer boundaries of the iris and locating the pupil centre. Pupil-iris boundary computation is based on correlation filtering approach, whereas iris-sclera boundary is determined through one dimensional intensity mapping. The proposed approach is computationally less extensive when compared with the existing algorithms like Hough transform.

  18. Delaminated rotator cuff tear: extension of delamination and cuff integrity after arthroscopic rotator cuff repair.

    PubMed

    Gwak, Heui-Chul; Kim, Chang-Wan; Kim, Jung-Han; Choo, Hye-Jeung; Sagong, Seung-Yeob; Shin, John

    2015-05-01

    The purpose of this study was to evaluate the extension of delamination and the cuff integrity after arthroscopic repair of delaminated rotator cuff tears. Sixty-five patients with delaminated rotator cuff tears were retrospectively reviewed. The delaminated tears were divided into full-thickness delaminated tears and partial-thickness delaminated tears. To evaluate the medial extension, we calculated the coronal size of the delaminated portion. To evaluate the posterior extension, we checked the tendon involved. Cuff integrity was evaluated by computed tomography arthrography. The mean medial extension in the full-thickness and partial-thickness delaminated tears was 18.1 ± 6.0 mm and 22.7 ± 6.3 mm, respectively (P = .0084). The posterior extension into the supraspinatus and the infraspinatus was 36.9% and 32.3%, respectively, in the full-thickness delaminated tears, and it was 27.7% and 3.1%, respectively, in the partial-thickness delaminated tears (P = .0043). With regard to cuff integrity, 35 cases of anatomic healing, 10 cases of partial healing defects, and 17 cases of retear were detected. Among the patients with retear and partial healing of the defect, all the partially healed defects showed delamination. Three retear patients showed delamination, and 14 retear patients did not show delamination; the difference was statistically significant (P = .0001). The full-thickness delaminated tears showed less medial extension and more posterior extension than the partial-thickness delaminated tears. Delamination did not develop in retear patients, but delamination was common in the patients with partially healed defects. Copyright © 2015 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  19. Interactive computer training to teach discrete-trial instruction to undergraduates and special educators in Brazil: A replication and extension.

    PubMed

    Higbee, Thomas S; Aporta, Ana Paula; Resende, Alice; Nogueira, Mateus; Goyos, Celso; Pollard, Joy S

    2016-12-01

    Discrete-trial instruction (DTI) is a behavioral method of teaching young children with autism spectrum disorders (ASD) that has received a significant amount of research support. Because of a lack of qualified trainers in many areas of the world, researchers have recently begun to investigate alternative methods of training professionals to implement behavioral teaching procedures. One promising training method is interactive computer training, in which slides with recorded narration, video modeling, and embedded evaluation of content knowledge are used to teach a skill. In the present study, the effectiveness of interactive computer training developed by Pollard, Higbee, Akers, and Brodhead (2014), translated into Brazilian Portuguese, was evaluated with 4 university students (Study 1) and 4 special education teachers (Study 2). We evaluated the effectiveness of training on DTI skills during role-plays with research assistants (Study 1) and during DTI sessions with young children with ASD (Studies 1 and 2) using a multiple baseline design. All participants acquired DTI skills after interactive computer training, although 5 of 8 participants required some form of feedback to reach proficiency. Responding generalized to untaught teaching programs for all participants. We evaluated maintenance with the teachers in Study 2, and DTI skills were maintained with 3 of 4 participants. © 2016 Society for the Experimental Analysis of Behavior.

  20. Stream Lifetimes Against Planetary Encounters

    NASA Technical Reports Server (NTRS)

    Valsecchi, G. B.; Lega, E.; Froeschle, Cl.

    2011-01-01

    We study, both analytically and numerically, the perturbation induced by an encounter with a planet on a meteoroid stream. Our analytical tool is the extension of pik s theory of close encounters, that we apply to streams described by geocentric variables. The resulting formulae are used to compute the rate at which a stream is dispersed by planetary encounters into the sporadic background. We have verified the accuracy of the analytical model using a numerical test.

  1. Evaluating painful osteopenia in the elderly

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Norris, M.A.; De Smet, A.A.

    1991-06-01

    Osteopenia is a frequent finding on radiographs of elderly patients. When the exam is performed for skeletal pain, this finding may be significant. The differential diagnosis for osteopenia is extensive, but individualizing the patient work-up begins with a careful clinical history and laboratory studies. Appropriate radiographic exams can then be requested. A plain radiograph is always the starting point, followed by--as indicated--a nuclear bone scan, computed tomography, and magnetic resonance imaging.

  2. Communal Learning versus Individual Learning: An Exploratory Convergent Parallel Mixed-Method Study to Describe How Young African American Novice Programmers Learn Computational Thinking Skills in an Informal Learning Environment

    ERIC Educational Resources Information Center

    Hatley, Leshell April Denise

    2016-01-01

    Today, most young people in the United States (U.S.) live technology-saturated lives. Their educational, entertainment, and career options originate from and demand incredible technological innovations. However, this extensive ownership of and access to technology does not indicate that today's youth know how technology works or how to control and…

  3. Darwin's bee-trap: The kinetics of Catasetum, a new world orchid.

    PubMed

    Nicholson, Charles C; Bales, James W; Palmer-Fortune, Joyce E; Nicholson, Robert G

    2008-01-01

    The orchid genera Catasetum employs a hair-trigger activated, pollen release mechanism, which forcibly attaches pollen sacs onto foraging insects in the New World tropics. This remarkable adaptation was studied extensively by Charles Darwin and he termed this rapid response "sensitiveness." Using high speed video cameras with a frame speed of 1000 fps, this rapid release was filmed and from the subsequent footage, velocity, speed, acceleration, force and kinetic energy were computed.

  4. Actor-network Procedures: Modeling Multi-factor Authentication, Device Pairing, Social Interactions

    DTIC Science & Technology

    2011-08-29

    unmodifiable properties of your body; or the capabilities that you cannot convey to others, such as your handwriting . An identity can thus be determined by...network, two principals with the same set of secrets but, say , different computational powers, can be distinguished by timing their responses. Or they... says that configurations are finite sets. Partially ordered multisets, or pomsets were introduced and extensively studied by Vaughan Pratt and his

  5. Ultrasonic Doppler measurement of renal artery blood flow

    NASA Technical Reports Server (NTRS)

    Freund, W. R.; Meindl, J. D.

    1975-01-01

    An extensive evaluation of the practical and theoretical limitations encountered in the use of totally implantable CW Doppler flowmeters is provided. Theoretical analyses, computer models, in-vitro and in-vivo calibration studies describe the sources and magnitudes of potential errors in the measurement of blood flow through the renal artery, as well as larger vessels in the circulatory system. The evaluation of new flowmeter/transducer systems and their use in physiological investigations is reported.

  6. Endogenous Crisis Waves: Stochastic Model with Synchronized Collective Behavior

    NASA Astrophysics Data System (ADS)

    Gualdi, Stanislao; Bouchaud, Jean-Philippe; Cencetti, Giulia; Tarzia, Marco; Zamponi, Francesco

    2015-02-01

    We propose a simple framework to understand commonly observed crisis waves in macroeconomic agent-based models, which is also relevant to a variety of other physical or biological situations where synchronization occurs. We compute exactly the phase diagram of the model and the location of the synchronization transition in parameter space. Many modifications and extensions can be studied, confirming that the synchronization transition is extremely robust against various sources of noise or imperfections.

  7. Extension of a Kinetic Approach to Chemical Reactions to Electronic Energy Levels and Reactions Involving Charged Species with Application to DSMC Simulations

    NASA Technical Reports Server (NTRS)

    Liechty, Derek S.

    2014-01-01

    The ability to compute rarefied, ionized hypersonic flows is becoming more important as missions such as Earth reentry, landing high mass payloads on Mars, and the exploration of the outer planets and their satellites are being considered. Recently introduced molecular-level chemistry models that predict equilibrium and nonequilibrium reaction rates using only kinetic theory and fundamental molecular properties are extended in the current work to include electronic energy level transitions and reactions involving charged particles. These extensions are shown to agree favorably with reported transition and reaction rates from the literature for near-equilibrium conditions. Also, the extensions are applied to the second flight of the Project FIRE flight experiment at 1634 seconds with a Knudsen number of 0.001 at an altitude of 76.4 km. In order to accomplish this, NASA's direct simulation Monte Carlo code DAC was rewritten to include the ability to simulate charge-neutral ionized flows, take advantage of the recently introduced chemistry model, and to include the extensions presented in this work. The 1634 second data point was chosen for comparisons to be made in order to include a CFD solution. The Knudsen number at this point in time is such that the DSMC simulations are still tractable and the CFD computations are at the edge of what is considered valid because, although near-transitional, the flow is still considered to be continuum. It is shown that the inclusion of electronic energy levels in the DSMC simulation is necessary for flows of this nature and is required for comparison to the CFD solution. The flow field solutions are also post-processed by the nonequilibrium radiation code HARA to compute the radiative portion.

  8. Computational simulation of concurrent engineering for aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1992-01-01

    Results are summarized of an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulations methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties - fundamental in developing such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering for propulsion systems and systems in general. Benefits and facets needing early attention in the development are outlined.

  9. Computational simulation for concurrent engineering of aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  10. Implementation of DFT application on ternary optical computer

    NASA Astrophysics Data System (ADS)

    Junjie, Peng; Youyi, Fu; Xiaofeng, Zhang; Shuai, Kong; Xinyu, Wei

    2018-03-01

    As its characteristics of huge number of data bits and low energy consumption, optical computing may be used in the applications such as DFT etc. which needs a lot of computation and can be implemented in parallel. According to this, DFT implementation methods in full parallel as well as in partial parallel are presented. Based on resources ternary optical computer (TOC), extensive experiments were carried out. Experimental results show that the proposed schemes are correct and feasible. They provide a foundation for further exploration of the applications on TOC that needs a large amount calculation and can be processed in parallel.

  11. Impact of remote sensing upon the planning, management, and development of water resources

    NASA Technical Reports Server (NTRS)

    Loats, H. L.; Fowler, T. R.; Frech, S. L.

    1974-01-01

    A survey of the principal water resource users was conducted to determine the impact of new remote data streams on hydrologic computer models. The analysis of the responses and direct contact demonstrated that: (1) the majority of water resource effort of the type suitable to remote sensing inputs is conducted by major federal water resources agencies or through federally stimulated research, (2) the federal government develops most of the hydrologic models used in this effort; and (3) federal computer power is extensive. The computers, computer power, and hydrologic models in current use were determined.

  12. Computational simulation for concurrent engineering of aerospace propulsion systems

    NASA Astrophysics Data System (ADS)

    Chamis, C. C.; Singhal, S. N.

    1993-02-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  13. Future Approach to tier-0 extension

    NASA Astrophysics Data System (ADS)

    Jones, B.; McCance, G.; Cordeiro, C.; Giordano, D.; Traylen, S.; Moreno García, D.

    2017-10-01

    The current tier-0 processing at CERN is done on two managed sites, the CERN computer centre and the Wigner computer centre. With the proliferation of public cloud resources at increasingly competitive prices, we have been investigating how to transparently increase our compute capacity to include these providers. The approach taken has been to integrate these resources using our existing deployment and computer management tools and to provide them in a way that exposes them to users as part of the same site. The paper will describe the architecture, the toolset and the current production experiences of this model.

  14. Arbitrated Quantum Signature with Hamiltonian Algorithm Based on Blind Quantum Computation

    NASA Astrophysics Data System (ADS)

    Shi, Ronghua; Ding, Wanting; Shi, Jinjing

    2018-03-01

    A novel arbitrated quantum signature (AQS) scheme is proposed motivated by the Hamiltonian algorithm (HA) and blind quantum computation (BQC). The generation and verification of signature algorithm is designed based on HA, which enables the scheme to rely less on computational complexity. It is unnecessary to recover original messages when verifying signatures since the blind quantum computation is applied, which can improve the simplicity and operability of our scheme. It is proved that the scheme can be deployed securely, and the extended AQS has some extensive applications in E-payment system, E-government, E-business, etc.

  15. Modeling Subsurface Reactive Flows Using Leadership-Class Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mills, Richard T; Hammond, Glenn; Lichtner, Peter

    2009-01-01

    We describe our experiences running PFLOTRAN - a code for simulation of coupled hydro-thermal-chemical processes in variably saturated, non-isothermal, porous media - on leadership-class supercomputers, including initial experiences running on the petaflop incarnation of Jaguar, the Cray XT5 at the National Center for Computational Sciences at Oak Ridge National Laboratory. PFLOTRAN utilizes fully implicit time-stepping and is built on top of the Portable, Extensible Toolkit for Scientific Computation (PETSc). We discuss some of the hurdles to 'at scale' performance with PFLOTRAN and the progress we have made in overcoming them on leadership-class computer architectures.

  16. Arbitrated Quantum Signature with Hamiltonian Algorithm Based on Blind Quantum Computation

    NASA Astrophysics Data System (ADS)

    Shi, Ronghua; Ding, Wanting; Shi, Jinjing

    2018-07-01

    A novel arbitrated quantum signature (AQS) scheme is proposed motivated by the Hamiltonian algorithm (HA) and blind quantum computation (BQC). The generation and verification of signature algorithm is designed based on HA, which enables the scheme to rely less on computational complexity. It is unnecessary to recover original messages when verifying signatures since the blind quantum computation is applied, which can improve the simplicity and operability of our scheme. It is proved that the scheme can be deployed securely, and the extended AQS has some extensive applications in E-payment system, E-government, E-business, etc.

  17. Challenges in reducing the computational time of QSTS simulations for distribution system analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deboever, Jeremiah; Zhang, Xiaochen; Reno, Matthew J.

    The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10more » to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.« less

  18. An Examination of the Impact of Computer-Based Animations and Visualization Sequence on Student Understanding of Hadley Cells in Atmospheric Circulation

    ERIC Educational Resources Information Center

    Harris, Daniel Wyatt

    2012-01-01

    Research examining animation use for student learning has been conducted in the last two decades across a multitude of instructional environments and content areas. The extensive construction and implementation of animations in learning resulted from the availability of powerful computing systems and the perceived advantages the novel medium…

  19. Using Testbanking To Implement Classroom Management/Extension through the Use of Computers.

    ERIC Educational Resources Information Center

    Thommen, John D.

    Testbanking provides teachers with an effective, low-cost, time-saving opportunity to improve the testing aspect of their classes. Testbanking, which involves the use of a testbank program and a computer, allows teachers to develop and generate tests and test-forms with a minimum of effort. Teachers who test using true and false, multiple choice,…

  20. Reaction of formaldehyde at the ortho- and para-positions of phenol: exploration of mechanisms using computational chemistry.

    Treesearch

    Anthony H. Conner; Melissa S. Reeves

    2001-01-01

    Computational chemistry methods can be used to explore the theoretical chemistry behind reactive systems, to compare the relative chemical reactivity of different systems, and, by extension, to predict the reactivity of new systems. Ongoing research has focused on the reactivity of a wide variety of phenolic compounds with formaldehyde using semi-empirical and ab...

  1. Convergence acceleration of viscous flow computations

    NASA Technical Reports Server (NTRS)

    Johnson, G. M.

    1982-01-01

    A multiple-grid convergence acceleration technique introduced for application to the solution of the Euler equations by means of Lax-Wendroff algorithms is extended to treat compressible viscous flow. Computational results are presented for the solution of the thin-layer version of the Navier-Stokes equations using the explicit MacCormack algorithm, accelerated by a convective coarse-grid scheme. Extensions and generalizations are mentioned.

  2. Revised description of index of Florida water data collection active stations and a user's guide for station or site information retrieval computer program FINDEX H578

    USGS Publications Warehouse

    Geiger, Linda H.

    1983-01-01

    The report is an update of U.S. Geological Survey Open-File Report 77-703, which described a retrieval program for administrative index of active data-collection sites in Florida. Extensive changes to the Findex system have been made since 1977 , making the previous report obsolete. A description of the data base and computer programs that are available in the Findex system are documented in this report. This system serves a vital need in the administration of the many and diverse water-data collection activities. District offices with extensive data-collection activities will benefit from the documentation of the system. Largely descriptive, the report tells how a file of computer card images has been established which contains entries for all sites in Florida at which there is currently a water-data collection activity. Entries include information such as identification number, station name, location, type of site, county, frequency of data collection, funding, and other pertinent details. The computer program FINDEX selectively retrieves entries and lists them in a format suitable for publication. The index is updated routinely. (USGS)

  3. GOAL-to-HAL translation study

    NASA Technical Reports Server (NTRS)

    Flanders, J. H.; Helmers, C. T.; Stanten, S. F.

    1973-01-01

    This report deals with the feasibility, problems, solutions, and mapping of a GOAL language to HAL language translator. Ground Operations Aerospace Language, or GOAL, is a test-oriented higher order language developed by the John F. Kennedy Space Center to be used in checkout and launch of the space shuttle. HAL is a structured higher order language developed by the Johnson Space Center to be used in writing the flight software for the onboard shuttle computers. Since the onboard computers will extensively support ground checkout of the space shuttle, and since these computers and the software development facilities on the ground use the HAL language as baseline, the translation of GOAL to HAL becomes significant. The issue of feasibility was examined and it was found that a GOAL to HAL translator is feasible. Special problems are identified and solutions proposed. Finally, examples of translation are provided for each category of complete GOAL statement.

  4. Optimizing Integrated Terminal Airspace Operations Under Uncertainty

    NASA Technical Reports Server (NTRS)

    Bosson, Christabelle; Xue, Min; Zelinski, Shannon

    2014-01-01

    In the terminal airspace, integrated departures and arrivals have the potential to increase operations efficiency. Recent research has developed geneticalgorithm- based schedulers for integrated arrival and departure operations under uncertainty. This paper presents an alternate method using a machine jobshop scheduling formulation to model the integrated airspace operations. A multistage stochastic programming approach is chosen to formulate the problem and candidate solutions are obtained by solving sample average approximation problems with finite sample size. Because approximate solutions are computed, the proposed algorithm incorporates the computation of statistical bounds to estimate the optimality of the candidate solutions. A proof-ofconcept study is conducted on a baseline implementation of a simple problem considering a fleet mix of 14 aircraft evolving in a model of the Los Angeles terminal airspace. A more thorough statistical analysis is also performed to evaluate the impact of the number of scenarios considered in the sampled problem. To handle extensive sampling computations, a multithreading technique is introduced.

  5. Early phase drug discovery: cheminformatics and computational techniques in identifying lead series.

    PubMed

    Duffy, Bryan C; Zhu, Lei; Decornez, Hélène; Kitchen, Douglas B

    2012-09-15

    Early drug discovery processes rely on hit finding procedures followed by extensive experimental confirmation in order to select high priority hit series which then undergo further scrutiny in hit-to-lead studies. The experimental cost and the risk associated with poor selection of lead series can be greatly reduced by the use of many different computational and cheminformatic techniques to sort and prioritize compounds. We describe the steps in typical hit identification and hit-to-lead programs and then describe how cheminformatic analysis assists this process. In particular, scaffold analysis, clustering and property calculations assist in the design of high-throughput screening libraries, the early analysis of hits and then organizing compounds into series for their progression from hits to leads. Additionally, these computational tools can be used in virtual screening to design hit-finding libraries and as procedures to help with early SAR exploration. Copyright © 2012 Elsevier Ltd. All rights reserved.

  6. Fun During Knee Rehabilitation: Feasibility and Acceptability Testing of a New Android-Based Training Device.

    PubMed

    Weber-Spickschen, Thomas Sanjay; Colcuc, Christian; Hanke, Alexander; Clausen, Jan-Dierk; James, Paul Abraham; Horstmann, Hauke

    2017-01-01

    The initial goals of rehabilitation after knee injuries and operations are to achieve full knee extension and to activate quadriceps muscle. In addition to regular physiotherapy, an android-based knee training device is designed to help patients achieve these goals and improve compliance in the early rehabilitation period. This knee training device combines fun in a computer game with muscular training or rehabilitation. Our aim was to test the feasibility and acceptability of this new device. 50 volunteered subjects enrolled to test out the computer game aided device. The first game was the high-striker game, which recorded maximum knee extension power. The second game involved controlling quadriceps muscular power to simulate flying an aeroplane in order to record accuracy of muscle activation. The subjects evaluated this game by completing a simple questionnaire. No technical problem was encountered during the usage of this device. No subjects complained of any discomfort after using this device. Measurements including maximum knee extension power, knee muscle activation and control were recorded successfully. Subjects rated their experience with the device as either excellent or very good and agreed that the device can motivate and monitor the progress of knee rehabilitation training. To the best of our knowledge, this is the first android-based tool available to fast track knee rehabilitation training. All subjects gave very positive feedback to this computer game aided knee device.

  7. SiMon: Simulation Monitor for Computational Astrophysics

    NASA Astrophysics Data System (ADS)

    Xuran Qian, Penny; Cai, Maxwell Xu; Portegies Zwart, Simon; Zhu, Ming

    2017-09-01

    Scientific discovery via numerical simulations is important in modern astrophysics. This relatively new branch of astrophysics has become possible due to the development of reliable numerical algorithms and the high performance of modern computing technologies. These enable the analysis of large collections of observational data and the acquisition of new data via simulations at unprecedented accuracy and resolution. Ideally, simulations run until they reach some pre-determined termination condition, but often other factors cause extensive numerical approaches to break down at an earlier stage. In those cases, processes tend to be interrupted due to unexpected events in the software or the hardware. In those cases, the scientist handles the interrupt manually, which is time-consuming and prone to errors. We present the Simulation Monitor (SiMon) to automatize the farming of large and extensive simulation processes. Our method is light-weight, it fully automates the entire workflow management, operates concurrently across multiple platforms and can be installed in user space. Inspired by the process of crop farming, we perceive each simulation as a crop in the field and running simulation becomes analogous to growing crops. With the development of SiMon we relax the technical aspects of simulation management. The initial package was developed for extensive parameter searchers in numerical simulations, but it turns out to work equally well for automating the computational processing and reduction of observational data reduction.

  8. Extension of research data repository system to support direct compute access to biomedical datasets: enhancing Dataverse to support large datasets

    PubMed Central

    McKinney, Bill; Meyer, Peter A.; Crosas, Mercè; Sliz, Piotr

    2016-01-01

    Access to experimental X-ray diffraction image data is important for validation and reproduction of macromolecular models and indispensable for the development of structural biology processing methods. In response to the evolving needs of the structural biology community, we recently established a diffraction data publication system, the Structural Biology Data Grid (SBDG, data.sbgrid.org), to preserve primary experimental datasets supporting scientific publications. All datasets published through the SBDG are freely available to the research community under a public domain dedication license, with metadata compliant with the DataCite Schema (schema.datacite.org). A proof-of-concept study demonstrated community interest and utility. Publication of large datasets is a challenge shared by several fields, and the SBDG has begun collaborating with the Institute for Quantitative Social Science at Harvard University to extend the Dataverse (dataverse.org) open-source data repository system to structural biology datasets. Several extensions are necessary to support the size and metadata requirements for structural biology datasets. In this paper, we describe one such extension—functionality supporting preservation of filesystem structure within Dataverse—which is essential for both in-place computation and supporting non-http data transfers. PMID:27862010

  9. Computational approaches for understanding the diagnosis and treatment of Parkinson’s disease

    PubMed Central

    Smith, Stephen L.; Lones, Michael A.; Bedder, Matthew; Alty, Jane E.; Cosgrove, Jeremy; Maguire, Richard J.; Pownall, Mary Elizabeth; Ivanoiu, Diana; Lyle, Camille; Cording, Amy; Elliott, Christopher J.H.

    2015-01-01

    This study describes how the application of evolutionary algorithms (EAs) can be used to study motor function in humans with Parkinson’s disease (PD) and in animal models of PD. Human data is obtained using commercially available sensors via a range of non-invasive procedures that follow conventional clinical practice. EAs can then be used to classify human data for a range of uses, including diagnosis and disease monitoring. New results are presented that demonstrate how EAs can also be used to classify fruit flies with and without genetic mutations that cause Parkinson’s by using measurements of the proboscis extension reflex. The case is made for a computational approach that can be applied across human and animal studies of PD and lays the way for evaluation of existing and new drug therapies in a truly objective way. PMID:26577157

  10. Nonlinear Solver Approaches for the Diffusive Wave Approximation to the Shallow Water Equations

    NASA Astrophysics Data System (ADS)

    Collier, N.; Knepley, M.

    2015-12-01

    The diffusive wave approximation to the shallow water equations (DSW) is a doubly-degenerate, nonlinear, parabolic partial differential equation used to model overland flows. Despite its challenges, the DSW equation has been extensively used to model the overland flow component of various integrated surface/subsurface models. The equation's complications become increasingly problematic when ponding occurs, a feature which becomes pervasive when solving on large domains with realistic terrain. In this talk I discuss the various forms and regularizations of the DSW equation and highlight their effect on the solvability of the nonlinear system. In addition to this analysis, I present results of a numerical study which tests the applicability of a class of composable nonlinear algebraic solvers recently added to the Portable, Extensible, Toolkit for Scientific Computation (PETSc).

  11. Practicing evidence based medicine at the bedside: a randomized controlled pilot study in undergraduate medical students assessing the practicality of tablets, smartphones, and computers in clinical life.

    PubMed

    Friederichs, Hendrik; Marschall, Bernhard; Weissenstein, Anne

    2014-12-05

    Practicing evidence-based medicine is an important aspect of providing good medical care. Accessing external information through literature searches on computer-based systems can effectively achieve integration in clinical care. We conducted a pilot study using smartphones, tablets, and stationary computers as search devices at the bedside. The objective was to determine possible differences between the various devices and assess students' internet use habits. In a randomized controlled pilot study, 120 students were divided in three groups. One control group solved clinical problems on a computer and two intervention groups used mobile devices at the bedside. In a questionnaire, students were asked to report their internet use habits as well as their satisfaction with their respective search tool using a 5-point Likert scale. Of 120 surveys, 94 (78.3%) complete data sets were analyzed. The mobility of the tablet (3.90) and the smartphone (4.39) was seen as a significant advantage over the computer (2.38, p < .001). However, for performing an effective literature search at the bedside, the computer (3.22) was rated superior to both tablet computers (2.13) and smartphones (1.68). No significant differences were detected between tablets and smartphones except satisfaction with screen size (tablet 4.10, smartphone 2.00, p < .001). Using a mobile device at the bedside to perform an extensive search is not suitable for students who prefer using computers. However, mobility is regarded as a substantial advantage, and therefore future applications might facilitate quick and simple searches at the bedside.

  12. Evaluating biomechanics of user-selected sitting and standing computer workstation.

    PubMed

    Lin, Michael Y; Barbir, Ana; Dennerlein, Jack T

    2017-11-01

    A standing computer workstation has now become a popular modern work place intervention to reduce sedentary behavior at work. However, user's interaction related to a standing computer workstation and its differences with a sitting workstation need to be understood to assist in developing recommendations for use and set up. The study compared the differences in upper extremity posture and muscle activity between user-selected sitting and standing workstation setups. Twenty participants (10 females, 10 males) volunteered for the study. 3-D posture, surface electromyography, and user-reported discomfort were measured while completing simulated tasks with each participant's self-selected workstation setups. Sitting computer workstation associated with more non-neutral shoulder postures and greater shoulder muscle activity, while standing computer workstation induced greater wrist adduction angle and greater extensor carpi radialis muscle activity. Sitting computer workstation also associated with greater shoulder abduction postural variation (90th-10th percentile) while standing computer workstation associated with greater variation for should rotation and wrist extension. Users reported similar overall discomfort levels within the first 10 min of work but had more than twice as much discomfort while standing than sitting after 45 min; with most discomfort reported in the low back for standing and shoulder for sitting. These different measures provide understanding in users' different interactions with sitting and standing and by alternating between the two configurations in short bouts may be a way of changing the loading pattern on the upper extremity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Biologically inspired collision avoidance system for unmanned vehicles

    NASA Astrophysics Data System (ADS)

    Ortiz, Fernando E.; Graham, Brett; Spagnoli, Kyle; Kelmelis, Eric J.

    2009-05-01

    In this project, we collaborate with researchers in the neuroscience department at the University of Delaware to develop an Field Programmable Gate Array (FPGA)-based embedded computer, inspired by the brains of small vertebrates (fish). The mechanisms of object detection and avoidance in fish have been extensively studied by our Delaware collaborators. The midbrain optic tectum is a biological multimodal navigation controller capable of processing input from all senses that convey spatial information, including vision, audition, touch, and lateral-line (water current sensing in fish). Unfortunately, computational complexity makes these models too slow for use in real-time applications. These simulations are run offline on state-of-the-art desktop computers, presenting a gap between the application and the target platform: a low-power embedded device. EM Photonics has expertise in developing of high-performance computers based on commodity platforms such as graphic cards (GPUs) and FPGAs. FPGAs offer (1) high computational power, low power consumption and small footprint (in line with typical autonomous vehicle constraints), and (2) the ability to implement massively-parallel computational architectures, which can be leveraged to closely emulate biological systems. Combining UD's brain modeling algorithms and the power of FPGAs, this computer enables autonomous navigation in complex environments, and further types of onboard neural processing in future applications.

  14. 3D simulations of early blood vessel formation

    NASA Astrophysics Data System (ADS)

    Cavalli, F.; Gamba, A.; Naldi, G.; Semplice, M.; Valdembri, D.; Serini, G.

    2007-08-01

    Blood vessel networks form by spontaneous aggregation of individual cells migrating toward vascularization sites (vasculogenesis). A successful theoretical model of two-dimensional experimental vasculogenesis has been recently proposed, showing the relevance of percolation concepts and of cell cross-talk (chemotactic autocrine loop) to the understanding of this self-aggregation process. Here we study the natural 3D extension of the computational model proposed earlier, which is relevant for the investigation of the genuinely three-dimensional process of vasculogenesis in vertebrate embryos. The computational model is based on a multidimensional Burgers equation coupled with a reaction diffusion equation for a chemotactic factor and a mass conservation law. The numerical approximation of the computational model is obtained by high order relaxed schemes. Space and time discretization are performed by using TVD schemes and, respectively, IMEX schemes. Due to the computational costs of realistic simulations, we have implemented the numerical algorithm on a cluster for parallel computation. Starting from initial conditions mimicking the experimentally observed ones, numerical simulations produce network-like structures qualitatively similar to those observed in the early stages of in vivo vasculogenesis. We develop the computation of critical percolative indices as a robust measure of the network geometry as a first step towards the comparison of computational and experimental data.

  15. GPU and APU computations of Finite Time Lyapunov Exponent fields

    NASA Astrophysics Data System (ADS)

    Conti, Christian; Rossinelli, Diego; Koumoutsakos, Petros

    2012-03-01

    We present GPU and APU accelerated computations of Finite-Time Lyapunov Exponent (FTLE) fields. The calculation of FTLEs is a computationally intensive process, as in order to obtain the sharp ridges associated with the Lagrangian Coherent Structures an extensive resampling of the flow field is required. The computational performance of this resampling is limited by the memory bandwidth of the underlying computer architecture. The present technique harnesses data-parallel execution of many-core architectures and relies on fast and accurate evaluations of moment conserving functions for the mesh to particle interpolations. We demonstrate how the computation of FTLEs can be efficiently performed on a GPU and on an APU through OpenCL and we report over one order of magnitude improvements over multi-threaded executions in FTLE computations of bluff body flows.

  16. "Hypothetical" Heavy Particles Dynamics in LES of Turbulent Dispersed Two-Phase Channel Flow

    NASA Technical Reports Server (NTRS)

    Gorokhovski, M.; Chtab, A.

    2003-01-01

    The extensive experimental study of dispersed two-phase turbulent flow in a vertical channel has been performed in Eaton's research group in the Mechanical Engineering Department at Stanford University. In Wang & Squires (1996), this study motivated the validation of LES approach with Lagrangian tracking of round particles governed by drag forces. While the computed velocity of the flow have been predicted relatively well, the computed particle velocity differed strongly from the measured one. Using Monte Carlo simulation of inter-particle collisions, the computation of Yamamoto et al. (2001) was specifically performed to model Eaton's experiment. The results of Yamamoto et al. (2001) improved the particle velocity distribution. At the same time, Vance & Squires (2002) mentioned that the stochastic simualtion of inter-particle collisions is too expensive, requiring significantly more CPU resources than one needs for the gas flow computation. Therefore, the need comes to account for the inter-particle collisions in a simpler and still effective way. To present such a model in the framework of LES/Lagrangian particle approach, and to compare the calculated results with Eaton's measurement and modeling of Yamamoto is the main objective of the present paper.

  17. A novel computational approach "BP-STOCH" to study ligand binding to finite lattice.

    PubMed

    Beshnova, Daria A; Bereznyak, Ekaterina G; Shestopalova, Anna V; Evstigneev, Maxim P

    2011-03-01

    We report a novel computational algorithm "BP-STOCH" to be used for studying single-type ligand binding with biopolymers of finite lengths, such as DNA oligonucleotides or oligopeptides. It is based on an idea to represent any type of ligand-biopolymer complex in a form of binary number, where "0" and "1" bits stand for vacant and engaged monomers of the biopolymer, respectively. Cycling over all binary numbers from the lowest 0 up to the highest 2(N) - 1 means a sequential generating of all possible configurations of vacant/engaged monomers, which, after proper filtering, results in a full set of possible types of complexes in solution between the ligand and the N-site lattice. The principal advantage of BP-STOCH algorithm is the possibility to incorporate into this cycle any conditions on computation of the concentrations and observed experimental parameters of the complexes in solution, and programmatic access to each monomer of the biopolymer within each binding site of every binding configuration. The latter is equivalent to unlimited extension of the basic reaction scheme and allows to use BP-STOCH algorithm as an alternative to conventional computational approaches.

  18. Computational analysis of a multistage axial compressor

    NASA Astrophysics Data System (ADS)

    Mamidoju, Chaithanya

    Turbomachines are used extensively in Aerospace, Power Generation, and Oil & Gas Industries. Efficiency of these machines is often an important factor and has led to the continuous effort to improve the design to achieve better efficiency. The axial flow compressor is a major component in a gas turbine with the turbine's overall performance depending strongly on compressor performance. Traditional analysis of axial compressors involves throughflow calculations, isolated blade passage analysis, Quasi-3D blade-to-blade analysis, single-stage (rotor-stator) analysis, and multi-stage analysis involving larger design cycles. In the current study, the detailed flow through a 15 stage axial compressor is analyzed using a 3-D Navier Stokes CFD solver in a parallel computing environment. Methodology is described for steady state (frozen rotor stator) analysis of one blade passage per component. Various effects such as mesh type and density, boundary conditions, tip clearance and numerical issues such as turbulence model choice, advection model choice, and parallel processing performance are analyzed. A high sensitivity of the predictions to the above was found. Physical explanation to the flow features observed in the computational study are given. The total pressure rise verses mass flow rate was computed.

  19. Flexible structure control experiments using a real-time workstation for computer-aided control engineering

    NASA Technical Reports Server (NTRS)

    Stieber, Michael E.

    1989-01-01

    A Real-Time Workstation for Computer-Aided Control Engineering has been developed jointly by the Communications Research Centre (CRC) and Ruhr-Universitaet Bochum (RUB), West Germany. The system is presently used for the development and experimental verification of control techniques for large space systems with significant structural flexibility. The Real-Time Workstation essentially is an implementation of RUB's extensive Computer-Aided Control Engineering package KEDDC on an INTEL micro-computer running under the RMS real-time operating system. The portable system supports system identification, analysis, control design and simulation, as well as the immediate implementation and test of control systems. The Real-Time Workstation is currently being used by CRC to study control/structure interaction on a ground-based structure called DAISY, whose design was inspired by a reflector antenna. DAISY emulates the dynamics of a large flexible spacecraft with the following characteristics: rigid body modes, many clustered vibration modes with low frequencies and extremely low damping. The Real-Time Workstation was found to be a very powerful tool for experimental studies, supporting control design and simulation, and conducting and evaluating tests withn one integrated environment.

  20. Navier-Stokes Computations With One-Equation Turbulence Model for Flows Along Concave Wall Surfaces

    NASA Technical Reports Server (NTRS)

    Wang, Chi R.

    2005-01-01

    This report presents the use of a time-marching three-dimensional compressible Navier-Stokes equation numerical solver with a one-equation turbulence model to simulate the flow fields developed along concave wall surfaces without and with a downstream extension flat wall surface. The 3-D Navier- Stokes numerical solver came from the NASA Glenn-HT code. The one-equation turbulence model was derived from the Spalart and Allmaras model. The computational approach was first calibrated with the computations of the velocity and Reynolds shear stress profiles of a steady flat plate boundary layer flow. The computational approach was then used to simulate developing boundary layer flows along concave wall surfaces without and with a downstream extension wall. The author investigated the computational results of surface friction factors, near surface velocity components, near wall temperatures, and a turbulent shear stress component in terms of turbulence modeling, computational mesh configurations, inlet turbulence level, and time iteration step. The computational results were compared with existing measurements of skin friction factors, velocity components, and shear stresses of the developing boundary layer flows. With a fine computational mesh and a one-equation model, the computational approach could predict accurately the skin friction factors, near surface velocity and temperature, and shear stress within the flows. The computed velocity components and shear stresses also showed the vortices effect on the velocity variations over a concave wall. The computed eddy viscosities at the near wall locations were also compared with the results from a two equation turbulence modeling technique. The inlet turbulence length scale was found to have little effect on the eddy viscosities at locations near the concave wall surface. The eddy viscosities, from the one-equation and two-equation modeling, were comparable at most stream-wise stations. The present one-equation turbulence model is an effective approach for turbulence modeling in the near solid wall surface region of flow over a concave wall.

  1. Surfer: An Extensible Pull-Based Framework for Resource Selection and Ranking

    NASA Technical Reports Server (NTRS)

    Zolano, Paul Z.

    2004-01-01

    Grid computing aims to connect large numbers of geographically and organizationally distributed resources to increase computational power; resource utilization, and resource accessibility. In order to effectively utilize grids, users need to be connected to the best available resources at any given time. As grids are in constant flux, users cannot be expected to keep up with the configuration and status of the grid, thus they must be provided with automatic resource brokering for selecting and ranking resources meeting constraints and preferences they specify. This paper presents a new OGSI-compliant resource selection and ranking framework called Surfer that has been implemented as part of NASA's Information Power Grid (IPG) project. Surfer is highly extensible and may be integrated into any grid environment by adding information providers knowledgeable about that environment.

  2. Algorithmic Extensions of Low-Dispersion Scheme and Modeling Effects for Acoustic Wave Simulation. Revised

    NASA Technical Reports Server (NTRS)

    Kaushik, Dinesh K.; Baysal, Oktay

    1997-01-01

    Accurate computation of acoustic wave propagation may be more efficiently performed when their dispersion relations are considered. Consequently, computational algorithms which attempt to preserve these relations have been gaining popularity in recent years. In the present paper, the extensions to one such scheme are discussed. By solving the linearized, 2-D Euler and Navier-Stokes equations with such a method for the acoustic wave propagation, several issues were investigated. Among them were higher-order accuracy, choice of boundary conditions and differencing stencils, effects of viscosity, low-storage time integration, generalized curvilinear coordinates, periodic series, their reflections and interference patterns from a flat wall and scattering from a circular cylinder. The results were found to be promising en route to the aeroacoustic simulations of realistic engineering problems.

  3. An assessment of laser velocimetry in hypersonic flow

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Although extensive progress has been made in computational fluid mechanics, reliable flight vehicle designs and modifications still cannot be made without recourse to extensive wind tunnel testing. Future progress in the computation of hypersonic flow fields is restricted by the need for a reliable mean flow and turbulence modeling data base which could be used to aid in the development of improved empirical models for use in numerical codes. Currently, there are few compressible flow measurements which could be used for this purpose. In this report, the results of experiments designed to assess the potential for laser velocimeter measurements of mean flow and turbulent fluctuations in hypersonic flow fields are presented. Details of a new laser velocimeter system which was designed and built for this test program are described.

  4. Principal facts for a gravity survey of the Gerlach Extension Known Geothermal Resource Area, Pershing County, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, D.L.; Kaufmann, H.E.

    1978-01-01

    During July 1977, fifty-one gravity stations were obtained in the Gerlach Extension Known Geothermal Resource Area and vicinity, northwestern Nevada. The gravity observations were made with a Worden gravimeter having a scale factor of about 0.5 milligal per division. No terrain corrections have been applied to these data. The earth tide correction was not used in drift reduction. The Geodetic Reference System 1967 formula (International Association of Geodesy, 1967) was used to compute theoretical gravity. Observed gravity is referenced to a base station in Gerlach, Nevada, having a value based on the Potsdam System of 1930. A density of 2.67more » g per cm/sup 3/ was used in computing the Bouguer anomaly.« less

  5. Principal facts for a gravity survey of the Fly Ranch Extension Known Geothermal Resource Area, Pershing County, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, D.L.; Kaufmann, H.E.

    1978-01-01

    During July 1977, forty-four gravity stations were obtained in the Fly Ranch Extension Known Geothermal Resource Area and vicinity, northwestern Nevada. The gravity observations were made with a Worden gravimeter having a scale factor of about 0.5 milligal per division. No terrain corrections have been applied to these data. The earth tide correction was not used in drift reduction. The Geodetic Reference System 1967 formula (International Association of Geodesy, 1967) was used to compute theoretical gravity. Observed gravity is referenced to a base station in Gerlach, Nevada, having a value based on the Potsdam System of 1930 (fig. 1). Amore » density of 2.67 g per cm/sup 3/ was used in computing the Bouguer anomaly.« less

  6. Multiprocessor architectural study

    NASA Technical Reports Server (NTRS)

    Kosmala, A. L.; Stanten, S. F.; Vandever, W. H.

    1972-01-01

    An architectural design study was made of a multiprocessor computing system intended to meet functional and performance specifications appropriate to a manned space station application. Intermetrics, previous experience, and accumulated knowledge of the multiprocessor field is used to generate a baseline philosophy for the design of a future SUMC* multiprocessor. Interrupts are defined and the crucial questions of interrupt structure, such as processor selection and response time, are discussed. Memory hierarchy and performance is discussed extensively with particular attention to the design approach which utilizes a cache memory associated with each processor. The ability of an individual processor to approach its theoretical maximum performance is then analyzed in terms of a hit ratio. Memory management is envisioned as a virtual memory system implemented either through segmentation or paging. Addressing is discussed in terms of various register design adopted by current computers and those of advanced design.

  7. A Practical Engineering Approach to Predicting Fatigue Crack Growth in Riveted Lap Joints

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Piascik, Robert S.; Newman, James C., Jr.

    1999-01-01

    An extensive experimental database has been assembled from very detailed teardown examinations of fatigue cracks found in rivet holes of fuselage structural components. Based on this experimental database, a comprehensive analysis methodology was developed to predict the onset of widespread fatigue damage in lap joints of fuselage structure. Several computer codes were developed with specialized capabilities to conduct the various analyses that make up the comprehensive methodology. Over the past several years, the authors have interrogated various aspects of the analysis methods to determine the degree of computational rigor required to produce numerical predictions with acceptable engineering accuracy. This study led to the formulation of a practical engineering approach to predicting fatigue crack growth in riveted lap joints. This paper describes the practical engineering approach and compares predictions with the results from several experimental studies.

  8. A Practical Engineering Approach to Predicting Fatigue Crack Growth in Riveted Lap Joints

    NASA Technical Reports Server (NTRS)

    Harris, C. E.; Piascik, R. S.; Newman, J. C., Jr.

    2000-01-01

    An extensive experimental database has been assembled from very detailed teardown examinations of fatigue cracks found in rivet holes of fuselage structural components. Based on this experimental database, a comprehensive analysis methodology was developed to predict the onset of widespread fatigue damage in lap joints of fuselage structure. Several computer codes were developed with specialized capabilities to conduct the various analyses that make up the comprehensive methodology. Over the past several years, the authors have interrogated various aspects of the analysis methods to determine the degree of computational rigor required to produce numerical predictions with acceptable engineering accuracy. This study led to the formulation of a practical engineering approach to predicting fatigue crack growth in riveted lap joints. This paper describes the practical engineering approach and compares predictions with the results from several experimental studies.

  9. Numerical modelling in friction lap joining of aluminium alloy and carbon-fiber-reinforced-plastic sheets

    NASA Astrophysics Data System (ADS)

    Das, A.; Bang, H. S.; Bang, H. S.

    2018-05-01

    Multi-material combinations of aluminium alloy and carbon-fiber-reinforced-plastics (CFRP) have gained attention in automotive and aerospace industries to enhance fuel efficiency and strength-to-weight ratio of components. Various limitations of laser beam welding, adhesive bonding and mechanical fasteners make these processes inefficient to join metal and CFRP sheets. Friction lap joining is an alternative choice for the same. Comprehensive studies in friction lap joining of aluminium to CFRP sheets are essential and scare in the literature. The present work reports a combined theoretical and experimental study in joining of AA5052 and CFRP sheets using friction lap joining process. A three-dimensional finite element based heat transfer model is developed to compute the temperature fields and thermal cycles. The computed results are validated extensively with the corresponding experimentally measured results.

  10. Worm epidemics in wireless ad hoc networks

    NASA Astrophysics Data System (ADS)

    Nekovee, Maziar

    2007-06-01

    A dramatic increase in the number of computing devices with wireless communication capability has resulted in the emergence of a new class of computer worms which specifically target such devices. The most striking feature of these worms is that they do not require Internet connectivity for their propagation but can spread directly from device to device using a short-range radio communication technology, such as WiFi or Bluetooth. In this paper, we develop a new model for epidemic spreading of these worms and investigate their spreading in wireless ad hoc networks via extensive Monte Carlo simulations. Our studies show that the threshold behaviour and dynamics of worm epidemics in these networks are greatly affected by a combination of spatial and temporal correlations which characterize these networks, and are significantly different from the previously studied epidemics in the Internet.

  11. Science in the cloud (SIC): A use case in MRI connectomics

    PubMed Central

    Gorgolewski, Krzysztof J.; Kleissas, Dean; Roncal, William Gray; Litt, Brian; Wandell, Brian; Poldrack, Russel A.; Wiener, Martin; Vogelstein, R. Jacob; Burns, Randal

    2017-01-01

    Abstract Modern technologies are enabling scientists to collect extraordinary amounts of complex and sophisticated data across a huge range of scales like never before. With this onslaught of data, we can allow the focal point to shift from data collection to data analysis. Unfortunately, lack of standardized sharing mechanisms and practices often make reproducing or extending scientific results very difficult. With the creation of data organization structures and tools that drastically improve code portability, we now have the opportunity to design such a framework for communicating extensible scientific discoveries. Our proposed solution leverages these existing technologies and standards, and provides an accessible and extensible model for reproducible research, called ‘science in the cloud’ (SIC). Exploiting scientific containers, cloud computing, and cloud data services, we show the capability to compute in the cloud and run a web service that enables intimate interaction with the tools and data presented. We hope this model will inspire the community to produce reproducible and, importantly, extensible results that will enable us to collectively accelerate the rate at which scientific breakthroughs are discovered, replicated, and extended. PMID:28327935

  12. Science in the cloud (SIC): A use case in MRI connectomics.

    PubMed

    Kiar, Gregory; Gorgolewski, Krzysztof J; Kleissas, Dean; Roncal, William Gray; Litt, Brian; Wandell, Brian; Poldrack, Russel A; Wiener, Martin; Vogelstein, R Jacob; Burns, Randal; Vogelstein, Joshua T

    2017-05-01

    Modern technologies are enabling scientists to collect extraordinary amounts of complex and sophisticated data across a huge range of scales like never before. With this onslaught of data, we can allow the focal point to shift from data collection to data analysis. Unfortunately, lack of standardized sharing mechanisms and practices often make reproducing or extending scientific results very difficult. With the creation of data organization structures and tools that drastically improve code portability, we now have the opportunity to design such a framework for communicating extensible scientific discoveries. Our proposed solution leverages these existing technologies and standards, and provides an accessible and extensible model for reproducible research, called 'science in the cloud' (SIC). Exploiting scientific containers, cloud computing, and cloud data services, we show the capability to compute in the cloud and run a web service that enables intimate interaction with the tools and data presented. We hope this model will inspire the community to produce reproducible and, importantly, extensible results that will enable us to collectively accelerate the rate at which scientific breakthroughs are discovered, replicated, and extended. © The Author 2017. Published by Oxford University Press.

  13. Heterogeneous concurrent computing with exportable services

    NASA Technical Reports Server (NTRS)

    Sunderam, Vaidy

    1995-01-01

    Heterogeneous concurrent computing, based on the traditional process-oriented model, is approaching its functionality and performance limits. An alternative paradigm, based on the concept of services, supporting data driven computation, and built on a lightweight process infrastructure, is proposed to enhance the functional capabilities and the operational efficiency of heterogeneous network-based concurrent computing. TPVM is an experimental prototype system supporting exportable services, thread-based computation, and remote memory operations that is built as an extension of and an enhancement to the PVM concurrent computing system. TPVM offers a significantly different computing paradigm for network-based computing, while maintaining a close resemblance to the conventional PVM model in the interest of compatibility and ease of transition Preliminary experiences have demonstrated that the TPVM framework presents a natural yet powerful concurrent programming interface, while being capable of delivering performance improvements of upto thirty percent.

  14. Large-scale molecular dynamics simulation of DNA: implementation and validation of the AMBER98 force field in LAMMPS.

    PubMed

    Grindon, Christina; Harris, Sarah; Evans, Tom; Novik, Keir; Coveney, Peter; Laughton, Charles

    2004-07-15

    Molecular modelling played a central role in the discovery of the structure of DNA by Watson and Crick. Today, such modelling is done on computers: the more powerful these computers are, the more detailed and extensive can be the study of the dynamics of such biological macromolecules. To fully harness the power of modern massively parallel computers, however, we need to develop and deploy algorithms which can exploit the structure of such hardware. The Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS) is a scalable molecular dynamics code including long-range Coulomb interactions, which has been specifically designed to function efficiently on parallel platforms. Here we describe the implementation of the AMBER98 force field in LAMMPS and its validation for molecular dynamics investigations of DNA structure and flexibility against the benchmark of results obtained with the long-established code AMBER6 (Assisted Model Building with Energy Refinement, version 6). Extended molecular dynamics simulations on the hydrated DNA dodecamer d(CTTTTGCAAAAG)(2), which has previously been the subject of extensive dynamical analysis using AMBER6, show that it is possible to obtain excellent agreement in terms of static, dynamic and thermodynamic parameters between AMBER6 and LAMMPS. In comparison with AMBER6, LAMMPS shows greatly improved scalability in massively parallel environments, opening up the possibility of efficient simulations of order-of-magnitude larger systems and/or for order-of-magnitude greater simulation times.

  15. The COMET Sleep Research Platform.

    PubMed

    Nichols, Deborah A; DeSalvo, Steven; Miller, Richard A; Jónsson, Darrell; Griffin, Kara S; Hyde, Pamela R; Walsh, James K; Kushida, Clete A

    2014-01-01

    The Comparative Outcomes Management with Electronic Data Technology (COMET) platform is extensible and designed for facilitating multicenter electronic clinical research. Our research goals were the following: (1) to conduct a comparative effectiveness trial (CET) for two obstructive sleep apnea treatments-positive airway pressure versus oral appliance therapy; and (2) to establish a new electronic network infrastructure that would support this study and other clinical research studies. The COMET platform was created to satisfy the needs of CET with a focus on creating a platform that provides comprehensive toolsets, multisite collaboration, and end-to-end data management. The platform also provides medical researchers the ability to visualize and interpret data using business intelligence (BI) tools. COMET is a research platform that is scalable and extensible, and which, in a future version, can accommodate big data sets and enable efficient and effective research across multiple studies and medical specialties. The COMET platform components were designed for an eventual move to a cloud computing infrastructure that enhances sustainability, overall cost effectiveness, and return on investment.

  16. Efficient 3D kinetic Monte Carlo method for modeling of molecular structure and dynamics.

    PubMed

    Panshenskov, Mikhail; Solov'yov, Ilia A; Solov'yov, Andrey V

    2014-06-30

    Self-assembly of molecular systems is an important and general problem that intertwines physics, chemistry, biology, and material sciences. Through understanding of the physical principles of self-organization, it often becomes feasible to control the process and to obtain complex structures with tailored properties, for example, bacteria colonies of cells or nanodevices with desired properties. Theoretical studies and simulations provide an important tool for unraveling the principles of self-organization and, therefore, have recently gained an increasing interest. The present article features an extension of a popular code MBN EXPLORER (MesoBioNano Explorer) aiming to provide a universal approach to study self-assembly phenomena in biology and nanoscience. In particular, this extension involves a highly parallelized module of MBN EXPLORER that allows simulating stochastic processes using the kinetic Monte Carlo approach in a three-dimensional space. We describe the computational side of the developed code, discuss its efficiency, and apply it for studying an exemplary system. Copyright © 2014 Wiley Periodicals, Inc.

  17. The COMET Sleep Research Platform

    PubMed Central

    Nichols, Deborah A.; DeSalvo, Steven; Miller, Richard A.; Jónsson, Darrell; Griffin, Kara S.; Hyde, Pamela R.; Walsh, James K.; Kushida, Clete A.

    2014-01-01

    Introduction: The Comparative Outcomes Management with Electronic Data Technology (COMET) platform is extensible and designed for facilitating multicenter electronic clinical research. Background: Our research goals were the following: (1) to conduct a comparative effectiveness trial (CET) for two obstructive sleep apnea treatments—positive airway pressure versus oral appliance therapy; and (2) to establish a new electronic network infrastructure that would support this study and other clinical research studies. Discussion: The COMET platform was created to satisfy the needs of CET with a focus on creating a platform that provides comprehensive toolsets, multisite collaboration, and end-to-end data management. The platform also provides medical researchers the ability to visualize and interpret data using business intelligence (BI) tools. Conclusion: COMET is a research platform that is scalable and extensible, and which, in a future version, can accommodate big data sets and enable efficient and effective research across multiple studies and medical specialties. The COMET platform components were designed for an eventual move to a cloud computing infrastructure that enhances sustainability, overall cost effectiveness, and return on investment. PMID:25848590

  18. Hints for an extension of the early exercise premium formula for American options

    NASA Astrophysics Data System (ADS)

    Bermin, Hans-Peter; Kohatsu-Higa, Arturo; Perelló, Josep

    2005-09-01

    There exists a non-closed formula for the American put option price and non-trivial computations are required to solve it. Strong efforts have been made to propose efficient numerical techniques but few have strong mathematical reasoning to ascertain why they work well. We present an extension of the American put price aiming to catch weaknesses of the numerical methods based on their non-fulfillment of the smooth pasting condition.

  19. BridgeUP: STEM. Creating Opportunities for Women through Tiered Mentorship

    NASA Astrophysics Data System (ADS)

    Secunda, Amy; Cornelis, Juliette; Ferreira, Denelis; Gomez, Anay; Khan, Ariba; Li, Anna; Soo, Audrey; Mac Low, Mordecai

    2018-01-01

    BridgeUP: STEM is an ambitious, and exciting initiative responding to the extensive gender and opportunity gaps that exist in the STEM pipeline for women, girls, and under-resourced youth. BridgeUP: STEM has developed a distinct identity in the landscape of computer science education by embedding programming in the context of scientific research. One of the ways in which this is accomplished is through a tiered mentorship program. Five Helen Fellows are chosen from a pool of female, postbaccalaureate applicants to be mentored by researchers at the American Museum of Natural History in a computational research project. The Helen Fellows then act as mentors to six high school women (Brown Scholars), guiding them through a computational project aligned with their own research. This year, three of the Helen Fellows, and by extension, eighteen Brown Scholars, are performing computational astrophysics research. This poster presents one example of a tiered mentorship working on modeling the migration of stellar mass black holes (BH) in active galactic nucleus (AGN) disks. Making an analogy from the well-studied migration and formation of planets in protoplanetary disks to the newer field of migration and formation of binary BH in AGN disks, the Helen Fellow is working with her mentors to make the necessary adaptations of an N-body code incorporating migration torques from the protoplanetary disk case to the AGN disk case to model how binary BH form. This is in order to better understand and make predictions for gravitational wave observations from the Laser Interferometer Gravitational-Wave Observatory (LIGO). The Brown Scholars then implement the Helen Fellow’s code for a variety of different distributions of initial stellar mass BH populations that they generate using python, and produce visualizations of the output to be used in a published paper. Over the course of the project, students will develop a basic understanding of the physics related to their project and develop their practical computational skills.

  20. Integrated Application of Active Controls (IAAC) technology to an advanced subsonic transport project: Current and advanced act control system definition study. Volume 2: Appendices

    NASA Technical Reports Server (NTRS)

    Hanks, G. W.; Shomber, H. A.; Dethman, H. A.; Gratzer, L. B.; Maeshiro, A.; Gangsaas, D.; Blight, J. D.; Buchan, S. M.; Crumb, C. B.; Dorwart, R. J.

    1981-01-01

    The current status of the Active Controls Technology (ACT) for the advanced subsonic transport project is investigated through analysis of the systems technical data. Control systems technologies under examination include computerized reliability analysis, pitch axis fly by wire actuator, flaperon actuation system design trade study, control law synthesis and analysis, flutter mode control and gust load alleviation analysis, and implementation of alternative ACT systems. Extensive analysis of the computer techniques involved in each system is included.

Top