78 FR 56871 - Advanced Scientific Computing Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-16
... Germantown Update on Exascale Update from Exascale technical approaches subcommittee Facilities update Report from Applied Math Committee of Visitors Exascale technical talks Public Comment (10-minute rule) Public...
Scientific and technical information output of the Langley Research Center
NASA Technical Reports Server (NTRS)
1984-01-01
Scientific and technical information that the Langley Research Center produced during the calendar year 1983 is compiled. Included are citations for Formal Reports, Quick-Release Technical Memorandums, Contractor Reports, Journal Articles and other Publications, Meeting Presentations, Technical Talks, Computer Programs, Tech Briefs, and Patents.
Scientific and technical information output of the Langley Research Center for calendar year 1982
NASA Technical Reports Server (NTRS)
1983-01-01
Citations are presented for 1380 for formal reports; quick-release technical memorandum; contractor reports; journal articles and periodical literature; technical talks and meeting presentations; computer programs; tech briefs, and patents produced during 1982. An author index is provided.
Scientific and technical information output of the Langley Research Center for Calendar Year 1985
NASA Technical Reports Server (NTRS)
1986-01-01
A compilation of the scientific and technical information that the Langley Research Center has produced during the calendar year 1985 is presented. Included are citations for Formal Reports, Quick-Release Technical Memorandums, Contractor Reports, Journal Articles and Other Publications, Meeting Presentations, Technical Talks, Computer Programs, Tech Briefs, and Patents.
Scientific and technical information output of the Langley Research Center for calendar year 1984
NASA Technical Reports Server (NTRS)
1985-01-01
The scientific and technical information that the Langley Research Center produced during the calendar year 1984 is compiled. Approximately 1650 citations are included comprising formal reports, quick-release technical memorandums, contractor reports, journal articles and other publications, meeting presentations, technical talks, computer programs, tech briefs, and patents.
NASA Langley Scientific and Technical Information Output: 1996
NASA Technical Reports Server (NTRS)
Stewart, Susan H. (Compiler); Phillips, Marilou S. (Compiler)
1997-01-01
This document is a compilation of the scientific and technical information that the Langley Research Center has produced during the calendar year 1996. Included are citations for Formal Reports, High-Numbered Conference Publications, High-Numbered Technical Memorandums, Contractor Reports, Journal Articles and Other Publications, Meeting Presentations, Technical Talks, Computer Programs, Tech Briefs, and Patents.
77 FR 62231 - DOE/Advanced Scientific Computing Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-12
.... Facilities update. ESnet-5. Early Career technical talks. Co-design. Innovative and Novel Computational Impact on Theory and Experiment (INCITE). Public Comment (10-minute rule). Public Participation: The...
NASA Langley Scientific and Technical Information Output, 1995. Volume 1
NASA Technical Reports Server (NTRS)
Stewart, Susan H. (Compiler); Phillips, Marilou S. (Compiler)
1996-01-01
This document is a compilation of the scientific and technical information that the Langley Research Center has produced during the calendar year 1995. Included are citations for formal reports, high-numbered conference publications, high-numbered technical memorandums, contractor reports, journal articles and other publications, meeting presentations, technical talks, computer programs, tech briefs, and patents.
77 FR 12823 - Advanced Scientific Computing Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-02
... Exascale ARRA projects--Magellan final report, Advanced Networking update Status from Computer Science COV Early Career technical talks Summary of Applied Math and Computer Science Workshops ASCR's new SBIR..., Office of Science. ACTION: Notice of Open Meeting. SUMMARY: This notice announces a meeting of the...
Scientific and technical information output of the Langley Research Center for calendar year 1986
NASA Technical Reports Server (NTRS)
1987-01-01
This document is a compilation of the scientific and technical information that the Langley Research Center has produced during the calendar year 1986. Included are citations for Formal Reports, Quick-Release Technical Memorandums, Contractor Reports, Journal Articles and Other Publications, Meeting Presentations, Techncial Talks, Computer Programs, Tech Briefs, and Patents.
NASA Astrophysics Data System (ADS)
Frenkel, Daan
2007-03-01
During the past decade there has been a unique synergy between theory, experiment and simulation in Soft Matter Physics. In colloid science, computer simulations that started out as studies of highly simplified model systems, have acquired direct experimental relevance because experimental realizations of these simple models can now be synthesized. Whilst many numerical predictions concerning the phase behavior of colloidal systems have been vindicated by experiments, the jury is still out on others. In my talk I will discuss some of the recent technical developments, new findings and open questions in computational soft-matter science.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bailey, David H.
In a previous humorous note entitled 'Twelve Ways to Fool the Masses,' I outlined twelve common ways in which performance figures for technical computer systems can be distorted. In this paper and accompanying conference talk, I give a reprise of these twelve 'methods' and give some actual examples that have appeared in peer-reviewed literature in years past. I then propose guidelines for reporting performance, the adoption of which would raise the level of professionalism and reduce the level of confusion, not only in the world of device simulation but also in the larger arena of technical computing.
1986-09-01
iUADROi THESIS James P. ’Mi’tnik First Lieutenant, USAF AFIT/GEM/DEM/86S-1 9 Approved for public release; distribution unlimited DTIC F--I F CT ESDEC 16...9 COMPUTER-AIDED SYSTEM NEEDS FOR THE TECHNICAL DESIGN SECTION OF THE BASE LEVEL CIVIL ENGINEERING SQUADRON THESIS V :-. . Presented to the Faculty...directed his own thesis and then turned around as an AFIT instructor, and helped direct mine. His suggestions, talk and ability to calm me down
Nanotechnology Review: Molecular Electronics to Molecular Motors
NASA Technical Reports Server (NTRS)
Srivastava, Deepak; Saini, Subhash (Technical Monitor)
1998-01-01
Reviewing the status of current approaches and future projections, as already published in scientific journals and books, the talk will summarize the direction in which computational and experimental nanotechnologies are progressing. Examples of nanotechnological approaches to the concepts of design and simulation of carbon nanotube based molecular electronic and mechanical devices will be presented. The concepts of nanotube based gears and motors will be discussed. The above is a non-technical review talk which covers long term precompetitive basic research in already published material that has been presented before many US scientific meeting audiences.
What Physicists Should Know About High Performance Computing - Circa 2002
NASA Astrophysics Data System (ADS)
Frederick, Donald
2002-08-01
High Performance Computing (HPC) is a dynamic, cross-disciplinary field that traditionally has involved applied mathematicians, computer scientists, and others primarily from the various disciplines that have been major users of HPC resources - physics, chemistry, engineering, with increasing use by those in the life sciences. There is a technological dynamic that is powered by economic as well as by technical innovations and developments. This talk will discuss practical ideas to be considered when developing numerical applications for research purposes. Even with the rapid pace of development in the field, the author believes that these concepts will not become obsolete for a while, and will be of use to scientists who either are considering, or who have already started down the HPC path. These principles will be applied in particular to current parallel HPC systems, but there will also be references of value to desktop users. The talk will cover such topics as: computing hardware basics, single-cpu optimization, compilers, timing, numerical libraries, debugging and profiling tools and the emergence of Computational Grids.
Magill, Molly; Apodaca, Timothy R.; Borsari, Brian; Gaume, Jacques; Hoadley, Ariel; Gordon, Rebecca E.F.; Tonigan, J. Scott; Moyers, Theresa
2018-01-01
Objective In the present meta-analysis, we test the technical and relational hypotheses of Motivational Interviewing (MI) efficacy. We also propose an a priori conditional process model where heterogeneity of technical path effect sizes should be explained by interpersonal/relational (i.e., empathy, MI Spirit) and intrapersonal (i.e., client treatment seeking status) moderators. Method A systematic review identified k = 58 reports, describing 36 primary studies and 40 effect sizes (N = 3025 participants). Statistical methods calculated the inverse variance-weighted pooled correlation coefficient for the therapist to client and the client to outcome paths across multiple target behaviors (i.e., alcohol use, other drug use, other behavior change). Results Therapist MI-consistent skills were correlated with more client change talk (r = .55, p < .001) as well as more sustain talk (r = .40, p < .001). MI-inconsistent skills were correlated with more sustain talk (r = .16, p < .001), but not change talk. When these indicators were combined into proportions, as recommended in the Motivational Interviewing Skill Code, the overall technical hypothesis was supported. Specifically, proportion MI consistency was related to higher proportion change talk (r = .11, p = .004) and higher proportion change talk was related to reductions in risk behavior at follow up (r = −.16, p < .001). When tested as two independent effects, client change talk was not significant, but sustain talk was positively associated with worse outcome (r = .19, p < .001). Finally, the relational hypothesis was not supported, but heterogeneity in technical hypothesis path effect sizes was partially explained by inter- and intra-personal moderators. Conclusions This meta-analysis provides additional support for the technical hypothesis of MI efficacy; future research on the relational hypothesis should occur in the field rather than in the context of clinical trials. PMID:29265832
Magill, Molly; Apodaca, Timothy R; Borsari, Brian; Gaume, Jacques; Hoadley, Ariel; Gordon, Rebecca E F; Tonigan, J Scott; Moyers, Theresa
2018-02-01
In the present meta-analysis, we test the technical and relational hypotheses of Motivational Interviewing (MI) efficacy. We also propose an a priori conditional process model where heterogeneity of technical path effect sizes should be explained by interpersonal/relational (i.e., empathy, MI Spirit) and intrapersonal (i.e., client treatment seeking status) moderators. A systematic review identified k = 58 reports, describing 36 primary studies and 40 effect sizes (N = 3,025 participants). Statistical methods calculated the inverse variance-weighted pooled correlation coefficient for the therapist to client and the client to outcome paths across multiple target behaviors (i.e., alcohol use, other drug use, other behavior change). Therapist MI-consistent skills were correlated with more client change talk (r = .55, p < .001) as well as more sustain talk (r = .40, p < .001). MI-inconsistent skills were correlated with more sustain talk (r = .16, p < .001), but not change talk. When these indicators were combined into proportions, as recommended in the Motivational Interviewing Skill Code, the overall technical hypothesis was supported. Specifically, proportion MI consistency was related to higher proportion change talk (r = .11, p = .004) and higher proportion change talk was related to reductions in risk behavior at follow up (r = -.16, p < .001). When tested as two independent effects, client change talk was not significant, but sustain talk was positively associated with worse outcome (r = .19, p < .001). Finally, the relational hypothesis was not supported, but heterogeneity in technical hypothesis path effect sizes was partially explained by inter- and intrapersonal moderators. This meta-analysis provides additional support for the technical hypothesis of MI efficacy; future research on the relational hypothesis should occur in the field rather than in the context of clinical trials. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Navy Manpower Planning and Programming: Basis for Systems Examination
1974-10-01
IRE5EARCH AND DEVEl. INAVAL RESEARCH] CHIEF OF NAVAL OPERATIONS OFFICE CHIIf OF NAVAL OPERATIONS NAVAL MATERIAL COMMAND •LitMARTERS NAVAL MATERIAL...DIVISION COMPENSATION BRANCH MANPOWER PROGRAMMING ■RANCH JOURNAL/TRADE TALK BRANCH 06A ASSISTANT FOR COMPUTER SCIENCES SYSTEMS DEVELOPMENT BRANCH...Assistant Director, Life Sciences , Air Force Office of Scientific Research Technical Library, Air Force Human Resources Laboratory, Lackland Air Force Base
Guidelines for the planning and preparation of illustrated technical talks
NASA Technical Reports Server (NTRS)
Hubbard, H. H.
1975-01-01
Guidelines are presented for the preparation of illustrated talks which are audience oriented and which are aimed at the efficient transfer of technical information. Early decisions concerning the required number of slides are helpful in initial planning for a good quality talk. Detailed considerations are: the establishment of limited objectives, selection of appropriate slide material, development of a text which is well coordinated with the slides, and accurate timing.
Risk assessments for mixtures: technical methods commonly used in the United States
A brief (20 minute) talk on the technical approaches used by EPA and other US agencies to assess risks posed by combined exposures to one or more chemicals. The talk systemically reviews the methodologies (whole-mixtures and component-based approaches) that are or have been used ...
2004-02-03
KENNEDY SPACE CENTER, FLA. - Astronaut Tim Kopra talks to a technician (off-camera) during Intravehicular Activity (IVA) constraints testing on the Italian-built Node 2, a future element of the International Space Station. The second of three Station connecting modules, the Node 2 attaches to the end of the U.S. Lab and provides attach locations for several other elements. Kopra is currently assigned technical duties in the Space Station Branch of the Astronaut Office, where his primary focus involves the testing of crew interfaces for two future ISS modules as well as the implementation of support computers and operational Local Area Network on ISS. Node 2 is scheduled to launch on mission STS-120, Station assembly flight 10A.
A History of the Andrew File System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bashear, Derrick
2011-02-22
Derrick Brashear and Jeffrey Altman will present a technical history of the evolution of Andrew File System starting with the early days of the Andrew Project at Carnegie Mellon through the commercialization by Transarc Corporation and IBM and a decade of OpenAFS. The talk will be technical with a focus on the various decisions and implementation trade-offs that were made over the course of AFS versions 1 through 4, the development of the Distributed Computing Environment Distributed File System (DCE DFS), and the course of the OpenAFS development community. The speakers will also discuss the various AFS branches developed atmore » the University of Michigan, Massachusetts Institute of Technology and Carnegie Mellon University.« less
Proceedings of 1981 Western Region Technical Symposium on Electronic Warfare
1981-01-01
LS"............. ...... ... 219 "S EEK TALK - A J AM -R ES ISTANT TACT!CA i. CO MMrVL’N ICATION 1 ý’Y 3TI VS ..T ....".. ......... ...... 33...commands, and other Department and Gray data base development, and computer of Defense agencies by providing EW combat modelling. analysis support to...inter- base radios and and Coirqunications Countermeasures surface-to-air missile communications (CCM) were exercised together for the nets. Two
2004-02-03
KENNEDY SPACE CENTER, FLA. - Astronaut Tim Kopra (second from right) talks with workers in the Space Station Processing Facility about the Intravehicular Activity (IVA) constraints testing on the Italian-built Node 2, a future element of the International Space Station. . The second of three Station connecting modules, the Node 2 attaches to the end of the U.S. Lab and provides attach locations for several other elements. Kopra is currently assigned technical duties in the Space Station Branch of the Astronaut Office, where his primary focus involves the testing of crew interfaces for two future ISS modules as well as the implementation of support computers and operational Local Area Network on ISS. Node 2 is scheduled to launch on mission STS-120, Station assembly flight 10A.
NASA Langley Scientific and Technical Information Output: 1998
NASA Technical Reports Server (NTRS)
Machie, Harriet B. (Compiler); Stewart, Susan H. (Compiler)
1999-01-01
This document is a compilation of the scientific and technical information that the Langley Research Center has produced during the calendar year 1998. Included are citations for Technical Publications, Conference Publications, Technical Memorandums, Contractor Reports, Journal Articles and Book Publications, Meeting Presentations, Technical Talks, and Patents.
NASA Langley Scientific and Technical Information Output-2001
NASA Technical Reports Server (NTRS)
Stewart, Susan H. (Compiler)
2002-01-01
This document is a compilation of the scientific and technical information that the Langley Research Center has produced during the 2001 calendar year. Included are citations for Technical Publications, Conference Publications, Technical Memorandums, Contractor Reports, Journal Articles and Book Publications, Meeting Presentations, Technical Talks, and Patents.
NASA Langley Scientific and Technical Information Output-2002
NASA Technical Reports Server (NTRS)
Stewart, Susan H. (Compiler)
2003-01-01
This document is a compilation of the scientific and technical information that the Langley Research Center has produced during the calendar year 2002. Included are citations for Technical Publications, Conference Publications, Technical Memorandums, Contractor Reports, Journal Articles and Book Publications, Meeting Presentations, Technical Talks, and Patents.
NASA Langley Scientific and Technical Information Output 2000
NASA Technical Reports Server (NTRS)
Machie, Harriet B. (Compiler); Stewart, Susan H. (Compiler)
2001-01-01
This document is a compilation of the scientific and technical information that the Langley Research Center has produced during the calendar year 2000. Included are citations for Special Publications, Technical Publications, Conference Publications, Technical Memorandum, Contractor Reports, Journal Articles and Book Publications, Meeting Presentations, Technical Talks, Tech Briefs, and Patents.
NASA Langley Scientific and Technical Information Output?2003
NASA Technical Reports Server (NTRS)
Stewart, Susan H. (Compiler)
2004-01-01
This document is a compilation of the scientific and technical information that the Langley Research Center has produced during the calendar year 2003. Included are citations for Special Publications, Technical Publications, Conference Publications, Technical Memorandums, Contractor Reports, Journal Articles and Book Publications, Meeting Presentations, Technical Talks, and Patents.
NASA Langley Scientific and Technical Information Output: 1999
NASA Technical Reports Server (NTRS)
Stewart, Susan H. (Compiler); Machie, Harriet (Compiler)
2000-01-01
This document is a compilation of the scientific and technical information that the Langley Research Center has produced during the calendar year 1999. Included are citations for Special Publications, Technical Publications, Conference Publications, Technical Memorandums, Contractor Reports, Journal Articles and Book Publications, Meeting Presentations, Technical Talks, Tech Briefs, and Patents.
Aircraft Interior Noise Control Using Distributed Piezoelectric Actuators
NASA Technical Reports Server (NTRS)
Sun, Jian Q.
1996-01-01
Developing a control system that can reduce the noise and structural vibration at the same time is an important task. This talk presents one possible technical approach for accomplishing this task. The target application of the research is for aircraft interior noise control. The emphasis of the present approach is not on control strategies, but rather on the design of actuators for the control system. In the talk, a theory of distributed piezoelectric actuators is introduced. A uniform cylindrical shell is taken as a simplified model of fuselage structures to illustrate the effectiveness of the design theory. The actuators developed are such that they can reduce the tonal structural vibration and interior noise in a wide range of frequencies. Extensive computer simulations have been done to study various aspects of the design theory. Experiments have also been conducted and the test results strongly support the theoretical development.
2017-02-01
Ames Women's Influence Network (WIN) Hidden Figures talk with "Computers" Carolyn Hofstetter and Carol Mead co-sponsored by the AAAG. Left to right Barbara Miller, Ames EEO, Computers Carolyn Hoffstetter and Carol Mead, talking to Carolyn Hofstetter is Arlene Spencer
The Quality of Talk in Children's Joint Activity at the Computer.
ERIC Educational Resources Information Center
Mercer, Neil
1994-01-01
Describes findings of the Spoken Language and New Technology (SLANT) research project which studied the talk of primary school children in the United Kingdom who were working in small groups at computers with various kinds of software. Improvements in the quality of talk and collaboration during computer-based activities are suggested. (Contains…
2004-08-01
components, and B2B /B2C aspects of those in a technical and economic snapshot. Talk number six discusses the trade-off between quality and cost, which...web services have been defined. The fifth talk summarizes key aspects of XML (Extended Markup Language), Web Services and their components, and B2B ...Internet is Run: A Worldwide Perspective 69 Christoph Pauls 5 XML, Web Services and B2C/ B2B : A Technical and Economical Snap- shot 87 Matthias Pitt 6
DYAD: A Computer Program for the Analysis of Interpersonal Communication
ERIC Educational Resources Information Center
Fogel, Daniel S.
1978-01-01
A computer program which generates descriptions of conversational patterns of dyads based on sound-silence data is described. Input consists of talk/no-talk designations; output consists of descriptive matrices, histograms, and individual talk parameters. (Author/JKS)
NASA Langley Scientific and Technical Information Output: 1997
NASA Technical Reports Server (NTRS)
Stewart, Susan H. (Compiler); Machie, Harriet B. (Compiler)
1998-01-01
This document is a compilation of the scientific and technical information that the Langley Research Center has produced during the calendar year 1997. Included are citations for Formal Reports, Conference Publications, High-Numbered Technical Memorandums, Contractor Reports, Journal Articles and Book Publications, Meeting Presentations, Technical Talks, and Patents.
ERIC Educational Resources Information Center
Skovholt, Karianne
2018-01-01
This article reports a case study on classroom interaction in teacher education in Norway. It addresses how teacher students in the school subject Norwegian constitute scientific talk in a student-led discussion. First, the analysis reveals tension in the classroom conversation between "mundane talk"--that is, where students make claims…
Talking to, Talking about, Talking with: Language Arts Students in Conversation with Poetic Texts
ERIC Educational Resources Information Center
Emert, Toby
2010-01-01
English teachers share the blame for the lack of imaginative responses from students to the texts they bring to students, given their penchant for focusing on the most technical elements of literature rather than on its emotional resonance. In classrooms, teachers often concentrate too heavily on what Janet Allen calls the "products" of their…
ERIC Educational Resources Information Center
Zhang, Yuejiao
2013-01-01
This article examines the influential Chinese science book "Brush Talks from Dream Brook," written by Shen Kuo in the 11th century. I suggest that "Brush Talks" reveals a tension between institutionalized science and science in the public, and a gap between the making of scientific knowledge and the communication of such…
Soft Skills: The New Curriculum for Hard-Core Technical Professionals
ERIC Educational Resources Information Center
Bancino, Randy; Zevalkink, Claire
2007-01-01
In this article, the authors talk about the importance of soft skills for hard-core technical professionals. In many technical professions, the complete focus of education and training is on technical topics either directly or indirectly related to a career or discipline. Students are generally required to master various mathematics skills,…
Lindqvist, Helena; Forsberg, Lars; Enebrink, Pia; Andersson, Gerhard; Rosendahl, Ingvar
2017-06-01
The technical component of Motivational Interviewing (MI) posits that client language mediates the relationship between counselor techniques and subsequent client behavioral outcomes. The purpose of this study was to examine this hypothesized technical component of MI in smoking cessation treatment in more depth. Secondary analysis of 106 first treatment sessions, derived from the Swedish National Tobacco Quitline, and previously rated using the Motivational Interviewing Sequential Code for Observing Process Exchanges (MI-SCOPE) Coder's Manual and the Motivational Interviewing Treatment Integrity code (MITI) Manual, version 3.1. The outcome measure was self-reported 6-month continuous abstinence at 12-month follow-up. Sequential analyses indicated that clients were significantly more likely than expected by chance to argue for change (change talk) following MI-consistent behaviors and questions and reflections favoring change. Conversely, clients were more likely to argue against change (sustain talk) following questions and reflections favoring status-quo. Parallel mediation analysis revealed that a counselor technique (reflections of client sustain talk) had an indirect effect on smoking outcome at follow-up through client language mediators. The study makes a significant contribution to our understanding of how MI works in smoking cessation treatment and adds further empirical support for the hypothesized technical component in MI. The results emphasize the importance of counselors avoiding unintentional reinforcement of sustain talk and underline the need for a greater emphasis on the direction of questions and reflections in MI trainings and fidelity measures. Copyright © 2017 Elsevier Inc. All rights reserved.
2017-02-01
Ames Women's Influence Network (WIN) Hidden Figures talk with "Computers" Carolyn Hofstetter and Carol Mead co-sponsored by the AAAG. Jack Boyd talk of working in the same 6ft w.t. group as Carol Mead.
COMPUTATIONAL TOXICOLOGY-WHERE IS THE DATA? ...
This talk will briefly describe the state of the data world for computational toxicology and one approach to improve the situation, called ACToR (Aggregated Computational Toxicology Resource). This talk will briefly describe the state of the data world for computational toxicology and one approach to improve the situation, called ACToR (Aggregated Computational Toxicology Resource).
, agencies and professional associations. Introduction to metrology career day at St. Charles North High St. Charles North High School students to talk with them about metrology. Here, use a portable , a Fermilab metrology technical specialist, visited St. Charles North High School students to talk
The Future of the Andrew File System
Brashear, Drrick; Altman, Jeffry
2018-05-25
The talk will discuss the ten operational capabilities that have made AFS unique in the distributed file system space and how these capabilities are being expanded upon to meet the needs of the 21st century. Derrick Brashear and Jeffrey Altman will present a technical road map of new features and technical innovations that are under development by the OpenAFS community and Your File System, Inc. funded by a U.S. Department of Energy Small Business Innovative Research grant. The talk will end with a comparison of AFS to its modern days competitors.
The Future of the Andrew File System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brashear, Drrick; Altman, Jeffry
2011-02-23
The talk will discuss the ten operational capabilities that have made AFS unique in the distributed file system space and how these capabilities are being expanded upon to meet the needs of the 21st century. Derrick Brashear and Jeffrey Altman will present a technical road map of new features and technical innovations that are under development by the OpenAFS community and Your File System, Inc. funded by a U.S. Department of Energy Small Business Innovative Research grant. The talk will end with a comparison of AFS to its modern days competitors.
Development of a Computer-Based Measure of Listening Comprehension of Science Talk
ERIC Educational Resources Information Center
Lin, Sheau-Wen; Liu, Yu; Chen, Shin-Feng; Wang, Jing-Ru; Kao, Huey-Lien
2015-01-01
The purpose of this study was to develop a computer-based assessment for elementary school students' listening comprehension of science talk within an inquiry-oriented environment. The development procedure had 3 steps: a literature review to define the framework of the test, collecting and identifying key constructs of science talk, and…
Uses of Computer Simulation Models in Ag-Research and Everyday Life
USDA-ARS?s Scientific Manuscript database
When the news media talks about models they could be talking about role models, fashion models, conceptual models like the auto industry uses, or computer simulation models. A computer simulation model is a computer code that attempts to imitate the processes and functions of certain systems. There ...
2017-02-01
Ames Women's Influence Network (WIN) Hidden Figures talk with "Computers" Carolyn Hofstetter and Carol Mead co-sponsored by the AAAG. Left to right Computers Carolyn Hofstetter, Carol Mead and Jack Boyd
Proceedings of the 5. joint Russian-American computational mathematics conference
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1997-12-31
These proceedings contain a record of the talks presented and papers submitted by participants. The conference participants represented three institutions from the United States, Sandia National Laboratories (SNL), Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and two from Russia, Russian Federal Nuclear Center--All Russian Research Institute of Experimental Physics (RFNC-VNIIEF/Arzamas-16), and Russian Federal Nuclear Center--All Russian Research Institute of Technical Physics (RFNC-VNIITF/Chelyabinsk-70). The presentations and papers cover a wide range of applications from radiation transport to materials. Selected papers have been indexed separately for inclusion in the Energy Science and Technology Database.
The Science and Technology of the US National Missile Defense System
NASA Astrophysics Data System (ADS)
Postol, Theodore A.
2010-03-01
The National Missile Defense System utilizes UHF and X-band radars for search, track and discrimination, and interceptors that use long-wave infrared sensors to identify and home on attacking warheads. The radars and infrared sensors in the missile defense system perform at near the theoretical limits predicted by physics. However, in spite of the fantastic technical advances in sensor technology, signal processing, and computational support functions, the National Missile Defense System cannot be expected to ever work in realistic combat environments. This talk will describe why these impressive technologies can never deliver on the promise of a credible defense against long-range ballistic missiles.
Multi-scale soil salinity mapping and monitoring with proximal and remote sensing
USDA-ARS?s Scientific Manuscript database
This talk is part of a technical short course on “Soil mapping and process modelling at diverse scales”. In the talk, guidelines, special considerations, protocols, and strengths and limitations are presented for characterizing spatial and temporal variation in soil salinity at several spatial scale...
2017-02-01
Ames Women's Influence Network (WIN) Hidden Figures talk with "Computers" Carolyn Hofstetter and Carol Mead co-sponsored by the AAAG. Left to right Barbara Miller, Ames EEO, Computers Carolyn Hofstetter and Carol Mead
2017-02-01
Ames Women's Influence Network (WIN) Hidden Figures talk with "Computers" Carolyn Hofstetter and Carol Mead co-sponsored by the AAAG.. Left to right Barbara Miller, Ames EEO, Computers Carolyn Hofstetter and Carol Mead
Summary of talks and papers at ISCB-Asia/SCCG 2012
2013-01-01
The second ISCB-Asia conference of the International Society for Computational Biology took place December 17-19, 2012, in Shenzhen, China. The conference was co-hosted by BGI as the first Shenzhen Conference on Computational Genomics (SCCG). 45 talks were presented at ISCB-Asia/SCCG 2012. The topics covered included software tools, reproducible computing, next-generation sequencing data analysis, transcription and mRNA regulation, protein structure and function, cancer genomics and personalized medicine. Nine of the proceedings track talks are included as full papers in this supplement. In this report we first give a short overview of the conference by listing some statistics and visualizing the talk abstracts as word clouds. Then we group the talks by topic and briefly summarize each one, providing references to related publications whenever possible. Finally, we close with a few comments on the success of this conference.
Accountable Talk in Reading Comprehension Instruction. CSE Technical Report 670
ERIC Educational Resources Information Center
Wolf, Mikyung Kim; Crosson, Amy C.; Resnick, Lauren B.
2006-01-01
This study examined the relationship between the quality of classroom talk and academic rigor in reading comprehension lessons. In addition, the study aimed to characterize effective questions to support rigorous reading comprehension lessons. The data were collected as a part of the Instructional Quality Assessment (IQA) pilot. The IQA is a…
ERIC Educational Resources Information Center
Sullivan, Patricia; Moore, Kristen
2013-01-01
This article brings together the communication needs and positioning of women in technical areas, and asks "how can technical communication classes contribute to the mentoring of young women engineers at a time when many of those women want to be identified as engineers instead of being spotlighted as women in engineering?" Incorporating…
A Study of Young Children's Metaknowing Talk: Learning Experiences with Computers
ERIC Educational Resources Information Center
Choi, Ji-Young
2010-01-01
This research project was undertaken in a time of increasing emphasis on the exploration of young children's learning and thinking at computers. The purpose of this study was to describe and interpret the characteristics of metaknowing talk that occurred during learning experiences with computers in a kindergarten community of learners. This…
Brief, embedded, spontaneous metacognitive talk indicates thinking like a physicist
NASA Astrophysics Data System (ADS)
Sayre, Eleanor C.; Irving, Paul W.
2015-12-01
[This paper is part of the Focused Collection on Upper Division Physics Courses.] Instructors and researchers think "thinking like a physicist" is important for students' professional development. However, precise definitions and observational markers remain elusive. We reinterpret popular beliefs inventories in physics to indicate what physicists think thinking like a physicist entails. Through discourse analysis of upper-division students' speech in natural settings, we show that students may appropriate or resist these elements. We identify a new element in the physicist speech genre: brief, embedded, spontaneous metacognitive talk (BESM talk). BESM talk communicates students' in-the-moment enacted expectations about physics as a technical field and a cultural endeavor. Students use BESM talk to position themselves as physicists or nonphysicists. Students also use BESM talk to communicate their expectations in four ways: understanding, confusion, spotting inconsistencies, and generalized expectations.
17th Chromosome-Centric Human Proteome Project Symposium in Tehran.
Meyfour, Anna; Pahlavan, Sara; Sobhanian, Hamid; Salekdeh, Ghasem Hosseini
2018-04-01
This report describes the 17th Chromosome-Centric Human Proteome Project which was held in Tehran, Iran, April 27 and 28, 2017. A brief summary of the symposium's talks including new technical and computational approaches for the identification of novel proteins from non-coding genomic regions, physicochemical and biological causes of missing proteins, and the close interactions between Chromosome- and Biology/Disease-driven Human Proteome Project are presented. A synopsis of decisions made on the prospective programs to maintain collaborative works, share resources and information, and establishment of a newly organized working group, the task force for missing protein analysis are discussed. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Are Cloud Environments Ready for Scientific Applications?
NASA Astrophysics Data System (ADS)
Mehrotra, P.; Shackleford, K.
2011-12-01
Cloud computing environments are becoming widely available both in the commercial and government sectors. They provide flexibility to rapidly provision resources in order to meet dynamic and changing computational needs without the customers incurring capital expenses and/or requiring technical expertise. Clouds also provide reliable access to resources even though the end-user may not have in-house expertise for acquiring or operating such resources. Consolidation and pooling in a cloud environment allow organizations to achieve economies of scale in provisioning or procuring computing resources and services. Because of these and other benefits, many businesses and organizations are migrating their business applications (e.g., websites, social media, and business processes) to cloud environments-evidenced by the commercial success of offerings such as the Amazon EC2. In this paper, we focus on the feasibility of utilizing cloud environments for scientific workloads and workflows particularly of interest to NASA scientists and engineers. There is a wide spectrum of such technical computations. These applications range from small workstation-level computations to mid-range computing requiring small clusters to high-performance simulations requiring supercomputing systems with high bandwidth/low latency interconnects. Data-centric applications manage and manipulate large data sets such as satellite observational data and/or data previously produced by high-fidelity modeling and simulation computations. Most of the applications are run in batch mode with static resource requirements. However, there do exist situations that have dynamic demands, particularly ones with public-facing interfaces providing information to the general public, collaborators and partners, as well as to internal NASA users. In the last few months we have been studying the suitability of cloud environments for NASA's technical and scientific workloads. We have ported several applications to multiple cloud environments including NASA's Nebula environment, Amazon's EC2, Magellan at NERSC, and SGI's Cyclone system. We critically examined the performance of the applications on these systems. We also collected information on the usability of these cloud environments. In this talk we will present the results of our study focusing on the efficacy of using clouds for NASA's scientific applications.
text only NLC Home Page NLC Technical SLAC Sources Damping Rings S & L Band Linacs Engineering ; Presentations Injector System Documentation Talks and Presentations The NLC ZDR ISG Reports Sources Lasers Photocathodes Electron Source Laser Maintenance Facility Positron Source Sources Technical Notes Sources Meeting
ERIC Educational Resources Information Center
Murphy, Dennis T.
An empirical study of the way children talk about art experiences is described and the meaning of this talk in terms of the cognition it represents is investigated. The criteria serving as the basis for creation of content analysis categories are subject matter, sensory elements, formal properties, technical competence, expressive elements,…
NASA Technical Reports Server (NTRS)
Klopfer, Goetz H.
1995-01-01
This final report covers the work done on corporate agreement NCC2-616 over a period of 5 1/2 years. It is broken into three segments of approximately 1 1/2 to 2 years each. The report is a summary report and is not intended to be comprehensive of all the work done under this corporate agreement. A more complete coverage of the work done is obtained from the papers and reports listed in the 'Papers' section. Additional reporting of significant work was done through 'Technical Highlights' and 'Research and Technical Summaries'. A listing of copies are given in the 'Technical Highlights and R and T' section. The work was also reported in a series of seminars, conference meetings, branch reviews, workshops, and project reviews. A list of these talks is given in the 'Presentation' section. Also during this time three students ranging from high school to graduate level were supervised. A list of the students and the type of work accomplished is given in the 'Mentoring' section. The report concludes with the 'Appendices' sections which include the three papers produced during the last 1 1/2 years of this corporate agreement.
2017-02-01
Ames Women's Influence Network (WIN) Hidden Figures talk with "Computers" Carolyn Hofsetter and Carol Mead co-sponsored by the AAAG. clockwise Jack Boyd, Miss Mead daughter of Carol Mead, Carol Mead and Carolyn Hofstetter
ERIC Educational Resources Information Center
Fischman, Josh
2007-01-01
In this article, the author talks about Classroom Presenter, a computer program that aids in student participation during class discussions and makes boring lectures more interactive. The program was created by Richard J. Anderson, a professor of computer science at the University of Washington, in Seattle. Classroom Presenter is now in use in…
ERIC Educational Resources Information Center
Lin, Sheau-Wen; Liu, Yu; Chen, Shin-Feng; Wang, Jing-Ru; Kao, Huey-Lien
2016-01-01
The purpose of this study was to develop a computer-based measure of elementary students' science talk and to report students' benchmarks. The development procedure had three steps: defining the framework of the test, collecting and identifying key reference sets of science talk, and developing and verifying the science talk instrument. The…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bennett, Janine Camille; Day, David Minot; Mitchell, Scott A.
This report summarizes the Combinatorial Algebraic Topology: software, applications & algorithms workshop (CAT Workshop). The workshop was sponsored by the Computer Science Research Institute of Sandia National Laboratories. It was organized by CSRI staff members Scott Mitchell and Shawn Martin. It was held in Santa Fe, New Mexico, August 29-30. The CAT Workshop website has links to some of the talk slides and other information, http://www.cs.sandia.gov/CSRI/Workshops/2009/CAT/index.html. The purpose of the report is to summarize the discussions and recap the sessions. There is a special emphasis on technical areas that are ripe for further exploration, and the plans for follow-up amongstmore » the workshop participants. The intended audiences are the workshop participants, other researchers in the area, and the workshop sponsors.« less
2006-03-24
KENNEDY SPACE CENTER, FLA. -- Kennedy Space Center Deputy Director Bill Parsons talks to guests at a ribbon-cutting ceremony for the Operations Support Building II (behind him). He and other key Center personnel and guests attended the significant event. The Operations Support Building II is an Agency safety and health initiative project to replace 198,466 square feet of substandard modular housing and trailers in the Launch Complex 39 area at Kennedy Space Center. The five-story building, which sits south of the Vehicle Assembly Building and faces the launch pads, includes 960 office spaces, 16 training rooms, computer and multimedia conference rooms, a Mission Conference Center with an observation deck, technical libraries, an Exchange store, storage, break areas, and parking. Photo credit: NASA/George Shelton
NASA Technical Reports Server (NTRS)
2004-01-01
KENNEDY SPACE CENTER, FLA. Astronaut Tim Kopra (second from right) talks with workers in the Space Station Processing Facility about the Intravehicular Activity (IVA) constraints testing on the Italian-built Node 2, a future element of the International Space Station. . The second of three Station connecting modules, the Node 2 attaches to the end of the U.S. Lab and provides attach locations for several other elements. Kopra is currently assigned technical duties in the Space Station Branch of the Astronaut Office, where his primary focus involves the testing of crew interfaces for two future ISS modules as well as the implementation of support computers and operational Local Area Network on ISS. Node 2 is scheduled to launch on mission STS-120, Station assembly flight 10A.
Research Projects, Technical Reports and Publications
NASA Technical Reports Server (NTRS)
Oliger, Joseph
1996-01-01
The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Advanced Methods for Scientific Computing High Performance Networks During this report pefiod Professor Antony Jameson of Princeton University, Professor Wei-Pai Tang of the University of Waterloo, Professor Marsha Berger of New York University, Professor Tony Chan of UCLA, Associate Professor David Zingg of University of Toronto, Canada and Assistant Professor Andrew Sohn of New Jersey Institute of Technology have been visiting RIACS. January 1, 1996 through September 30, 1996 RIACS had three staff scientists, four visiting scientists, one post-doctoral scientist, three consultants, two research associates and one research assistant. RIACS held a joint workshop with Code 1 29-30 July 1996. The workshop was held to discuss needs and opportunities in basic research in computer science in and for NASA applications. There were 14 talks given by NASA, industry and university scientists and three open discussion sessions. There were approximately fifty participants. A proceedings is being prepared. It is planned to have similar workshops on an annual basis. RIACS technical reports are usually preprints of manuscripts that have been submitted to research 'ournals or conference proceedings. A list of these reports for the period January i 1, 1996 through September 30, 1996 is in the Reports and Abstracts section of this report.
ERIC Educational Resources Information Center
Symons, Duncan; Pierce, Robyn
2015-01-01
In this study we examine the use of cumulative and exploratory talk types in a year 5 computer supported collaborative learning environment. The focus for students in this environment was to participate in mathematical problem solving, with the intention of developing the proficiencies of problem solving and reasoning. Findings suggest that…
Classroom Talk and Computational Thinking
ERIC Educational Resources Information Center
Jenkins, Craig W.
2017-01-01
This paper is part of a wider action research project taking place at a secondary school in South Wales, UK. The overarching aim of the project is to examine the potential for aspects of literacy and computational thinking to be developed using extensible 'build your own block' programming activities. This paper examines classroom talk at an…
CrossTalk: The Journal of Defense Software Engineering. Volume 21, Number 5
2008-05-01
per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing...and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information...the publisher of CrossTalk, providing both editorial oversight and technical review of the journal.CrossTalk’s mission is to encourage the engineering
Number Talks Build Numerical Reasoning
ERIC Educational Resources Information Center
Parrish, Sherry D.
2011-01-01
"Classroom number talks," five- to fifteen-minute conversations around purposefully crafted computation problems, are a productive tool that can be incorporated into classroom instruction to combine the essential processes and habits of mind of doing math. During number talks, students are asked to communicate their thinking when presenting and…
Cross-Talk in Superconducting Transmon Quantum Computing Architecture
NASA Astrophysics Data System (ADS)
Abraham, David; Chow, Jerry; Corcoles, Antonio; Rothwell, Mary; Keefe, George; Gambetta, Jay; Steffen, Matthias; IBM Quantum Computing Team
2013-03-01
Superconducting transmon quantum computing test structures often exhibit significant undesired cross-talk. For experiments with only a handful of qubits this cross-talk can be quantified and understood, and therefore corrected. As quantum computing circuits become more complex, and thereby contain increasing numbers of qubits and resonators, it becomes more vital that the inadvertent coupling between these elements is minimized. The task of accurately controlling each single qubit to the level of precision required throughout the realization of a quantum algorithm is difficult by itself, but coupled with the need of nulling out leakage signals from neighboring qubits or resonators would quickly become impossible. We discuss an approach to solve this critical problem. We acknowledge support from IARPA under contract W911NF-10-1-0324.
Summary Report of Working Group 2: Computation
NASA Astrophysics Data System (ADS)
Stoltz, P. H.; Tsung, R. S.
2009-01-01
The working group on computation addressed three physics areas: (i) plasma-based accelerators (laser-driven and beam-driven), (ii) high gradient structure-based accelerators, and (iii) electron beam sources and transport [1]. Highlights of the talks in these areas included new models of breakdown on the microscopic scale, new three-dimensional multipacting calculations with both finite difference and finite element codes, and detailed comparisons of new electron gun models with standard models such as PARMELA. The group also addressed two areas of advances in computation: (i) new algorithms, including simulation in a Lorentz-boosted frame that can reduce computation time orders of magnitude, and (ii) new hardware architectures, like graphics processing units and Cell processors that promise dramatic increases in computing power. Highlights of the talks in these areas included results from the first large-scale parallel finite element particle-in-cell code (PIC), many order-of-magnitude speedup of, and details of porting the VPIC code to the Roadrunner supercomputer. The working group featured two plenary talks, one by Brian Albright of Los Alamos National Laboratory on the performance of the VPIC code on the Roadrunner supercomputer, and one by David Bruhwiler of Tech-X Corporation on recent advances in computation for advanced accelerators. Highlights of the talk by Albright included the first one trillion particle simulations, a sustained performance of 0.3 petaflops, and an eight times speedup of science calculations, including back-scatter in laser-plasma interaction. Highlights of the talk by Bruhwiler included simulations of 10 GeV accelerator laser wakefield stages including external injection, new developments in electromagnetic simulations of electron guns using finite difference and finite element approaches.
Summary Report of Working Group 2: Computation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stoltz, P. H.; Tsung, R. S.
2009-01-22
The working group on computation addressed three physics areas: (i) plasma-based accelerators (laser-driven and beam-driven), (ii) high gradient structure-based accelerators, and (iii) electron beam sources and transport [1]. Highlights of the talks in these areas included new models of breakdown on the microscopic scale, new three-dimensional multipacting calculations with both finite difference and finite element codes, and detailed comparisons of new electron gun models with standard models such as PARMELA. The group also addressed two areas of advances in computation: (i) new algorithms, including simulation in a Lorentz-boosted frame that can reduce computation time orders of magnitude, and (ii) newmore » hardware architectures, like graphics processing units and Cell processors that promise dramatic increases in computing power. Highlights of the talks in these areas included results from the first large-scale parallel finite element particle-in-cell code (PIC), many order-of-magnitude speedup of, and details of porting the VPIC code to the Roadrunner supercomputer. The working group featured two plenary talks, one by Brian Albright of Los Alamos National Laboratory on the performance of the VPIC code on the Roadrunner supercomputer, and one by David Bruhwiler of Tech-X Corporation on recent advances in computation for advanced accelerators. Highlights of the talk by Albright included the first one trillion particle simulations, a sustained performance of 0.3 petaflops, and an eight times speedup of science calculations, including back-scatter in laser-plasma interaction. Highlights of the talk by Bruhwiler included simulations of 10 GeV accelerator laser wakefield stages including external injection, new developments in electromagnetic simulations of electron guns using finite difference and finite element approaches.« less
Recent Work in Hybrid Radiation Transport Methods with Applications to Commercial Nuclear Power
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kulesza, Joel A.
This talk will begin with an overview of hybrid radiation transport methods followed by a discussion of the author’s work to advance current capabilities. The talk will then describe applications for these methods in commercial nuclear power reactor analyses and techniques for experimental validation. When discussing these analytical and experimental activities, the importance of technical standards such as those created and maintained by ASTM International will be demonstrated.
Arntzen, Erik; Halstadtro, Lill-Beathe; Halstadtro, Monica
2009-01-01
The purpose of the study was to extend the literature on verbal self-regulation by using the “silent dog” method to evaluate the role of verbal regulation over nonverbal behavior in 2 individuals with autism. Participants were required to talk-aloud while performing functional computer tasks.Then the effects of distracters with increasing demands on target behavior were evaluated as well as whether self-talk emitted by Participant 1 could be used to alter Participant 2's performance. Results suggest that participants' tasks seemed to be under control of self-instructions, and the rules generated from Participants 1's self-talk were effective in teaching computer skills to Participant 2. The silent dog method was useful in evaluating the possible role of self-generated rules in teaching computer skills to participants with autism. PMID:22477428
Arntzen, Erik; Halstadtro, Lill-Beathe; Halstadtro, Monica
2009-01-01
The purpose of the study was to extend the literature on verbal self-regulation by using the "silent dog" method to evaluate the role of verbal regulation over nonverbal behavior in 2 individuals with autism. Participants were required to talk-aloud while performing functional computer tasks.Then the effects of distracters with increasing demands on target behavior were evaluated as well as whether self-talk emitted by Participant 1 could be used to alter Participant 2's performance. Results suggest that participants' tasks seemed to be under control of self-instructions, and the rules generated from Participants 1's self-talk were effective in teaching computer skills to Participant 2. The silent dog method was useful in evaluating the possible role of self-generated rules in teaching computer skills to participants with autism.
Formulaic Language in Computer-Supported Communication: Theory Meets Reality.
ERIC Educational Resources Information Center
Wray, Alison
2002-01-01
Attempts to validate a psycholinguistic model of language processing. One experiment designed to provide insight into the model is TALK, is a system developed to promote conversational fluency in non-speaking individuals. TALK, designed primarily for people with cerebral palsy and motor neuron disease. Talk is demonstrated to be a viable tool for…
75 FR 37783 - DOE/NSF Nuclear Science Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-30
... Science Foundation's Nuclear Physics Office. Technical Talk on Deep Underground Science and Engineering... Energy's Office of Nuclear Physics Web site for viewing. Rachel Samuel, Deputy Committee Management...
Second Annual Research Center for Optical Physics (RCOP) Forum
NASA Technical Reports Server (NTRS)
Allario, Frank (Editor); Temple, Doyle (Editor)
1995-01-01
The Research Center for Optical Physics (RCOP) held its Second Annual Forum on September 23-24, 1994. The forum consisted of two days of technical sessions with invited talks, submitted talks, and a student poster session. Participants in the technical sessions included students and researchers from CCNY/CUNY, Fisk University, Georgia Institute of Technology, Hampton University, University of Maryland, the Univeristy of Michigan, NASA Langley Research Center, North Caroline A and T University, Steven's Institute of Technology, and NAWC-Warminster. Topics included chaotic lasers, pumped optical filters, nonlinear responses in polythiophene and thiophene based thin films, crystal growth and spectroscopy, laser-induced photochromic centers, raman scattering in phorphyrin, superradiance, doped fluoride crystals, luminescence of terbium in silicate glass, and radiative and nonradiative transitions in rare-earth ions.
Quantum Computation: Entangling with the Future
NASA Technical Reports Server (NTRS)
Jiang, Zhang
2017-01-01
Commercial applications of quantum computation have become viable due to the rapid progress of the field in the recent years. Efficient quantum algorithms are discovered to cope with the most challenging real-world problems that are too hard for classical computers. Manufactured quantum hardware has reached unprecedented precision and controllability, enabling fault-tolerant quantum computation. Here, I give a brief introduction on what principles in quantum mechanics promise its unparalleled computational power. I will discuss several important quantum algorithms that achieve exponential or polynomial speedup over any classical algorithm. Building a quantum computer is a daunting task, and I will talk about the criteria and various implementations of quantum computers. I conclude the talk with near-future commercial applications of a quantum computer.
Long-term Stable Conservative Multiscale Methods for Vortex Flows
2017-10-31
Computational and Applied Mathematics and Engeneering, Eccomas 2016 (Crete, June, 2016) - M. A. Olshanskii, Scientific computing seminar of Math ...UMass Dartmouth (October 2015) - L. Rebholz, Applied Math Seminar Talk, University of Alberta (October 2015) - L. Rebholz, Colloquium Talk, Scientific...Colloquium, (November 2016) - L. Rebholz, Joint Math Meetings 2017, Special session on recent advances in numerical analysis of PDEs, Atlanta GA
Rouf, Emran; Whittle, Jeff; Lu, Na; Schwartz, Mark D
2007-01-01
The use of electronic medical records can improve the technical quality of care, but requires a computer in the exam room. This could adversely affect interpersonal aspects of care, particularly when physicians are inexperienced users of exam room computers. To determine whether physician experience modifies the impact of exam room computers on the physician-patient interaction. Cross-sectional surveys of patients and physicians. One hundred fifty five adults seen for scheduled visits by 11 faculty internists and 12 internal medicine residents in a VA primary care clinic. Physician and patient assessment of the effect of the computer on the clinical encounter. Patients seeing residents, compared to those seeing faculty, were more likely to agree that the computer adversely affected the amount of time the physician spent talking to (34% vs 15%, P = 0.01), looking at (45% vs 24%, P = 0.02), and examining them (32% vs 13%, P = 0.009). Moreover, they were more likely to agree that the computer made the visit feel less personal (20% vs 5%, P = 0.017). Few patients thought the computer interfered with their relationship with their physicians (8% vs 8%). Residents were more likely than faculty to report these same adverse effects, but these differences were smaller and not statistically significant. Patients seen by residents more often agreed that exam room computers decreased the amount of interpersonal contact. More research is needed to elucidate key tasks and behaviors that facilitate doctor-patient communication in such a setting.
Videoconferencing via satellite. Opening Congress to the people: Technical report
NASA Technical Reports Server (NTRS)
Wood, F. B.; Coates, V. T.; Chartrand, R. L.; Ericson, R. F.
1978-01-01
The feasibility of using satellite videoconferencing as a mechanism for informed dialogue between Congressmen and constituents to strengthen the legislative process was evaluated. Satellite videoconferencing was defined as a two-way interactive television with the TV signals transmitted by satellite. With videoconferencing, one or more Congressmen in Washington, D. C. can see, hear and talk with groups of citizens at distant locations around the country. Simultaneously, the citizens can see, hear and talk with the Congressmen.
The Use of Non-Specific Comments in a Conversation Aid for Non-Speaking People.
ERIC Educational Resources Information Center
Todman, John; Morrison, Zara
1995-01-01
TALK (Talk Aid using pre-Loaded Knowledge) is a computer system linked to a speech synthesizer which enables nonspeaking people to engage in real-time social conversation. TALK provides categories of general comments that can be used whenever a suitable specific response is unavailable. Results are reported of a study evaluating effectiveness of…
Intelligent Computer-Aided Instruction and Musical Performance Skills. CITE Report No. 18.
ERIC Educational Resources Information Center
Baker, Michael
This paper is a transcription from memory of a short talk that used overhead projector slides, with musical examples played on an Apple Macintosh computer and a Yamaha CX5 synthesizer. The slides appear in the text as reduced "icons" at the point where they would have been used in the talk. The paper concerns ways in which artificial intelligence…
Acceptability of the Talking Touchscreen for Health Literacy Assessment
Yost, Kathleen J.; Webster, Kimberly; Baker, David W.; Jacobs, Elizabeth A.; Anderson, Andy; Hahn, Elizabeth A.
2012-01-01
Self-administration of a multimedia health literacy measure in clinic settings is a novel concept. Demonstrated ease of use and acceptability will help predicate the future value of this strategy. We previously demonstrated the acceptability of a “Talking Touchscreen” for health status assessment. For this study, we adapted the touchscreen for self-administration of a new health literacy measure. Primary care patients (n=610) in clinics for underserved populations completed health status and health literacy questions on the Talking Touchscreen and participated in an interview. Participants were 51% female, 10% age 60+, 67% African American, 18% without a high school education, and 14% who had never used a computer. The majority (93%) had no difficulty using the touchscreen, including those who were computer-naïve (87%). Most rated the screen design as very good or excellent (72%), including computer-naïve patients (71%) and older patients (75%). Acceptability of the touchscreen did not differ by health literacy level. The Talking Touchscreen was easy to use and acceptable for self-administration of a new health literacy measure. Self-administration should reduce staff burden and costs, interview bias, and feelings of embarrassment by those with lower literacy. Tools like the Talking Touchscreen may increase exposure of underserved populations to new technologies. PMID:20845195
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bai, Zhaojun; Yang, Chao
What is common among electronic structure calculation, design of MEMS devices, vibrational analysis of high speed railways, and simulation of the electromagnetic field of a particle accelerator? The answer: they all require solving large scale nonlinear eigenvalue problems. In fact, these are just a handful of examples in which solving nonlinear eigenvalue problems accurately and efficiently is becoming increasingly important. Recognizing the importance of this class of problems, an invited minisymposium dedicated to nonlinear eigenvalue problems was held at the 2005 SIAM Annual Meeting. The purpose of the minisymposium was to bring together numerical analysts and application scientists to showcasemore » some of the cutting edge results from both communities and to discuss the challenges they are still facing. The minisymposium consisted of eight talks divided into two sessions. The first three talks focused on a type of nonlinear eigenvalue problem arising from electronic structure calculations. In this type of problem, the matrix Hamiltonian H depends, in a non-trivial way, on the set of eigenvectors X to be computed. The invariant subspace spanned by these eigenvectors also minimizes a total energy function that is highly nonlinear with respect to X on a manifold defined by a set of orthonormality constraints. In other applications, the nonlinearity of the matrix eigenvalue problem is restricted to the dependency of the matrix on the eigenvalues to be computed. These problems are often called polynomial or rational eigenvalue problems In the second session, Christian Mehl from Technical University of Berlin described numerical techniques for solving a special type of polynomial eigenvalue problem arising from vibration analysis of rail tracks excited by high-speed trains.« less
Computational Aspects of Data Assimilation and the ESMF
NASA Technical Reports Server (NTRS)
daSilva, A.
2003-01-01
The scientific challenge of developing advanced data assimilation applications is a daunting task. Independently developed components may have incompatible interfaces or may be written in different computer languages. The high-performance computer (HPC) platforms required by numerically intensive Earth system applications are complex, varied, rapidly evolving and multi-part systems themselves. Since the market for high-end platforms is relatively small, there is little robust middleware available to buffer the modeler from the difficulties of HPC programming. To complicate matters further, the collaborations required to develop large Earth system applications often span initiatives, institutions and agencies, involve geoscience, software engineering, and computer science communities, and cross national borders.The Earth System Modeling Framework (ESMF) project is a concerted response to these challenges. Its goal is to increase software reuse, interoperability, ease of use and performance in Earth system models through the use of a common software framework, developed in an open manner by leaders in the modeling community. The ESMF addresses the technical and to some extent the cultural - aspects of Earth system modeling, laying the groundwork for addressing the more difficult scientific aspects, such as the physical compatibility of components, in the future. In this talk we will discuss the general philosophy and architecture of the ESMF, focussing on those capabilities useful for developing advanced data assimilation applications.
Teaching Electronic Health Record Communication Skills.
Palumbo, Mary Val; Sandoval, Marie; Hart, Vicki; Drill, Clarissa
2016-06-01
This pilot study investigated nurse practitioner students' communication skills when utilizing the electronic health record during history taking. The nurse practitioner students (n = 16) were videotaped utilizing the electronic health record while taking health histories with standardized patients. The students were videotaped during two separate sessions during one semester. Two observers recorded the time spent (1) typing and talking, (2) typing only, and (3) looking at the computer without talking. Total history taking time, computer placement, and communication skills were also recorded. During the formative session, mean history taking time was 11.4 minutes, with 3.5 minutes engaged with the computer (30.6% of visit). During the evaluative session, mean history taking time was 12.4 minutes, with 2.95 minutes engaged with the computer (24% of visit). The percentage of time individuals spent changed over the two visits: typing and talking, -3.1% (P = .3); typing only, +12.8% (P = .038); and looking at the computer, -9.6% (P = .039). This study demonstrated that time spent engaged with the computer during a patient encounter does decrease with student practice and education. Therefore, students benefit from instruction on electronic health record-specific communication skills, and use of a simple mnemonic to reinforce this is suggested.
The Third NASA Goddard Conference on Mass Storage Systems and Technologies
NASA Technical Reports Server (NTRS)
Kobler, Benjamin (Editor); Hariharan, P. C. (Editor)
1993-01-01
This report contains copies of nearly all of the technical papers and viewgraphs presented at the Goddard Conference on Mass Storage Systems and Technologies held in October 1993. The conference served as an informational exchange forum for topics primarily relating to the ingestion and management of massive amounts of data and the attendant problems involved. Discussion topics include the necessary use of computers in the solution of today's infinitely complex problems, the need for greatly increased storage densities in both optical and magnetic recording media, currently popular storage media and magnetic media storage risk factors, data archiving standards including a talk on the current status of the IEEE Storage Systems Reference Model (RM). Additional topics addressed System performance, data storage system concepts, communications technologies, data distribution systems, data compression, and error detection and correction.
GRACE Follow-On and Potential Successors: Mission Options for the Upcoming Decadaes
NASA Astrophysics Data System (ADS)
Watkins, M. M.
2017-12-01
The GRACE Follow-On mission is currently scheduled to launch in the first quarter of 2018, providing a successor to GRACE for ongoing critical measurements of Earth's time varying mass distribution for the coming decade. As noted in the literature (and in several talks in this session), there are also several possible mission proposals and technologies to either augment or succeed GRACE FO for extended and improved measurements. Various scientific, programmatic, and technical issues drive each of these potential missions and these factors will be important in determining which will ultimately be selected for flight. These issues include accuracy requirements based on science goals, technical maturity, cost, and international partnership options. In this talk, we will provide a final detailed update before launch on the GRACE Follow-On status and expectations, and we will outline several of the key options for future missions after GRACE FO.
CrossTalk: The Journal of Defense Software Engineering. Volume 20, Number 3, March 2007
2007-03-01
Capability Maturity Model ® Integration (CMMI®). CMU Software Engineering Institute <www.sei.cmu.edu/cmmi>. 5. ISO /IEC 27001 :2005. Information Security...international standards bodies – International Organization for Standardi- zation ( ISO ) and International Electro- technical Commission (IEC) – are working on a...number of projects that affect soft- ware security: • The ISO Technical Management Board (TMB) performs strategic planning and coordination for ISO
Countermeasures to the US National Missile Defense
NASA Astrophysics Data System (ADS)
Gronlund, Lisbeth
2001-04-01
One of the key technical questions about national missile defenses is whether they can be expected to work under real-world conditions if the attacker takes steps to defeat the defense. This talk will discuss steps that an emerging missile state could take to confuse, overwhelm, or otherwise defeat the planned US NMD system developed by the Clinton administration. It will consider three such ``countermeasures" that would be within the technical capability of a state that could develop and deploy a long-range missile capable of reaching the United States, which is the threat the NMD system is intended to defend against. The talk will be based on the April 2000 report ``Countermeasures: A Technical Evaluation of the Operational Effectiveness of the Planned US National Missile Defense System," which was co-authored by the speaker and 10 other physicists and engineers. Although the talk will refer to the ground-based NMD system under development, the conclusions are applicable to any mid-course NMD system using hit-to-kill infrared-homing interceptors, regardless of their basing mode. The three countermeasures considered are: (1) biological weapons deployed on 100 or more small bomblets, or submunitions, that would be released shortly after the boost phase; (2) nuclear warheads with anti-simulation balloon decoys, in which the attacker disguises the warhead by enclosing it in an aluminum-coated mylar balloon and releasing it along with a large number of otherwise similar but empty balloons; and (3) nuclear warheads with cooled shrouds, in which the attacker foils the kill vehicle's homing process by covering each nuclear warhead with a double-walled cone containing liquid nitrogen.
The Montreal Protocol treaty and its illuminating history of science-policy decision-making
NASA Astrophysics Data System (ADS)
Grady, C.
2017-12-01
The Montreal Protocol on Substances that Deplete the Ozone Layer, hailed as one of the most effective environmental treaties of all time, has a thirty year history of science-policy decision-making. The partnership between Parties to the Montreal Protocol and its technical assessment panels serve as a basis for understanding successes and evaluating stumbles of global environmental decision-making. Real-world environmental treaty negotiations can be highly time-sensitive, politically motivated, and resource constrained thus scientists and policymakers alike are often unable to confront the uncertainties associated with the multitude of choices. The science-policy relationship built within the framework of the Montreal Protocol has helped constrain uncertainty and inform policy decisions but has also highlighted the limitations of the use of scientific understanding in political decision-making. This talk will describe the evolution of the scientist-policymaker relationship over the history of the Montreal Protocol. Examples will illustrate how the Montreal Protocol's technical panels inform decisions of the country governments and will characterize different approaches pursued by different countries with a particular focus on the recently adopted Kigali Amendment. In addition, this talk will take a deeper dive with an analysis of the historic technical panel assessments on estimating financial resources necessary to enable compliance to the Montreal Protocol compared to the political financial decisions made through the Protocol's Multilateral Fund replenishment negotiation process. Finally, this talk will describe the useful lessons and challenges from these interactions and how they may be applicable in other environmental management frameworks across multiple scales under changing climatic conditions.
How to Structure University/Industry Cooperation for Maximum Mutual Benefit
NASA Astrophysics Data System (ADS)
Sommer, Klaus H.
2000-03-01
Research in the technical industries has changed dramatically in the past twenty years. As part of the change, many companies have shifted their long-term research from within company labs to university labs using a variety of mechanisms for such "cooperations." This talk focuses on how Bayer Corporation uses contract research, unrestricted funds, consortia, and government contracts to supplement in-house research programs. The talk emphasizes the importance of careful tailoring of these mechanisms in order to achieve maximum success for both the company and its university partners.
Building Science-Relevant Literacy with Technical Writing in High School
DOE Office of Scientific and Technical Information (OSTI.GOV)
Girill, T R
2006-06-02
By drawing on the in-class work of an on-going literacy outreach project, this paper explains how well-chosen technical writing activities can earn time in high-school science courses by enabling underperforming students (including ESL students) to learn science more effectively. We adapted basic research-based text-design and usability techniques into age-appropriate exercises and cases using the cognitive apprenticeship approach. This enabled high-school students, aided by explicit guidelines, to build their cognitive maturity, learn how to craft good instructions and descriptions, and apply those skills to better note taking and technical talks in their science classes.
Screen and nonscreen sedentary behavior and sleep in adolescents.
Brunetti, Vanessa C; O'Loughlin, Erin K; O'Loughlin, Jennifer; Constantin, Evelyn; Pigeon, Étienne
2016-12-01
This study examined the associations between screen (computer, videogame, TV) and nonscreen (talking on the phone, doing homework, reading) sedentary time, and sleep in adolescents. Data were drawn from AdoQuest, a prospective investigation of 1843 grade 5 students aged 10-12 years at inception in the greater Montreal (Canada) area. Data for this cross-sectional analysis on screen and nonscreen sedentary time, sleep duration, and daytime sleepiness were collected in 2008-2009 from 1233 participants (67% of 1843) aged 14-16 years. Computer and videogame use >2 hours per day was associated with 17 and 11 fewer minutes of sleep per night, respectively. Computer use and talking on the phone were both associated with being a short sleeper (<8 hour per night) (odds ratio =2.2 [1.4-3.4] and 3.0 [1.5-6.2], respectively), whereas TV time was protective (odds ratio=0.5 [0.3-0.8]). Participants who reported >2 hours of computer use or talking on the phone per day had higher daytime sleepiness scores (11.9 and 13.9, respectively) than participants who reported d2 hours per day (9.7 and 10.3, respectively). Computer use and time spent talking on the phone are associated with short sleep and more daytime sleepiness in adolescents. Videogame time is also associated with less sleep. Clinicians, parents, and adolescents should be made aware that sedentary behavior and especially screen-related sedentary behavior may affect sleep duration negatively and is possibly associated with daytime sleepiness. Copyright \\© 2016 National Sleep Foundation. Published by Elsevier Inc. All rights reserved.
2017-02-01
Ames Women's Influence Network (WIN) Hidden Figures talk with "Computers" Carolyn Hofstetter and Carol Mead co-sponsored by the AAAG. Group photo Front Row left to right; Carolyn Hofstetter, Jack Boyd, Carol Mead Middle Row: Kathy Lee, Annette Randall, Trincella Lewis, Ann Mead (daughter to Carol Mead), Vanessa Kuroda, Netti Halcomb Roozeboom Back Row; Dr Barbara Miller, Dr Wendy Okolo, Denise Snow, Leedjia Svec, Erika Rodriquez, Rhonda Baker, Ray Gilstrap, Glenn Bugos
NASA Astrophysics Data System (ADS)
Spinney, Laura
2017-09-01
Computer scientist Luc Steels uses artificial intelligence to explore the origins and evolution of language. He is best known for his 1999-2001 Talking Heads Experiment, in which robots had to construct a language from scratch to communicate with each other. Now Steels, who works at the Free University of Brussels (VUB), has composed an opera based on the legend of Faust, with a twenty-first-century twist. He talks about Mozart as a nascent computer programmer, how music maps onto language, and the blurred boundaries of a digitized world.
48 CFR 252.227-7027 - Deferred ordering of technical data or computer software.
Code of Federal Regulations, 2013 CFR
2013-10-01
... technical data or computer software. 252.227-7027 Section 252.227-7027 Federal Acquisition Regulations... data or computer software. As prescribed at 227.7103-8(b), use the following clause: Deferred Ordering of Technical Data or Computer Software (APR 1988) In addition to technical data or computer software...
48 CFR 252.227-7027 - Deferred ordering of technical data or computer software.
Code of Federal Regulations, 2011 CFR
2011-10-01
... technical data or computer software. 252.227-7027 Section 252.227-7027 Federal Acquisition Regulations... data or computer software. As prescribed at 227.7103-8(b), use the following clause: Deferred Ordering of Technical Data or Computer Software (APR 1988) In addition to technical data or computer software...
48 CFR 252.227-7027 - Deferred ordering of technical data or computer software.
Code of Federal Regulations, 2012 CFR
2012-10-01
... technical data or computer software. 252.227-7027 Section 252.227-7027 Federal Acquisition Regulations... data or computer software. As prescribed at 227.7103-8(b), use the following clause: Deferred Ordering of Technical Data or Computer Software (APR 1988) In addition to technical data or computer software...
48 CFR 252.227-7027 - Deferred ordering of technical data or computer software.
Code of Federal Regulations, 2014 CFR
2014-10-01
... technical data or computer software. 252.227-7027 Section 252.227-7027 Federal Acquisition Regulations... data or computer software. As prescribed at 227.7103-8(b), use the following clause: Deferred Ordering of Technical Data or Computer Software (APR 1988) In addition to technical data or computer software...
48 CFR 252.227-7027 - Deferred ordering of technical data or computer software.
Code of Federal Regulations, 2010 CFR
2010-10-01
... technical data or computer software. 252.227-7027 Section 252.227-7027 Federal Acquisition Regulations... data or computer software. As prescribed at 227.7103-8(b), use the following clause: Deferred Ordering of Technical Data or Computer Software (APR 1988) In addition to technical data or computer software...
The graphical brain: Belief propagation and active inference
Friston, Karl J.; Parr, Thomas; de Vries, Bert
2018-01-01
This paper considers functional integration in the brain from a computational perspective. We ask what sort of neuronal message passing is mandated by active inference—and what implications this has for context-sensitive connectivity at microscopic and macroscopic levels. In particular, we formulate neuronal processing as belief propagation under deep generative models. Crucially, these models can entertain both discrete and continuous states, leading to distinct schemes for belief updating that play out on the same (neuronal) architecture. Technically, we use Forney (normal) factor graphs to elucidate the requisite message passing in terms of its form and scheduling. To accommodate mixed generative models (of discrete and continuous states), one also has to consider link nodes or factors that enable discrete and continuous representations to talk to each other. When mapping the implicit computational architecture onto neuronal connectivity, several interesting features emerge. For example, Bayesian model averaging and comparison, which link discrete and continuous states, may be implemented in thalamocortical loops. These and other considerations speak to a computational connectome that is inherently state dependent and self-organizing in ways that yield to a principled (variational) account. We conclude with simulations of reading that illustrate the implicit neuronal message passing, with a special focus on how discrete (semantic) representations inform, and are informed by, continuous (visual) sampling of the sensorium. Author Summary This paper considers functional integration in the brain from a computational perspective. We ask what sort of neuronal message passing is mandated by active inference—and what implications this has for context-sensitive connectivity at microscopic and macroscopic levels. In particular, we formulate neuronal processing as belief propagation under deep generative models that can entertain both discrete and continuous states. This leads to distinct schemes for belief updating that play out on the same (neuronal) architecture. Technically, we use Forney (normal) factor graphs to characterize the requisite message passing, and link this formal characterization to canonical microcircuits and extrinsic connectivity in the brain. PMID:29417960
48 CFR 252.227-7016 - Rights in bid or proposal information.
Code of Federal Regulations, 2014 CFR
2014-10-01
... terms “technical data” and “computer software” are defined in the Rights in Technical Data—Noncommercial... delivery of technical data, the term “computer software” is defined in the Rights in Noncommercial Computer... Research Program, the Rights in Noncommercial Technical Data and Computer Software—Small Business...
48 CFR 252.227-7016 - Rights in bid or proposal information.
Code of Federal Regulations, 2010 CFR
2010-10-01
... terms “technical data” and “computer software” are defined in the Rights in Technical Data—Noncommercial... delivery of technical data, the term “computer software” is defined in the Rights in Noncommercial Computer... Research Program, the Rights in Noncommercial Technical Data and Computer Software—Small Business...
48 CFR 252.227-7016 - Rights in bid or proposal information.
Code of Federal Regulations, 2011 CFR
2011-10-01
... terms “technical data” and “computer software” are defined in the Rights in Technical Data—Noncommercial... delivery of technical data, the term “computer software” is defined in the Rights in Noncommercial Computer... Research Program, the Rights in Noncommercial Technical Data and Computer Software—Small Business...
48 CFR 252.227-7016 - Rights in bid or proposal information.
Code of Federal Regulations, 2013 CFR
2013-10-01
... terms “technical data” and “computer software” are defined in the Rights in Technical Data—Noncommercial... delivery of technical data, the term “computer software” is defined in the Rights in Noncommercial Computer... Research Program, the Rights in Noncommercial Technical Data and Computer Software—Small Business...
48 CFR 252.227-7016 - Rights in bid or proposal information.
Code of Federal Regulations, 2012 CFR
2012-10-01
... terms “technical data” and “computer software” are defined in the Rights in Technical Data—Noncommercial... delivery of technical data, the term “computer software” is defined in the Rights in Noncommercial Computer... Research Program, the Rights in Noncommercial Technical Data and Computer Software—Small Business...
Synthesis of Tree-Structured Computing Systems through Use of Closures.
1984-11-29
best hope of 8 achieving subpolynomial running times for typical problems without a degree of inter - connection that makes physical implementation... Inter HAS v TALKS leftson (SENDS v) TALKS rightson (SENDS v) HEARS parent (USES v.parent) HEARS U.inter (USES u-value) leaf HAS li HEARS parent (USES...v.parent) U Istype TREE (i),iE[i ... n-1] SIZE n root HAS u TALKS T.root (SENDS u) HEARS leftaon(USES v.left) HEARS rightson(USES v.rght) Inter HAS u
Code of Federal Regulations, 2012 CFR
2012-10-01
... technical data and computer software-Small Business Innovation Research (SBIR) Program. 252.227-7018 Section... Clauses 252.227-7018 Rights in noncommercial technical data and computer software—Small Business... Noncommercial Technical Data and Computer Software—Small Business Innovation Research (SBIR) Program (MAR 2011...
Exploration Medical System Technical Architecture Overview
NASA Technical Reports Server (NTRS)
Cerro, J.; Rubin, D.; Mindock, J.; Middour, C.; McGuire, K.; Hanson, A.; Reilly, J.; Burba, T.; Urbina, M.
2018-01-01
The Exploration Medical Capability (ExMC) Element Systems Engineering (SE) goals include defining the technical system needed to support medical capabilities for a Mars exploration mission. A draft medical system architecture was developed based on stakeholder needs, system goals, and system behaviors, as captured in an ExMC concept of operations document and a system model. This talk will discuss a high-level view of the medical system, as part of a larger crew health and performance system, both of which will support crew during Deep Space Transport missions. Other mission components, such as the flight system, ground system, caregiver, and patient, will be discussed as aspects of the context because the medical system will have important interactions with each. Additionally, important interactions with other aspects of the crew health and performance system are anticipated, such as health & wellness, mission task performance support, and environmental protection. This talk will highlight areas in which we are working with other disciplines to understand these interactions.
Argonne Out Loud: Computation, Big Data, and the Future of Cities
Catlett, Charlie
2018-01-16
Charlie Catlett, a Senior Computer Scientist at Argonne and Director of the Urban Center for Computation and Data at the Computation Institute of the University of Chicago and Argonne, talks about how he and his colleagues are using high-performance computing, data analytics, and embedded systems to better understand and design cities.
Seventy Years of Computing in the Nuclear Weapons Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Archer, Billy Joe
Los Alamos has continuously been on the forefront of scientific computing since it helped found the field. This talk will explore the rich history of computing in the Los Alamos weapons program. The current status of computing will be discussed, as will the expectations for the near future.
MO-G-12A-01: Quantitative Imaging Metrology: What Should Be Assessed and How?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giger, M; Petrick, N; Obuchowski, N
The first two symposia in the Quantitative Imaging Track focused on 1) the introduction of quantitative imaging (QI) challenges and opportunities, and QI efforts of agencies and organizations such as the RSNA, NCI, FDA, and NIST, and 2) the techniques, applications, and challenges of QI, with specific examples from CT, PET/CT, and MR. This third symposium in the QI Track will focus on metrology and its importance in successfully advancing the QI field. While the specific focus will be on QI, many of the concepts presented are more broadly applicable to many areas of medical physics research and applications. Asmore » such, the topics discussed should be of interest to medical physicists involved in imaging as well as therapy. The first talk of the session will focus on the introduction to metrology and why it is critically important in QI. The second talk will focus on appropriate methods for technical performance assessment. The third talk will address statistically valid methods for algorithm comparison, a common problem not only in QI but also in other areas of medical physics. The final talk in the session will address strategies for publication of results that will allow statistically valid meta-analyses, which is critical for combining results of individual studies with typically small sample sizes in a manner that can best inform decisions and advance the field. Learning Objectives: Understand the importance of metrology in the QI efforts. Understand appropriate methods for technical performance assessment. Understand methods for comparing algorithms with or without reference data (i.e., “ground truth”). Understand the challenges and importance of reporting results in a manner that allows for statistically valid meta-analyses.« less
Östlund, Ann-Sofi; Wadensten, Barbro; Häggström, Elisabeth; Lindqvist, Helena; Kristofferzon, Marja-Leena
2016-11-01
The aim of this study was to describe what verbal behaviours/kinds of talk occur during recorded motivational interviewing sessions between nurses in primary care and their patients. The aim was also to examine what kinds of nurse talk predict patient change talk, neutral talk and/or sustain talk. Motivational interviewing is a collaborative conversational style. It has been shown to be effective, in addressing health behaviours such as diet, exercise, weight loss and chronic disease management. In Sweden, it is one of the approaches to disease prevention conversations with patients recommended in the National Guidelines for Disease Prevention. Research on the mechanisms underlying motivational interviewing is growing, but research on motivational interviewing and disease prevention has also been called for. A descriptive and predictive design was used. Data were collected during 2011-2014. Fifty audio-recorded motivational interviewing sessions between 23 primary care nurses and 50 patients were analysed using Motivational Interviewing Sequential Code for Observing Process Exchanges. The frequency of specific kinds of talk and sequential analysis (to predict patient talk from nurse talk) were computed using the software Generalized Sequential Querier 5. The primary care nurses and patients used neutral talk most frequently. Open and negative questions, complex and positive reflections were significantly more likely to be followed by change talk and motivational interviewing-inconsistent talk, positive questions and negative reflections by sustain talk. To increase patients' change talk, primary care nurses need to use more open questions, complex reflections and questions and reflections directed towards change. © 2016 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Frame, Michael; Cohen, Nathan
2015-03-01
The Yale University mathematics department hosted a memorial for Benoit on April 29 and 30, 2011. The first day of the meeting consisted of three technical talks on some aspects of fractals, Benoit's principal intellectual legacy. Bernard Sapoval spoke on fractals in physics, Peter Jones on fractals in mathematics, and Nassim Taleb on fractals in finance...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stelson, P.H.
The bulk of the Division's effort concerned nuclear physics and accelerator development, but work in the areas of nuclear data, research applicable to the magnetic fusion project, atomic and molecular physics, and high-energy physics is also recounted. Lists of publications, technical talks, personnel, etc., are included. Individual reports with sufficient data are abstracted separately. (RWR)
The Role of Gestures in a Teacher-Student-Discourse about Atoms
ERIC Educational Resources Information Center
Abels, Simone
2016-01-01
Recent educational research emphasises the importance of analysing talk and gestures to come to an understanding about students' conceptual learning. Gestures are perceived as complex hand movements being equivalent to other language modes. They can convey experienceable as well as abstract concepts. As well as technical language, gestures…
ERIC Educational Resources Information Center
Tremlett, Lewis
1976-01-01
Presents an overview of the relation of nuclear power to human health and the environment, and discusses the advantages and disadvantages of nuclear power as an energy source urging technical educators to inculcate an awareness of the problems associated with the production of energy. Describes the fission reaction process, the hazards of…
Current Investments in the NASA Entry Systems Modeling Project
NASA Technical Reports Server (NTRS)
Wright, Michael; Barnhardt, Michael; Hughes, Monica
2017-01-01
This talk will provide an overview of investments in the Entry Systems Modeling project, along with some context of where the effort sits in the overall Space Technology EDL Portfolio. Technical highlights, particularly with referent to work on Ablation Modeling, will be given. Future directions will be discussed.
This presentation is for a webinar sponsored by the Society of Wetland Scientists. It is tailored to a technical audience with research interests in wetland ecology and management. The talk will introduce the National Aquatic Resource Surveys and then transition to a discussion...
Talk the Talk: Learner-Generated Podcasts as Catalysts for Knowledge Creation
ERIC Educational Resources Information Center
Lee, Mark J. W.; McLoughlin, Catherine; Chan, Anthony
2008-01-01
Podcasting allows audio content from one or more user-selected feeds or channels to be automatically downloaded to one's computer as it becomes available, then later transferred to a portable player for consumption at a convenient time and place. It is enjoying phenomenal growth in mainstream society, alongside other Web 2.0 technologies that…
Talking about Code: Integrating Pedagogical Code Reviews into Early Computing Courses
ERIC Educational Resources Information Center
Hundhausen, Christopher D.; Agrawal, Anukrati; Agarwal, Pawan
2013-01-01
Given the increasing importance of soft skills in the computing profession, there is good reason to provide students withmore opportunities to learn and practice those skills in undergraduate computing courses. Toward that end, we have developed an active learning approach for computing education called the "Pedagogical Code Review"…
Braving the Waters, or How To Get your Computing "Feet" Wet.
ERIC Educational Resources Information Center
Miller, Pat
1997-01-01
Suggests ways for educators to become computer literate: (1) admit you don't know about computers; (2) be patient with yourself; (3) read about technology; (4) get a home computer; (5) attend training; (6) experiment; (7) ask for assistance; and (8) talk with others about what works for them. (AEF)
Global stability of plane Couette flow beyond the energy stability limit
NASA Astrophysics Data System (ADS)
Fuentes, Federico; Goluskin, David
2017-11-01
This talk will present computations verifying that the laminar state of plane Couette flow is nonlinearly stable to all perturbations. The Reynolds numbers up to which this globally stability is verified are larger than those at which stability can be proven by the energy method, which is the typical method for demonstrating nonlinear stability of a fluid flow. This improvement is achieved by constructing Lyapunov functions that are more general than the energy. These functions are not restricted to being quadratic, and they are allowed to depend explicitly on the spectrum of the velocity field in the eigenbasis of the energy stability operator. The optimal choice of such a Lyapunov function is a convex optimization problem, and it can be constructed with computer assistance by solving a semidefinite program. This general method will be described in a companion talk by David Goluskin; the present talk focuses on its application to plane Couette flow.
High-End Computing for Incompressible Flows
NASA Technical Reports Server (NTRS)
Kwak, Dochan; Kiris, Cetin
2001-01-01
The objective of the First MIT Conference on Computational Fluid and Solid Mechanics (June 12-14, 2001) is to bring together industry and academia (and government) to nurture the next generation in computational mechanics. The objective of the current talk, 'High-End Computing for Incompressible Flows', is to discuss some of the current issues in large scale computing for mission-oriented tasks.
"I Can Live With This" - Talking To Communities About Natural Hazard Risk: A Story From New Zealand
NASA Astrophysics Data System (ADS)
Saunders, W. S. A.; Kilvington, M.; Van Dissen, R. J.
2016-12-01
Talking to people about a risk they might face in the future as a result of decisions they make in the present is notoriously hard; even when the consequences are quite apparent. Talking to entire communities about the risks of natural hazard events can seem almost impossible. However, the world we live in is changing, and talking to communities about likely hazard events such as greater and more dramatic storm and flood events, sea level rise, or coastal erosion is something that local government agencies have to do more and more. Moreover, as well as communicating what is known about the science of natural hazards, local governments have to talk about the possible impacts and the risk. One of the most important questions local agencies and communities face is "what can we live with and what must we do something about?" In this presentation we share how one local government agency in New Zealand took on the challenge of talking to their community about planning for future land use that takes account of natural hazard risk. They used an innovative process that helped people understand complex risk concepts; reflect on both the consequences and likelihood of hazard events; and then consider the implications for themselves and their community. This process engaged the public imagination and produced a robust response that could be evaluated alongside technical input on risk thresholds and integrated into final (statutory and defendable) land use policy decisions.
Computer Integration into the Early Childhood Curriculum
ERIC Educational Resources Information Center
Mohammad, Mona; Mohammad, Heyam
2012-01-01
Navin and Mark are playing at the computer in their preschool classroom. Like the rest of their classmates, these four-year-old children fearlessly experiment with computer as they navigate through the art program they are using. As they draw and paint on the computer screen, Mark and Navin talk about their creation. "Let's try the stamps" insists…
A Study of the Braille and Talking Book Program in Ohio. Final Report.
ERIC Educational Resources Information Center
Wessells, Michael B.; And Others
This study evaluates user satisfaction and the cost effectiveness of the use of computer systems in the Braille and Talking Book Program of the Ohio Regional Libraries for the Blind and Physically Handicapped, and makes recommendations for patterns of service and funding. This report provides an executive summary as well as a 3-part presentation…
A prototype Knowledge-Based System to Aid Space System Restoration Management.
1986-12-01
Systems. ......... 122 Appendix B: Computation of Weights With AHP . . .. 132 Appendix C: ART Code .. ............... 138 Appendix D: Test Outputs...45 5.1 Earth Coverage With Geosynchronous Satellites 49 5.2 Space System Configurations ... ........... . 50 5.3 AHP Hierarchy...67 5.4 AHP Hierarchy With Weights .... ............ 68 6.1 TALK Schema Structure ..... .............. 75 6.2 ART Code for TALK Satellite C
"Computer Science Can Feed a Lot of Dreams"
ERIC Educational Resources Information Center
Educational Horizons, 2014
2014-01-01
Pat Yongpradit is the director of education at Code.org. He leads all education efforts, including professional development and curriculum creation, and he builds relationships with school districts. Pat joined "Educational Horizons" to talk about why it is important to teach computer science--even for non-computer science teachers. This…
Computational toxicity in 21st century safety sciences (China talk - Fuzhou China)
presentation at the Joint Meeting of Analytical Toxicology and Computational Toxicology Committee (Chinese Society of Toxicology) International Workshop on Advanced Chemical Safety Assessment Technologies on 11 May 2016, Fuzhou University, Fuzhou China
48 CFR 252.227-7026 - Deferred delivery of technical data or computer software.
Code of Federal Regulations, 2010 CFR
2010-10-01
... technical data or computer software. 252.227-7026 Section 252.227-7026 Federal Acquisition Regulations... data or computer software. As prescribed at 227.7103-8(a), use the following clause: Deferred Delivery of Technical Data or Computer Software (APR 1988) The Government shall have the right to require, at...
48 CFR 252.227-7026 - Deferred delivery of technical data or computer software.
Code of Federal Regulations, 2012 CFR
2012-10-01
... technical data or computer software. 252.227-7026 Section 252.227-7026 Federal Acquisition Regulations... data or computer software. As prescribed at 227.7103-8(a), use the following clause: Deferred Delivery of Technical Data or Computer Software (APR 1988) The Government shall have the right to require, at...
48 CFR 252.227-7026 - Deferred delivery of technical data or computer software.
Code of Federal Regulations, 2014 CFR
2014-10-01
... technical data or computer software. 252.227-7026 Section 252.227-7026 Federal Acquisition Regulations... data or computer software. As prescribed at 227.7103-8(a), use the following clause: Deferred Delivery of Technical Data or Computer Software (APR 1988) The Government shall have the right to require, at...
48 CFR 252.227-7026 - Deferred delivery of technical data or computer software.
Code of Federal Regulations, 2011 CFR
2011-10-01
... technical data or computer software. 252.227-7026 Section 252.227-7026 Federal Acquisition Regulations... data or computer software. As prescribed at 227.7103-8(a), use the following clause: Deferred Delivery of Technical Data or Computer Software (APR 1988) The Government shall have the right to require, at...
48 CFR 252.227-7026 - Deferred delivery of technical data or computer software.
Code of Federal Regulations, 2013 CFR
2013-10-01
... technical data or computer software. 252.227-7026 Section 252.227-7026 Federal Acquisition Regulations... data or computer software. As prescribed at 227.7103-8(a), use the following clause: Deferred Delivery of Technical Data or Computer Software (APR 1988) The Government shall have the right to require, at...
On Establishing Big Data Wave Breakwaters with Analytics (Invited)
NASA Astrophysics Data System (ADS)
Riedel, M.
2013-12-01
The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.
Cultivating Our Own: Helping Dreams Become a Reality
ERIC Educational Resources Information Center
Decken, Grace
2012-01-01
For many students, participating in a Career and Technical Student Organization empowers them with knowledge and skills they find invaluable in college and careers. In this article, the author talks about some of her student success stories and their participation in Health Occupations Students of America. She begins by discussing the role of…
Education for the Other Side of Gaming
ERIC Educational Resources Information Center
Stone, Michael R.
2006-01-01
In this article, the author talks about gambling and how career and technical education can play a role in gaming education. While the growth of gambling fuels the economy, it can also fuel hidden addiction. Identified by the American Psychiatric Association in its Diagnostic and Statistical Manual of the Mental Disorders as pathological gambling,…
What Will It Take to Establish Technology/Engineering Education for All Students?
ERIC Educational Resources Information Center
Technology Teacher, 2008
2008-01-01
The growing national awareness of the importance of technical skills for "all students" opens up a tremendous opportunity for today's technology teachers. In this interview, Cary Sneider of the National Center for Technological Literacy at the Museum of Science, Boston, talks about how technology educators can establish clear standards…
Behavioral Talk-Write as a Method for Teaching Technical Editing.
ERIC Educational Resources Information Center
Gilbertsen, Michael; Killingsworth, M. Jimmie
1987-01-01
Presents a process-oriented method for teachers of stylistic editing workshops that allows them to (1) focus on individual students, (2) start with students basic repertory of responses and build from there, (3) work with freely emitted behavior, (4) ensure frequent and brief responses, and (5) achieve desired behavior through sequential steps.…
Kodak AMSD Concept Overview and Status (Semi-Rigid Mirror with Sparse Actuators)
NASA Technical Reports Server (NTRS)
Matthews, Gary; Maji, Arup K. (Technical Monitor)
2001-01-01
This talk will review Kodak's current AMSD technical and schedule status. For AMSD, Kodak is fabricating a semi-rigid closed-back egg-crate glass mirror, a graphite composite reaction structure, and 16 force actuators for figure control. The mirror is currently on schedule for cryotesting in early '02.
Simulation As a Tool in Education Research and Development. A Technical Paper. EdTalk.
ERIC Educational Resources Information Center
Hood, Paul
This document introduces simulation as a field of endeavor that has great potential for education research, development, and training. Simulation allows education developers to explore, develop, and test new educational programs and practices before communities, educators, and students are asked to participate in them. Simulation technologies…
A Modest Proposal for Improving the Education of Reading Teachers. Technical Report No. 487.
ERIC Educational Resources Information Center
Anderson, Richard C.; And Others
A gap exists between talk about teaching that is featured in most preservice teacher education and the working knowledge and problem-solving expertise that characterize skilled teaching. This gap exists because typical teacher training does not embody the principles of modeling, coaching, scaffolding, articulation, and reflection. Three methods…
Astrophysical Computation in Research, the Classroom and Beyond
NASA Astrophysics Data System (ADS)
Frank, Adam
2009-03-01
In this talk I review progress in the use of simulations as a tool for astronomical research, for education and public outreach. The talk will include the basic elements of numerical simulations as well as advances in algorithms which have led to recent dramatic progress such as the use of Adaptive Mesh Refinement methods. The scientific focus of the talk will be star formation jets and outflows while the educational emphasis will be on the use of advanced platforms for simulation based learning in lecture and integrated homework. Learning modules for science outreach websites such as DISCOVER magazine will also be highlighted.
Code of Federal Regulations, 2013 CFR
2013-01-01
..., DEPARTMENT OF AGRICULTURE GUIDELINES FOR THE TRANSFER OF EXCESS COMPUTERS OR OTHER TECHNICAL EQUIPMENT..., in writing, an authorized official to approve transfers of excess computers or other technical...) Excess computers or other technical equipment must first be internally screened to ensure it is not...
Code of Federal Regulations, 2014 CFR
2014-01-01
..., DEPARTMENT OF AGRICULTURE GUIDELINES FOR THE TRANSFER OF EXCESS COMPUTERS OR OTHER TECHNICAL EQUIPMENT..., in writing, an authorized official to approve transfers of excess computers or other technical...) Excess computers or other technical equipment must first be internally screened to ensure it is not...
Yost, Kathleen J; Webster, Kimberly; Baker, David W; Choi, Seung W; Bode, Rita K; Hahn, Elizabeth A
2009-06-01
Current health literacy measures are too long, imprecise, or have questionable equivalence of English and Spanish versions. The purpose of this paper is to describe the development and pilot testing of a new bilingual computer-based health literacy assessment tool. We analyzed literacy data from three large studies. Using a working definition of health literacy, we developed new prose, document and quantitative items in English and Spanish. Items were pilot tested on 97 English- and 134 Spanish-speaking participants to assess item difficulty. Items covered topics relevant to primary care patients and providers. English- and Spanish-speaking participants understood the tasks involved in answering each type of question. The English Talking Touchscreen was easy to use and the English and Spanish items provided good coverage of the difficulty continuum. Qualitative and quantitative results provided useful information on computer acceptability and initial item difficulty. After the items have been administered on the Talking Touchscreen (la Pantalla Parlanchina) to 600 English-speaking (and 600 Spanish-speaking) primary care patients, we will develop a computer adaptive test. This health literacy tool will enable clinicians and researchers to more precisely determine the level at which low health literacy adversely affects health and healthcare utilization.
2017-09-06
WASHINGTON, D.C.---S&T Partnership Forum In-Space Assembly Technical Interchange Meeting-On September 6th 2017, many of the United States government experts on In-Space Assembly met at the U.S. Naval Research Lab to discuss both technology development and in-space applications that would advance national capabilities in this area. Expertise from NASA, USAF, NRO, DARPA and NRL met in this meeting which was coordinated by the NASA Headquarters, Office of the Chief Technologist. This technical interchange meeting was the second meeting of the members of this Science and Technology Partnership Forum. Glen Henshaw of Code 8231 talks to the group in the Space Robotics Lab.
None
2018-01-24
The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing â from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Seti@Home. Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance. 4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.
None
2018-06-20
The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing â from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry. Michael Yoo, Managing Director, Head of the Technical Council, UBS. Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse. Grid computing gets mentions in the press for community programs starting last decade with "Seti@Home". Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.
None
2018-01-25
The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing â from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industries Adam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.
None
2018-02-02
The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing â from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance. 4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followedmore » by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Seti@Home. Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance. 4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followedmore » by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry. Michael Yoo, Managing Director, Head of the Technical Council, UBS. Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse. Grid computing gets mentions in the press for community programs starting last decade with "Seti@Home". Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followedmore » by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance. 4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followedmore » by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followedmore » by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industries Adam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followedmore » by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Seti@Home. Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN. 3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.« less
None
2018-02-01
The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing â from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.
None
2018-01-24
The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing â from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Seti@Home. Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN. 3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.
Let Documents Talk to Each Other: A Computer Model for Connection of Short Documents.
ERIC Educational Resources Information Center
Chen, Z.
1993-01-01
Discusses the integration of scientific texts through the connection of documents and describes a computer model that can connect short documents. Information retrieval and artificial intelligence are discussed; a prototype system of the model is explained; and the model is compared to other computer models. (17 references) (LRW)
Computer-Tutors and a Freshman Writer: A Protocol Study.
ERIC Educational Resources Information Center
Strickland, James
Although there are many retrospective accounts from teachers and professional writers concerning the effect of computers on their writing, there are few real-time accounts of students struggling to simultaneously develop as writers and cope with computers. To fill this void in "testimonial data," a study examining talking-aloud protocols from a…
Computational Physics' Greatest Hits
NASA Astrophysics Data System (ADS)
Bug, Amy
2011-03-01
The digital computer, has worked its way so effectively into our profession that now, roughly 65 years after its invention, it is virtually impossible to find a field of experimental or theoretical physics unaided by computational innovation. It is tough to think of another device about which one can make that claim. In the session ``What is computational physics?'' speakers will distinguish computation within the field of computational physics from this ubiquitous importance across all subfields of physics. This talk will recap the invited session ``Great Advances...Past, Present and Future'' in which five dramatic areas of discovery (five of our ``greatest hits'') are chronicled: The physics of many-boson systems via Path Integral Monte Carlo, the thermodynamic behavior of a huge number of diverse systems via Monte Carlo Methods, the discovery of new pharmaceutical agents via molecular dynamics, predictive simulations of global climate change via detailed, cross-disciplinary earth system models, and an understanding of the formation of the first structures in our universe via galaxy formation simulations. The talk will also identify ``greatest hits'' in our field from the teaching and research perspectives of other members of DCOMP, including its Executive Committee.
Labacher, Lukas; Mitchell, Claudia
2013-01-01
Young adults often lack access to confidential, long-lasting, and nonjudgmental interactions with sexual health professionals at brick-and-mortar clinics. To ensure that patients return for their STI test results, post-result counseling, and STI-related information, computer-mediated health intervention programming allows them to receive sexual health information through onsite computers, the Internet, and mobile phone calls and text messages. To determine whether young adults (age: M = 21 years) prefer to communicate with health professionals about the status of their sexual health through computer-mediated communication devices, 303 second-year university students (183 from an urban North American university and 120 from a periurban university in South Africa) completed a paper-based survey indicating how they prefer to communicate with doctors and nurses: talking face to face, mobile phone call, text message, Internet chat programs, Facebook, Twitter, or e-mail. Nearly all students, and female students in South Africa in particular, prefer to receive their STI test results, post-results counseling, and STI-related information by talking face to face with doctors and nurses rather than communicating through computers or mobile phones. Results are clarified in relation to gender, availability of various technologies, and prevalence of HIV in Canada and in South Africa.
NASA Astrophysics Data System (ADS)
Mukai, K.; ASTRO-E Guest Observer Facility Team
1998-12-01
The XRS instrument on board ASTRO-E is expected to last about two years, before it runs out of cryogen. This leads us to place a particular emphasis on the technical aspects of the observing proposals to maximize the scientific return, more so than for missions/instruments with longer life times. In this talk, we will introduce the tools that we provide for you to write technically sound ASTRO-E XRS proposals. They include PIMMS/W3pimms and xspec/WebSpec for exposure time calculation, simaste for more detailed simulations (particularly of extended sources), and Wasabi, the Web-based observation visualization tool.
Coordinating Council. Seventh Meeting: Acquisitions
NASA Technical Reports Server (NTRS)
1992-01-01
The theme for this NASA Scientific and Technical Information Program Coordinating Council meeting was Acquisitions. In addition to NASA and the NASA Center for AeroSpace Information (CASI) presentations, the report contains fairly lengthy visuals about acquisitions at the Defense Technical Information Center. CASI's acquisitions program and CASI's proactive acquisitions activity were described. There was a presentation on the document evaluation process at CASI. A talk about open literature scope and coverage at the American Institute of Aeronautics and Astronautics was also given. An overview of the STI Program's Acquisitions Experts Committee was given next. Finally acquisitions initiatives of the NASA STI program were presented.
Scaffolding Collaborative Technical Writing with Procedural Facilitation and Synchronous Discussion
ERIC Educational Resources Information Center
Yeh, Shiou-Wen; Lo, Jia-Jiunn; Huang, Jeng-Jia
2011-01-01
With the advent of computer technology, researchers and instructors are attempting to devise computer support for effective collaborative technical writing. In this study, a computer-supported environment for collaborative technical writing was developed. This system (Process-Writing Wizard) provides process-oriented scaffolds and a synchronous…
Pacific Educational Computer Network Study. Final Report.
ERIC Educational Resources Information Center
Hawaii Univ., Honolulu. ALOHA System.
The Pacific Educational Computer Network Feasibility Study examined technical and non-technical aspects of the formation of an international Pacific Area computer network for higher education. The technical study covered the assessment of the feasibility of a packet-switched satellite and radio ground distribution network for data transmission…
48 CFR 227.7103-7 - Use and non-disclosure agreement.
Code of Federal Regulations, 2010 CFR
2010-10-01
... release, disclosure, or authorized use of technical data or computer software subject to special license... (b) of this subsection, technical data or computer software delivered to the Government with..., display, or disclose technical data subject to limited rights or computer software subject to restricted...
48 CFR 252.204-7012 - Safeguarding of unclassified controlled technical information.
Code of Federal Regulations, 2014 CFR
2014-10-01
.... Cyber incident means actions taken through the use of computer networks that result in an actual or... printed within an information system. Technical information means technical data or computer software, as..., catalog-item identifications, data sets, studies and analyses and related information, and computer...
NASA Astrophysics Data System (ADS)
Fiala, L.; Lokajicek, M.; Tumova, N.
2015-05-01
This volume of the IOP Conference Series is dedicated to scientific contributions presented at the 16th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2014), this year the motto was ''bridging disciplines''. The conference took place on September 1-5, 2014, at the Faculty of Civil Engineering, Czech Technical University in Prague, Czech Republic. The 16th edition of ACAT explored the boundaries of computing system architectures, data analysis algorithmics, automatic calculations, and theoretical calculation technologies. It provided a forum for confronting and exchanging ideas among these fields, where new approaches in computing technologies for scientific research were explored and promoted. This year's edition of the workshop brought together over 140 participants from all over the world. The workshop's 16 invited speakers presented key topics on advanced computing and analysis techniques in physics. During the workshop, 60 talks and 40 posters were presented in three tracks: Computing Technology for Physics Research, Data Analysis - Algorithms and Tools, and Computations in Theoretical Physics: Techniques and Methods. The round table enabled discussions on expanding software, knowledge sharing and scientific collaboration in the respective areas. ACAT 2014 was generously sponsored by Western Digital, Brookhaven National Laboratory, Hewlett Packard, DataDirect Networks, M Computers, Bright Computing, Huawei and PDV-Systemhaus. Special appreciations go to the track liaisons Lorenzo Moneta, Axel Naumann and Grigory Rubtsov for their work on the scientific program and the publication preparation. ACAT's IACC would also like to express its gratitude to all referees for their work on making sure the contributions are published in the proceedings. Our thanks extend to the conference liaisons Andrei Kataev and Jerome Lauret who worked with the local contacts and made this conference possible as well as to the program coordinator Federico Carminati and the conference chair Denis Perret-Gallix for their global supervision. Further information on ACAT 2014 can be found at http://www.particle.cz/acat2014
De Leo, Gianluca; Gonzales, Carol H; Battagiri, Padmaja; Leroy, Gondy
2011-08-01
Autism is a complex neurobiological disorder that is part of a group of disorders known as autism spectrum disorders (ASD). Today, one in 150 individuals is diagnosed with autism. Lack of social interaction and problems with communication are the main characteristics displayed by children with ASD. The Picture Exchange Communication System (PECS) is a communication system where children exchange visual symbols as a form of communication. The visual symbols are laminated pictures stored in a binder. We have designed, developed and are currently testing a software application, called PixTalk which works on any Windows Mobile Smart-phone. Teachers and caregivers can access a web site and select from an online library the images to be downloaded on to the Smart-phone. Children can browse and select images to express their intentions, desires, and emotions using PixTalk. Case study results indicate that PixTalk can be used as part of ongoing therapy.
Winning the Global Skills Race: National Centers Prime Students for Success in Emerging Job Markets
ERIC Educational Resources Information Center
Murray, Corey
2007-01-01
This article talks about a joint effort between the National Science Foundation and the nation's community colleges that helps students secure jobs in technical career fields. It describes Advanced Technological Education Program (ATE), National Science Foundation's (NSF's) premier initiative with two-year colleges that was created in response to…
Framework Fuels the Need to Read: Strategies Boost Literacy of Students in Content-Area Classes
ERIC Educational Resources Information Center
Schoenbach, Ruth; Greenleaf, Cynthia L.; Hale, Gina
2010-01-01
Middle and high school teachers across academic disciplines face increased pressure to address the Common Core State Standards (CCSS) for English language arts and for literacy in history/social studies, science, and technical subjects. This means that the responsibility of preparing students to read, write, talk, and think critically about…
Safety and Sex Practices among Nebraska Adolescents. Technical Report 24.
ERIC Educational Resources Information Center
Newman, Ian M.; Perry-Hunnicutt, Christina
This report describes a range of adolescent behaviors related to their safety and the safety of others. The behaviors reported here range from ordinary safety precautions such as only swimming in supervised areas and wearing helmets when riding a motorcycle to less talked about behaviors such as using condoms during sexual intercourse and carrying…
ERIC Educational Resources Information Center
Farkas, Steve; Duffett, Ann
2014-01-01
In 1993, the Kettering Foundation and Public Agenda released a report titled "Divided Within, Besieged Without: The Politics of Education in Four American School Districts." The study's attention to communities was distinct from the conventional focus on the technical issues of school administration and funding, and it reported on what…
ERIC Educational Resources Information Center
Jakee, Keith
2011-01-01
This instructional paper is intended to provide an alternative approach to developing lecture materials, including handouts and PowerPoint slides, successfully developed over several years. The principal objective is to aid in the bridging of traditional "chalk and talk" lecture approaches with more active learning techniques, especially in more…
ERIC Educational Resources Information Center
Konopnicki, Patrick
1996-01-01
After hours of introducing team training, facilitation skills, and Total Quality Management tools, the old classroom practices of "chalk and talk" faded in Virginia Beach schools' technical and career education classes. Academic teachers also improved instruction, using innovative TQM tools such as nominal group voting, course mission…
U.S. Department of Energy facilities needed to advance nuclear power.
Ahearne, John F
2011-01-01
This talk is based upon a November 2008 report by the U.S. Department of Energy (DOE) Nuclear Energy Advisory Committee (NEAC). The report has two parts, a policy section and a technology section. Here extensive material from the Technical Subcommittee section of the NEAC report is used. Copyright © 2010 Health Physics Society
Decision aid use during post-biopsy consultations for localized prostate cancer.
Holmes-Rovner, Margaret; Srikanth, Akshay; Henry, Stephen G; Langford, Aisha; Rovner, David R; Fagerlin, Angela
2018-02-01
Decision Aids (DAs) effectively translate medical evidence for patients but are not routinely used in clinical practice. Little is known about how DAs are used during patient-clinician encounters. To characterize the content and communicative function of high-quality DAs during diagnostic clinic visits for prostate cancer. 252 men newly diagnosed with localized prostate cancer who had received a DA, 45 treating physicians at 4 US Veterans Administration urology clinics. Qualitative analysis of transcribed audio recordings was used to inductively develop categories capturing content and function of all direct references to DAs (booklet talk). The presence or absence of any booklet talk per transcript was also calculated. Booklet talk occurred in 55% of transcripts. Content focused on surgical procedures (36%); treatment choice (22%); and clarifying risk classification (17%). The most common function of booklet talk was patient corroboration of physicians' explanations (42%), followed by either physician or patient acknowledgement that the patient had the booklet. Codes reflected the absence of DA use for shared decision-making. In regression analysis, predictors of booklet talk were fewer years of patient education (P = .027) and more time in the encounter (P = .027). Patient race, DA type, time reading the DA, physician informing quality and physician age did not predict booklet talk. Results show that good decision aids, systematically provided to patients, appeared to function not to open up deliberations about how to balance benefits and harms of competing treatments, but rather to allow patients to ask narrow technical questions about recommended treatments. © 2017 The Authors Health Expectations Published by John Wiley & Sons Ltd.
48 CFR 209.505-4 - Obtaining access to proprietary information.
Code of Federal Regulations, 2014 CFR
2014-10-01
...) For contractors accessing third party proprietary technical data or computer software, non-disclosure... limited rights technical data, commercial technical data, or restricted rights computer software. The...
ERIC Educational Resources Information Center
Thornburg, David; Beane, Pam
1983-01-01
Presents programs for creating animated characters (Atari), random sentences (Logo), and making a triangle (TRS-80 Level III Basic), and suggestions for creative writing and comparison shopping for computers/software. Also includes "Modems for Micros: Your Computer Can Talk on the Phone" (Bill Chalgren) on telecommunications capabilities of…
Theoretical/Computational Studies of High-Temperature Superconductivity from Quantum Magnetism
2016-06-09
Rodriguez1 1Department of Physics and Astronomy , California State University, Los Angeles, California 90032 Abstract The symmetry of a single Cooper pair in...2014. 7. J.P. Rodriguez, “Collective Modes in Iron Superconductors from the Local Moment Limit” (invited talk), Department of Physics and Astronomy ...Are the New Class of Iron-Pnictide Superconductors Doped Mott Insulators?” (invited talk), Department of Physics and Astronomy , California State
NASA Astrophysics Data System (ADS)
Morin, Paul; Porter, Claire; Cloutier, Michael; Howat, Ian; Noh, Myoung-Jong; Willis, Michael; Kramer, WIlliam; Bauer, Greg; Bates, Brian; Williamson, Cathleen
2017-04-01
Surface topography is among the most fundamental data sets for geosciences, essential for disciplines ranging from glaciology to geodynamics. Two new projects are using sub-meter, commercial imagery licensed by the National Geospatial-Intelligence Agency and open source photogrammetry software to produce a time-tagged 2m posting elevation model of the Arctic and an 8m posting reference elevation model for the Antarctic. When complete, this publically available data will be at higher resolution than any elevation models that cover the entirety of the Western United States. These two polar projects are made possible due to three equally important factors: 1) open-source photogrammetry software, 2) petascale computing, and 3) sub-meter imagery licensed to the United States Government. Our talk will detail the technical challenges of using automated photogrammetry software; the rapid workflow evolution to allow DEM production; the task of deploying the workflow on one of the world's largest supercomputers; the trials of moving massive amounts of data, and the management strategies the team needed to solve in order to meet deadlines. Finally, we will discuss the implications of this type of collaboration for future multi-team use of leadership-class systems such as Blue Waters, and for further elevation mapping.
A Talking Computers System for Persons with Vision and Speech Handicaps. Final Report.
ERIC Educational Resources Information Center
Visek & Maggs, Urbana, IL.
This final report contains a detailed description of six software systems designed to assist individuals with blindness and/or speech disorders in using inexpensive, off-the-shelf computers rather than expensive custom-made devices. The developed software is not written in the native machine language of any particular brand of computer, but in the…
ERIC Educational Resources Information Center
Arnold, Nike
2007-01-01
Many studies (e.g., [Beauvois, M.H., 1998. "E-talk: Computer-assisted classroom discussion--attitudes and motivation." In: Swaffar, J., Romano, S., Markley, P., Arens, K. (Eds.), "Language learning online: Theory and practice in the ESL and L2 computer classroom." Labyrinth Publications, Austin, TX, pp. 99-120; Bump, J., 1990. "Radical changes in…
ERIC Educational Resources Information Center
Trescases, Pierre
A computer system developed as a database access facilitator for the blind is found to have application to foreign language instruction, specifically in teaching French to speakers of English. The computer is programmed to translate symbols from the International Phonetic Alphabet (IPA) into appropriate phonemes for whatever language is being…
Code of Federal Regulations, 2010 CFR
2010-10-01
... technical data and computer software-Small Business Innovation Research (SBIR) Program. 252.227-7018 Section... Innovation Research (SBIR) Program. As prescribed in 227.7104(a), use the following clause: Rights in Noncommercial Technical Data and Computer Software—Small Business Innovation Research (SBIR) Program (JUN 1995...
Code of Federal Regulations, 2014 CFR
2014-10-01
... technical data and computer software-Small Business Innovation Research (SBIR) Program. 252.227-7018 Section... Innovation Research (SBIR) Program. As prescribed in 227.7104(a), use the following clause: Rights in Noncommercial Technical Data and Computer Software—Small Business Innovation Research (SBIR) Program (FEB 2014...
Code of Federal Regulations, 2011 CFR
2011-10-01
... technical data and computer software-Small Business Innovation Research (SBIR) Program. 252.227-7018 Section... Innovation Research (SBIR) Program. As prescribed in 227.7104(a), use the following clause: Rights in Noncommercial Technical Data and Computer Software—Small Business Innovation Research (SBIR) Program (MAR 2011...
Code of Federal Regulations, 2013 CFR
2013-10-01
... technical data and computer software-Small Business Innovation Research (SBIR) Program. 252.227-7018 Section... Innovation Research (SBIR) Program. As prescribed in 227.7104(a), use the following clause: Rights in Noncommercial Technical Data and Computer Software—Small Business Innovation Research (SBIR) Program (MAY 2013...
The Application of Computers to Library Technical Processing
ERIC Educational Resources Information Center
Veaner, Allen B.
1970-01-01
Describes computer applications to acquisitions and technical processing and reports in detail on Stanford's development work in automated technical processing. Author is Assistant Director for Bibliographic Operation, Stanford University Libraries. (JB)
Flight Engineer Budarin uses a laptop computer in the SM during Expedition Six
2003-03-21
ISS006-E-45279 (21 March 2003) --- Cosmonaut Nikolai M. Budarin, Expedition Six flight engineer, uses a computer as he talks on a communication system in the Zvezda Service Module on the International Space Station (ISS). Budarin represents Rosaviakosmos.
ERIC Educational Resources Information Center
Clearing: Nature and Learning in the Pacific Northwest, 1985
1985-01-01
Presents an activity in which students create a computer program capable of recording and projecting paper use at school. Includes instructional strategies and background information such as requirements for pounds of paper/tree, energy needs, water consumption, and paper value at the recycling center. A sample program is included. (DH)
ERIC Educational Resources Information Center
Metz, Rosalyn
2010-01-01
While many talk about the cloud, few actually understand it. Three organizations' definitions come to the forefront when defining the cloud: Gartner, Forrester, and the National Institutes of Standards and Technology (NIST). Although both Gartner and Forrester provide definitions of cloud computing, the NIST definition is concise and uses…
Computing and Systems Applied in Support of Coordinated Energy, Environmental, and Climate Planning
This talk focuses on how Dr. Loughlin is applying Computing and Systems models, tools and methods to more fully understand the linkages among energy systems, environmental quality, and climate change. Dr. Loughlin will highlight recent and ongoing research activities, including: ...
Physics and Robotic Sensing -- the good, the bad, and approaches to making it work
NASA Astrophysics Data System (ADS)
Huff, Brian
2011-03-01
All of the technological advances that have benefited consumer electronics have direct application to robotics. Technological advances have resulted in the dramatic reduction in size, cost, and weight of computing systems, while simultaneously doubling computational speed every eighteen months. The same manufacturing advancements that have enabled this rapid increase in computational power are now being leveraged to produce small, powerful and cost-effective sensing technologies applicable for use in mobile robotics applications. Despite the increase in computing and sensing resources available to today's robotic systems developers, there are sensing problems typically found in unstructured environments that continue to frustrate the widespread use of robotics and unmanned systems. This talk presents how physics has contributed to the creation of the technologies that are making modern robotics possible. The talk discusses theoretical approaches to robotic sensing that appear to suffer when they are deployed in the real world. Finally the author presents methods being used to make robotic sensing more robust.
Analysis for water conflicts in a changing world
NASA Astrophysics Data System (ADS)
Lund, J. R.
2012-12-01
Like any subject which involves billions of dollars and thousands or millions of people, managing water involves serious conflicts among contending objectives and interest groups. These conflicts usually spill into the technical and scientific analysis of water resources problems and potential solutions. A favorable or unfavorable analytical outcome can be worth millions or cost millions to a stakeholder, so they have a self-interested duty to contend. This talk examines ideas for conducting analysis to improve the technical and scientific quality of public and policy discussions of controversial water problems. More than just solid technical work is needed. Investigators must organize, disseminate, and communicate their work effectively and attentively. Research must often be designed to be effective in informing policy discussions. Several sometimes conflicting strategies are available for this.
Coordination and interpretation of vocal and visible resources: 'trail-off' conjunctions.
Walker, Gareth
2012-03-01
The empirical focus of this paper is a conversational turn-taking phenomenon in which conjunctions produced immediately after a point of possible syntactic and pragmatic completion are treated by co-participants as points of possible completion and transition relevance. The data for this study are audio-video recordings of 5 unscripted face-to-face interactions involving native speakers of US English, yielding 28 'trail-off' conjunctions. Detailed sequential analysis of talk is combined with analysis of visible features (including gaze, posture, gesture and involvement with material objects) and technical phonetic analysis. A range of phonetic and visible features are shown to regularly co-occur in the production of 'trail-off' conjunctions. These features distinguish them from other conjunctions followed by the cessation of talk.
NASA Astrophysics Data System (ADS)
Cho, Young-Ho
2012-09-01
This special section of Journal of Micromechanics and Microengineering features papers selected from the 11th International Workshop on Micro and Nanotechnology for Power Generation and Energy Conversion Applications (PowerMEMS 2011), held at Sejong Hotel in Seoul, Korea during 15-18 November 2011. Since the first PowerMEMS workshop held in Sendai, Japan in 2000, the workshop has developed as the premier forum for reporting research results in micro and nanotechnology for power generation, energy conversion, harvesting and processing applications, including in-depth technical issues on nanostructures and materials for small-scale high-density energy and thermal management. Potential PowerMEMS applications cover not only portable power devices for consumer electronics and remote sensors, but also micro engines, impulsive thrusters and fuel cells for systems ranging from the nanometer to the millimeter scale. The 2011 technical program consists of 1 plenary talk, 4 invited talks and 118 contributed presentations. The 48 oral and 70 poster presentations, selected by 27 Technical Program Committee Members from 131 submitted abstracts, have stimulated lively discussion maximizing the interaction between participants. Among them, this special section includes 9 papers covering micro-scale power generators, energy converters, harvesters, thrusters and thermal coolers. Finally, we are grateful to the members of the International Steering Committee, the Technical Program Committee, and the Local Organizing Committee for their efforts and contributions to PowerMEMS 2011. We also thank the two companies Samsung Electro-Mechanics and LG Elite for technical tour arrangements. Special thanks go to Dr Ian Forbes, the editorial staff of the Journal of Micromechanics and Microengineering, as well as to the staff of IOP Publishing for making this special section possible.
Computer-aided drug design: the next 20 years
NASA Astrophysics Data System (ADS)
Van Drie, John H.
2007-10-01
This perspectives article has been taken from a talk the author gave at the symposium in honor of Yvonne C. Martin's retirement, held at the American Chemical Society spring meeting in Chicago on March 25, 2007. The talk was intended as a somewhat lighthearted attempt to gaze into the future; inevitably, in print, things will come across more seriously than was intended. As we all know—the past is rarely predictive of the future.
Dan Goldin Presentation: Pathway to the Future
NASA Technical Reports Server (NTRS)
1999-01-01
In the "Path to the Future" presentation held at NASA's Langley Center on March 31, 1999, NASA's Administrator Daniel S. Goldin outlined the future direction and strategies of NASA in relation to the general space exploration enterprise. NASA's Vision, Future System Characteristics, Evolutions of Engineering, and Revolutionary Changes are the four main topics of the presentation. In part one, the Administrator talks in detail about NASA's vision in relation to the NASA Strategic Activities that are Space Science, Earth Science, Human Exploration, and Aeronautics & Space Transportation. Topics discussed in this section include: space science for the 21st century, flying in mars atmosphere (mars plane), exploring new worlds, interplanetary internets, earth observation and measurements, distributed information-system-in-the-sky, science enabling understanding and application, space station, microgravity, science and exploration strategies, human mars mission, advance space transportation program, general aviation revitalization, and reusable launch vehicles. In part two, he briefly talks about the future system characteristics. He discusses major system characteristics like resiliencey, self-sufficiency, high distribution, ultra-efficiency, and autonomy and the necessity to overcome any distance, time, and extreme environment barriers. Part three of Mr. Goldin's talk deals with engineering evolution, mainly evolution in the Computer Aided Design (CAD)/Computer Aided Engineering (CAE) systems. These systems include computer aided drafting, computerized solid models, virtual product development (VPD) systems, networked VPD systems, and knowledge enriched networked VPD systems. In part four, the last part, the Administrator talks about the need for revolutionary changes in communication and networking areas of a system. According to the administrator, the four major areas that need cultural changes in the creativity process are human-centered computing, an infrastructure for distributed collaboration, rapid synthesis and simulation tools, and life-cycle integration and validation. Mr. Goldin concludes his presentation with the following maxim "Collaborate, Integrate, Innovate or Stagnate and Evaporate." He also answers some questions after the presentation.
The Design and Transfer of Advanced Command and Control (C2) Computer-Based Systems
1980-03-31
TECHNICAL REPORT 80-02 QUARTERLY TECHNICAL REPORT: THE DESIGN AND TRANSFER OF ADVANCED COMMAND AND CONTROL (C 2 ) COMPUTER-BASED SYSTEMS ARPA...The Tasks/Objectives and/or Purposes of the overall project are connected with the design , development, demonstration and transfer of advanced...command and control (C2 ) computer-based systems; this report covers work in the computer-based design and transfer areas only. The Technical Problems thus
Non-standard analysis and embedded software
NASA Technical Reports Server (NTRS)
Platek, Richard
1995-01-01
One model for computing in the future is ubiquitous, embedded computational devices analogous to embedded electrical motors. Many of these computers will control physical objects and processes. Such hidden computerized environments introduce new safety and correctness concerns whose treatment go beyond present Formal Methods. In particular, one has to begin to speak about Real Space software in analogy with Real Time software. By this we mean, computerized systems which have to meet requirements expressed in the real geometry of space. How to translate such requirements into ordinary software specifications and how to carry out proofs is a major challenge. In this talk we propose a research program based on the use of no-standard analysis. Much detail remains to be carried out. The purpose of the talk is to inform the Formal Methods community that Non-Standard Analysis provides a possible avenue to attack which we believe will be fruitful.
Commodity Cluster Computing for Remote Sensing Applications using Red Hat LINUX
NASA Technical Reports Server (NTRS)
Dorband, John
2003-01-01
Since 1994, we have been doing research at Goddard Space Flight Center on implementing a wide variety of applications on commodity based computing clusters. This talk is about these clusters and haw they are used on these applications including ones for remote sensing.
Owens, Mandy D; Rowell, Lauren N; Moyers, Theresa
2017-10-01
Motivational Interviewing (MI) is an evidence-based approach shown to be helpful for a variety of behaviors across many populations. Treatment fidelity is an important tool for understanding how and with whom MI may be most helpful. The Motivational Interviewing Treatment Integrity coding system was recently updated to incorporate new developments in the research and theory of MI, including the relational and technical hypotheses of MI (MITI 4.2). To date, no studies have examined the MITI 4.2 with forensic populations. In this project, twenty-two brief MI interventions with jail inmates were evaluated to test the reliability of the MITI 4.2. Validity of the instrument was explored using regression models to examine the associations between global scores (Empathy, Partnership, Cultivating Change Talk and Softening Sustain Talk) and outcomes. Reliability of this coding system with these data was strong. We found that therapists had lower ratings of Empathy with participants who had more extensive criminal histories. Both Relational and Technical global scores were associated with criminal histories as well as post-intervention ratings of motivation to decrease drug use. Findings indicate that the MITI 4.2 was reliable for coding sessions with jail inmates. Additionally, results provided information related to the relational and technical hypotheses of MI. Future studies can use the MITI 4.2 to better understand the mechanisms behind how MI works with this high-risk group. Published by Elsevier Ltd.
Comprehensive feedback on trainee surgeons’ non-technical skills
Dieckmann, Peter; Beier-Holgersen, Randi; Rosenberg, Jacob; Oestergaard, Doris
2015-01-01
Objectives This study aimed to explore the content of conversations, feedback style, and perceived usefulness of feedback to trainee surgeons when conversations were stimulated by a tool for assessing surgeons’ non-technical skills. Methods Trainee surgeons and their supervisors used the Non-Technical Skills for Surgeons in Denmark tool to stimulate feedback conversations. Audio recordings of post-operation feedback conversations were collected. Trainees and supervisors provided questionnaire responses on the usefulness and comprehensiveness of the feedback. The feedback conversations were qualitatively analyzed for content and feedback style. Usefulness was investigated using a scale from 1 to 5 and written comments were qualitatively analyzed. Results Six trainees and six supervisors participated in eight feedback conversations. Eighty questionnaires (response rate 83 percent) were collected from 13 trainees and 12 supervisors. Conversations lasted median eight (2-15) minutes. Supervisors used the elements and categories in the tool to structure the content of the conversations. Supervisors tended to talk about the trainees’ actions and their own frames rather than attempting to understand the trainees’ perceptions. Supervisors and trainees welcomed the feedback opportunity and agreed that the conversations were useful and comprehensive. Conclusions The content of the feedback conversations reflected the contents of the tool and the feedback was considered useful and comprehensive. However, supervisors talked primarily about their own frames, so in order for the feedback to reach its full potential, supervisors may benefit from training techniques to stimulate a deeper reflection among trainees. PMID:25602262
ERIC Educational Resources Information Center
Newman, Denis; Torzs, Frederic
Arguing that the development of a notion of sense-making is of critical importance to improving science learning, this paper examines science teaching in four Boston (Massachusetts)-area classrooms that participated in an experiment on ways of integrating technology into a sixth-grade science curriculum on the earth's seasons. The task of the…
Overcoming the Challenges of Globalization: Community Colleges and the South's Economic Future.
ERIC Educational Resources Information Center
Edwards, Allen, Ed.
2000-01-01
This newsletter offers the text of an address by David L. Dodson, President of MDC, Inc., to the presidents and officers of the Southern Association of Community, Junior, and Technical Colleges (SACTJC) on December 6, 1999 in Atlanta, Georgia. The talk focuses on MDC's fall 2000 report, the "State of the South." Research for the report…
Kabachinski, Jeff
2005-01-01
In this issue and next, we're investigating different sections of the DICOM Standard to get a grip on what it's all about. The bottom line is to facilitate communication and DICOM addresses all the technical aspects to allow complying OEMs to talk to one another. In part 2, IT World completes its overview on DICOM by exploring UID, networking with DICOM, conformance, and conformance statements.
ERIC Educational Resources Information Center
Zuckerman, John V.
In an experiment to determine the most efficient design for the commentary of an instructional film, special consideration was given to three variables concerned with the construction of commentaries: the level of verbalization (the amount of talk), the personal reference of the narrator, and the phase relationship between the commentary and the…
NASA Technical Reports Server (NTRS)
2004-01-01
An interview with William Readdy is presented.Rsaddy graduated From the United States Naval Academy in 1974. After eleven years service as a naval aviator and test pilot, he joined NASA in 1986 as a research pilot. His technical assignments to date have included Training and Safety Officer, Orbiter project staff; NASA Director of Operations in Star City, Russia; and Space Shuttle Program Development Manager.
Get College- and Career-Ready at a Vo-Tech High School
ERIC Educational Resources Information Center
Demarest, Kathy K.; Gehrt, Victoria C.
2015-01-01
While talk abounds in the buzzword-happy education arena of what it means to develop students who are college and career ready, the author paints a portrait of a vocational-technical school district in northern Delaware that is actually doing both with its students, and has been for some 40 years. The vo-tech experience is not for students who…
A project in Zambia: talking to children about AIDS.
Baker, K
1988-09-01
Early in 1987, it became clear to this individual that children were a high priority group for Acquired Immune Deficiency Syndrome (AIDS) education. Preparation for providing AIDS education in Zambia included reading as much as possible about AIDS and AIDS education in schools, contacting the Health Education Unit at the Ministry of Health for their permission and advice, and making posters and preparing a list of 10 basic questions about AIDS. The 1st talks were at a boys' technical school and a large girls' day school. Following an introduction of the subject, the format included: a 10-minute quiz with students writing down their answers; a 35-40 minute talk, using posters as visual aids; a 20-30 minute open question time; and a repeat of the same quiz as a form of "posttest." The students responded positively, and there was a substantial increase in the percentage of correct answers after each talk. Subsequently, talks were given in other Lusaka secondary schools. After the 1st few talks, the pretest and posttest was discontinued as it was considered preferable to spend more time answering the students' questions. The talks varied depending on the audience, but posters were always included as visual aids. Initially, this AIDS education effort was voluntary and unfunded. Subsequently, and as the work grew, NORAD funded the project, paying for duplicating and printing as well as a salary on an hourly basis. A booklet on AIDS for secondary schools has been written and duplicated and accepted by the Intersectorial Committee on AIDS Health Education with minor changes. Late in 1987, the booklet was rewritten totally and expanded, with numerous illustrations. Throughout the booklet, human immunodeficiency virus (HIV) is carefully differentiated from AIDS disease. Talks also have been initiated at the Upper Primary School level. The format has been altered somewhat for these younger children as they tend to be noisy and excited. The primary project planned for 1988 is to talk to teachers and health educators, individually and in groups, informally and formally. Thus far, 58 talks have been given in 22 secondary schools and 11 primary schools along with 29 talks to nonschool groups. Culturally, it is much easier as a medically trained non-Zambian to talk about AIDS and AIDS-related concerns, but the message needs to be given more than once. It must be discussed both in the classroom and at home until it becomes a part of life.
ERIC Educational Resources Information Center
Nikolaidou, Georgia N.
2012-01-01
This exploratory work describes and analyses the collaborative interactions that emerge during computer-based music composition in the primary school. The study draws on socio-cultural theories of learning, originated within Vygotsky's theoretical context, and proposes a new model, namely Computer-mediated Praxis and Logos under Synergy (ComPLuS).…
Equation solvers for distributed-memory computers
NASA Technical Reports Server (NTRS)
Storaasli, Olaf O.
1994-01-01
A large number of scientific and engineering problems require the rapid solution of large systems of simultaneous equations. The performance of parallel computers in this area now dwarfs traditional vector computers by nearly an order of magnitude. This talk describes the major issues involved in parallel equation solvers with particular emphasis on the Intel Paragon, IBM SP-1 and SP-2 processors.
Too much small talk? Medical students' pelvic examination skills falter with pleasant patients.
Posner, Glenn D; Hamstra, Stanley J
2013-12-01
The competent performance of a female pelvic examination requires both technical proficiency and superlative communication skills. However, the ideal medium with which to assess these skills remains to be elucidated. Part-task trainers (PTTs) offer an effective and affordable means of testing technical skills, but may not allow students to demonstrate their communication skills. Hybrids involving standardised patients (SPs) (SP-PTT) offer a more realistic assessment of communication, but students may feel awkward when examining the female genitalia. The objective of this study was to compare the use of PTTs with that of SP-PTT hybrids in the assessment of technical and communication skills in the female pelvic examination. A total of 145 medical students were randomised to one of three conditions during their summative objective structured clinical examination (OSCE) at the completion of clerkship. Students performed the female pelvic examination on: (i) a PTT alone ('plastic' condition); (ii) an SP-PTT hybrid with an SP who did not engage in any superfluous conversation ('perfunctory' condition), or (iii) an SP-PTT hybrid with an SP who was trained to offer small talk and banter, which was judged to better reflect the typical doctor-patient interaction ('pleasant' condition). Communication skills did not differ significantly among the three groups (p = 0.354). There was a significant difference among groups in technical skills scores (p = 0.0018). Students in the 'plastic' condition performed best, followed by those in the 'perfunctory' and 'pleasant' conditions, respectively. Medical students demonstrate equivalent communication skills whether they work with a PTT or an SP-PTT hybrid, but their technical skills suffer in the presence of an SP. Working with the PTT alone does not appear to disadvantage students in terms of communication skills, but may offer better conditions for performing technical aspects of the procedure. Whether the 'plastic patient' is the most meaningful and valid means of predicting overall competence in the clinical setting is still a matter for debate. © 2013 John Wiley & Sons Ltd.
Developing Technology Products - A Physicist's Perspective
NASA Astrophysics Data System (ADS)
Burka, Michael
2014-03-01
There are many physicists working in the industrial sector. We rarely have the word physicist in our job title; we are far more commonly called engineers or scientists. But, we are physicists, and we succeed because our training in physics has given us the habits of mind and the technical skills that one needs to solve complex technical challenges. This talk will explore the transition from physics research to technology product development using examples from my own career, first as a postdoctoral fellow and research scientist on the LIGO project, and then developing products in the spectroscopy, telecommunications, and medical device industries. Approaches to identifying and pursuing opportunities in industry will be discussed.
"Ask Argonne" - Charlie Catlett, Computer Scientist, Part 2
Catlett, Charlie
2018-02-14
A few weeks back, computer scientist Charlie Catlett talked a bit about the work he does and invited questions from the public during Part 1 of his "Ask Argonne" video set (http://bit.ly/1joBtzk). In Part 2, he answers some of the questions that were submitted. Enjoy!
"Ask Argonne" - Charlie Catlett, Computer Scientist, Part 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Catlett, Charlie
2014-06-17
A few weeks back, computer scientist Charlie Catlett talked a bit about the work he does and invited questions from the public during Part 1 of his "Ask Argonne" video set (http://bit.ly/1joBtzk). In Part 2, he answers some of the questions that were submitted. Enjoy!
Tableau Economique: Teaching Economics with a Tablet Computer
ERIC Educational Resources Information Center
Scott, Robert H., III
2011-01-01
The typical method of instruction in economics is chalk and talk. Economics courses often require writing equations and drawing graphs and charts, which are all best done in freehand. Unlike static PowerPoint presentations, tablet computers create dynamic nonlinear presentations. Wireless technology allows professors to write on their tablets and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deline, C.
Computer modeling is able to predict the performance of distributed power electronics (microinverters, power optimizers) in PV systems. However, details about partial shade and other mismatch must be known in order to give the model accurate information to go on. This talk will describe recent updates in NREL’s System Advisor Model program to model partial shading losses with and without distributed power electronics, along with experimental validation results. Computer modeling is able to predict the performance of distributed power electronics (microinverters, power optimizers) in PV systems. However, details about partial shade and other mismatch must be known in order tomore » give the model accurate information to go on. This talk will describe recent updates in NREL’s System Advisor Model program to model partial shading losses.« less
Pfeffer, J; Sutton, R I
1999-01-01
In today's business world, there's no shortage of know-how. When companies get into trouble, their executives have vast resources at their disposal: their own experiences, colleagues' ideas, reams of computer-generated data, thousands of publications, and consultants armed with the latest managerial concepts and tools. But all too often, even with all that knowledge floating around, companies are plagued with an inertia that comes from knowing too much and doing too little--a phenomenon the authors call the knowing-doing gap. The gap often can be traced to a basic human propensity: the willingness to let talk substitute for action. When confronted with a problem, people act as though discussing it, formulating decisions, and hashing out plans for action are the same as actually fixing it. And after researching organizations of all shapes and sizes, the authors concluded that a particular kind of talk is an especially insidious inhibitor of action: "smart talk." People who can engage in such talk generally sound confident and articulate; they can spout facts and may even have interesting ideas. But such people often exhibit the less benign aspects of smart talk as well: They focus on the negative, and they favor unnecessarily complex or abstract language. The former lapses into criticism for criticism's sake; the latter confuses people. Both tendencies can stop action in its tracks. How can you shut the smart-talk trap and close the knowing-doing gap? The authors lay out five methods that successful companies employ in order to translate the right kind of talk into intelligent action.
Spatio-Temporal Nonlinear Filtering With Applications to Information Assurance and Counter Terrorism
2011-11-14
2009): 279. doi: 10.1016/j.comnet.2008.10.001 2011/11/07 14:36:13 44 Jelena Mirkovic, Peter Reiher, Christos Papadopoulos, Alefiya Hussain, Marla Shepard ...Applications to Remote Sensing,” Department of Statistics and Department of Computer Sciences, University of Chicago , September, 2011 (Invited). 2. A.G... Chicago , IL April 13, 2010, Edward H. Bosch Organizer. 58. Andrea Bertozzi, Invited talk, Plenary talk, Joint SIAM/RSME-SCM-SEMA Meeting on Emerging
48 CFR 212.7003 - Technical data and computer software.
Code of Federal Regulations, 2011 CFR
2011-10-01
... computer software. 212.7003 Section 212.7003 Federal Acquisition Regulations System DEFENSE ACQUISITION... data and computer software. For purposes of establishing delivery requirements and license rights for technical data under 227.7102 and for computer software under 227.7202, there shall be a rebuttable...
48 CFR 212.7003 - Technical data and computer software.
Code of Federal Regulations, 2013 CFR
2013-10-01
... computer software. 212.7003 Section 212.7003 Federal Acquisition Regulations System DEFENSE ACQUISITION... data and computer software. For purposes of establishing delivery requirements and license rights for technical data under 227.7102 and for computer software under 227.7202, there shall be a rebuttable...
48 CFR 212.7003 - Technical data and computer software.
Code of Federal Regulations, 2012 CFR
2012-10-01
... computer software. 212.7003 Section 212.7003 Federal Acquisition Regulations System DEFENSE ACQUISITION... data and computer software. For purposes of establishing delivery requirements and license rights for technical data under 227.7102 and for computer software under 227.7202, there shall be a rebuttable...
48 CFR 212.7003 - Technical data and computer software.
Code of Federal Regulations, 2014 CFR
2014-10-01
... computer software. 212.7003 Section 212.7003 Federal Acquisition Regulations System DEFENSE ACQUISITION... data and computer software. For purposes of establishing delivery requirements and license rights for technical data under 227.7102 and for computer software under 227.7202, there shall be a rebuttable...
48 CFR 212.7003 - Technical data and computer software.
Code of Federal Regulations, 2010 CFR
2010-10-01
... computer software. 212.7003 Section 212.7003 Federal Acquisition Regulations System DEFENSE ACQUISITION... data and computer software. For purposes of establishing delivery requirements and license rights for technical data under 227.7102 and for computer software under 227.7202, there shall be a rebuttable...
Technical Writing in the Computer Industry: Job Opportunities for PH.D.'s.
ERIC Educational Resources Information Center
Turnbull, Andrew D.
1981-01-01
Answers questions about the field of technical writing, especially in the computer industry. Explains what "software" and "software documentation" are, what the "software documentation specialist" (technical writer) does, and how to prepare for such a job. (FL)
How to manage without being a manager
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sweeney, M.A.
1997-06-01
In the author`s current position at Sandia National Laboratories within the Pulsed Power Sciences Center, much of the author`s time is spent in composing short (one page) technical reports and long (> 20 page) technical contracts and program plans for transmission to the Department of Energy and to upper management and also in reviewing long technical documents for accuracy. A major requirement of these efforts is to complete them on a timely basis, often within a few hours or a few days. In this talk, the author reveals some communication {open_quotes}secrets{close_quotes} that have been learned. The idea behind these twelvemore » {open_quotes}secrets{close_quotes} is to get the answers you, as a nonmanager, need quickly from a manager without creating stress either on your part or the manager`s part.« less
Generalizations of polylogarithms for Feynman integrals
NASA Astrophysics Data System (ADS)
Bogner, Christian
2016-10-01
In this talk, we discuss recent progress in the application of generalizations of polylogarithms in the symbolic computation of multi-loop integrals. We briefly review the Maple program MPL which supports a certain approach for the computation of Feynman integrals in terms of multiple polylogarithms. Furthermore we discuss elliptic generalizations of polylogarithms which have shown to be useful in the computation of the massive two-loop sunrise integral.
Francescatto, Margherita; Hermans, Susanne M A; Babaei, Sepideh; Vicedo, Esmeralda; Borrel, Alexandre; Meysman, Pieter
2015-01-01
In this meeting report, we give an overview of the talks, presentations and posters presented at the third European Symposium of the International Society for Computational Biology (ISCB) Student Council. The event was organized as a satellite meeting of the 13th European Conference for Computational Biology (ECCB) and took place in Strasbourg, France on September 6th, 2014.
Underwater sound from the whale's point of view
NASA Astrophysics Data System (ADS)
Arveson, Paul T.
2003-04-01
There have been numerous reports in the recent literature of apparently stressful effects on marine mammals due to sonar experiments. But another man-made source-the radiated noise from ships-contributes significantly to the ocean ambient, nearly everywhere and all the time. The technical basis for this talk is a set of accurate and detailed measurements of the radiated noise of a typical cargoship [P. Arveson and D. Vendittis, ``Radiated noise characteristics of a large cargo ship,'' J. Acoust. Soc. Am. (2000)]. However, the talk will be a popular-level demonstration and a (necessarily) fictitious narrative of acoustical experiences from a humpback whale's point of view. Room acoustics permitting, the audience should be able to gain an experiential insight into the environmental impact of shipping noise on the life and habits of these creatures.
Ourmazd, Abbas [University of Wisconsin, Milwaukee, Wisconsin, USA
2017-12-09
Ever shattered a valuable vase into 10 to the 6th power pieces and tried to reassemble it under a light providing a mean photon count of 10 minus 2 per detector pixel with shot noise? If you can do that, you can do single-molecule crystallography. This talk will outline how this can be done in principle. In more technical terms, the talk will describe how the combination of scattering physics and Bayesian algorithms can be used to reconstruct the 3-D diffracted intensity distribution from a collection of individual 2-D diffiraction patterns down to a mean photon count of 10 minus 2 per pixel, the signal level anticipated from the Linac Coherent Light Source, and hence determine the structure of individual macromolecules and nanoparticles.
The Los Alamos Scientific Laboratory - An Isolated Nuclear Research Establishment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bradbury, Norris E.; Meade, Roger Allen
Early in his twenty-five year career as the Director of the Los Alamos Scientific Laboratory, Norris Bradbury wrote at length about the atomic bomb and the many implications the bomb might have on the world. His themes were both technical and philosophical. In 1963, after nearly twenty years of leading the nation’s first nuclear weapons laboratory, Bradbury took the opportunity to broaden his writing. In a paper delivered to the International Atomic Energy Agency’s symposium on the “Criteria in the Selection of Sites for the Construction of Reactors and Nuclear Research Centers,” Bradbury took the opportunity to talk about themore » business of nuclear research and the human component of operating a scientific laboratory. This report is the transcript of his talk.« less
Game Changers: The Quest to Rethink Institutional Roles and Functions at U.S. Community Colleges
ERIC Educational Resources Information Center
Woods, Bob
2014-01-01
When the 10 members of the American Association of Community College's (AACC's) 21st-Century Implementation Team 7 (nine of whom are community college presidents) sat down in 2013 to talk about reforming institutional roles and functions at the nation's two-year career and technical colleges, everyone in the room knew the work before them would be…
Strategic Engagement in Global S&T: Opportunities for Defense Research
2014-01-01
local customization, gaining access to new markets, and placing technical staff close to manufacturing and design centers, but also because the...visit Visual access to research process; can talk to more people about the work Collaboration Designing , carrying out, and analyzing research...Development, and Acquisition DASN(RDT&E) Deputy Assistant Secretary of the Navy for Research, Development, Testing , and Evaluation Chief of Naval Research
Towards Intelligent Control for Next Generation CESTOL Aircraft
NASA Technical Reports Server (NTRS)
Acosta, Diana Michelle
2008-01-01
This talk will present the motivation, research approach and status of intelligent control research for Next Generation Cruise Efficient Short Take Off and Landing (CESTOL) aircraft. An introduction to the challenges of CESTOL control will be given, leading into an assessment of potential control solutions. The approach of the control research will be discussed, including a brief overview of the technical aspects of the research.
ERIC Educational Resources Information Center
Association for Education in Journalism and Mass Communication.
The Media and Technology section of this collection of conference presentations contains the following 13 papers: "The 'Talking Newspaper': The Technical Virtuosity and Monologic Modality of Audiotex(t)" (George Albert Gladney); "An Historic Opportunity?: Communication Research in the Design of Communication Interfaces and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grossman, David
A irreverent non-technical review of the history of surprisingly animate machines, from ancient Egypt to current times. Areas include teleoperators for hazardous environments, assembly systems, medical applications, entertainment, and science fiction. The talk has over 100 slides, covering such varied topics as Memnon son of Dawn, Droz's automata, Vaucanson's duck, cathedral clocks, Von Kempelen's chess player, household robots, Asimov's laws, Disneyland, dinosaurs, and movie droids and cyborgs.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-07
....usda.gov . SUPPLEMENTARY INFORMATION: A. Background A proposed rule was published in the Federal.... Computers or other technical equipment means central processing units, laptops, desktops, computer mouses...
Problematics of different technical maintenance for computers
NASA Technical Reports Server (NTRS)
Dostalek, Z.
1977-01-01
Two modes of operations are used in the technical maintenance of computers: servicing provided by the equipment supplier, and that done by specially trained computer users. The advantages and disadvantages of both modes are discussed. Maintenance downtime is tabulated for two computers serviced by user employees over an eight year period.
Code of Federal Regulations, 2014 CFR
2014-10-01
.... Computer software does not include computer data bases or computer software documentation. Litigation... includes technical data and computer software, but does not include information that is lawfully, publicly available without restriction. Technical data means recorded information, regardless of the form or method...
48 CFR 252.227-7032 - Rights in technical data and computer software (foreign).
Code of Federal Regulations, 2013 CFR
2013-10-01
... and computer software (foreign). 252.227-7032 Section 252.227-7032 Federal Acquisition Regulations... computer software (foreign). As prescribed in 227.7103-17, use the following clause: Rights in Technical Data and Computer Software (Foreign) (JUN 1975) The United States Government may duplicate, use, and...
48 CFR 252.227-7032 - Rights in technical data and computer software (foreign).
Code of Federal Regulations, 2014 CFR
2014-10-01
... and computer software (foreign). 252.227-7032 Section 252.227-7032 Federal Acquisition Regulations... computer software (foreign). As prescribed in 227.7103-17, use the following clause: Rights in Technical Data and Computer Software (Foreign) (JUN 1975) The United States Government may duplicate, use, and...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-08
...; Defense Federal Acquisition Regulation Supplement; Rights in Technical Data and Computer Software (OMB... 227.72, Rights in Computer Software and Computer Software Documentation, and related provisions and... rights in technical data and computer software. DoD needs this information to implement 10 U.S.C. 2320...
48 CFR 252.227-7032 - Rights in technical data and computer software (foreign).
Code of Federal Regulations, 2011 CFR
2011-10-01
... and computer software (foreign). 252.227-7032 Section 252.227-7032 Federal Acquisition Regulations... computer software (foreign). As prescribed in 227.7103-17, use the following clause: Rights in Technical Data and Computer Software (Foreign) (JUN 1975) The United States Government may duplicate, use, and...
48 CFR 252.227-7032 - Rights in technical data and computer software (foreign).
Code of Federal Regulations, 2012 CFR
2012-10-01
... and computer software (foreign). 252.227-7032 Section 252.227-7032 Federal Acquisition Regulations... computer software (foreign). As prescribed in 227.7103-17, use the following clause: Rights in Technical Data and Computer Software (Foreign) (JUN 1975) The United States Government may duplicate, use, and...
48 CFR 252.227-7032 - Rights in technical data and computer software (foreign).
Code of Federal Regulations, 2010 CFR
2010-10-01
... and computer software (foreign). 252.227-7032 Section 252.227-7032 Federal Acquisition Regulations... computer software (foreign). As prescribed in 227.7103-17, use the following clause: Rights in Technical Data and Computer Software (Foreign) (JUN 1975) The United States Government may duplicate, use, and...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-23
... Data and Computer Software AGENCY: Defense Acquisition Regulations System; Department of Defense (DoD... in Technical Data, and Subpart 227.72, Rights in Computer Software and Computer Software... are associated with rights in technical data and computer software. DoD needs this information to...
Making Advanced Computer Science Topics More Accessible through Interactive Technologies
ERIC Educational Resources Information Center
Shao, Kun; Maher, Peter
2012-01-01
Purpose: Teaching advanced technical concepts in a computer science program to students of different technical backgrounds presents many challenges. The purpose of this paper is to present a detailed experimental pedagogy in teaching advanced computer science topics, such as computer networking, telecommunications and data structures using…
Cell phone use while driving and attributable crash risk.
Farmer, Charles M; Braitman, Keli A; Lund, Adrian K
2010-10-01
Prior research has estimated that crash risk is 4 times higher when talking on a cell phone versus not talking. The objectives of this study were to estimate the extent to which drivers talk on cell phones while driving and to compute the implied annual number of crashes that could have been avoided if driver cell phone use were restricted. A national survey of approximately 1200 U.S. drivers was conducted. Respondents were asked to approximate the amount of time spent driving during a given day, number of cell phone calls made or received, and amount of driving time spent talking on a cell phone. Population attributable risk (PAR) was computed for each combination of driver gender, driver age, day of week, and time of day. These were multiplied by the corresponding crash counts to estimate the number of crashes that could have been avoided. On average, drivers were talking on cell phones approximately 7 percent of the time while driving. Rates were higher on weekdays (8%), in the afternoon and evening (8%), and for drivers younger than 30 (16%). Based on these use rates, restricting cell phones while driving could have prevented an estimated 22 percent (i.e., 1.3 million) of the crashes in 2008. Although increased rates of cell phone use while driving should be leading to increased crash rates, crash rates have been declining. Reasons for this paradox are unclear. One possibility is that the increase in cell phone use and crash risk due to cell phone use have been overestimated. Another possibility is that cell phone use has supplanted other driving distractions that were similarly hazardous.
"Small Talk Is Not Cheap": Phatic Computer-Mediated Communication in Intercultural Classes
ERIC Educational Resources Information Center
Maíz-Arévalo, Carmen
2017-01-01
The present study aims to analyse the phatic exchanges performed by a class of nine intercultural Master's students during a collaborative assignment which demanded online discussion using English as a lingua franca (ELF). Prior studies on the use of phatic communication in computer-mediated communication have concentrated on social networking…
Department of Defense High Performance Computing Modernization Program. 2008 Annual Report
2009-04-01
place to another on the network. Without it, a computer could only talk to itself - no email, no web browsing, and no iTunes . Most of the Internet...Your SecurID Card ), Ken Renard Secure Wireless, Rob Scott and Stephen Bowman Securing Today’s Networks, Rich Whittney, Juniper Networks, Federal
This presentation gives a brief introduction to EPA's computational toxicology program and the Athens Lab's role in it. The talk also covered a brief introduction to metabolomics; advantages/disadvanage of metabolomics for toxicity assessment; goals of the EPA Athens metabolomics...
Computing Spacetimes: From Cosmology to Black Holes
NASA Technical Reports Server (NTRS)
Centrella, Joan
2007-01-01
Numerical relativity, the solution of the Einstein equations on a computer, is one of the most challenging and exciting areas of physics. Richard Matzner has played a key role in this subject from its birth, roughly 3 decades ago, to the present. This talk will present some of the highlights of Richard's work in numerical relativity.
Antony Williams is a Computational Chemist at the US Environmental Protection Agency in the National Center for Computational Toxicology. He has been involved in cheminformatics and the dissemination of chemical information for over twenty-five years. He has worked for a Fortune ...
Bimodal Reading: Benefits of a Talking Computer for Average and Less Skilled Readers.
ERIC Educational Resources Information Center
Montali, Julie; Lewandowski, Lawrence
1996-01-01
Eighteen average readers and 18 less-skilled readers (grades 8 and 9) were presented with social studies and science passages via a computer either visually (on screen), auditorily (read by digitized voice), or bimodally (on screen, highlighted while being voiced). Less-skilled readers demonstrated comprehension in the bimodal condition equivalent…
NASA Astrophysics Data System (ADS)
Fedosov, Dmitry
2011-03-01
Computational biophysics is a large and rapidly growing area of computational physics. In this talk, we will focus on a number of biophysical problems related to blood cells and blood flow in health and disease. Blood flow plays a fundamental role in a wide range of physiological processes and pathologies in the organism. To understand and, if necessary, manipulate the course of these processes it is essential to investigate blood flow under realistic conditions including deformability of blood cells, their interactions, and behavior in the complex microvascular network. Using a multiscale cell model we are able to accurately capture red blood cell mechanics, rheology, and dynamics in agreement with a number of single cell experiments. Further, this validated model yields accurate predictions of the blood rheological properties, cell migration, cell-free layer, and hemodynamic resistance in microvessels. In addition, we investigate blood related changes in malaria, which include a considerable stiffening of red blood cells and their cytoadherence to endothelium. For these biophysical problems computational modeling is able to provide new physical insights and capabilities for quantitative predictions of blood flow in health and disease.
Web Delivery of Interactive Laboratories: Comparison of Three Authoring Tools
NASA Astrophysics Data System (ADS)
Silbar, Richard R.
2002-04-01
It is well-known that the more a student interacts with a subject, the better he or she will learn it. This is particularly true in technical subjects. One way to do this is to have computer-based "laboratories" in which the student manipulates objects on the screen with keyboard or mouse and then sees the outcome of those actions. One example of such a laboratory we have built, using Macromedia's Authorware, deals with addition of two vectors in the geometric approach. The problem with Authorware, however, is that, if one wants to deliver the training over the Web, that requires the download and installation of a big plug-in. Therefore, as an experiment, I built clones of the Vector Addition Laboratory using Macromedia's Director or Flash, each of which have smaller plug-ins which are often already installed in the user's browser. The Director and Flash versions are similar to (but definitely not the same as) the Authorware version. This talk goes into these differences and demonstrates the techniques used. You can view the three examples on-line at http://www.whistlesoft.com/ silbar.
Magnetic Resonance Microscopy of the Lung
NASA Astrophysics Data System (ADS)
Johnson, G. Allan
1999-11-01
The lung presents both challenges and opportunities for study by magnetic resonance imaging (MRI). The technical challenges arise from respiratory and cardiac motion, limited signal from the tissues, and unique physical structure of the lung. These challenges are heightened in magnetic resonance microscopy (MRM) where the spatial resolution may be up to a million times higher than that of conventional MRI. The development of successful techniques for MRM of the lung present enormous opportunities for basic studies of lung structure and function, toxicology, environmental stress, and drug discovery by permitting investigators to study this most essential organ nondestructively in the live animal. Over the last 15 years, scientists at the Duke Center for In Vivo Microscopy have developed techniques for MRM in the live animal through an interdisciplinary program of biology, physics, chemistry, electrical engineering, and computer science. This talk will focus on the development of specialized radiofrequency coils for lung imaging, projection encoding methods to limit susceptibility losses, specialized support structures to control and monitor physiologic motion, and the most recent development of hyperpolarized gas imaging with ^3He and ^129Xe.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koniges, A.E.; Craddock, G.G.; Schnack, D.D.
The purpose of the workshop was to assemble workers, both within and outside of the fusion-related computations areas, for discussion regarding the issues of dynamically adaptive gridding. There were three invited talks related to adaptive gridding application experiences in various related fields of computational fluid dynamics (CFD), and nine short talks reporting on the progress of adaptive techniques in the specific areas of scrape-off-layer (SOL) modeling and magnetohydrodynamic (MHD) stability. Adaptive mesh methods have been successful in a number of diverse fields of CFD for over a decade. The method involves dynamic refinement of computed field profiles in a waymore » that disperses uniformly the numerical errors associated with discrete approximations. Because the process optimizes computational effort, adaptive mesh methods can be used to study otherwise the intractable physical problems that involve complex boundary shapes or multiple spatial/temporal scales. Recent results indicate that these adaptive techniques will be required for tokamak fluid-based simulations involving the diverted tokamak SOL modeling and MHD simulations problems related to the highest priority ITER relevant issues.Individual papers are indexed separately on the energy data bases.« less
Algorithmics - Is There Hope for a Unified Theory?
NASA Astrophysics Data System (ADS)
Hromkovič, Juraj
Computer science was born with the formal definition of the notion of an algorithm. This definition provides clear limits of automatization, separating problems into algorithmically solvable problems and algorithmically unsolvable ones. The second big bang of computer science was the development of the concept of computational complexity. People recognized that problems that do not admit efficient algorithms are not solvable in practice. The search for a reasonable, clear and robust definition of the class of practically solvable algorithmic tasks started with the notion of the class {P} and of {NP}-completeness. In spite of the fact that this robust concept is still fundamental for judging the hardness of computational problems, a variety of approaches was developed for solving instances of {NP}-hard problems in many applications. Our 40-years short attempt to fix the fuzzy border between the practically solvable problems and the practically unsolvable ones partially reminds of the never-ending search for the definition of "life" in biology or for the definitions of matter and energy in physics. Can the search for the formal notion of "practical solvability" also become a never-ending story or is there hope for getting a well-accepted, robust definition of it? Hopefully, it is not surprising that we are not able to answer this question in this invited talk. But to deal with this question is of crucial importance, because only due to enormous effort scientists get a better and better feeling of what the fundamental notions of science like life and energy mean. In the flow of numerous technical results, we must not forget the fact that most of the essential revolutionary contributions to science were done by defining new concepts and notions.
ERIC Educational Resources Information Center
Pieper, Gail W.
1987-01-01
Recommends teaching about the uses of humor in technical writing classes by using computer user manuals. Suggests that humor has a place in technical communication, particularly in computer manuals, where new users' apprehension must be reduced, heavy technical points need clarification, and warnings and cautions should be reinforced. (SKC)
SMAC7; Sequential multi-channel analysis with computer-7; SMA7; Metabolic panel 7; CHEM-7 ... breathing problems, diabetes or diabetes-related complications, and medicine side effects. Talk to your provider about the ...
Debugging Techniques Used by Experienced Programmers to Debug Their Own Code.
1990-09-01
IS. NUMBER OF PAGES code debugging 62 computer programmers 16. PRICE CODE debug programming 17. SECURITY CLASSIFICATION 18. SECURITY CLASSIFICATION 119...Davis, and Schultz (1987) also compared experts and novices, but focused on the way a computer program is represented cognitively and how that...of theories in the emerging computer programming domain (Fisher, 1987). In protocol analysis, subjects are asked to talk/think aloud as they solve
2015-01-01
In this meeting report, we give an overview of the talks, presentations and posters presented at the third European Symposium of the International Society for Computational Biology (ISCB) Student Council. The event was organized as a satellite meeting of the 13th European Conference for Computational Biology (ECCB) and took place in Strasbourg, France on September 6th, 2014. PMID:25708611
Career Corner: Pitching Your Contributions at the Right Level
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson-Cook, Christine Michaela
Whether it is in a job interview, a presentation or in collaborations with colleagues with differing technical backgrounds, effectively conveying your ideas and contributions is at least as important as the content. Daniel Goleman speaks to the importance of emotional intelligence being a key driver of success and advancement. Why does this matter so much? If you dive right into technical details without providing a broader context and motivation for the problem, then the people with whom you are communicating will not appreciate the contribution. If you talk only about your ideas at a high level with insufficient detail, thenmore » the weight of your contributions might be undervalued or misinterpreted.« less
Career Corner: Pitching Your Contributions at the Right Level
Anderson-Cook, Christine Michaela
2017-04-15
Whether it is in a job interview, a presentation or in collaborations with colleagues with differing technical backgrounds, effectively conveying your ideas and contributions is at least as important as the content. Daniel Goleman speaks to the importance of emotional intelligence being a key driver of success and advancement. Why does this matter so much? If you dive right into technical details without providing a broader context and motivation for the problem, then the people with whom you are communicating will not appreciate the contribution. If you talk only about your ideas at a high level with insufficient detail, thenmore » the weight of your contributions might be undervalued or misinterpreted.« less
Web Services Provide Access to SCEC Scientific Research Application Software
NASA Astrophysics Data System (ADS)
Gupta, N.; Gupta, V.; Okaya, D.; Kamb, L.; Maechling, P.
2003-12-01
Web services offer scientific communities a new paradigm for sharing research codes and communicating results. While there are formal technical definitions of what constitutes a web service, for a user community such as the Southern California Earthquake Center (SCEC), we may conceptually consider a web service to be functionality provided on-demand by an application which is run on a remote computer located elsewhere on the Internet. The value of a web service is that it can (1) run a scientific code without the user needing to install and learn the intricacies of running the code; (2) provide the technical framework which allows a user's computer to talk to the remote computer which performs the service; (3) provide the computational resources to run the code; and (4) bundle several analysis steps and provide the end results in digital or (post-processed) graphical form. Within an NSF-sponsored ITR project coordinated by SCEC, we are constructing web services using architectural protocols and programming languages (e.g., Java). However, because the SCEC community has a rich pool of scientific research software (written in traditional languages such as C and FORTRAN), we also emphasize making existing scientific codes available by constructing web service frameworks which wrap around and directly run these codes. In doing so we attempt to broaden community usage of these codes. Web service wrapping of a scientific code can be done using a "web servlet" construction or by using a SOAP/WSDL-based framework. This latter approach is widely adopted in IT circles although it is subject to rapid evolution. Our wrapping framework attempts to "honor" the original codes with as little modification as is possible. For versatility we identify three methods of user access: (A) a web-based GUI (written in HTML and/or Java applets); (B) a Linux/OSX/UNIX command line "initiator" utility (shell-scriptable); and (C) direct access from within any Java application (and with the correct API interface from within C++ and/or C/Fortran). This poster presentation will provide descriptions of the following selected web services and their origin as scientific application codes: 3D community velocity models for Southern California, geocoordinate conversions (latitude/longitude to UTM), execution of GMT graphical scripts, data format conversions (Gocad to Matlab format), and implementation of Seismic Hazard Analysis application programs that calculate hazard curve and hazard map data sets.
Future tense: call for a new generation of artists
NASA Astrophysics Data System (ADS)
Ohlmann, Dietmar
1995-02-01
Some people try hard to educate others about the beauty and technical benefits of holographic applications but another generation is already waiting to learn more about the media which talk to them about the future. Today the most common question is 'How can I do holograms with a computer?' 'Can I do it with an Amiga?' For the MIT specialists these are now very simple questions. We can expect to see the present shape of the holographic laboratory pass into history. I personally like to work with a VHS camera and mix it with CAD/CAM images, but computer and video are not the only media which will change the face of holography. The He.Ne. will be exchanged by diode laser. In a wavelength of 690 nm, some of them bring 40 mW in single mode and single line, not bigger than your little finger. Having such energy in so little a container, and the state of the art drifts rapidly into more flexibility. Using new media and introducing it in our societies give us a new responsibility. Would too much media kill the art? I do not think so, because I like the variety of media which give new possibility of expression. The game with new media is the power of creativity and it will find its meaning by itself.
... in files on a computer or within an electronic medical record (EMR) , or in paper folders. When ... 2014 More on this topic for: Parents Teens Electronic Health Records Talking to Your Child's Doctor How ...
Genomic impact of eukaryotic transposable elements
2012-01-01
The third international conference on the genomic impact of eukaryotic transposable elements (TEs) was held 24 to 28 February 2012 at the Asilomar Conference Center, Pacific Grove, CA, USA. Sponsored in part by the National Institutes of Health grant 5 P41 LM006252, the goal of the conference was to bring together researchers from around the world who study the impact and mechanisms of TEs using multiple computational and experimental approaches. The meeting drew close to 170 attendees and included invited floor presentations on the biology of TEs and their genomic impact, as well as numerous talks contributed by young scientists. The workshop talks were devoted to computational analysis of TEs with additional time for discussion of unresolved issues. Also, there was ample opportunity for poster presentations and informal evening discussions. The success of the meeting reflects the important role of Repbase in comparative genomic studies, and emphasizes the need for close interactions between experimental and computational biologists in the years to come. PMID:23171443
Genomic impact of eukaryotic transposable elements.
Arkhipova, Irina R; Batzer, Mark A; Brosius, Juergen; Feschotte, Cédric; Moran, John V; Schmitz, Jürgen; Jurka, Jerzy
2012-11-21
The third international conference on the genomic impact of eukaryotic transposable elements (TEs) was held 24 to 28 February 2012 at the Asilomar Conference Center, Pacific Grove, CA, USA. Sponsored in part by the National Institutes of Health grant 5 P41 LM006252, the goal of the conference was to bring together researchers from around the world who study the impact and mechanisms of TEs using multiple computational and experimental approaches. The meeting drew close to 170 attendees and included invited floor presentations on the biology of TEs and their genomic impact, as well as numerous talks contributed by young scientists. The workshop talks were devoted to computational analysis of TEs with additional time for discussion of unresolved issues. Also, there was ample opportunity for poster presentations and informal evening discussions. The success of the meeting reflects the important role of Repbase in comparative genomic studies, and emphasizes the need for close interactions between experimental and computational biologists in the years to come.
Pakistan’s Nuclear Weapons: Proliferation and Security Issues
2009-10-15
and technical measures to prevent unauthorized or accidental use of nuclear weapons, as well as contribute to physical security of storage ...Talks On Nuclear Security,” The Boston Globe, May 5, 2009. 79 Abdul Mannan, “Preventing Nuclear Terrorism in Pakistan: Sabotage of a Spent Fuel Cask or...a Commercial Irradiation Source in Transport ,” in Pakistan’s Nuclear Future, 2008; Martellini, 2008. 80 Martellini, 2008. 81 For more information
Microwave Semiconductor Research - Materials, Devices, Circuits.
1982-04-30
34, C.L. Tang and J-M. Halbout, invited talk, SPIE Technical Symposium, Los Angeles, CA (January, 1982). 3. "Observation of light induced refractive index ... index slab its desirable dispersive properties. The relatively poor dispersion characteristics of the uniform dielectric slab can be attributed to the...34, January 1982. 2. H. Zmuda completed his M.S. program. Thesis: "Simplified Dispersion Analysis of the Multistep and Graded Index Dielectric Slab Waveguide
A Research Program in Computer Technology. 1986 Annual Technical Report
1989-08-01
1986 (Annual Technical Report I July 1985 - June 1986 A Research Program in Computer Technology ISI/SR-87-178 U S C INFORMA-TION S C I EN C ES...Program in Computer Technology (Unclassified) 12. PERSONAL AUTHOR(S) 151 Research Staff 13a. TYPE OF REPORT 113b. TIME COVERED 14 DATE OF REPORT (Yeer...survivable networks 17. distributed processing, local networks, personal computers, workstation environment 18. computer acquisition, Strategic Computing 19
Introduction to the Use of Computers in Libraries: A Textbook for the Non-Technical Student.
ERIC Educational Resources Information Center
Ogg, Harold C.
This book outlines computing and information science from the perspective of what librarians and educators need to do with computer technology and how it can help them perform their jobs more efficiently. It provides practical explanations and library applications for non-technical users of desktop computers and other library automation tools.…
Talk Across the Oceans: Language and Culture of the Global Internet Community.
ERIC Educational Resources Information Center
Takahashi, Shinji
1996-01-01
Discusses some of the technological difficulties associated with the use of English or other European languages on the Internet, and uses Japanese computing as an example. Examines the linguistic culture of the language with attention to English, how technology limits/expands communication, and the role of languages in the computer domain.…
Higher Inductive Types as Homotopy-Initial Algebras
2016-08-01
Higher Inductive Types as Homotopy-Initial Algebras Kristina Sojakova CMU-CS-16-125 August 2016 School of Computer Science Carnegie Mellon University...talk at the Workshop on Logic, Language, Information and Computation (WoLLIC 2011). 1, 2.1 [38] M. Warren. Homotopy-Theoretic Aspects of Constructive Type Theory. PhD thesis, Carnegie Mellon University, 2008. 1 143
Adjustable Spin-Spin Interaction with 171Yb+ ions and Addressing of a Quantum Byte
NASA Astrophysics Data System (ADS)
Wunderlich, Christof
2015-05-01
Trapped atomic ions are a well-advanced physical system for investigating fundamental questions of quantum physics and for quantum information science and its applications. When contemplating the scalability of trapped ions for quantum information science one notes that the use of laser light for coherent operations gives rise to technical and also physical issues that can be remedied by replacing laser light by microwave (MW) and radio-frequency (RF) radiation employing suitably modified ion traps. Magnetic gradient induced coupling (MAGIC) makes it possible to coherently manipulate trapped ions using exclusively MW and RF radiation. After introducing the general concept of MAGIC, I shall report on recent experimental progress using 171Yb+ ions, confined in a suitable Paul trap, as effective spin-1/2 systems interacting via MAGIC. Entangling gates between non-neighbouring ions will be presented. The spin-spin coupling strength is variable and can be adjusted by variation of the secular trap frequency. In general, executing a quantum gate with a single qubit, or a subset of qubits, affects the quantum states of all other qubits. This reduced fidelity of the whole quantum register may preclude scalability. We demonstrate addressing of individual qubits within a quantum byte (eight qubits interacting via MAGIC) using MW radiation and measure the error induced in all non-addressed qubits (cross-talk) associated with the application of single-qubit gates. The measured cross-talk is on the order 10-5 and therefore below the threshold commonly agreed sufficient to efficiently realize fault-tolerant quantum computing. Furthermore, experimental results on continuous and pulsed dynamical decoupling (DD) for protecting quantum memories and quantum gates against decoherence will be briefly discussed. Finally, I report on using continuous DD to realize a broadband ultrasensitive single-atom magnetometer.
ERIC Educational Resources Information Center
Anderson, Greg; And Others
1996-01-01
Describes the Computer Science Technical Report Project, one of the earliest investigations into the system engineering of digital libraries which pioneered multiinstitutional collaborative research into technical, social, and legal issues related to the development and implementation of a large, heterogeneous, distributed digital library. (LRW)
ERIC Educational Resources Information Center
Wall, Jeffrey D.; Knapp, Janice
2014-01-01
Learning technical computing skills is increasingly important in our technology driven society. However, learning technical skills in information systems (IS) courses can be difficult. More than 20 percent of students in some technical courses may dropout or fail. Unfortunately, little is known about students' perceptions of the difficulty of…
Computer Supported Education at Fox Valley Technical Institute. IBM Application Brief.
ERIC Educational Resources Information Center
International Business Machines Corp., White Plains, NY.
Fox Valley Technical Institute (FVTI) has developed an approach to education which emphasizes competency-based, round-the-clock education entailing short terms, flexible class schedules, and individualized instruction and which has as its focus strong computer support at classroom, technical, and management levels. The college provides 6,000…
Computer Programs for Technical Communicators: The Compelling Curriculum. Working Draft.
ERIC Educational Resources Information Center
Selfe, Cynthia L.; Wahlstrom, Billie J.
A series of computer programs have been developed at Michigan Technological University for use with technical writing and technical communications classes. The first type of program in the series, CURIE II, includes process-based modules, each of which corresponds to one of the following assignments: memoranda, resumes, feasibility reports,…
Sohn, W
2003-11-13
An essential factor for successful sex counseling by the family doctor is an atmosphere of openness and trust between physician and patient. However, few patients will begin to talk about their sexual problems of their own accord. The physician should therefore allow himself sufficient time for such counseling, be aware of his own limitations, and develop an ear attuned to involuntary remarks by the patient. During talks, only sparse use should be made of technical terms, the better to encourage the patient. The problems most commonly described in the doctor's office are functional disorders with a psychosomatic cause, and triggering factors may vary considerably (a high level of stress at the workplace, social or financial crises, monotonous leisure activities). In view of this, a somatic investigation should always be preceded by careful history-taking.
How to Cloud for Earth Scientists: An Introduction
NASA Technical Reports Server (NTRS)
Lynnes, Chris
2018-01-01
This presentation is a tutorial on getting started with cloud computing for the purposes of Earth Observation datasets. We first discuss some of the main advantages that cloud computing can provide for the Earth scientist: copious processing power, immense and affordable data storage, and rapid startup time. We also talk about some of the challenges of getting the most out of cloud computing: re-organizing the way data are analyzed, handling node failures and attending.
Training Methods to Build Human Terrain Mapping Skills
2010-10-01
confidence in making friends, and talking to strangers. • Language – a few key phrases. • Language training with Arabic teacher (not computer -based...session to evaluate the lesson content and delivery method. Based on your feedback we will make changes and corrections to the content and the computer ...requirement, exemplar training materials were developed. The training materials took the form of a modular computer /web-based and web-deliverable course of
Gender, Lies and Video Games: the Truth about Females and Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klawe, Maria M.
2006-02-22
This talk explores how girls and women differ from boys and men in their uses of and attitudes towards computers and computing. From playing computer games to pursuing computing careers, the participation of females tends to be very low compared to that of males. Why is this? Opinions range from girls wanting to avoid the math and/or the geek image of programming to girls having better things to do with their lives. We discuss research findings on this issue, as well as initiatives designed to increase the participation of females in computing.
Gender, Lies and Video Games: the Truth about Females and Computing
Klawe, Maria M. [Princeton University, Princeton, New Jersey, United States
2017-12-09
This talk explores how girls and women differ from boys and men in their uses of and attitudes towards computers and computing. From playing computer games to pursuing computing careers, the participation of females tends to be very low compared to that of males. Why is this? Opinions range from girls wanting to avoid the math and/or the geek image of programming to girls having better things to do with their lives. We discuss research findings on this issue, as well as initiatives designed to increase the participation of females in computing.
IUTAM Symposium on Hydrodynamic Diffusion of Suspended Particles
NASA Technical Reports Server (NTRS)
Davis, R. H.
1995-01-01
The focus of the symposium was on multiparticle hydrodynamic interactions which lead to fluctuating motion of the particles and resulting particle migration and dispersion or diffusion. Implications of these phenomena were described for sedimentation, fluidization, suspension flows, granular flows, and fiber suspensions. Computer simulation techniques as well as experimental techniques were described. Each session had an invited leadoff talk which overviewed the session topic as well as described the speaker's own related research. Ample time for discussion was included after each talk as well as at the end of each session. The symposium started with a keynote talk on the first evening on What is so puzzling about hydrodynamic diffusion?, which set the tone for the rest of the meeting by emphasizing both recent advances and unanswered issues.
Podcast: The Electronic Crimes Division
Sept 26, 2016. Chris Lukas, the Special Agent in Charge of the Electronic Crimes Division within the OIG's Office of Investigations talks about computer forensics, cybercrime in the EPA and his division's role in criminal investigations.
NASA Technical Reports Server (NTRS)
1981-01-01
Communication is made possible for disabled individuals by means of an electronic system, developed at Stanford University's School of Medicine, which produces highly intelligible synthesized speech. Familiarly known as the "talking wheelchair" and formally as the Versatile Portable Speech Prosthesis (VPSP). Wheelchair mounted system consists of a word processor, a video screen, a voice synthesizer and a computer program which instructs the synthesizer how to produce intelligible sounds in response to user commands. Computer's memory contains 925 words plus a number of common phrases and questions. Memory can also store several thousand other words of the user's choice. Message units are selected by operating a simple switch, joystick or keyboard. Completed message appears on the video screen, then user activates speech synthesizer, which generates a voice with a somewhat mechanical tone. With the keyboard, an experienced user can construct messages as rapidly as 30 words per minute.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tapia, Richard
1998-06-01
In June, The Center for Research on Parallel Computation (CRPC), an NSF-funded Science and Technology Center, hosted the 4th Annual Conference for African-American Reserachers in the Mathematical Sciences (CAARMS4) at Rice University. The main goal of this conference was to highlight current work by African-American researchers and graduate students in mathematics. This conference strengthened the mathematical sciences by encouraging the increased participation of African-American and underrepresented groups into the field, facilitating working relationships between them and helping to cultivate their careers. In addition to the talks there was a graduate student poster session and tutorials on topics in mathematics andmore » computer science. These talks, presentations, and discussions brought a broader perspective to the critical issues involving minority participation in mathematics.« less
Computer Art--A New Tool in Advertising Graphics.
ERIC Educational Resources Information Center
Wassmuth, Birgit L.
Using computers to produce art began with scientists, mathematicians, and individuals with strong technical backgrounds who used the graphic material as visualizations of data in technical fields. People are using computer art in advertising, as well as in painting; sculpture; music; textile, product, industrial, and interior design; architecture;…
Providing scalable system software for high-end simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greenberg, D.
1997-12-31
Detailed, full-system, complex physics simulations have been shown to be feasible on systems containing thousands of processors. In order to manage these computer systems it has been necessary to create scalable system services. In this talk Sandia`s research on scalable systems will be described. The key concepts of low overhead data movement through portals and of flexible services through multi-partition architectures will be illustrated in detail. The talk will conclude with a discussion of how these techniques can be applied outside of the standard monolithic MPP system.
NASA Astrophysics Data System (ADS)
Lamarque, J. F.
2016-12-01
In this talk, we will discuss the upcoming release of CESM2 and the computational and scientific challenges encountered in the process. We will then discuss upcoming new opportunities in development and applications of Earth System Models; in particular, we will discuss additional ways in which the university community can contribute to CESM.
Considerations for Future Climate Data Stewardship
NASA Astrophysics Data System (ADS)
Halem, M.; Nguyen, P. T.; Chapman, D. R.
2009-12-01
In this talk, we will describe the lessons learned based on processing and generating a decade of gridded AIRS and MODIS IR sounding data. We describe the challenges faced in accessing and sharing very large data sets, maintaining data provenance under evolving technologies, obtaining access to legacy calibration data and the permanent preservation of Earth science data records for on demand services. These lessons suggest a new approach to data stewardship will be required for the next decade of hyper spectral instruments combined with cloud resolving models. It will not be sufficient for stewards of future data centers to just provide the public with access to archived data but our experience indicates that data needs to reside close to computers with ultra large disc farms and tens of thousands of processors to deliver complex services on demand over very high speed networks much like the offerings of search engines today. Over the first decade of the 21st century, petabyte data records were acquired from the AIRS instrument on Aqua and the MODIS instrument on Aqua and Terra. NOAA data centers also maintain petabytes of operational IR sounders collected over the past four decades. The UMBC Multicore Computational Center (MC2) developed a Service Oriented Atmospheric Radiance gridding system (SOAR) to allow users to select IR sounding instruments from multiple archives and choose space-time- spectral periods of Level 1B data to download, grid, visualize and analyze on demand. Providing this service requires high data rate bandwidth access to the on line disks at Goddard. After 10 years, cost effective disk storage technology finally caught up with the MODIS data volume making it possible for Level 1B MODIS data to be available on line. However, 10Ge fiber optic networks to access large volumes of data are still not available from CSFC to serve the broader community. Data transfer rates are well below 10MB/s limiting their usefulness for climate studies. During this decade, processor performance hit a power wall leading computer vendors to design multicore processor chips. High performance computer systems obtained petaflop performance by clustering tens of thousands of multicore processor chips. Thus, power consumption and autonomic recovery from processor and disc failures have become major cost and technical considerations for future data archives. To address these new architecture requirements, a transparent parallel programming paradigm, the Hadoop MapReduce cloud computing system, became available as an open S/W system. In addition, the Hadoop File System and manages the distribution of data to these processors as well as backs up the processing in the event of any processor or disc failure. However, to employ this paradigm, the data needs to be stored on the computer system. We conclude this talk with a climate data preservation approach that addresses the scalability crisis to exabyte data requirements for the next decade based on projections of processor, disc data density and bandwidth doubling rates.
NASA Astrophysics Data System (ADS)
Houlton, H. R.; Ricci, J.; Wilson, C. E.; Keane, C.
2014-12-01
Professional development experiences, such as internships, research presentations and professional network building, are becoming increasingly important to enhance students' employability post-graduation. The practical, non-technical skills that are important for succeeding during these professional development experiences, such as public speaking, project management, ethical practices and writing, transition well and are imperative to the workplace. Thereby, graduates who have honed these skills are more competitive candidates for geoscience employment. Fortunately, the geoscience community recognizes the importance of these professional development opportunities and the skills required to successfully complete them, and are giving students the chance to practice non-technical skills while they are still enrolled in academic programs. The American Geosciences Institute has collected data regarding students' professional development experiences, including the preparation they receive in the corresponding non-technical skills. This talk will discuss the findings of two of AGI's survey efforts - the Geoscience Student Exit Survey and the Geoscience Careers Master's Preparation Survey (NSF: 1202707). Specifically, data highlighting the role played by internships, career opportunities and the complimentary non-technical skills will be discussed. As a practical guide, events informed by this research, such as AGI's professional development opportunities, networking luncheons and internships, will also be included.
Applying and evaluating computer-animated tutors
NASA Astrophysics Data System (ADS)
Massaro, Dominic W.; Bosseler, Alexis; Stone, Patrick S.; Connors, Pamela
2002-05-01
We have developed computer-assisted speech and language tutors for deaf, hard of hearing, and autistic children. Our language-training program utilizes our computer-animated talking head, Baldi, as the conversational agent, who guides students through a variety of exercises designed to teach vocabulary and grammer, to improve speech articulation, and to develop linguistic and phonological awareness. Baldi is an accurate three-dimensional animated talking head appropriately aligned with either synthesized or natural speech. Baldi has a tongue and palate, which can be displayed by making his skin transparent. Two specific language-training programs have been evaluated to determine if they improve word learning and speech articulation. The results indicate that the programs are effective in teaching receptive and productive language. Advantages of utilizing a computer-animated agent as a language tutor are the popularity of computers and embodied conversational agents with autistic kids, the perpetual availability of the program, and individualized instruction. Students enjoy working with Baldi because he offers extreme patience, he doesn't become angry, tired, or bored, and he is in effect a perpetual teaching machine. The results indicate that the psychology and technology of Baldi holds great promise in language learning and speech therapy. [Work supported by NSF Grant Nos. CDA-9726363 and BCS-9905176 and Public Health Service Grant No. PHS R01 DC00236.
American Conference on Neutron Scattering 2014
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dillen, J. Ardie
2014-12-31
Scientists from the around the world converged in Knoxville, TN to have share ideas, present technical information and contribute to the advancement of neutron scattering. Featuring over 400 oral/poster presentations, ACNS 2014 offered a strong program of plenary, invited and contributed talks and poster sessions covering topics in soft condensed matter, hard condensed matter, biology, chemistry, energy and engineering applications in neutron physics – confirming the great diversity of science that is enabled by neutron scattering.
CrossTalk: The Journal of Defense Software Engineering. Volume 19, Number 2
2006-02-01
dielectric film between hydrophobic and hydrophilic states to shape liquid into an optical lens . Wait, there’s more. Electrowetting (you’re still...I’ll have a Double-Layer Super Twist Nematic Latte, pronto. My favorite technical word is Electrowetting . I am not kidding. This is a legitimate...word and is a promising technolo- gy for optical switching networks and focusing lasers. Electrowetting (stop giggling) uses electrical fields to modify
1989-01-25
high fructose . The United States has plentiful corn at a cheap price, so high fructose syrup that uses corn as its raw material accounts for 40...percent of sugar consumption in the United States. Coca Cola, which has been termed a world class beverage, uses high fructose syrup as a sweetener. This...Country Sells High -Tech Spectrometers [XINHUA] ■■■■■■■■■ *" Foreign Trade, Technical Talks Conclude in Henan [CEI Database 7 Dec] ^ Henan
ERIC Educational Resources Information Center
Heidar, Davood Mashhadi; Afghari, Akbar
2015-01-01
The present paper concentrates on a web-based inquiry in the synchronous computer-mediated communication (SCMC) via Web 2.0 technologies of Talk and Write and Skype. It investigates EFL learners' socio-cognitive progress through dynamic assessment (DA), which follows Vygotsky's inclination for supportive interchange in the zone of proximal…
ERIC Educational Resources Information Center
Technology & Learning, 2008
2008-01-01
When it comes to IT, there has always been an important link between data center control and client flexibility. As computing power increases, so do the potentially crippling threats to security, productivity and financial stability. This article talks about Dell's On-Demand Desktop Streaming solution which is designed to centralize complete…
What Do We Learn in Smethwick Village? Computer Games, Media Learning and Discursive Confusion
ERIC Educational Resources Information Center
McDougall, Julian
2007-01-01
This article presents findings from research exploring the intervention made by the introduction of computer games as an object of study in Media Studies at AS level in England. The outcome is a range of discursive data in the form of teachers and students from two English colleges talking about their experiences of this curriculum encounter. This…
Writing as Involvement: A Case for Face-to-Face Classroom Talk in a Computer Age.
ERIC Educational Resources Information Center
Berggren, Anne G.
The abandonment of face-to-face voice conversations in favor of the use of electronic conversations in composition classes is an issue to be interrogated. In a recent push to "prepare students for the 21st century," teachers are asked to teach computer applications in the humanities--and composition teachers, who will teach writing in…
ERIC Educational Resources Information Center
Waddick, John
1994-01-01
Compares the effect of a chemistry computer simulation, written by the author, with the effect of an instructor demonstration. The study indicates that in this particular situation the operation of a spectrophotometer can be effectively taught by computer simulation method. The program is written using HyperTalk, the HyperCard programming…
Green Supercomputing at Argonne
Beckman, Pete
2018-02-07
Pete Beckman, head of Argonne's Leadership Computing Facility (ALCF) talks about Argonne National Laboratory's green supercomputingâeverything from designing algorithms to use fewer kilowatts per operation to using cold Chicago winter air to cool the machine more efficiently. Argonne was recognized for green computing in the 2009 HPCwire Readers Choice Awards. More at http://www.anl.gov/Media_Center/News/2009/news091117.html Read more about the Argonne Leadership Computing Facility at http://www.alcf.anl.gov/
2017-09-06
WASHINGTON, D.C.---S&T Partnership Forum In-Space Assembly Technical Interchange Meeting-On September 6th 2017, many of the United States government experts on In-Space Assembly met at the U.S. Naval Research Lab to discuss both technology development and in-space applications that would advance national capabilities in this area. Expertise from NASA, USAF, NRO, DARPA and NRL met in this meeting which was coordinated by the NASA Headquarters, Office of the Chief Technologist. This technical interchange meeting was the second meeting of the members of this Science and Technology Partnership Forum. Glen Henshaw of Code 8231 talks to the group in the Space Robotics Lab.
Energy Experiments for STEM Students
NASA Astrophysics Data System (ADS)
Fanchi, John
2011-03-01
Texas Christian University (TCU) is developing an undergraduate program that prepares students to become engineers with an emphasis in energy systems. One of the courses in the program is a technical overview of traditional energy (coal, oil and gas), nuclear energy, and renewable energy that requires as a pre-requisite two semesters of calculus-based physics. Energy experiments are being developed that will facilitate student involvement and provide hands-on learning opportunities. Students participating in the course will improve their understanding of energy systems; be introduced to outstanding scientific and engineering problems; learn about the role of energy in a global and societal context; and evaluate contemporary issues associated with energy. This talk will present the status of experiments being developed for the technical energy survey course.
An Information and Technical Manual for the Computer-Assisted Teacher Training System (CATTS).
ERIC Educational Resources Information Center
Semmel, Melvyn I.; And Others
The manual presents technical information on the computer assisted teacher training system (CATTS) which aims at developing a versatile and economical computer based teacher training system with the capability of providing immediate analysis and feedback of data relevant to teacher pupil transactions in a classroom setting. The physical…
Computer Education Curriculum. Connecticut Vocational Technical School System. Version 4.
ERIC Educational Resources Information Center
Kittell, Linda; Walczak, Joseph
This computer education curriculum is designed specifically for Connecticut's Regional Vocational Technical Schools' grade 9 computer education course. Each of the 24 lessons is expected to cover at least one class period of 50 minutes. Introductory materials include a listing of course goals and objectives, an outline of sequence and scope via…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-22
..., Airborne Automatic Dead Reckoning Computer Equipment Utilizing Aircraft Heading and Doppler Ground Speed.... ACTION: Notice of intent to cancel Technical Standard Order (TSO)-C68a, Airborne automatic dead reckoning... dead reckoning computer equipment utilizing aircraft heading and Doppler ground speed and drift angle...
Code of Federal Regulations, 2010 CFR
2010-10-01
... Software and Noncommercial Computer Software Documentation clause of this contract. (3) For Small Business... Rights in Noncommercial Technical Data and Computer Software—Small Business Innovative Research (SBIR) Program clause of this contract. (b) Technical data or computer software provided to the Contractor as...
48 CFR 252.227-7028 - Technical data or computer software previously delivered to the government.
Code of Federal Regulations, 2010 CFR
2010-10-01
... software previously delivered to the government. 252.227-7028 Section 252.227-7028 Federal Acquisition... computer software previously delivered to the government. As prescribed in 227.7103-6(d), 227.7104(f)(2), or 227.7203-6(e), use the following provision: Technical Data or Computer Software Previously...
48 CFR 252.227-7028 - Technical data or computer software previously delivered to the government.
Code of Federal Regulations, 2012 CFR
2012-10-01
... software previously delivered to the government. 252.227-7028 Section 252.227-7028 Federal Acquisition... computer software previously delivered to the government. As prescribed in 227.7103-6(d), 227.7104(f)(2), or 227.7203-6(e), use the following provision: Technical Data or Computer Software Previously...
48 CFR 252.227-7028 - Technical data or computer software previously delivered to the government.
Code of Federal Regulations, 2014 CFR
2014-10-01
... software previously delivered to the government. 252.227-7028 Section 252.227-7028 Federal Acquisition... computer software previously delivered to the government. As prescribed in 227.7103-6(d), 227.7104(f)(2), or 227.7203-6(e), use the following provision: Technical Data or Computer Software Previously...
48 CFR 227.7104 - Contracts under the Small Business Innovation Research (SBIR) Program.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Data and Computer Software—Small Business Innovation Research (SBIR) Program, when technical data or computer software will be generated during performance of contracts under the SBIR program. (b) Under the clause at 252.227-7018, the Government obtains SBIR data rights in technical data and computer software...
48 CFR 227.7104 - Contracts under the Small Business Innovation Research (SBIR) Program.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Data and Computer Software—Small Business Innovation Research (SBIR) Program, when technical data or computer software will be generated during performance of contracts under the SBIR program. (b) Under the clause at 252.227-7018, the Government obtains SBIR data rights in technical data and computer software...
48 CFR 252.227-7028 - Technical data or computer software previously delivered to the government.
Code of Federal Regulations, 2013 CFR
2013-10-01
... software previously delivered to the government. 252.227-7028 Section 252.227-7028 Federal Acquisition... computer software previously delivered to the government. As prescribed in 227.7103-6(d), 227.7104(f)(2), or 227.7203-6(e), use the following provision: Technical Data or Computer Software Previously...
48 CFR 252.227-7028 - Technical data or computer software previously delivered to the government.
Code of Federal Regulations, 2011 CFR
2011-10-01
... software previously delivered to the government. 252.227-7028 Section 252.227-7028 Federal Acquisition... computer software previously delivered to the government. As prescribed in 227.7103-6(d), 227.7104(f)(2), or 227.7203-6(e), use the following provision: Technical Data or Computer Software Previously...
ALOHA System Technical Reports 16, 19, 24, 28, and 30, 1974.
ERIC Educational Resources Information Center
Hawaii Univ., Honolulu. ALOHA System.
A series of technical reports based on the Aloha System for educational computer programs provide a background on how various countries in the Pacific region developed computer capabilities and describe their current operations, as well as prospects for future expansion. Included are studies on the Japan-Hawaii TELEX and Satellite; computers at…
Quantitative Prediction of Computational Quality (so the S and C Folks will Accept it)
NASA Technical Reports Server (NTRS)
Hemsch, Michael J.; Luckring, James M.; Morrison, Joseph H.
2004-01-01
Our choice of title may seem strange but we mean each word. In this talk, we are not going to be concerned with computations made "after the fact", i.e. those for which data are available and which are being conducted for explanation and insight. Here we are interested in preventing S&C design problems by finding them through computation before data are available. For such a computation to have any credibility with those who absorb the risk, it is necessary to quantitatively PREDICT the quality of the computational results.
Proceedings: Computer Science and Data Systems Technical Symposium, volume 1
NASA Technical Reports Server (NTRS)
Larsen, Ronald L.; Wallgren, Kenneth
1985-01-01
Progress reports and technical updates of programs being performed by NASA centers are covered. Presentations in viewgraph form are included for topics in three categories: computer science, data systems and space station applications.
NASA Astrophysics Data System (ADS)
Granger Morgan, M.
2011-04-01
In a book for the general public published a year before his death, Carl Sagan wrote, "Every time a scientific paper presents a bit of data, it's accompanied by an error bar---a quiet but instant reminder that no knowledge is complete or perfect." For those of us educated in experimental natural science such an observation seems so obvious as to hardly need saying. Yet when, after completing a PhD in experimental radio physics, I began to work on problems in environmental and energy risk and policy analysis in the early 1970s, I was amazed to find that the characterization and treatment of uncertainty was almost completely lacking in the analysis of that day. In the first part of this talk, I will briefly summarize how I, and a number of my physics-educated colleagues, have worked to rectify this situation. Doctoral education in the Department of Engineering and Public Policy (EPP) at Carnegie Mellon University has also been shaped by a number of ideas and problem-solving styles that derive from physics. These have been strengthened considerably through integration with a number of ideas from experimental social science -- a field that too many in physics ignore or even belittle. In the second part of the talk, I will describe the PhD program in EPP, talk a bit about some of its unique features, and describe a few of the problems we address.
1981 Bibliography of Technical Writing.
ERIC Educational Resources Information Center
Book, Virginia Alm; And Others
1982-01-01
Offers resources on technical writing published in 1981. Arranges the citations under the following categories: bibliographies, books, reviews, and articles on the profession; theory and philosophy; pedagogy; technical speech; research; designing degree programs; technical writing and the computer; writing technical articles and reports;…
Summer Series 2012 - Conversation with Kathy Yelick
Yelick, Kathy, Miller, Jeff
2018-05-11
Jeff Miller, head of Public Affairs, sat down in conversation with Kathy Yelick, Associate Berkeley Lab Director, Computing Sciences, in the second of a series of powerpoint-free talks on July 18th 2012, at Berkeley Lab.
Summer Series 2012 - Conversation with Kathy Yelick
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yelick, Kathy, Miller, Jeff
2012-07-23
Jeff Miller, head of Public Affairs, sat down in conversation with Kathy Yelick, Associate Berkeley Lab Director, Computing Sciences, in the second of a series of powerpoint-free talks on July 18th 2012, at Berkeley Lab.
Green Supercomputing at Argonne
Pete Beckman
2017-12-09
Pete Beckman, head of Argonne's Leadership Computing Facility (ALCF) talks about Argonne National Laboratory's green supercomputingâeverything from designing algorithms to use fewer kilowatts per operation to using cold Chicago winter air to cool the machine more efficiently.
Analog-to-digital clinical data collection on networked workstations with graphic user interface.
Lunt, D
1991-02-01
An innovative respiratory examination system has been developed that combines physiological response measurement, real-time graphic displays, user-driven operating sequences, and networked file archiving and review into a scientific research and clinical diagnosis tool. This newly constructed computer network is being used to enhance the research center's ability to perform patient pulmonary function examinations. Respiratory data are simultaneously acquired and graphically presented during patient breathing maneuvers and rapidly transformed into graphic and numeric reports, suitable for statistical analysis or database access. The environment consists of the hardware (Macintosh computer, MacADIOS converters, analog amplifiers), the software (HyperCard v2.0, HyperTalk, XCMDs), and the network (AppleTalk, fileservers, printers) as building blocks for data acquisition, analysis, editing, and storage. System operation modules include: Calibration, Examination, Reports, On-line Help Library, Graphic/Data Editing, and Network Storage.
1997-12-01
that I’ll turn my attention to that computer game we’ve talked so much about... Dave Van Veldhuizen and Scott Brown (soon-to-be Drs. Van Veldhuizen and...Industry Training Systems Conference. 1988. 37. Van Veldhuizen , D. A. and L. J Hutson. "A Design Methodology for Domain Inde- pendent Computer...proposed by Van Veld- huizen and Hutson (37), extends the general architecture to support both a domain- independent approach to implementing CGFs and
Comments on the Development of Computational Mathematics in Czechoslovakia and in the USSR.
1987-03-01
ACT (COusduMe an reverse .eld NE 4040604W SWi 1410011 6F 660" ambe The talk is an Invited lecture at Ale Conference on the History of Scientific and...Numeric Computations, May 13-15, 1987, Princeton, New Jersey. It present soon basic subjective observations about the history of numerical methods in...invited lecture at ACH Conference on the History of Scientific and Numeric Computations, May 13’-15, 1987, Princeton, New Jersey. It present some basic
Minimum Energy Pathways for Chemical Reactions
NASA Technical Reports Server (NTRS)
Walch, S. P.; Langhoff, S. R. (Technical Monitor)
1995-01-01
Computed potential energy surfaces are often required for computation of such parameters as rate constants as a function of temperature, product branching ratios, and other detailed properties. We have found that computation of the stationary points/reaction pathways using CASSCF/derivative methods, followed by use of the internally contracted CI method to obtain accurate energetics, gives useful results for a number of chemically important systems. The talk will focus on a number of applications to reactions leading to NOx and soot formation in hydrocarbon combustion.
A Perspective on Computational Human Performance Models as Design Tools
NASA Technical Reports Server (NTRS)
Jones, Patricia M.
2010-01-01
The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.
ERIC Educational Resources Information Center
Arntzen, Erik; Halstadtro, Lill-Beathe; Halstadtro, Monica
2009-01-01
The purpose of the study was to extend the literature on verbal self-regulation by using the "silent dog" method to evaluate the role of verbal regulation over nonverbal behavior in 2 individuals with autism. Participants were required to talk-aloud while performing functional computer tasks.Then the effects of distracters with increasing demands…
1980 Bibliography of Technical Writing.
ERIC Educational Resources Information Center
Book, Virginia Alm; And Others
1981-01-01
Offers resources on technical writing that were published in 1980. Arranges the citations under 12 categories: bibliographies, books, reviews, and articles on theory and philosophy; pedagogy; writing technical articles and reports; research; technical writing and the computer; graphic/visual aids; correspondence; technical speech; and designing…
The Role of NASA Observations in Understanding Earth System Change
NASA Technical Reports Server (NTRS)
Fladeland, Matthew M.
2009-01-01
This presentation will introduce a non-technical audience to NASA Earth science research goals and the technologies used to achieve them. The talk will outline the primary science focus areas and then provide overviews of current and planned missions, in addition to instruments, aircraft, and other technologies that are used to turn data into useful information for scientists and policy-makers. This presentation is part of an Earth Day symposium at the University of Mary.
Reflections on the Conception, Birth, and Childhood of Numerical Weather Prediction
NASA Astrophysics Data System (ADS)
Lorenz, Edward N.
2006-05-01
In recognition of the contributions of Norman Phillips and Joseph Smagorinsky to the field of numerical weather prediction (NWP), a symposium was held in 2003; this account is an amplification of a talk presented there. Ideas anticipating the advent of NWP, the first technically successful numerical weather forcast, and the subsequent progression of NWP to a mature discipline are described, with special emphasis on the work of Phillips and Smagorinsky and their mentor Jule Charney.
Proceedings of the DoD/Industry Technical Information Conference 7-8 December 1982
1983-04-01
Government-Productivity in the U.S./Limited Research Partnerships for R&DDr. Jack Williams 1................ Industry-Industry/DoD Relationship• Mr. Daniel ...Board regulations /0 DANIEL SULLIVAN President Frost & Sullivan Incorporated INDUSTRY/DOD RELATIONSHIP The purpose of my talk is to set the tone by...Roger Collins Washington, D.C. 20305 Dept. 07-30, Bldg. 9-Al Lockheed Daniel a. Donovan P.o. Box 551 Director, Washington Operations Burbank, CA 91520
Selected Research and Development Topics on Aerospace Communications at NASA Glenn Research Center
NASA Technical Reports Server (NTRS)
Miranda, Felix A.; Romanofsky, Robert R.; Nessel, James A.
2014-01-01
This presentation discusses some of the efforts on communications RD that have been performed or are currently underway at NASA Glenn Research Center. The primary purpose of this presentation is to outline some RD topics to serve as talking points for a Technical Interchange Meeting with the Ohio State University. The meeting is scheduled to take place at The ElectroScience Laboratory of the Ohio State University on February 24, 2014.
2011-03-24
TODD MAY, SPECIAL TECHNICAL ASSISTANT TO THE MARSHALL CENTER DIRECTOR, AND NASA ADMINISTRATOR CHARLES BOLDEN TALK WITH HUNTSVILLE CITY MAYOR TOMMY BATTLE, CENTER, DURING THE MARSHALL SMALL BUSINESS ALLIANCE MEETING MARCH 24 AT THE DAVIDSON CENTER FOR SPACE EXPLORATION IN HUNTSVILLE. BATTLE PROVIDED OPENING REMARKS AT THE EVENT, AND BOLDEN WELCOMED GUESTS AND PRESENTED THE MARSHALL CENTER WITH THE NASA SMALL BUSINESS ADMINISTRATOR'S CUP AWARD FOR FISCAL YEAR 2010 -- THE SECOND TIME IN THREE YEARS MARSHALL HAS BROUGHT HOME THIS PARTICULAR AWARD FOR EXCELLENCE.
Proceedings: Computer Science and Data Systems Technical Symposium, volume 2
NASA Technical Reports Server (NTRS)
Larsen, Ronald L.; Wallgren, Kenneth
1985-01-01
Progress reports and technical updates of programs being performed by NASA centers are covered. Presentations in viewgraph form, along with abstracts, are included for topics in three catagories: computer science, data systems, and space station applications.
Psychology of computer use: XXIV. Computer-related stress among technical college students.
Ballance, C T; Rogers, S U
1991-10-01
Hudiburg's Computer Technology Hassles Scale, along with a measure of global stress and a scale on attitudes toward computers, were administered to 186 students in a two-year technical college. Hudiburg's work with the hassles scale as a measure of "technostress" was affirmed. Moderate, but statistically significant, correlations among the three scales are reported. No relationship between the hassles scale and achievement as measured by GPA was detected.
Virtual Facility at Fermilab: Infrastructure and Services Expand to Public Clouds
Timm, Steve; Garzoglio, Gabriele; Cooper, Glenn; ...
2016-02-18
In preparation for its new Virtual Facility Project, Fermilab has launched a program of work to determine the requirements for running a computation facility on-site, in public clouds, or a combination of both. This program builds on the work we have done to successfully run experimental workflows of 1000-VM scale both on an on-site private cloud and on Amazon AWS. To do this at scale we deployed dynamically launched and discovered caching services on the cloud. We are now testing the deployment of more complicated services on Amazon AWS using native load balancing and auto scaling features they provide. Themore » Virtual Facility Project will design and develop a facility including infrastructure and services that can live on the site of Fermilab, off-site, or a combination of both. We expect to need this capacity to meet the peak computing requirements in the future. The Virtual Facility is intended to provision resources on the public cloud on behalf of the facility as a whole instead of having each experiment or Virtual Organization do it on their own. We will describe the policy aspects of a distributed Virtual Facility, the requirements, and plans to make a detailed comparison of the relative cost of the public and private clouds. Furthermore, this talk will present the details of the technical mechanisms we have developed to date, and the plans currently taking shape for a Virtual Facility at Fermilab.« less
Virtual Facility at Fermilab: Infrastructure and Services Expand to Public Clouds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Timm, Steve; Garzoglio, Gabriele; Cooper, Glenn
In preparation for its new Virtual Facility Project, Fermilab has launched a program of work to determine the requirements for running a computation facility on-site, in public clouds, or a combination of both. This program builds on the work we have done to successfully run experimental workflows of 1000-VM scale both on an on-site private cloud and on Amazon AWS. To do this at scale we deployed dynamically launched and discovered caching services on the cloud. We are now testing the deployment of more complicated services on Amazon AWS using native load balancing and auto scaling features they provide. Themore » Virtual Facility Project will design and develop a facility including infrastructure and services that can live on the site of Fermilab, off-site, or a combination of both. We expect to need this capacity to meet the peak computing requirements in the future. The Virtual Facility is intended to provision resources on the public cloud on behalf of the facility as a whole instead of having each experiment or Virtual Organization do it on their own. We will describe the policy aspects of a distributed Virtual Facility, the requirements, and plans to make a detailed comparison of the relative cost of the public and private clouds. Furthermore, this talk will present the details of the technical mechanisms we have developed to date, and the plans currently taking shape for a Virtual Facility at Fermilab.« less
Cloud-based calculators for fast and reliable access to NOAA's geomagnetic field models
NASA Astrophysics Data System (ADS)
Woods, A.; Nair, M. C.; Boneh, N.; Chulliat, A.
2017-12-01
While the Global Positioning System (GPS) provides accurate point locations, it does not provide pointing directions. Therefore, the absolute directional information provided by the Earth's magnetic field is of primary importance for navigation and for the pointing of technical devices such as aircrafts, satellites and lately, mobile phones. The major magnetic sources that affect compass-based navigation are the Earth's core, its magnetized crust and the electric currents in the ionosphere and magnetosphere. NOAA/CIRES Geomagnetism (ngdc.noaa.gov/geomag/) group develops and distributes models that describe all these important sources to aid navigation. Our geomagnetic models are used in variety of platforms including airplanes, ships, submarines and smartphones. While the magnetic field from Earth's core can be described in relatively fewer parameters and is suitable for offline computation, the magnetic sources from Earth's crust, ionosphere and magnetosphere require either significant computational resources or real-time capabilities and are not suitable for offline calculation. This is especially important for small navigational devices or embedded systems, where computational resources are limited. Recognizing the need for a fast and reliable access to our geomagnetic field models, we developed cloud-based application program interfaces (APIs) for NOAA's ionospheric and magnetospheric magnetic field models. In this paper we will describe the need for reliable magnetic calculators, the challenges faced in running geomagnetic field models in the cloud in real-time and the feedback from our user community. We discuss lessons learned harvesting and validating the data which powers our cloud services, as well as our strategies for maintaining near real-time service, including load-balancing, real-time monitoring, and instance cloning. We will also briefly talk about the progress we achieved on NOAA's Big Earth Data Initiative (BEDI) funded project to develop API interface to our Enhanced Magnetic Model (EMM).
Grid and Cloud for Developing Countries
NASA Astrophysics Data System (ADS)
Petitdidier, Monique
2014-05-01
The European Grid e-infrastructure has shown the capacity to connect geographically distributed heterogeneous compute resources in a secure way taking advantages of a robust and fast REN (Research and Education Network). In many countries like in Africa the first step has been to implement a REN and regional organizations like Ubuntunet, WACREN or ASREN to coordinate the development, improvement of the network and its interconnection. The Internet connections are still exploding in those countries. The second step has been to fill up compute needs of the scientists. Even if many of them have their own multi-core or not laptops for more and more applications it is not enough because they have to face intensive computing due to the large amount of data to be processed and/or complex codes. So far one solution has been to go abroad in Europe or in America to run large applications or not to participate to international communities. The Grid is very attractive to connect geographically-distributed heterogeneous resources, aggregate new ones and create new sites on the REN with a secure access. All the users have the same servicers even if they have no resources in their institute. With faster and more robust internet they will be able to take advantage of the European Grid. There are different initiatives to provide resources and training like UNESCO/HP Brain Gain initiative, EUMEDGrid, ..Nowadays Cloud becomes very attractive and they start to be developed in some countries. In this talk challenges for those countries to implement such e-infrastructures, to develop in parallel scientific and technical research and education in the new technologies will be presented illustrated by examples.
1991-07-01
authoring systems. Concurrently, great strides in computer-aided design and computer-aided maintenance have contributed to this capability. 12 Junod ...J.; William A. Nugent; and L. John Junod . Plan for the Navy/Air Force Test of the Interactive Electronic Technical Manual (IETM) at Cecil Field...AFHRL Logistics and Human Factors Division, WPAFB. Aug 1990. 12. Junod , John L. PY90 Interactive Electronic Technical Manual (IETM) Portable Delivery
An acceptable role for computers in the aircraft design process
NASA Technical Reports Server (NTRS)
Gregory, T. J.; Roberts, L.
1980-01-01
Some of the reasons why the computerization trend is not wholly accepted are explored for two typical cases: computer use in the technical specialties and computer use in aircraft synthesis. The factors that limit acceptance are traced in part, to the large resources needed to understand the details of computer programs, the inability to include measured data as input to many of the theoretical programs, and the presentation of final results without supporting intermediate answers. Other factors are due solely to technical issues such as limited detail in aircraft synthesis and major simplifying assumptions in the technical specialties. These factors and others can be influenced by the technical specialist and aircraft designer. Some of these factors may become less significant as the computerization process evolves, but some issues, such as understanding large integrated systems, may remain issues in the future. Suggestions for improved acceptance include publishing computer programs so that they may be reviewed, edited, and read. Other mechanisms include extensive modularization of programs and ways to include measured information as part of the input to theoretical approaches.
Mamykina, Lena; Vawdrey, David K.; Hripcsak, George
2016-01-01
Purpose To understand how much time residents spend using computers as compared with other activities, and what residents use computers for. Method This time and motion study was conducted in June and July 2010 at NewYork-Presbyterian/Columbia University Medical Center with seven residents (first-, second-, and third-year) on the general medicine service. An experienced observer shadowed residents during a single day shift, captured all their activities using an iPad application, and took field notes. The activities were captured using a validated taxonomy of clinical activities, expanded to describe computer-based activities with a greater level of detail. Results Residents spent 364.5 minutes (50.6%) of their shift time using computers, compared with 67.8 minutes (9.4%) interacting with patients. In addition, they spent 292.3 minutes (40.6%) talking with others in person, 186.0 minutes (25.8%) handling paper notes, 79.7 minutes (11.1%) in rounds, 80.0 minutes (11.1%) walking or waiting, and 54.0 minutes (7.5%) talking on the phone. Residents spent 685 minutes (59.6%) multitasking. Computer-based documentation activities amounted to 189.9 minutes (52.1%) of all computer-based activities time, with 128.7 minutes (35.3%) spent writing notes and 27.3 minutes (7.5%) reading notes composed by others. Conclusions The study showed residents spent considerably more time interacting with computers (over 50% of their shift time), than in direct contact with patients (less than 10% of their shift time). Some of this may be due to an increasing reliance on computing systems for access to patient data, further exacerbated by inefficiencies in the design of the electronic health record. PMID:27028026
Mamykina, Lena; Vawdrey, David K; Hripcsak, George
2016-06-01
To understand how much time residents spend using computers compared with other activities, and what residents use computers for. This time and motion study was conducted in June and July 2010 at NewYork-Presbyterian/Columbia University Medical Center with seven residents (first-, second-, and third-year) on the general medicine service. An experienced observer shadowed residents during a single day shift, captured all their activities using an iPad application, and took field notes. The activities were captured using a validated taxonomy of clinical activities, expanded to describe computer-based activities with a greater level of detail. Residents spent 364.5 minutes (50.6%) of their shift time using computers, compared with 67.8 minutes (9.4%) interacting with patients. In addition, they spent 292.3 minutes (40.6%) talking with others in person, 186.0 minutes (25.8%) handling paper notes, 79.7 minutes (11.1%) in rounds, 80.0 minutes (11.1%) walking or waiting, and 54.0 minutes (7.5%) talking on the phone. Residents spent 685 minutes (59.6%) multitasking. Computer-based documentation activities amounted to 189.9 minutes (52.1%) of all computer-based activities time, with 128.7 minutes (35.3%) spent writing notes and 27.3 minutes (7.5%) reading notes composed by others. The study showed that residents spent considerably more time interacting with computers (over 50% of their shift time) than in direct contact with patients (less than 10% of their shift time). Some of this may be due to an increasing reliance on computing systems for access to patient data, further exacerbated by inefficiencies in the design of the electronic health record.
PREFACE: New trends in Computer Simulations in Physics and not only in physics
NASA Astrophysics Data System (ADS)
Shchur, Lev N.; Krashakov, Serge A.
2016-02-01
In this volume we have collected papers based on the presentations given at the International Conference on Computer Simulations in Physics and beyond (CSP2015), held in Moscow, September 6-10, 2015. We hope that this volume will be helpful and scientifically interesting for readers. The Conference was organized for the first time with the common efforts of the Moscow Institute for Electronics and Mathematics (MIEM) of the National Research University Higher School of Economics, the Landau Institute for Theoretical Physics, and the Science Center in Chernogolovka. The name of the Conference emphasizes the multidisciplinary nature of computational physics. Its methods are applied to the broad range of current research in science and society. The choice of venue was motivated by the multidisciplinary character of the MIEM. It is a former independent university, which has recently become the part of the National Research University Higher School of Economics. The Conference Computer Simulations in Physics and beyond (CSP) is planned to be organized biannually. This year's Conference featured 99 presentations, including 21 plenary and invited talks ranging from the analysis of Irish myths with recent methods of statistical physics, to computing with novel quantum computers D-Wave and D-Wave2. This volume covers various areas of computational physics and emerging subjects within the computational physics community. Each section was preceded by invited talks presenting the latest algorithms and methods in computational physics, as well as new scientific results. Both parallel and poster sessions paid special attention to numerical methods, applications and results. For all the abstracts presented at the conference please follow the link http://csp2015.ac.ru/files/book5x.pdf
Using the General Mission Analysis Tool (GMAT)
NASA Technical Reports Server (NTRS)
Hughes, Steven P.; Conway, Darrel J.; Parker, Joel
2017-01-01
This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT). These slides will be used to accompany the demonstration. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. This talk is a combination of existing presentations and material; system user guide and technical documentation; a GMAT basics and overview, and technical presentations from the TESS projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project. Slides for navigation and optimal control are borrowed from system documentation and training material.
Quantum simulations with noisy quantum computers
NASA Astrophysics Data System (ADS)
Gambetta, Jay
Quantum computing is a new computational paradigm that is expected to lie beyond the standard model of computation. This implies a quantum computer can solve problems that can't be solved by a conventional computer with tractable overhead. To fully harness this power we need a universal fault-tolerant quantum computer. However the overhead in building such a machine is high and a full solution appears to be many years away. Nevertheless, we believe that we can build machines in the near term that cannot be emulated by a conventional computer. It is then interesting to ask what these can be used for. In this talk we will present our advances in simulating complex quantum systems with noisy quantum computers. We will show experimental implementations of this on some small quantum computers.
Optical Interconnection Via Computer-Generated Holograms
NASA Technical Reports Server (NTRS)
Liu, Hua-Kuang; Zhou, Shaomin
1995-01-01
Method of free-space optical interconnection developed for data-processing applications like parallel optical computing, neural-network computing, and switching in optical communication networks. In method, multiple optical connections between multiple sources of light in one array and multiple photodetectors in another array made via computer-generated holograms in electrically addressed spatial light modulators (ESLMs). Offers potential advantages of massive parallelism, high space-bandwidth product, high time-bandwidth product, low power consumption, low cross talk, and low time skew. Also offers advantage of programmability with flexibility of reconfiguration, including variation of strengths of optical connections in real time.
Code of Federal Regulations, 2013 CFR
2013-01-01
... described in paragraph (1) of this definition. Computers or other technical equipment means central... 7 Agriculture 15 2013-01-01 2013-01-01 false Definitions. 3203.3 Section 3203.3 Agriculture..., DEPARTMENT OF AGRICULTURE GUIDELINES FOR THE TRANSFER OF EXCESS COMPUTERS OR OTHER TECHNICAL EQUIPMENT...
ERIC Educational Resources Information Center
Landis, Melodee
2001-01-01
Describes the blurring line between the traditional roles of "systems guys" and "hackers" in the world of computer network professionals, creating a "techie 3.0," or computer professional who has experience and training in technical and non-technical fields and contributes both design expertise and creativity. (EV)
Lightning Talks 2015: Theoretical Division
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shlachter, Jack S.
2015-11-25
This document is a compilation of slides from a number of student presentations given to LANL Theoretical Division members. The subjects cover the range of activities of the Division, including plasma physics, environmental issues, materials research, bacterial resistance to antibiotics, and computational methods.
What is Supercomputing? A Conversation with Kathy Yelick
Yelick, Kathy
2017-12-11
In this highlight video, Jeff Miller, head of Public Affairs, sat down in conversation with Kathy Yelick, Associate Berkeley Lab Director, Computing Sciences, in the second of a series of "powerpoint-free" talks on July 18th 2012, at Berkeley Lab.
What is Supercomputing? A Conversation with Kathy Yelick
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yelick, Kathy
2012-07-23
In this highlight video, Jeff Miller, head of Public Affairs, sat down in conversation with Kathy Yelick, Associate Berkeley Lab Director, Computing Sciences, in the second of a series of "powerpoint-free" talks on July 18th 2012, at Berkeley Lab.
Effecting IT infrastructure culture change: management by processes and metrics
NASA Technical Reports Server (NTRS)
Miller, R. L.
2001-01-01
This talk describes the processes and metrics used by Jet Propulsion Laboratory to bring about the required IT infrastructure culture change to update and certify, as Y2K compliant, thousands of computers and millions of lines of code.
2016-06-22
NASA MARSHALL SPACE FLIGHT CENTER DIRECTOR TODD MAY TALKS ABOUT HIS VISION FOR THE CENTER DURING AN ALL-HANDS MEETING JUNE 22 IN MORRIS AUDITORIUM, AND BROADCAST CENTERWIDE. ALSO SPEAKING TO THE MARSHALL TEAM AND TAKING QUESTIONS DURING THE EVENT ARE, FROM LEFT, MARSHALL DEPUTY DIRECTOR JODY SINGER, ASSOCIATE DIRECTOR ROBIN HENDERSON AND ASSOCIATE DIRECTOR, TECHNICAL, PAUL MCCONNAUGHEY. "WE'RE IN THE BUSINESS OF MAKING THE IMPOSSIBLE POSSIBLE," SAID MAY, CITING PROGRESS ON THE SPACE LAUNCH SYSTEM AND THE JOURNEY TO MARS AND RECOUNTING HIGHLIGHTS OF MARSHALL'S 56-YEAR HISTORY.
2004-06-08
KENNEDY SPACE CENTER, FLA. - Paul Curto (left), chief technologist with NASA’s Inventions and Contributions Board, learns about research being done in the Space Life Sciences Lab from Jessica Prenger, senior agricultural engineer. Curto is visiting KSC to talk to innovators and encourage workers to submit technologies for future Space Act Awards. The Inventions and Contributions Board, established in 1958, is a major contributor in rewarding outstanding scientific or technical contributions sponsored, adopted, supported or used by NASA that are significant to aeronautics and space activities.
Knowledge-Based Natural Language Understanding: A AAAI-87 Survey Talk
1987-01-01
a close call can be easily transformed into a regrettable mistake (don’t cry over spilt milk ) if G is not characterized as a fleeting goal and a...that technical literature is characterized by very dry and literal language. If there is one place where metaphors might not intrude, it must 3 be when...question. What, in your opinion, controls the development of this research from the point of view of both evidential support and falsification ? I ask
Knowledge-Based Natural Language Understanding: A AAAI-87 Survey Talk
1991-01-01
easily transformed into a regrettable mistake (don’t cry over spilt milk ) if G is not characterized as a fleeting goal and a recovery plan therefore...technical literature is characterized by very dry and literal language. If there is one place where metaphors might not intrude, it must be when people...from the point of view of both evidential support and falsification ? I ask it because you didn’t say anything about it. A: Well, I think there’s a lot
Global Precipitation Measurement (GPM) Mission
2017-12-08
Art Azarbarzin, NASA Global Precipitation Measurement (GPM) project manager talks during a technical briefing for the launch of the Global Precipitation Measurement (GPM) Core Observatory aboard an H-IIA rocket, Wednesday, Feb. 26, 2014, Tanegashima Space Center, Japan. Launch is scheduled for early in the morning of Feb. 28 Japan time. Once launched, the GPM spacecraft will collect information that unifies data from an international network of existing and future satellites to map global rainfall and snowfall every three hours. Photo Credit: (NASA/Bill Ingalls)
Proceedings of the twentieth annual meeting of the society for organic petrology
Bragg, Linda J.; Lentz, Erika E.; Warwick, Peter D.; Finkelman, Robert B.; Trippi, Michael H.; Karlsen, Alex W.
2004-01-01
The Society for Organic Petrology (TSOP; pronounced "Tee'-sop") was established in 1984 to consolidate and foster the organizational activities of scientists and engineers involved with coal petrology, kerogen petrology, organic geochemistry, and related disciplines. The following report, "Proceedings of the Twentieth Annual Meeting of The Society for Organic Petrology" (ISSN 1060-7250), features technical talks, poster presentations, business meetings, short courses, and field trips from the Fall 2003 annual meeting held in Washington, D.C.
CrossTalk. The Journal of Defense Software Engineering. Volume 25, Number 5. Sep/Oct 2012
2012-10-01
the threat actors it faces (be they nation states, empowered small agents or cyber-criminals), but also to have an actuarial view of the likelihood...systems thinking, which is full of technical jargon and mathematics . He wanted non-expert educators to be able to teach the concepts to K-12 students...able to conjecture mathematically that decreasing the exposure time window will improve the resilience of a SCIT-based system. To adapt SCIT we
2004-06-08
KENNEDY SPACE CENTER, FLA. - Paul Curto (left), chief technologist with NASA’s Inventions and Contributions Board, learns from bioengineer Tony Rector (right) about a wastewater processing project Rector is working on in the Space Life Sciences Lab. Curto is visiting KSC to talk to innovators and encourage workers to submit technologies for future Space Act Awards. The Inventions and Contributions Board, established in 1958, is a major contributor in rewarding outstanding scientific or technical contributions sponsored, adopted, supported or used by NASA that are significant to aeronautics and space activities.
Development of a Pulsed Pressure-Based Technique for Cavitation Damage Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Fei; Wang, Jy-An John; Liu, Yun
2012-01-01
Cavitation occurs in many fluid systems and can lead to severe material damage. To assist the study of cavitation damage, a novel testing method utilizing pulsed pressure was developed. In this talk, the scientific background and the technical approach of this development are present and preliminary testing results are discussed. It is expected that this technique can be used to evaluate cavitation damage under various testing conditions including harsh environments such as those relevant to geothermal power generation.
NASA Technical Reports Server (NTRS)
2004-01-01
KENNEDY SPACE CENTER, FLA. Paul Curto (left), chief technologist with NASAs Inventions and Contributions Board, learns from bioengineer Tony Rector (right) about a wastewater processing project Rector is working on in the Space Life Sciences Lab. Curto is visiting KSC to talk to innovators and encourage workers to submit technologies for future Space Act Awards. The Inventions and Contributions Board, established in 1958, is a major contributor in rewarding outstanding scientific or technical contributions sponsored, adopted, supported or used by NASA that are significant to aeronautics and space activities.
ERIC Educational Resources Information Center
Scheffler, F. L.; And Others
A feasibility study examined the capability of a computer-based system's handling of technical information pertinent to the design of instructional systems. Structured interviews were held to assess the information needs of both researchers and practitioners and an investigation was conducted of 10 computer-based information storage and retrieval…
48 CFR 227.7103-7 - Use and non-disclosure agreement.
Code of Federal Regulations, 2011 CFR
2011-10-01
... recipient shall not, for example, enhance, decompile, disassemble, or reverse engineer the software; time... (b) of this subsection, technical data or computer software delivered to the Government with..., display, or disclose technical data subject to limited rights or computer software subject to restricted...
48 CFR 227.7103-7 - Use and non-disclosure agreement.
Code of Federal Regulations, 2014 CFR
2014-10-01
... recipient shall not, for example, enhance, decompile, disassemble, or reverse engineer the software; time... (b) of this subsection, technical data or computer software delivered to the Government with..., display, or disclose technical data subject to limited rights or computer software subject to restricted...
48 CFR 227.7103-7 - Use and non-disclosure agreement.
Code of Federal Regulations, 2012 CFR
2012-10-01
... recipient shall not, for example, enhance, decompile, disassemble, or reverse engineer the software; time... (b) of this subsection, technical data or computer software delivered to the Government with..., display, or disclose technical data subject to limited rights or computer software subject to restricted...
48 CFR 227.7103-7 - Use and non-disclosure agreement.
Code of Federal Regulations, 2013 CFR
2013-10-01
... recipient shall not, for example, enhance, decompile, disassemble, or reverse engineer the software; time... (b) of this subsection, technical data or computer software delivered to the Government with..., display, or disclose technical data subject to limited rights or computer software subject to restricted...
Automatic Publishing of Library Bulletins.
ERIC Educational Resources Information Center
Inbal, Moshe
1980-01-01
Describes the use of a computer to publish library bulletins that list recent accessions of technical reports according to the subject classification scheme of NTIS/SRIM (National Technical Information Service's Scientific Reports in Microfiche). The codes file, the four computer program functions, and costs/economy are discussed. (JD)
Palmer, Rebecca; Enderby, Pam; Paterson, Gail
2013-01-01
Speech and language therapy (SLT) for aphasia can be difficult to access in the later stages of stroke recovery, despite evidence of continued improvement with sufficient therapeutic intensity. Computerized aphasia therapy has been reported to be useful for independent language practice, providing new opportunities for continued rehabilitation. The success of this option depends on its acceptability to patients and carers. To investigate factors that affect the acceptability of independent home computerized aphasia therapy practice. An acceptability study of computerized therapy was carried out alongside a pilot randomized controlled trial of computer aphasia therapy versus usual care for people more than 6 months post-stroke. Following language assessment and computer exercise prescription by a speech and language therapist, participants practised three times a week for 5 months at home with monthly volunteer support. Semi-structured interviews were conducted with 14 participants who received the intervention and ten carers (n = 24). Questions from a topic guide were presented and answered using picture, gesture and written support. Interviews were audio recorded, transcribed verbatim and analysed thematically. Three research SLTs identified and cross-checked themes and subthemes emerging from the data. The key themes that emerged were benefits and disadvantages of computerized aphasia therapy, need for help and support, and comparisons with face-to-face therapy. The independence, flexibility and repetition afforded by the computer was viewed as beneficial and the personalized exercises motivated participants to practise. Participants and carers perceived improvements in word-finding and confidence-talking. Computer practice could cause fatigue and interference with other commitments. Support from carers or volunteers for motivation and technical assistance was seen as important. Although some participants preferred face-to-face therapy, using a computer for independent language practice was perceived to be an acceptable alternative. Independent computerized aphasia therapy is acceptable to stroke survivors. Acceptability can be maximized by tailoring exercises to personal interests of the individual, ensuring access to support and giving consideration to fatigue and life style when recommending practice schedules. © 2013 Royal College of Speech and Language Therapists.
Training to use motivational interviewing techniques for depression: a cluster randomized trial.
Keeley, Robert D; Burke, Brian L; Brody, David; Dimidjian, Sona; Engel, Matthew; Emsermann, Caroline; deGruy, Frank; Thomas, Marshall; Moralez, Ernesto; Koester, Steve; Kaplan, Jessica
2014-01-01
The goal of this study was to assess the effects of training primary care providers (PCPs) to use Motivational Interviewing (MI) when treating depressed patients on providers' MI performance and patients' expressions of interest in depression treatment ("change talk") and short-term treatment adherence. This was a cluster randomized trial in urban primary care clinics (3 intervention, 4 control). We recruited 21 PCPs (10 intervention, 11 control) and 171 English-speaking patients with newly diagnosed depression (85 intervention, 86 control). MI training included a baseline and up to 2 refresher classroom trainings, along with feedback on audiotaped patient encounters. We report summary measures of technical (rate of MI-consistent statements per 10 minutes during encounters) and relational (global rating of "MI Spirit") MI performance, the association between MI performance and number of MI trainings attended (0, 1, 2, or 3), and rates of patient change talk regarding depression treatments (physical activity, antidepressant medication). We report PCP use of physical activity recommendations and antidepressant prescriptions and patients' short-term physical activity level and prescription fill rates. Use of MI-consistent statements was 26% higher for MI-trained versus control PCPs (P = .005). PCPs attending all 3 MI trainings (n = 6) had 38% higher use of MI-consistent statements (P < .001) and were over 5 times more likely to show beginning proficiency in MI Spirit (P = .036) relative to control PCPs. Although PCPs' use of physical activity recommendations and antidepressant prescriptions was not significantly different by randomization arm, patients seen by MI-trained PCPs had more frequent change talk (P = .001). Patients of MI-trained PCPs also expressed change talk about physical activity 3 times more frequently (P = .01) and reported more physical activity (3.05 vs 1.84 days in the week after the visit; P = .007) than their counterparts visiting untrained PCPs. Change talk about antidepressant medication and fill rates were similar by randomization arm (P > .05 for both). MI training resulted in improved MI performance, more depression-related patient change talk, and better short-term adherence. © Copyright 2014 by the American Board of Family Medicine.
An Overview of High Performance Computing and Challenges for the Future
Google Tech Talks
2017-12-09
In this talk we examine how high performance computing has changed over the last 10-year and look toward the future in terms of trends. These changes have had and will continue to have a major impact on our software. A new generation of software libraries and lgorithms are needed for the effective and reliable use of (wide area) dynamic, distributed and parallel environments. Some of the software and algorithm challenges have already been encountered, such as management of communication and memory hierarchies through a combination of compile--time and run--time techniques, but the increased scale of computation, depth of memory hierarchies, range of latencies, and increased run--time environment variability will make these problems much harder. We will focus on the redesign of software to fit multicore architectures. Speaker: Jack Dongarra University of Tennessee Oak Ridge National Laboratory University of Manchester Jack Dongarra received a Bachelor of Science in Mathematics from Chicago State University in 1972 and a Master of Science in Computer Science from the Illinois Institute of Technology in 1973. He received his Ph.D. in Applied Mathematics from the University of New Mexico in 1980. He worked at the Argonne National Laboratory until 1989, becoming a senior scientist. He now holds an appointment as University Distinguished Professor of Computer Science in the Electrical Engineering and Computer Science Department at the University of Tennessee, has the position of a Distinguished Research Staff member in the Computer Science and Mathematics Division at Oak Ridge National Laboratory (ORNL), Turing Fellow in the Computer Science and Mathematics Schools at the University of Manchester, and an Adjunct Professor in the Computer Science Department at Rice University. He specializes in numerical algorithms in linear algebra, parallel computing, the use of advanced-computer architectures, programming methodology, and tools for parallel computers. His research includes the development, testing and documentation of high quality mathematical software. He has contributed to the design and implementation of the following open source software packages and systems: EISPACK, LINPACK, the BLAS, LAPACK, ScaLAPACK, Netlib, PVM, MPI, NetSolve, Top500, ATLAS, and PAPI. He has published approximately 200 articles, papers, reports and technical memoranda and he is coauthor of several books. He was awarded the IEEE Sid Fernbach Award in 2004 for his contributions in the application of high performance computers using innovative approaches. He is a Fellow of the AAAS, ACM, and the IEEE and a member of the National Academy of Engineering.
An Overview of High Performance Computing and Challenges for the Future
DOE Office of Scientific and Technical Information (OSTI.GOV)
Google Tech Talks
In this talk we examine how high performance computing has changed over the last 10-year and look toward the future in terms of trends. These changes have had and will continue to have a major impact on our software. A new generation of software libraries and lgorithms are needed for the effective and reliable use of (wide area) dynamic, distributed and parallel environments. Some of the software and algorithm challenges have already been encountered, such as management of communication and memory hierarchies through a combination of compile--time and run--time techniques, but the increased scale of computation, depth of memory hierarchies,more » range of latencies, and increased run--time environment variability will make these problems much harder. We will focus on the redesign of software to fit multicore architectures. Speaker: Jack Dongarra University of Tennessee Oak Ridge National Laboratory University of Manchester Jack Dongarra received a Bachelor of Science in Mathematics from Chicago State University in 1972 and a Master of Science in Computer Science from the Illinois Institute of Technology in 1973. He received his Ph.D. in Applied Mathematics from the University of New Mexico in 1980. He worked at the Argonne National Laboratory until 1989, becoming a senior scientist. He now holds an appointment as University Distinguished Professor of Computer Science in the Electrical Engineering and Computer Science Department at the University of Tennessee, has the position of a Distinguished Research Staff member in the Computer Science and Mathematics Division at Oak Ridge National Laboratory (ORNL), Turing Fellow in the Computer Science and Mathematics Schools at the University of Manchester, and an Adjunct Professor in the Computer Science Department at Rice University. He specializes in numerical algorithms in linear algebra, parallel computing, the use of advanced-computer architectures, programming methodology, and tools for parallel computers. His research includes the development, testing and documentation of high quality mathematical software. He has contributed to the design and implementation of the following open source software packages and systems: EISPACK, LINPACK, the BLAS, LAPACK, ScaLAPACK, Netlib, PVM, MPI, NetSolve, Top500, ATLAS, and PAPI. He has published approximately 200 articles, papers, reports and technical memoranda and he is coauthor of several books. He was awarded the IEEE Sid Fernbach Award in 2004 for his contributions in the application of high performance computers using innovative approaches. He is a Fellow of the AAAS, ACM, and the IEEE and a member of the National Academy of Engineering.« less
Preface to advances in numerical simulation of plasmas
NASA Astrophysics Data System (ADS)
Parker, Scott E.; Chacon, Luis
2016-10-01
This Journal of Computational Physics Special Issue, titled ;Advances in Numerical Simulation of Plasmas,; presents a snapshot of the international state of the art in the field of computational plasma physics. The articles herein are a subset of the topics presented as invited talks at the 24th International Conference on the Numerical Simulation of Plasmas (ICNSP), August 12-14, 2015 in Golden, Colorado. The choice of papers was highly selective. The ICNSP is held every other year and is the premier scientific meeting in the field of computational plasma physics.
Computing the universe: how large-scale simulations illuminate galaxies and dark energy
NASA Astrophysics Data System (ADS)
O'Shea, Brian
2015-04-01
High-performance and large-scale computing is absolutely to understanding astronomical objects such as stars, galaxies, and the cosmic web. This is because these are structures that operate on physical, temporal, and energy scales that cannot be reasonably approximated in the laboratory, and whose complexity and nonlinearity often defies analytic modeling. In this talk, I show how the growth of computing platforms over time has facilitated our understanding of astrophysical and cosmological phenomena, focusing primarily on galaxies and large-scale structure in the Universe.
A European Flagship Programme on Extreme Computing and Climate
NASA Astrophysics Data System (ADS)
Palmer, Tim
2017-04-01
In 2016, an outline proposal co-authored by a number of leading climate modelling scientists from around Europe for a (c. 1 billion euro) flagship project on exascale computing and high-resolution global climate modelling was sent to the EU via its Future and Emerging Flagship Technologies Programme. The project is formally entitled "A Flagship European Programme on Extreme Computing and Climate (EPECC)"? In this talk I will outline the reasons why I believe such a project is needed and describe the current status of the project. I will leave time for some discussion.
Distributed computer taxonomy based on O/S structure
NASA Technical Reports Server (NTRS)
Foudriat, Edwin C.
1985-01-01
The taxonomy considers the resource structure at the operating system level. It compares a communication based taxonomy with the new taxonomy to illustrate how the latter does a better job when related to the client's view of the distributed computer. The results illustrate the fundamental features and what is required to construct fully distributed processing systems. The problem of using network computers on the space station is addressed. A detailed discussion of the taxonomy is not given here. Information is given in the form of charts and diagrams that were used to illustrate a talk.
48 CFR 204.7301 - Definitions.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 204.7301 Definitions. As used in this subpart— Adequate security means protective measures that are... restrictions. Cyber incident means actions taken through the use of computer networks that result in an actual.... Technical information means technical data or computer software, as those terms are defined in the clause at...
Planning, Using the New Technology in Classrooms.
ERIC Educational Resources Information Center
Barker, Bruce O.
1990-01-01
"Technology talk" among progressive administrators includes more than just computers and VCRs. New telecommunications developments (in satellites, fiber optics, electronic bulletin boards, electronic mail, and two-way interactive instructional delivery) are "hot topics" that both principals and teachers must learn about. Peer interactions and…
Parmitano and Cassidy in U.S. Lab
2013-05-31
ISS036-E-005515 (31 May 2013) --- European Space Agency astronaut Luca Parmitano (left) and NASA astronaut Chris Cassidy talk with fellow human beings on Earth using videoconferencing software and one of their on-board laptop computers in the U.S. lab Destiny.
76 FR 64330 - Advanced Scientific Computing Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-18
... talks on HPC Reliability, Diffusion on Complex Networks, and Reversible Software Execution Systems Report from Applied Math Workshop on Mathematics for the Analysis, Simulation, and Optimization of Complex Systems Report from ASCR-BES Workshop on Data Challenges from Next Generation Facilities Public...
A History of the ARPANET: The First Decade
1981-04-01
10 2.2 Major Technical Problems and Approaches 11-12 2.3 Major Changes in Objectives and Approaches 11-19 3, ,3 Scientific and Technical Results and...successful projects ever undertaken by DARPA! The program has initiated extensive changes in the Defense Department’s use of computers as well as in...the ARPANET project represents a similarly far-reaching change in the use of computers by mankind. The full impact of the technical changes set in
ERIC Educational Resources Information Center
Butler, A. K.; And Others
The performance/design requirements and a detailed technical description for a Computer-Directed Training Subsystem to be integrated into the Air Force Phase II Base Level System are described. The subsystem may be used for computer-assisted lesson construction and has presentation capability for on-the-job training for data automation, staff, and…
A Research Program in Computer Technology. 1987 Annual Technical Report
1990-07-01
TITLE (Indcle Security Clanificstion) 1987 Annual Technical Report: *A Research Program in Computer Technology (Unclassified) 12. PERSONAL AUTHOR(S) IS...distributed processing, survivable networks 17. NCE: distributed processing, local networks, personal computers, workstation environment 18. SC Dev...are the auw’iors and should not be Interpreted as representIng the official opinion or policy of DARPA, the U.S. Government, or any person or agency
Extreme Science (LBNL Science at the Theater)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ajo-Franklin, Caroline; Klein, Spencer; Minor, Andrew
On Feb. 27, 2012 at the Berkeley Repertory Theatre, four Berkeley Lab scientists presented talks related to extreme science - and what it means to you. Topics include: Neutrino hunting in Antarctica. Learn why Spencer Klein goes to the ends of the Earth to search for these ghostly particles. From Chernobyl to Central Asia, Tamas Torok travels the globe to study microbial diversity in extreme environments. Andrew Minor uses the world's most advanced electron microscopes to explore materials at ultrahigh stresses and in harsh environments. And microbes that talk to computers? Caroline Ajo-Franklin is pioneering cellular-electrical connections that could helpmore » transform sunlight into fuel.« less
Quality user support: Supporting quality users
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woolley, T.C.
1994-12-31
During the past decade, fundamental changes have occurred in technical computing in the oil industry. Technical computing systems have moved from local, fragmented quantity, to global, integrated, quality. The compute power available to the average geoscientist at his desktop has grown exponentially. Technical computing applications have increased in integration and complexity. At the same time, there has been a significant change in the work force due to the pressures of restructuring, and the increased focus on international opportunities. The profile of the user of technical computing resources has changed. Users are generally more mature, knowledgeable, and team oriented than theirmore » predecessors. In the 1990s, computer literacy is a requirement. This paper describes the steps taken by Oryx Energy Company to address the problems and opportunities created by the explosive growth in computing power and needs, coupled with the contraction of the business. A successful user support strategy will be described. Characteristics of the program include: (1) Client driven support; (2) Empowerment of highly skilled professionals to fill the support role; (3) Routine and ongoing modification to the support plan; (4) Utilization of the support assignment to create highly trained advocates on the line; (5) Integration of the support role to the reservoir management team. Results of the plan include a highly trained work force, stakeholder teams that include support personnel, and global support from a centralized support organization.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-04
..., Airborne Automatic Dead Reckoning Computer Equipment Utilizing Aircraft Heading and Doppler Ground Speed.... ACTION: Notice of cancellation of Technical Standard Order (TSO)-C68a, Airborne Automatic Dead Reckoning... . SUPPLEMENTARY INFORMATION: Background Doppler radar is a semiautomatic self-contained dead reckoning navigation...
Technical Advances and Fifth Grade Reading Comprehension: Do Students Benefit?
ERIC Educational Resources Information Center
Fountaine, Drew
This paper takes a look at some recent studies on utilization of technical tools, primarily personal computers and software, for improving fifth-grade students' reading comprehension. Specifically, the paper asks what benefits an educator can expect students to derive from closed-captioning and computer-assisted reading comprehension products. It…
Word Processing for Technical Writers and Teachers.
ERIC Educational Resources Information Center
Mullins, Carolyn J.; West, Thomas W.
This discussion of the computing network and word processing facilities available to professionals on the Indiana University campuses identifies the word and text processing needs of technical writers and faculty, describes the current computing network, and outlines both long- and short-range objectives, policies, and plans for meeting these…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-28
... information as part of the research needed to write a NIST Special Publication (SP) to help Computer Security.... The NIST SP will identify technical standards, methodologies, procedures, and processes that facilitate prompt and effective response. This RFI requests information regarding technical best practices...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-20
... technical data package, in cases where the Government may have funded only a small portion of the... subcontractor's asserted restrictions on technical data and computer software. DATES: Effective date: September... data and computer software. More specifically, the final rule affects these validation procedures in...
Space lab system analysis: Advanced Solid Rocket Motor (ASRM) communications networks analysis
NASA Technical Reports Server (NTRS)
Ingels, Frank M.; Moorhead, Robert J., II; Moorhead, Jane N.; Shearin, C. Mark; Thompson, Dale R.
1990-01-01
A synopsis of research on computer viruses and computer security is presented. A review of seven technical meetings attended is compiled. A technical discussion on the communication plans for the ASRM facility is presented, with a brief tutorial on the potential local area network media and protocols.
CAD/CAM (Computer-Aided Design/Computer-Aided Manufacturing) Highlights.
1984-10-01
ATTN: AMCRE-C/Ms. Jean Lamb C dr, ATTN: ANCRM-P/Mr. Jerry Gibson Cdr, ATTN: AMCQA-E/Mr. Billings Cdr, ATTN: AMXAM-TL/Technical Library US Army...Sands Missile Range, ATTN: STEWS -TE-TL/Technical Library Cdr, US Military Academy, ATTN: LTC Lanse Leach - - . Cdr, Yuma Proving Grounds, ATTN
Differential modal Zernike wavefront sensor employing a computer-generated hologram: a proposal.
Mishra, Sanjay K; Bhatt, Rahul; Mohan, Devendra; Gupta, Arun Kumar; Sharma, Anurag
2009-11-20
The process of Zernike mode detection with a Shack-Hartmann wavefront sensor is computationally extensive. A holographic modal wavefront sensor has therefore evolved to process the data optically by use of the concept of equal and opposite phase bias. Recently, a multiplexed computer-generated hologram (CGH) technique was developed in which the output is in the form of bright dots that specify the presence and strength of a specific Zernike mode. We propose a wavefront sensor using the concept of phase biasing in the latter technique such that the output is a pair of bright dots for each mode to be sensed. A normalized difference signal between the intensities of the two dots is proportional to the amplitude of the sensed Zernike mode. In our method the number of holograms to be multiplexed is decreased, thereby reducing the modal cross talk significantly. We validated the proposed method through simulation studies for several cases. The simulation results demonstrate simultaneous wavefront detection of lower-order Zernike modes with a resolution better than lambda/50 for the wide measurement range of +/-3.5lambda with much reduced cross talk at high speed.
Chitale, Rohan; Ghobrial, George M; Lobel, Darlene; Harrop, James
2013-10-01
The learning and development of technical skills are paramount for neurosurgical trainees. External influences and a need for maximizing efficiency and proficiency have encouraged advancements in simulator-based learning models. To confirm the importance of establishing an educational curriculum for teaching minimally invasive techniques of pedicle screw placement using a computer-enhanced physical model of percutaneous pedicle screw placement with simultaneous didactic and technical components. A 2-hour educational curriculum was created to educate neurosurgical residents on anatomy, pathophysiology, and technical aspects associated with image-guided pedicle screw placement. Predidactic and postdidactic practical and written scores were analyzed and compared. Scores were calculated for each participant on the basis of the optimal pedicle screw starting point and trajectory for both fluoroscopy and computed tomographic navigation. Eight trainees participated in this module. Average mean scores on the written didactic test improved from 78% to 100%. The technical component scores for fluoroscopic guidance improved from 58.8 to 52.9. Technical score for computed tomography-navigated guidance also improved from 28.3 to 26.6. Didactic and technical quantitative scores with a simulator-based educational curriculum improved objectively measured resident performance. A minimally invasive spine simulation model and curriculum may serve a valuable function in the education of neurosurgical residents and outcomes for patients.
Preface: C/NOFS Results and Equatorial Ionospheric Dynamics
NASA Technical Reports Server (NTRS)
Klenzing, J.; de La Beaujardiere, O.; Gentile, L. C.; Retterer, J.; Rodrigues, F. S.; Stoneback, R. A.
2014-01-01
The Communication/Navigation Outage Forecasting System (C/NOFS) satellite was launched into orbit in April 2008 as part of an ongoing effort to understand and identify plasma irregularities that adversely impact the propagation of radio waves in the upper atmosphere. Combined with recent improvements in radar, airglow, and ground-based studies, as well as state-of-the-art modeling techniques, the C/NOFS mission has led to new insights into equatorial ionospheric electrodynamics. In order to document these advances, the C/NOFS Results and Equatorial Dynamics Technical Interchange Meeting was held in Albuquerque, New Mexico from 12 to 14 March 2013. The meeting was a great success with 55 talks and 22 posters, and covered topics including the numerical simulations of plasma irregularities, the effects of atmospheric tides, stratospheric phenomena, and magnetic storms on the upper atmosphere, causes and predictions of scintillation-causing ionospheric irregularities, current and future instrumentation efforts in the equatorial region. The talks were broken into the following three topical sessions: A. Ambient Ionosphere and Thermosphere B. Transient Phenomena in the Low-Latitude Ionosphere C. New Missions, New Sensors, New Science and Engineering Issues. The following special issue was planned as a follow-up to the meeting. We would like to thank Mike Pinnock, the editors and staff of Copernicus, and our reviewers for their work in bringing this special issue to the scientific community. Our thanks also go to Patricia Doherty and the meeting organizing committee for arranging the C/NOFS Technical Interchange Meeting.
1984-04-01
Scientific- Architecture 4% 4% Technical Computer Sci 38% 37% Math 40% 40% Meteorology 6% 6% Physics 12 % 13% Nontechnical Quality Freeflow 2/ Quality...Architecture 4 Computer Sci 48 43 40 Math 30 35 38 Meteorology 6 6 6 Physics 12 12 12 Engineer Electrical 40% 50% 50% Aero Group 25 25 30 Other / 35 25 20...with Technical Degrees by Major Weapon System. . . 12 FIGURE 4 - Pilots with Technical Degrees by Category . . . . . . 13 FIGURE 5 - Regression
GenePRIMP: Improving Microbial Gene Prediction Quality
Pati, Amrita
2018-01-24
Amrita Pati of the DOE Joint Genome Institute's Genome Biology group talks about a computational pipeline that evaluates the accuracy of gene models in genomes and metagenomes at different stages of finishing at the "Sequencing, Finishing, Analysis in the Future" meeting in Santa Fe, NM.
Tree fruit orchard of the future: An overview
USDA-ARS?s Scientific Manuscript database
Mechanization has been prevailing in row crops over the past decades, and now gradually in some fruit crops, with integration of innovative computers, robotics, mechanics, and precision orchard management. This talk will give an overview of challenges facing commercial fruit industries and needs of ...
Proving Program Termination With Matrix Weighted Digraphs
NASA Technical Reports Server (NTRS)
Dutle, Aaron
2015-01-01
Program termination analysis is an important task in logic and computer science. While determining if a program terminates is known to be undecidable in general, there has been a significant amount of attention given to finding sufficient and computationally practical conditions to prove termination. One such method takes a program and builds from it a matrix weighted digraph. These are directed graphs whose edges are labeled by square matrices with entries in {-1,0,1}, equipped with a nonstandard matrix multiplication. Certain properties of this digraph are known to imply the termination of the related program. In particular, termination of the program can be determined from the weights of the circuits in the digraph. In this talk, the motivation for addressing termination and how matrix weighted digraphs arise will be briefly discussed. The remainder of the talk will describe an efficient method for bounding the weights of a finite set of the circuits in a matrix weighted digraph, which allows termination of the related program to be deduced.
NASA's EOSDIS Cumulus: Ingesting, Archiving, Managing, and Distributing from Commercial Cloud
NASA Astrophysics Data System (ADS)
Baynes, K.; Ramachandran, R.; Pilone, D.; Quinn, P.; Schuler, I.; Gilman, J.; Jazayeri, A.
2017-12-01
NASA's Earth Observing System Data and Information System (EOSDIS) has been working towards a vision of a cloud-based, highly-flexible, ingest, archive, management, and distribution system for its ever-growing and evolving data holdings. This system, Cumulus, is emerging from its prototyping stages and is poised to make a huge impact on how NASA manages and disseminates its Earth science data. This talk will outline the motivation for this work, present the achievements and hurdles of the past 18 months and will chart a course for the future expansion of the Cumulus expansion. We will explore on not just the technical, but also the socio-technical challenges that we face in evolving a system of this magnitude into the cloud and how we are rising to meet those challenges through open collaboration and intentional stakeholder engagement.
Review of Excess Weapons Plutonium Disposition LLNL Contract Work in Russia-(English)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jardine, L; Borisov, G B
This third meeting of the recently completed and ongoing Russian plutonium immobilization contract work was held at the State Education Center (SEC) in St. Petersburg on January 14-18, 2002. The meeting agenda is reprinted here as Appendix A and the attendance list as Appendix B. The meeting had 58 Russian participants from 21 Russian organizations, including the industrial sites (Mayak, Krasonayarsk-26, Tomsk), scientific institutes (VNIINM, KRI, VNIPIPT, RIAR), design organizations (VNIPIET and GSPI), universities (Nyzhny Novgorod, Urals Technical), Russian Academy of Sciences (Institute of Physical Chemistry or IPhCh, Institute of Ore-Deposit Geology, Petrography, Mineralogy, and Geochemistry or IGEM), Radon-Moscow, S&TCmore » Podol'osk, Kharkov-Ukraine, GAN-SEC-NRS and SNIIChM, the RF Ministry of Atomic Energy (Minatom) and Gosatomnadzor (GAN). This volume, published by LLNL, documents this third annual meeting. Forty-nine technical papers were presented by the Russian participants, and nearly all of these have been collected in this Proceedings. The two objectives for the meeting were to: (1) Bring together the Russian organizations, experts, and managers performing this contract work into one place for four days to review and discuss their work amongst each other. (2) Publish a meeting summary and proceedings of all the excellent Russian plutonium immobilization and other plutonium disposition contract work in one document so that the wide extent of the Russian immobilization activities are documented, referencable and available for others to use, as were the Proceedings of the two previous meetings. Attendees gave talks describing their LLNL contract work and submitted written papers documenting their contract work (in English and Russian), in both hard copy and on computer disks. Simultaneous translation into Russian and English was used for presentations made at the State Region Educational Center (SEC).« less
12th IAEA Technical Meeting on Energetic Particles in Magnetic Confinement Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berk, Herbert L.; Breizman, Boris N.
2014-02-21
The 12th IAEA Technical Meeting on Energetic Particles in Magnetic Confinement Systems took place in Austin, Texas (7–11 September 2011). This meeting was organized jointly with the 5th IAEA Technical Meeting on Theory of Plasma Instabilities (5–7 September 2011). The two meetings shared one day (7 September 2011) with presentations relevant to both groups. Some of the work reported at these meetings was then published in a special issue of Nuclear Fusion [Nucl. Fusion 52 (2012)]. Summaries of the Energetic Particle Conference presentations were given by Kazuo Toi and Boris Breizman. They respectively discussed the experimental and theoretical progress presentedmore » at the meeting. Highlights of this meeting include the tremendous progress that has been achieved in the development of diagnostics that enables the ‘viewing’ of internal fluctuations and allows comparison with theoretical predictions, as demonstrated, for example, in the talks of P. Lauber and M. Osakabe. The need and development of hardened diagnostics in the severe radiation environment, such as those that will exist in ITER, was discussed in the talks of V. Kiptily and V.A. Kazakhov. In theoretical studies, much of the effort is focused on nonlinear phenomena. For example, detailed comparison of theory and experiment on D-III-D on the n = 0 geodesic mode was reported in separate papers by R. Nazikian and G. Fu. A large number of theoretical papers were presented on wave chirping including a paper by B.N. Breizman, which notes that wave chirping from a single frequency may emanate continuously once marginal stability conditions have been established. Another area of wide interest was the detailed study of alpha orbits in a burning plasma, where losses can come from symmetry breaking due to finite coil number or magnetic field imperfections introduced by diagnostic or test modules. An important area of development, covered by M.A. Hole and D.A. Spong, is concerned with the self-consistent treatment of the induced fields that accounts for toroidally asymmetric MHD response. In addition, a significant number of studies focused on understanding nonlinear behavior by means of computer simulation of energetic particle driven instability. An under-represented area of investigation was the study of electron runaway formation during major tokamak disruptions. It was noted in an overview by S. Putvinski that electron energies in the 10–20 MeV range is to be expected during projected major disruptions in ITER and that reliable methods for mitigation of the runaway process needs to be developed. Significant recent work in the field of the disruption induced electron runaway, which was reported by J. Riemann, had been submitted to Physics of Plasmas [3]. Overall it is clear that reliable mitigation of electron runaway is an extremely important topic that is in need of better understanding and solutions.« less
Proceedings-1979 third annual practical conference on communication
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1981-04-01
Topics covered at the meeting include: nonacademic writing, writer and editor training in technical publications, readability of technical documents, guide for beginning technical editors, a visual aids data base, newsletter publishing, style guide for a project management organization, word processing, computer graphics, text management for technical documentation, and typographical terminology.
Technical Education Outreach in Materials Science and Technology Based on NASA's Materials Research
NASA Technical Reports Server (NTRS)
Jacobs, James A.
2003-01-01
The grant NAG-1 -2125, Technical Education Outreach in Materials Science and Technology, based on NASA s Materials Research, involves collaborative effort among the National Aeronautics and Space Administration s Langley Research Center (NASA-LaRC), Norfolk State University (NSU), national research centers, private industry, technical societies, colleges and universities. The collaboration aims to strengthen math, science and technology education by providing outreach related to materials science and technology (MST). The goal of the project is to transfer new developments from LaRC s Center for Excellence for Structures and Materials and other NASA materials research into technical education across the nation to provide educational outreach and strengthen technical education. To achieve this goal we are employing two main strategies: 1) development of the gateway website
IDEA Technical Report No. 2. Description of Data Base, 1976-77.
ERIC Educational Resources Information Center
Cashin, William E.; Slawson, Hugh M.
The data and computational procedures used by the IDEA system at Kansas State University (during the 1976-77 academic year) to interpret ratings of teacher performance are described in this technical report. The computations for each of the seven parts (evaluation, course description, students' self ratings, methods, additional questions,…
IDEA Technical Report No. 3. Description of Data Base, 1977-78.
ERIC Educational Resources Information Center
Cashin, William E.; Slawson, Hugh M.
The data and computational procedures used by the IDEA System during the 1977-78 academic year at Kansas State University to interpret ratings of teacher performance are described in this technical report. The computations for each of the seven parts (evaluation, course description, students' self-ratings, methods, additional questions, diagnostic…
ERIC Educational Resources Information Center
Mitzel, Harold E.; Brandon, George L.
A series of five reports is presented which describes the activities carried out by the Pennsylvania State University group engaged in research in computer-assisted instruction (CAI) in vocational-technical education. The reports cover the period January 1968-June 1968 and deal with: 1) prior knowledge and individualized instruction; 2) numerical…
ERIC Educational Resources Information Center
Merrimack Education Center, Chelmsford, MA.
This report of the Technical Assistance Study provided to the Smithville Public Schools by the Technology Lighthouse of the Merrimack Education Center offers information for use in planning computer technology applications over a 3-year period. It provides specific guidelines and criteria for planning and development, equipment considerations,…
Research in the Classroom: Talk, Texts, and Inquiry.
ERIC Educational Resources Information Center
Donoahue, Zoe, Ed.; And Others
This book presents nine studies conducted by teacher researchers who explore the oral and written discourse of learning communities--communities of students, communities of teachers, and communities in which students and teachers learn together. The studies focus on journal writing, conversation, story telling, geometry, computer technology, and…
Problems in Decentralized Decision making and Computation.
1984-12-01
systesis being referred to. Findeisen [1982] clarifies this distinction by talking about the "programing" and "execution" phases.) 5. The lower and higher...n ... *iii... -258- Findeisen , W., (1982), "Decentralized and Hierarchical Control Under Consistency or Disagreement of Interest," Automatica, Vol. 18
Natural Language Description of Emotion
ERIC Educational Resources Information Center
Kazemzadeh, Abe
2013-01-01
This dissertation studies how people describe emotions with language and how computers can simulate this descriptive behavior. Although many non-human animals can express their current emotions as social signals, only humans can communicate about emotions symbolically. This symbolic communication of emotion allows us to talk about emotions that we…
Content Analysis: What Are They Talking About?
ERIC Educational Resources Information Center
Strijbos, Jan-Willem; Martens, Rob L.; Prins, Frans J.; Jochems, Wim M. G.
2006-01-01
Quantitative content analysis is increasingly used to surpass surface level analyses in computer-supported collaborative learning (e.g., counting messages), but critical reflection on accepted practice has generally not been reported. A review of CSCL conference proceedings revealed a general vagueness in definitions of units of analysis. In…
Development Communication Report, No. 41, March 1983.
ERIC Educational Resources Information Center
Development Communication Report, 1983
1983-01-01
This newsletter on development projects in developing nations include the following major articles: (1) "An Insider's Perspective: Dr. Henry Cassirer Talks to DCR about Development Communication and Unesco"; (2) "Comic Books Carry Health Messages to Rural Children in Honduras," by Oscar Vigano; (3) "Computers Come to the…
ERIC Educational Resources Information Center
Manzo, Kathleen Kennedy
2005-01-01
In this article, the author talks about Moore Square Museums Magnet Middle School, where educators are blending technology more heavily into the teaching of arts and music than most other schools. Arts teachers are integrating computer software with traditional instruction in dance, music, theater and visual arts to spark students' creativity. It…
HexSim - A general purpose framework for spatially-explicit, individual-based modeling
HexSim is a framework for constructing spatially-explicit, individual-based computer models designed for simulating terrestrial wildlife population dynamics and interactions. HexSim is useful for a broad set of modeling applications. This talk will focus on a subset of those ap...
ERIC Educational Resources Information Center
Nemirovsky, Ricardo; Tierney, Cornelia; Wright, Tracy
1998-01-01
Analyzed two children's use of a computer-based motion detector to make sense of symbolic expressions (Cartesian graphs). Found three themes: (1) tool perspectives, efforts to understand graphical responses to body motion; (2) fusion, emergent ways of talking and behaving that merge symbols and referents; and (3) graphical spaces, when changing…
NASA Astrophysics Data System (ADS)
Laws, Priscilla
2010-02-01
In June 1986 Ronald Thornton (at the Tufts University Center for Science and Mathematics Teaching) and Priscilla Laws (at Dickinson College) applied independently for grants to develop curricular materials based on both the outcomes of Physics Education Research and the use of Microcomputer Based Laboratory Tools (MBL) developed by Robert Tinker, Ron Thornton and others at Technical Education Research Centers (TERC). Thornton proposed to develop a series of Tools for Scientific Thinking (TST) laboratory exercises to address known learning difficulties using carefully sequenced MBL observations. These TST laboratories were to be beta tested at several types of institutions. Laws proposed to develop a Workshop Physics Activity Guide for a 2 semester calculus-based introductory course sequence centering on MBL-based guided inquiry. Workshop Physics was to be designed to replace traditional lectures and separate labs in relatively small classes and was to be tested at Dickinson College. In September 1986 a project officer at the Fund for Post-Secondary Education (FIPSE) awarded grants to Laws and Thornton provided that they would collaborate. David Sokoloff (at the University of Oregon) joined Thornton to develop and test the TST laboratories. This talk will describe the 23 year collaboration between Thornton, Laws, and Sokoloff that led to the development of a suite of Activity Based Physics curricular materials, new apparatus and enhanced computer tools for real time graphing, data collection and mathematical modeling. The Suite includes TST Labs, the Workshop Physics Activity Guide, RealTime Physics Laboratory Modules, and a series of Interactive Lecture Demonstrations. A textbook and a guide to using the Suite were also developed. The vital importance of obtaining continued grant support, doing continuous research on student learning, collaborating with instructors at other institutions, and forging relationships with vendors and publishers will be described. )
[Virtual audiovisual talking heads: articulatory data and models--applications].
Badin, P; Elisei, F; Bailly, G; Savariaux, C; Serrurier, A; Tarabalka, Y
2007-01-01
In the framework of experimental phonetics, our approach to the study of speech production is based on the measurement, the analysis and the modeling of orofacial articulators such as the jaw, the face and the lips, the tongue or the velum. Therefore, we present in this article experimental techniques that allow characterising the shape and movement of speech articulators (static and dynamic MRI, computed tomodensitometry, electromagnetic articulography, video recording). We then describe the linear models of the various organs that we can elaborate from speaker-specific articulatory data. We show that these models, that exhibit a good geometrical resolution, can be controlled from articulatory data with a good temporal resolution and can thus permit the reconstruction of high quality animation of the articulators. These models, that we have integrated in a virtual talking head, can produce augmented audiovisual speech. In this framework, we have assessed the natural tongue reading capabilities of human subjects by means of audiovisual perception tests. We conclude by suggesting a number of other applications of talking heads.
The “Common Solutions” Strategy of the Experiment Support group at CERN for the LHC Experiments
NASA Astrophysics Data System (ADS)
Girone, M.; Andreeva, J.; Barreiro Megino, F. H.; Campana, S.; Cinquilli, M.; Di Girolamo, A.; Dimou, M.; Giordano, D.; Karavakis, E.; Kenyon, M. J.; Kokozkiewicz, L.; Lanciotti, E.; Litmaath, M.; Magini, N.; Negri, G.; Roiser, S.; Saiz, P.; Saiz Santos, M. D.; Schovancova, J.; Sciabà, A.; Spiga, D.; Trentadue, R.; Tuckett, D.; Valassi, A.; Van der Ster, D. C.; Shiers, J. D.
2012-12-01
After two years of LHC data taking, processing and analysis and with numerous changes in computing technology, a number of aspects of the experiments’ computing, as well as WLCG deployment and operations, need to evolve. As part of the activities of the Experiment Support group in CERN's IT department, and reinforced by effort from the EGI-InSPIRE project, we present work aimed at common solutions across all LHC experiments. Such solutions allow us not only to optimize development manpower but also offer lower long-term maintenance and support costs. The main areas cover Distributed Data Management, Data Analysis, Monitoring and the LCG Persistency Framework. Specific tools have been developed including the HammerCloud framework, automated services for data placement, data cleaning and data integrity (such as the data popularity service for CMS, the common Victor cleaning agent for ATLAS and CMS and tools for catalogue/storage consistency), the Dashboard Monitoring framework (job monitoring, data management monitoring, File Transfer monitoring) and the Site Status Board. This talk focuses primarily on the strategic aspects of providing such common solutions and how this relates to the overall goals of long-term sustainability and the relationship to the various WLCG Technical Evolution Groups. The success of the service components has given us confidence in the process, and has developed the trust of the stakeholders. We are now attempting to expand the development of common solutions into the more critical workflows. The first is a feasibility study of common analysis workflow execution elements between ATLAS and CMS. We look forward to additional common development in the future.
NASA Astrophysics Data System (ADS)
Bunge, Hans-Peter
2002-08-01
Earth's mantle overturns itself about once every 200 Million years (myrs). Prima facie evidence for this overturn is the motion of tectonic plates at the surface of the Earth driving the geologic activity of our planet. Supporting evidence also comes from seismic tomograms of the Earth's interior that reveal the convective currents in remarkable clarity. Much has been learned about the physics of solid state mantle convection over the past two decades aided primarily by sophisticated computer simulations. Such simulations are reaching the threshold of fully resolving the convective system globally. In this talk we will review recent progress in mantle dynamics studies. We will then turn our attention to the fundamental question of whether it is possible to explicitly reconstruct mantle flow back in time. This is a classic problem of history matching, amenable to control theory and data assimilation. The technical advances that make such approach feasible are dramatically increasing compute resources, represented for example through Beowulf clusters, and new observational initiatives, represented for example through the US-Array effort that should lead to an order-of-magnitude improvement in our ability to resolve Earth structure seismically below North America. In fact, new observational constraints on deep Earth structure illustrate the growing importance of of improving our data assimilation skills in deep Earth models. We will explore data assimilation through high resolution global adjoint models of mantle circulation and conclude that it is feasible to reconstruct mantle flow back in time for at least the past 100 myrs.
NASA Technical Reports Server (NTRS)
Olsen, Lola
1992-01-01
In addition to the discussions, Ocean Climate Data Workshop hosts gave participants an opportunity to hear about, see, and test for themselves some of the latest computer tools now available for those studying climate change and the oceans. Six speakers described computer systems and their functions. The introductory talks were followed by demonstrations to small groups of participants and some opportunities for participants to get hands-on experience. After this familiarization period, attendees were invited to return during the course of the Workshop and have one-on-one discussions and further hands-on experience with these systems. Brief summaries or abstracts of introductory presentations are addressed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lucas, Robert; Ang, James; Bergman, Keren
2014-02-10
Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a systemmore » that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.« less
Aquarius SAC-D Post-Launch Briefing
2011-06-10
Conrado Varotto, CONAE Executive and Technical Director, Buenos Aires, talks during the Aquarius/SAC-D post-launch press conference on Friday, June 10, 2011 at the NASA Resident Office, Vandenberg Air Force Base, Calif. The joint U.S./Argentinian Aquarius/Satélite de Aplicaciones Científicas (SAC)-D mission, launched earlier on Friday June 10, will map the salinity at the ocean surface, information critical to improving our understanding of two major components of Earth's climate system: the water cycle and ocean circulation. Photo Credit: (NASA/Bill Ingalls)
2014 Summer Series - Rusty Schweickart - Dinosaur Syndrome Avoidance Project: How Gozit?
2014-07-17
The 2013 Chelyabinsk meteor demonstrated that grave uncertainties exist pertaining to near-Earth objects (NEOs). Although the impact rate for dangerous asteroids is relatively low, the consequences of such an event are severe. Apollo Astronaut Rusty Schweickart, will talk about our prospects of avoiding the same fate as the dinosaurs. He will review the status of the global efforts to protect life on the planet from the devastation of large asteroid impacts. He will also discuss both the technical and geopolitical components of the challenge of preventing future asteroid impacts.
New Concepts for Far-Infrared and Submillimeter Space Astronomy
NASA Technical Reports Server (NTRS)
Benford, Dominic J. (Editor); Leisawitz, David T. (Editor)
2004-01-01
The Second Workshop on New Concepts for Far-Infrared and Submillimeter Space Astronomy aimed to highlight the groundbreaking opportunities available for astronomical investigations in the far-infrared to submillimeter using advanced, space-based telescopes. Held at the University of Maryland on March 7-8, 2002, the Workshop was attended by 130 participants from 50 institutions, and represented scientists and engineers from many countries and with a wide variety of experience. The technical content featured 17 invited talks and 44 contributed posters, complemented by two sixperson panels to address questions of astronomy and technology.
Astrometry: Beyond Microarcseconds
NASA Astrophysics Data System (ADS)
Kulkarni, Shrinivas
2009-05-01
The next decade will witness the flowering of astrometry. On the ground we are already reaping the benefits of adaptive optics, interferometry and digital sky surveys. The precision of GAIA and SIM-Lite will usher in an age of tens to microarcsecond astrometry. In this talk (meant to provoke and whet the appetite of the audience) the speaker will explore astromery in the post-GAIA era. At the sub-microarcsecond the Universe is measurably not static. The speaker will address the basic technical and astronomical challenges and of course the scientific rewards of sub-microarcsecond astromery.
Stochastic locality and master-field simulations of very large lattices
NASA Astrophysics Data System (ADS)
Lüscher, Martin
2018-03-01
In lattice QCD and other field theories with a mass gap, the field variables in distant regions of a physically large lattice are only weakly correlated. Accurate stochastic estimates of the expectation values of local observables may therefore be obtained from a single representative field. Such master-field simulations potentially allow very large lattices to be simulated, but require various conceptual and technical issues to be addressed. In this talk, an introduction to the subject is provided and some encouraging results of master-field simulations of the SU(3) gauge theory are reported.
2017-12-01
prepared for pilot- testing 1 Tasks are described using technical language when possible to facilitate discussion with people with tetraplegia and...you learn ‘he’s doing that, I’ll try that’…We’d all talk about similar issues, similar things that are going on with us… When I see other people , you...caregivers when there are multiple people providing care over the day or week 1. Set expectations early. “It’s also key that you put it all out in front
"Size Matters": Women in High Tech Start-Ups
NASA Astrophysics Data System (ADS)
Lackritz, Hilary
2001-03-01
For those who want constant excitement, change, and rapid opportunities to have an impact in the technical world, start-up companies offer wonderful challenges. This talk will focus realistically on rewards and risks in the start-up world. An outline of the differences between the high tech start-ups and the academic and consulting worlds from a personal viewpoint will be presented. Size usually does matter, and in this case, small size can equal independence, entrepreneurship, and other advantages that are hard to come by in Dilbert’s corporate world.
NASA Technical Reports Server (NTRS)
2004-01-01
KENNEDY SPACE CENTER, FLA. Dr. Gary Stutte explains to Paul Curto (right), chief technologist with NASAs Inventions and Contributions Board, the research being done in this plant growth chamber in the Space Life Sciences Lab. Stutte is a senior research scientist with Dynamac Corp. Curto is visiting KSC to talk to innovators and encourage workers to submit technologies for future Space Act Awards. The Inventions and Contributions Board, established in 1958, is a major contributor in rewarding outstanding scientific or technical contributions sponsored, adopted, supported or used by NASA that are significant to aeronautics and space activities.
Wants Talk Psychotherapy but Cannot Talk
Guina, Cathryn
2018-01-01
While post-stroke depression (PSD) is a common sequelae of stroke, many stroke survivors also have expressive aphasia (i.e., the inability to produce spoken or written language), which limits or prevents treating depression with talk psychotherapy. Unlike most psychotherapy modalities, eye movement desensitization and reprocessing (EMDR) does not require extensive verbal communication to therapists, which might make EMDR an ideal treatment modality for aphasic patients with mental health concerns. The authors present the first known case reporting EMDR in aphasia, describing the treatment of a 50-year-old woman with a history of depression following a left middle cerebral artery stroke. Left frontal lobe strokes are independently associated with both PSD and expressive aphasia. EMDR began two years following the stroke, at which point the patient continued to have persistent expressive aphasia despite previously completing more than a year of speech therapy. Using the Blind to Therapist Protocol, EMDR successfully led to improvement in depressive symptoms and, surprisingly, improvement in aphasia. This case report suggests that EMDR might be beneficial for those with mental health concerns who have expressive communication impairments that might prevent treatment with other psychotherapy modalities. We discuss potential challenges and technical workarounds with EMDR in aphasia, we speculate about potential biopsychosocial explanations for our results, and we recommend future research on EMDR for PSD and other mental health concerns in the context of aphasia, as well as possibly for aphasia itself. PMID:29497580
ERIC Educational Resources Information Center
Basnet, Kul Bahadur; Kim, Jinsoo
2010-01-01
The purpose of this study was to assess the Diploma in Computer Engineering (DCE) courses offered at affiliated schools of the Council for Technical Education and Vocational Training (CTEVT) with a focus on the goals of the curriculum and employment opportunities. Document analysis, questionnaires, focus group discussions and semi-structured…
ERIC Educational Resources Information Center
Van Norman, Ethan R.; Nelson, Peter M.; Parker, David C.
2017-01-01
Computer adaptive tests (CATs) hold promise to monitor student progress within multitiered systems of support. However, the relationship between how long and how often data are collected and the technical adequacy of growth estimates from CATs has not been explored. Given CAT administration times, it is important to identify optimal data…
ERIC Educational Resources Information Center
Capps, Joan P.
An instructional method using flow-chart symbols to make mathematical abstractions more concrete was implemented for a year in a technical mathematics course. Students received instruction in computer applications and programming in the BASIC language in order to increase motivation and firm the mathematical skills and problem-solving approaches…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-07
... asserted restrictions on technical data and computer software. DATES: Comments on the proposed rule should... restrictions on technical data and computer software. More specifically, the proposed rule affects these...) items (as defined at 41 U.S.C. 431(c)). Since COTS items are a subtype of commercial items, this change...
Spectrum orbit utilization program technical manual SOUP5 Version 3.8
NASA Technical Reports Server (NTRS)
Davidson, J.; Ottey, H. R.; Sawitz, P.; Zusman, F. S.
1984-01-01
The underlying engineering and mathematical models as well as the computational methods used by the SOUP5 analysis programs, which are part of the R2BCSAT-83 Broadcast Satellite Computational System, are described. Included are the algorithms used to calculate the technical parameters and references to the relevant technical literature. The system provides the following capabilities: requirements file maintenance, data base maintenance, elliptical satellite beam fitting to service areas, plan synthesis from specified requirements, plan analysis, and report generation/query. Each of these functions are briefly described.
ERIC Educational Resources Information Center
Sproull, Lee; And Others
1996-01-01
Demonstrates that college students' responses to a talking-face computer interface differ from their responses to a text-display interface. In reaction to a humanlike interface, subjects attributed some personality traits to it, were more aroused by it, and tended to present themselves more positively. Gender differences in interface reactions…
Worldwide Emerging Environmental Issues Affecting the U.S. Military. May 2009
2009-05-01
devices can disable the power grid of an entire region; and research on computer-mediated telepathy such as Silent Talk might one day be used to intercept...Pentagon Preps Soldier Telepathy Push http://www.wired.com/dangerroom/2009/05/pentagon-preps-soldier- telepathy -push/ Item 2. New International
ERIC Educational Resources Information Center
Portnoy, Sean
2008-01-01
In this article, the author talks about Urban Tech, a New York City-based organization that works with schools to teach students the life skills it believes are necessary for academic achievement and workforce readiness. Its Youth Leadership Academy program uses a variety of computer-based, interactive elements to educate students on such topics…
NASA Technical Reports Server (NTRS)
2004-01-01
The proceedings of this symposium consist of abstracts of talks presented by interns at NASA Glenn Research Center (GRC). The interns assisted researchers at GRC in projects which primarily address the following topics: aircraft engines and propulsion, spacecraft propulsion, fuel cells, thin film photovoltaic cells, aerospace materials, computational fluid dynamics, aircraft icing, management, and computerized simulation.
ERIC Educational Resources Information Center
Akers, Anne Trice Thompson
2009-01-01
This qualitative study examined middle grades preservice language arts teachers' perceptions of young adult literature through the lenses of reader response, new literacy, and activity theory. Undergraduate preservice teachers used synchronous and asynchronous computer-mediated communication (CMC) to respond online to three young adult books with…
Women@Work: Listening to Gendered Relations of Power in Teachers' Talk about New Technologies.
ERIC Educational Resources Information Center
Jenson, Jennifer; Rose, Chloe Brushwood
2003-01-01
Examines teachers' working identities, highlighting gender inequities among teachers, within school systems, and in society, especially in relation to computers. Highlights tensions central to teaching in relation to new technologies, emphasizing gender inequities that structure understandings of teaching. Documents how, for the teachers studied,…
This paper presents the formulation and evaluation of a mechanistic mathematical model of fathead minnow ovarian steroidogenesis. The model presented in the present study was adpated from other models developed as part of an integrated, multi-disciplinary computational toxicolog...
Probing for quantum speedup on D-Wave Two
NASA Astrophysics Data System (ADS)
Rønnow, Troels F.; Wang, Zhihui; Job, Joshua; Isakov, Sergei V.; Boixo, Sergio; Lidar, Daniel; Martinis, John; Troyer, Matthias
2014-03-01
Quantum speedup refers to the advantage quantum devices can have over classical ones in solving classes of computational problems. In this talk we show how to correctly define and measure quantum speedup in experimental devices. We show how to avoid issues that might mask or fake quantum speedup.
Studies on a Spatialized Audio Interface for Sonar
2011-10-03
addition of spatialized audio to visual displays for sonar is much akin to the development of talking movies in the early days of cinema and can be...than using the brute-force approach. PCA is one among several techniques that share similarities with the computational architecture of a
Making Cloud Computing Available For Researchers and Innovators (Invited)
NASA Astrophysics Data System (ADS)
Winsor, R.
2010-12-01
High Performance Computing (HPC) facilities exist in most academic institutions but are almost invariably over-subscribed. Access is allocated based on academic merit, the only practical method of assigning valuable finite compute resources. Cloud computing on the other hand, and particularly commercial clouds, draw flexibly on an almost limitless resource as long as the user has sufficient funds to pay the bill. How can the commercial cloud model be applied to scientific computing? Is there a case to be made for a publicly available research cloud and how would it be structured? This talk will explore these themes and describe how Cybera, a not-for-profit non-governmental organization in Alberta Canada, aims to leverage its high speed research and education network to provide cloud computing facilities for a much wider user base.
Introduction to computers: Reference guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ligon, F.V.
1995-04-01
The ``Introduction to Computers`` program establishes formal partnerships with local school districts and community-based organizations, introduces computer literacy to precollege students and their parents, and encourages students to pursue Scientific, Mathematical, Engineering, and Technical careers (SET). Hands-on assignments are given in each class, reinforcing the lesson taught. In addition, the program is designed to broaden the knowledge base of teachers in scientific/technical concepts, and Brookhaven National Laboratory continues to act as a liaison, offering educational outreach to diverse community organizations and groups. This manual contains the teacher`s lesson plans and the student documentation to this introduction to computer course.
NASA Technical Reports Server (NTRS)
Taylor, N. L.
1983-01-01
To response to a need for improved computer-generated plots that are acceptable to the Langley publication process, the LaRC Graphics Output System has been modified to encompass the publication requirements, and a guideline has been established. This guideline deals only with the publication requirements of computer-generated plots. This report explains the capability that authors of NASA technical reports can use to obtain publication--quality computer-generated plots or the Langley publication process. The rules applied in developing this guideline and examples illustrating the rules are included.
Eigenspace perturbations for structural uncertainty estimation of turbulence closure models
NASA Astrophysics Data System (ADS)
Jofre, Lluis; Mishra, Aashwin; Iaccarino, Gianluca
2017-11-01
With the present state of computational resources, a purely numerical resolution of turbulent flows encountered in engineering applications is not viable. Consequently, investigations into turbulence rely on various degrees of modeling. Archetypal amongst these variable resolution approaches would be RANS models in two-equation closures, and subgrid-scale models in LES. However, owing to the simplifications introduced during model formulation, the fidelity of all such models is limited, and therefore the explicit quantification of the predictive uncertainty is essential. In such scenario, the ideal uncertainty estimation procedure must be agnostic to modeling resolution, methodology, and the nature or level of the model filter. The procedure should be able to give reliable prediction intervals for different Quantities of Interest, over varied flows and flow conditions, and at diametric levels of modeling resolution. In this talk, we present and substantiate the Eigenspace perturbation framework as an uncertainty estimation paradigm that meets these criteria. Commencing from a broad overview, we outline the details of this framework at different modeling resolution. Thence, using benchmark flows, along with engineering problems, the efficacy of this procedure is established. This research was partially supported by NNSA under the Predictive Science Academic Alliance Program (PSAAP) II, and by DARPA under the Enabling Quantification of Uncertainty in Physical Systems (EQUiPS) project (technical monitor: Dr Fariba Fahroo).
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This technical note describes the current capabilities and availability of the Automated Dredging and Disposal Alternatives Management System (ADDAMS). The technical note replaces the earlier Technical Note EEDP-06-12, which should be discarded. Planning, design, and management of dredging and dredged material disposal projects often require complex or tedious calculations or involve complex decision-making criteria. In addition, the evaluations often must be done for several disposal alternatives or disposal sites. ADDAMS is a personal computer (PC)-based system developed to assist in making such evaluations in a timely manner. ADDAMS contains a collection of computer programs (applications) designed to assist in managingmore » dredging projects. This technical note describes the system, currently available applications, mechanisms for acquiring and running the system, and provisions for revision and expansion.« less
Huang, Kun; Liu, Yunlong; Huang, Yufei; Li, Lang; Cooper, Lee; Ruan, Jianhua; Zhao, Zhongming
2016-08-22
We summarize the 2015 International Conference on Intelligent Biology and Medicine (ICIBM 2015) and the editorial report of the supplement to BMC Genomics. The supplement includes 20 research articles selected from the manuscripts submitted to ICIBM 2015. The conference was held on November 13-15, 2015 at Indianapolis, Indiana, USA. It included eight scientific sessions, three tutorials, four keynote presentations, three highlight talks, and a poster session that covered current research in bioinformatics, systems biology, computational biology, biotechnologies, and computational medicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Kandler A; Santhanagopalan, Shriram; Yang, Chuanbo
Computer models are helping to accelerate the design and validation of next generation batteries and provide valuable insights not possible through experimental testing alone. Validated 3-D physics-based models exist for predicting electrochemical performance, thermal and mechanical response of cells and packs under normal and abuse scenarios. The talk describes present efforts to make the models better suited for engineering design, including improving their computation speed, developing faster processes for model parameter identification including under aging, and predicting the performance of a proposed electrode material recipe a priori using microstructure models.
A prisoner's dilemma experiment on cooperation with people and human-like computers.
Kiesler, S; Sproull, L; Waters, K
1996-01-01
The authors investigated basic properties of social exchange and interaction with technology in an experiment on cooperation with a human-like computer partner or a real human partner. Talking with a computer partner may trigger social identity feelings or commitment norms. Participants played a prisoner's dilemma game with a confederate or a computer partner. Discussion, inducements to make promises, and partner cooperation varied across trials. On Trial 1, after discussion, most participants proposed cooperation. They kept their promises as much with a text-only computer as with a person, but less with a more human-like computer. Cooperation dropped sharply when any partner avoided discussion. The strong impact of discussion fits a social contract explanation of cooperation following discussion. Participants broke their promises to a computer more than to a person, however, indicating that people make heterogeneous commitments.
Ethics, advertising and the definition of a profession.
Dyer, A R
1985-01-01
In the climate of concern about high medical costs, the relationship between the trade and professional aspects of medical practice is receiving close scrutiny. In the United Kingdom there is talk of increasing privatisation of health services, and in the United States the Federal Trade Commission (FTC) has attempted to define medicine as a trade for the purposes of commercial regulation. The Supreme Court recently upheld the FTC charge that the American Medical Association (AMA) has been in restraint of trade because of ethical strictures against advertising. The concept of profession, as it has been analyzed in sociological, legal, philosophical, and historical perspectives, reveals the importance of an ethic of service as well as technical expertise as defining characteristics of professions. It is suggested that the medical profession should pay more attention to its service ideal at this time when doctors are widely perceived to be technically preoccupied. PMID:4009637
Proprietary Manned Space Flight Proposals, 1973 to 2013, plus
NASA Astrophysics Data System (ADS)
Fisher, Philip
2016-03-01
In 1973 a concept for a manned space flight experiment was submitted to NASA as an unsolicited proprietary proposal,*. In 1998*, 2004*, and 2013* proposals successively more details were provided. An abbreviation of the 1998 proposal was published. By 2013 the five technical variables of 1998 had increased to over ten. Some technical and management details of the proposals will be presented and updated. The first flight of two could use some hardware now being developed. The experiment seems superior to any mission publicly advocated by NASA, so this talk's purpose is to encourage NASA to delay landing humans on Mars until the first spacecraft can be developed and activated. *Complete proposals are in the Philip C. Fisher papers, Niels Bohr Library and Archives, American Institute of Physics (available one year after author's death). Work after 1982 supported by successive forms of Ruffner Associates.
Ethics, advertising and the definition of a profession.
Dyer, A R
1985-06-01
In the climate of concern about high medical costs, the relationship between the trade and professional aspects of medical practice is receiving close scrutiny. In the United Kingdom there is talk of increasing privatisation of health services, and in the United States the Federal Trade Commission (FTC) has attempted to define medicine as a trade for the purposes of commercial regulation. The Supreme Court recently upheld the FTC charge that the American Medical Association (AMA) has been in restraint of trade because of ethical strictures against advertising. The concept of profession, as it has been analyzed in sociological, legal, philosophical, and historical perspectives, reveals the importance of an ethic of service as well as technical expertise as defining characteristics of professions. It is suggested that the medical profession should pay more attention to its service ideal at this time when doctors are widely perceived to be technically preoccupied.
NASA Technical Reports Server (NTRS)
Baynes, Katie; Ramachandran, Rahul; Pilone, Dan; Quinn, Patrick; Gilman, Jason; Schuler, Ian; Jazayeri, Alireza
2017-01-01
NASA's Earth Observing System Data and Information System (EOSDIS) has been working towards a vision of a cloud-based, highly-flexible, ingest, archive, management, and distribution system for its ever-growing and evolving data holdings. This system, Cumulus, is emerging from its prototyping stages and is poised to make a huge impact on how NASA manages and disseminates its Earth science data. This talk will outline the motivation for this work, present the achievements and hurdles of the past 18 months and will chart a course for the future expansion of the Cumulus expansion. We will explore on not just the technical, but also the socio-technical challenges that we face in evolving a system of this magnitude into the cloud and how we are rising to meet those challenges through open collaboration and intentional stakeholder engagement.
Physical Origins of Space Weather Impacts: Open Physics Questions
NASA Astrophysics Data System (ADS)
Lanzerotti, L. J.
2011-12-01
Beginning with the era of development of electrical telegraph systems in the early 19th century, physical processes in the space environment on the Sun, in the interplanetary medium, and around Earth have influenced the design and operations of ever-increasing and sophisticated technical systems, both in space and on the ground. Understanding of Earth's space environment has increased enormously in the last century and one-half. Nevertheless, many of the physical processes that produced effects on early cable and wireless technologies continue to plague modern-day systems. And as new technologies are developed for improved communications, surveillance, navigation, and conditions for human space flight, the solar-terrestrial environment often offers surprises to their safe, secure and uninterrupted operations. This talk will address some of the challenges that I see to the successful operations of some modern-day technical systems that are posed by significant deficiencies of understanding of physical processes operating from the Sun to the Earth.
48 CFR 209.505-4 - Obtaining access to proprietary information.
Code of Federal Regulations, 2011 CFR
2011-10-01
...) Non-disclosure requirements for contractors accessing third party proprietary technical data or... may be required to enter into non-disclosure agreements directly with the third party asserting restrictions on limited rights technical data, commercial technical data, or restricted rights computer...
ERIC Educational Resources Information Center
Hitchcock, A. Allen
The problem that this practicum attempted to solve was that students in a vocational-technical college tended to underachieve in courses that were mainly cognitive in nature, as evidenced by low overall grade-point course averages and other measures. The researcher designed computer-based simulation/gaming instruction that aimed to increase…
ERIC Educational Resources Information Center
Ausburn, Lynna J.; Ausburn, Floyd B.; Kroutter, Paul J.
2013-01-01
This study used a cross-case analysis methodology to compare four line-of-inquiry studies of desktop virtual environments (DVEs) to examine the relationships of gender and computer gaming experience to learning performance and perceptions. Comparison was made of learning patterns in a general non-technical DVE with patterns in technically complex,…
ERIC Educational Resources Information Center
HANKIN, EDWARD K.; AND OTHERS
THIS TECHNICAL PROGRESS REPORT COVERS THE FIRST THREE MONTHS OF A PROJECT TO DEVELOP COMPUTER ASSISTED PREVOCATIONAL READING AND ARITHMETIC COURSES FOR DISADVANTAGED YOUTHS AND ADULTS. DURING THE FIRST MONTH OF OPERATION, PROJECT PERSONNEL CONCENTRATED ON SUCH ADMINISTRATIVE MATTERS AS TRAINING STAFF AND PREPARING FACILITIES. AN ARITHMETIC PROGRAM…
ERIC Educational Resources Information Center
Tuttle, Francis
Twenty-three instructors participated in an 8-week summer institute to develop their technical competency to teach the second year of a 2-year Technical Education Computer Science Program. Instructional material covered the following areas: (1) compiler languages and systems design, (2) cost studies, (3) business organization, (4) advanced…
IDEA Technical Report No. 4. Description of IDEA Standard Form Data Base.
ERIC Educational Resources Information Center
Cashin, William E.; Perrin, Bruce M.
The data and computational procedures used by the IDEA System to generate IDEA Reports from information collected on the Standard Form of the IDEA Survey Form are described in this technical report. The computations for each of the seven parts of the IDEA Report are explained. The data base used for this 1978-79 Kansas State University study…
ERIC Educational Resources Information Center
Holbrook, M. Cay; MacCuspie, P. Ann
2010-01-01
Braille-reading mathematicians, scientists, and computer scientists were asked to examine the usability of the Unified English Braille Code (UEB) for technical materials. They had little knowledge of the code prior to the study. The research included two reading tasks, a short tutorial about UEB, and a focus group. The results indicated that the…
117. Back side technical facilities S.R. radar transmitter & computer ...
117. Back side technical facilities S.R. radar transmitter & computer building no. 102, "building sections - sheet I" - architectural, AS-BLT AW 35-46-04, sheet 12, dated 23 January, 1961. - Clear Air Force Station, Ballistic Missile Early Warning System Site II, One mile west of mile marker 293.5 on Parks Highway, 5 miles southwest of Anderson, Anderson, Denali Borough, AK
122. Back side technical facilities S.R. radar transmitter & computer ...
122. Back side technical facilities S.R. radar transmitter & computer building no. 102, section II "elevations & details" - structural, AS-BLT AW 35-46-04, sheet 73, dated 23 January, 1961. - Clear Air Force Station, Ballistic Missile Early Warning System Site II, One mile west of mile marker 293.5 on Parks Highway, 5 miles southwest of Anderson, Anderson, Denali Borough, AK
118. Back side technical facilities S.R. radar transmitter & computer ...
118. Back side technical facilities S.R. radar transmitter & computer building no. 102, "building sections - sheet I" - architectural, AS-BLT AW 35-46-04, sheet 13, dated 23 January, 1961. - Clear Air Force Station, Ballistic Missile Early Warning System Site II, One mile west of mile marker 293.5 on Parks Highway, 5 miles southwest of Anderson, Anderson, Denali Borough, AK
121. Back side technical facilities S.R. radar transmitter & computer ...
121. Back side technical facilities S.R. radar transmitter & computer building no. 102, section II "sections & elevations" - structural, AS-BLT AW 35-46-04, sheet 72, dated 23 January, 1961. - Clear Air Force Station, Ballistic Missile Early Warning System Site II, One mile west of mile marker 293.5 on Parks Highway, 5 miles southwest of Anderson, Anderson, Denali Borough, AK
Medrela-Kuder, Ewa
2011-01-01
The aim of the study was the evaluation of a dietary habits profile and physical activity of Physiotherapy and Technical & Computer Science students. The research involved a group of 174 non-full-time students of higher education institutions in Krakow aged between 22 and 27. 81 students of the surveyed studied Physiotherapy at the University of Physical Education, whereas 93 followed a course in Technical & Computer Science at the Pedagogical University. In this project a diagnostic survey method was used. The study revealed that the lifestyle of university youth left much to be desired. Dietary errors were exemplified by irregular meals intake, low consumption of fish, milk and dairy, snacking between meals on high calorie products with a poor nutrient content. With regard to physical activity, Physiotherapy students were characterised by more positive attitudes than those from Technical & Computer Science. Such physical activity forms as swimming, team sports, cycling and strolling were declared by the surveyed the most frequently. Health-oriented education should be introduced in such a way as to improve the knowledge pertaining to a health-promoting lifestyle as a means of prevention of numerous diseases.
48 CFR 12.212 - Computer software.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Computer software. 12.212... software. (a) Commercial computer software or commercial computer software documentation shall be acquired... required to— (1) Furnish technical information related to commercial computer software or commercial...
48 CFR 12.212 - Computer software.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Computer software. 12.212... software. (a) Commercial computer software or commercial computer software documentation shall be acquired... required to— (1) Furnish technical information related to commercial computer software or commercial...
48 CFR 12.212 - Computer software.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Computer software. 12.212... software. (a) Commercial computer software or commercial computer software documentation shall be acquired... required to— (1) Furnish technical information related to commercial computer software or commercial...
48 CFR 12.212 - Computer software.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Computer software. 12.212... software. (a) Commercial computer software or commercial computer software documentation shall be acquired... required to— (1) Furnish technical information related to commercial computer software or commercial...
48 CFR 12.212 - Computer software.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Computer software. 12.212... software. (a) Commercial computer software or commercial computer software documentation shall be acquired... required to— (1) Furnish technical information related to commercial computer software or commercial...
32 CFR 701.38 - Technical data.
Code of Federal Regulations, 2013 CFR
2013-07-01
... OFFICIAL RECORDS AVAILABILITY OF DEPARTMENT OF THE NAVY RECORDS AND PUBLICATION OF DEPARTMENT OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC FOIA Definitions and Terms § 701.38 Technical data. Recorded information, regardless of form or method of the recording, of a scientific or technical nature (including computer...
32 CFR 701.38 - Technical data.
Code of Federal Regulations, 2012 CFR
2012-07-01
... OFFICIAL RECORDS AVAILABILITY OF DEPARTMENT OF THE NAVY RECORDS AND PUBLICATION OF DEPARTMENT OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC FOIA Definitions and Terms § 701.38 Technical data. Recorded information, regardless of form or method of the recording, of a scientific or technical nature (including computer...
32 CFR 701.38 - Technical data.
Code of Federal Regulations, 2014 CFR
2014-07-01
... OFFICIAL RECORDS AVAILABILITY OF DEPARTMENT OF THE NAVY RECORDS AND PUBLICATION OF DEPARTMENT OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC FOIA Definitions and Terms § 701.38 Technical data. Recorded information, regardless of form or method of the recording, of a scientific or technical nature (including computer...
Management of technical date in Nihon Doro kodan
NASA Astrophysics Data System (ADS)
Hanada, Jun'ichi
Nihon Doro Kodan Laboratory has collected and contributed technical data (microfiches, aerial photographs, books and literature) on plans, designs, constructions and maintenance of the national expressways and the ordinary toll roads since 1968. This work is systematized on computer to retrieve and contribute data faster. Now Laboratory operates Technical Data Management System which manages all of technical data and Technical Document Management System which manages technical documents. These systems stand on users' on-line retrieval and data accumuration by microfiches and optical disks.
Bibliography--Unclassified Technical Reports, Special Reports, and Technical Notes: FY 1982.
1982-11-01
in each category are listed in chronological order under seven areas: manpower management, personnel administration , organization management, education...7633). Technical reports listed that have unlimited distribution can also be obtained from the National Technical Information Service , 5285 Port Royal...simulations of manpower systems. This research exploits the technology of computer-managed large-scale data bases. PERSONNEL ADMINISTRATION The personnel