Public Domain Microcomputer Software for Forestry.
ERIC Educational Resources Information Center
Martin, Les
A project was conducted to develop a computer forestry/forest products bibliography applicable to high school and community college vocational/technical programs. The project director contacted curriculum clearinghouses, computer companies, and high school and community college instructors in order to obtain listings of public domain programs for…
The Computer Revolution and Physical Chemistry.
ERIC Educational Resources Information Center
O'Brien, James F.
1989-01-01
Describes laboratory-oriented software programs that are short, time-saving, eliminate computational errors, and not found in public domain courseware. Program availability for IBM and Apple microcomputers is included. (RT)
Code of Federal Regulations, 2011 CFR
2011-07-01
... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... public domain computer software. (a) General. This section prescribes the procedures for submission of legal documents pertaining to computer shareware and the deposit of public domain computer software...
Code of Federal Regulations, 2010 CFR
2010-07-01
... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... public domain computer software. (a) General. This section prescribes the procedures for submission of legal documents pertaining to computer shareware and the deposit of public domain computer software...
Code of Federal Regulations, 2013 CFR
2013-07-01
... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... public domain computer software. (a) General. This section prescribes the procedures for submission of legal documents pertaining to computer shareware and the deposit of public domain computer software...
Code of Federal Regulations, 2012 CFR
2012-07-01
... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... public domain computer software. (a) General. This section prescribes the procedures for submission of legal documents pertaining to computer shareware and the deposit of public domain computer software...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Orvis, W.J.
1993-11-03
The Computer Incident Advisory Capability (CIAC) operates two information servers for the DOE community, FELICIA (formerly FELIX) and IRBIS. FELICIA is a computer Bulletin Board System (BBS) that can be accessed by telephone with a modem. IRBIS is an anonymous ftp server that can be accessed on the Internet. Both of these servers contain all of the publicly available CIAC, CERT, NIST, and DDN bulletins, virus descriptions, the VIRUS-L moderated virus bulletin board, copies of public domain and shareware virus- detection/protection software, and copies of useful public domain and shareware utility programs. This guide describes how to connect these systemsmore » and obtain files from them.« less
ERIC Educational Resources Information Center
Primich, Tracy
1992-01-01
Discusses computer viruses that attack the Macintosh and describes Symantec AntiVirus for Macintosh (SAM), a commercial program designed to detect and eliminate viruses; sample screen displays are included. SAM is recommended for use in library settings as well as two public domain virus protection programs. (four references) (MES)
NASA Technical Reports Server (NTRS)
Saltsman, James F.
1992-01-01
This manual presents computer programs for characterizing and predicting fatigue and creep-fatigue resistance of metallic materials in the high-temperature, long-life regime for isothermal and nonisothermal fatigue. The programs use the total strain version of Strainrange Partitioning (TS-SRP). An extensive database has also been developed in a parallel effort. This database is probably the largest source of high-temperature, creep-fatigue test data available in the public domain and can be used with other life prediction methods as well. This users manual, software, and database are all in the public domain and are available through COSMIC (382 East Broad Street, Athens, GA 30602; (404) 542-3265, FAX (404) 542-4807). Two disks accompany this manual. The first disk contains the source code, executable files, and sample output from these programs. The second disk contains the creep-fatigue data in a format compatible with these programs.
Code of Federal Regulations, 2014 CFR
2014-07-01
... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... of public domain computer software. (a) General. This section prescribes the procedures for... software under section 805 of Public Law 101-650, 104 Stat. 5089 (1990). Documents recorded in the...
The open-source, public domain JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) API (Application Programming Interface) provides conventions and Fortran-90 modules to develop applications (computer programs) for analyzing process models. The input ...
Software For Computing Reliability Of Other Software
NASA Technical Reports Server (NTRS)
Nikora, Allen; Antczak, Thomas M.; Lyu, Michael
1995-01-01
Computer Aided Software Reliability Estimation (CASRE) computer program developed for use in measuring reliability of other software. Easier for non-specialists in reliability to use than many other currently available programs developed for same purpose. CASRE incorporates mathematical modeling capabilities of public-domain Statistical Modeling and Estimation of Reliability Functions for Software (SMERFS) computer program and runs in Windows software environment. Provides menu-driven command interface; enabling and disabling of menu options guides user through (1) selection of set of failure data, (2) execution of mathematical model, and (3) analysis of results from model. Written in C language.
Langevin, Christian D.
2009-01-01
SEAWAT is a MODFLOW-based computer program designed to simulate variable-density groundwater flow coupled with multi-species solute and heat transport. The program has been used for a wide variety of groundwater studies including saltwater intrusion in coastal aquifers, aquifer storage and recovery in brackish limestone aquifers, and brine migration within continental aquifers. SEAWAT is relatively easy to apply because it uses the familiar MODFLOW structure. Thus, most commonly used pre- and post-processors can be used to create datasets and visualize results. SEAWAT is a public domain computer program distributed free of charge by the U.S. Geological Survey.
Hypercluster Parallel Processor
NASA Technical Reports Server (NTRS)
Blech, Richard A.; Cole, Gary L.; Milner, Edward J.; Quealy, Angela
1992-01-01
Hypercluster computer system includes multiple digital processors, operation of which coordinated through specialized software. Configurable according to various parallel-computing architectures of shared-memory or distributed-memory class, including scalar computer, vector computer, reduced-instruction-set computer, and complex-instruction-set computer. Designed as flexible, relatively inexpensive system that provides single programming and operating environment within which one can investigate effects of various parallel-computing architectures and combinations on performance in solution of complicated problems like those of three-dimensional flows in turbomachines. Hypercluster software and architectural concepts are in public domain.
Large Scale Portability of Hospital Information System Software
Munnecke, Thomas H.; Kuhn, Ingeborg M.
1986-01-01
As part of its Decentralized Hospital Computer Program (DHCP) the Veterans Administration installed new hospital information systems in 169 of its facilities during 1984 and 1985. The application software for these systems is based on the ANS MUMPS language, is public domain, and is designed to be operating system and hardware independent. The software, developed by VA employees, is built upon a layered approach, where application packages layer on a common data dictionary which is supported by a Kernel of software. Communications between facilities are based on public domain Department of Defense ARPA net standards for domain naming, mail transfer protocols, and message formats, layered on a variety of communications technologies.
SPREADSHEET BASED SCALING CALCULATIONS AND MEMBRANE PERFORMANCE
Many membrane element manufacturers provide a computer program to aid buyers in the use of their elements. However, to date there are few examples of fully integrated public domain software available for calculating reverse osmosis and nanofiltration system performance. The Total...
NASA Technical Reports Server (NTRS)
Arya, Vinod K.; Halford, Gary R. (Technical Monitor)
2003-01-01
This manual presents computer programs FLAPS for characterizing and predicting fatigue and creep-fatigue resistance of metallic materials in the high-temperature, long-life regime for isothermal and nonisothermal fatigue. The programs use the Total Strain version of Strainrange Partitioning (TS-SRP), and several other life prediction methods described in this manual. The user should be thoroughly familiar with the TS-SRP and these life prediction methods before attempting to use any of these programs. Improper understanding can lead to incorrect use of the method and erroneous life predictions. An extensive database has also been developed in a parallel effort. The database is probably the largest source of high-temperature, creep-fatigue test data available in the public domain and can be used with other life-prediction methods as well. This users' manual, software, and database are all in the public domain and can be obtained by contacting the author. The Compact Disk (CD) accompanying this manual contains an executable file for the FLAPS program, two datasets required for the example problems in the manual, and the creep-fatigue data in a format compatible with these programs.
Theoretical basis of the DOE-2 building energy use analysis program
NASA Astrophysics Data System (ADS)
Curtis, R. B.
1981-04-01
A user-oriented, public domain, computer program was developed that will enable architects and engineers to perform design and retrofit studies of the energy-use of buildings under realistic weather conditions. The DOE-2.1A has been named by the US DOE as the standard evaluation technique for the Congressionally mandated building energy performance standards (BEPS). A number of program design decisions were made that determine the breadth of applicability of DOE-2.1. Such design decisions are intrinsic to all building energy use analysis computer programs and determine the types of buildings or the kind of HVAC systems that can be modeled. In particular, the weighting factor method used in DOE-2 has both advantages and disadvantages relative to other computer programs.
NASA Astrophysics Data System (ADS)
Reimer, Ashton S.; Cheviakov, Alexei F.
2013-03-01
A Matlab-based finite-difference numerical solver for the Poisson equation for a rectangle and a disk in two dimensions, and a spherical domain in three dimensions, is presented. The solver is optimized for handling an arbitrary combination of Dirichlet and Neumann boundary conditions, and allows for full user control of mesh refinement. The solver routines utilize effective and parallelized sparse vector and matrix operations. Computations exhibit high speeds, numerical stability with respect to mesh size and mesh refinement, and acceptable error values even on desktop computers. Catalogue identifier: AENQ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AENQ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License v3.0 No. of lines in distributed program, including test data, etc.: 102793 No. of bytes in distributed program, including test data, etc.: 369378 Distribution format: tar.gz Programming language: Matlab 2010a. Computer: PC, Macintosh. Operating system: Windows, OSX, Linux. RAM: 8 GB (8, 589, 934, 592 bytes) Classification: 4.3. Nature of problem: To solve the Poisson problem in a standard domain with “patchy surface”-type (strongly heterogeneous) Neumann/Dirichlet boundary conditions. Solution method: Finite difference with mesh refinement. Restrictions: Spherical domain in 3D; rectangular domain or a disk in 2D. Unusual features: Choice between mldivide/iterative solver for the solution of large system of linear algebraic equations that arise. Full user control of Neumann/Dirichlet boundary conditions and mesh refinement. Running time: Depending on the number of points taken and the geometry of the domain, the routine may take from less than a second to several hours to execute.
DOT National Transportation Integrated Search
1975-12-01
Frequency domain computer programs developed or acquired by TSC for the analysis of rail vehicle dynamics are described in two volumes. Volume 2 contains program listings including subroutines for the four TSC frequency domain programs described in V...
A distributed version of the NASA Engine Performance Program
NASA Technical Reports Server (NTRS)
Cours, Jeffrey T.; Curlett, Brian P.
1993-01-01
Distributed NEPP, a version of the NASA Engine Performance Program, uses the original NEPP code but executes it in a distributed computer environment. Multiple workstations connected by a network increase the program's speed and, more importantly, the complexity of the cases it can handle in a reasonable time. Distributed NEPP uses the public domain software package, called Parallel Virtual Machine, allowing it to execute on clusters of machines containing many different architectures. It includes the capability to link with other computers, allowing them to process NEPP jobs in parallel. This paper discusses the design issues and granularity considerations that entered into programming Distributed NEPP and presents the results of timing runs.
NASA Technical Reports Server (NTRS)
Bown, Rodney L.
1991-01-01
Software engineering ethics is reviewed. The following subject areas are covered: lack of a system viewpoint; arrogance of PC DOS software vendors; violation od upward compatibility; internet worm; internet worm revisited; student cheating and company hiring interviews; computing practitioners and the commodity market; new projects and old programming languages; schedule and budget; and recent public domain comments.
DOT National Transportation Integrated Search
1975-12-01
Frequency domain computer programs developed or acquired by TSC for the analysis of rail vehicle dynamics are described in two volumes. Volume I defines the general analytical capabilities required for computer programs applicable to single rail vehi...
A statistical package for computing time and frequency domain analysis
NASA Technical Reports Server (NTRS)
Brownlow, J.
1978-01-01
The spectrum analysis (SPA) program is a general purpose digital computer program designed to aid in data analysis. The program does time and frequency domain statistical analyses as well as some preanalysis data preparation. The capabilities of the SPA program include linear trend removal and/or digital filtering of data, plotting and/or listing of both filtered and unfiltered data, time domain statistical characterization of data, and frequency domain statistical characterization of data.
Epi info - present and future.
Su, Y; Yoon, S S
2003-01-01
Epi Info is a suite of public domain computer programs for public health professionals developed by the Centers for Disease Control and Prevention (CDC). Epi Info is used for rapid questionnaire design, data entry and validation, data analysis including mapping and graphing, and creation of reports. Epi Info was originally created in 1985 using Turbo Pascal. In 1998, the last version of Epi Info for DOS, version 6, was released. Epi Info for DOS is currently supported by CDC but is no longer updated. The current version, Epi Info 2002, is Windows-based software developed using Microsoft Visual Basic. Approximately 300,000 downloads of Epi Info software occurred in 2002 from approximately 130 countries. These numbers make Epi Info probably one of the most widely distributed and used public domain programs in the world. The DOS version of Epi Info was translated into 13 languages, and efforts are underway to translate the Windows version into other major languages. Versions already exist for Spanish, French, Portuguese, Chinese, Japanese, and Arabic.
Software Descriptions. Micro-computers: Atari, Apple, PET, TRS-80.
ERIC Educational Resources Information Center
Olivero, James L.
Each of the more than 200 educational software programs developed by both commercial and public domain sources which are described is intended for use with one of the four microcomputers most frequently used for instructional purposes--Atari, Apple, PET, and TRS-80. These descriptions are offered as a service for those who are just beginning to…
The purpose of this document is to introduce the use of the ground water geohydrology computer program WhAEM for Microsoft Windows (32-bit), or WhAEM2000. WhAEM2000 is a public domain, ground-water flow model designed to facilitate capture zone delineation and protection area map...
Computer Aided Method for System Safety and Reliability Assessments
2008-09-01
program between 1998 and 2003. This tool was not marketed in the public domain after the CRV program ended. The other tool is called eXpress, and it...support Government reviewed and approved analyses methodologies which can 5 then be shared with other government agencies and industry partners...Documented for B&R, UP&L, EPRI 30 DEC 80 GO IBM Version Enhanced at UCC , Dallas, Descriptors, Facility to Alter Array Sizes, Explanation of Use 1 SEP 82
CALNPS: Computer Analysis Language Naval Postgraduate School Version
1989-06-01
The graphics capabilities were expanded to include hai copy options using the PlotlO and Disspia araplaics libraries. T’\\u di ,pla. !z1 options are ...8217:c:n of tbhis page All oiher ediiions are obsc,,C I. nclassified Approved for public release; distribution is unlimited. CALNPS Computer Analysis... are now available and the user now has the capability to plot curves from data files from within the CALNPS domain. As CALNPS is a very large program
Public-domain-software solution to data-access problems for numerical modelers
Jenter, Harry; Signell, Richard
1992-01-01
Unidata's network Common Data Form, netCDF, provides users with an efficient set of software for scientific-data-storage, retrieval, and manipulation. The netCDF file format is machine-independent, direct-access, self-describing, and in the public domain, thereby alleviating many problems associated with accessing output from large hydrodynamic models. NetCDF has programming interfaces in both the Fortran and C computer language with an interface to C++ planned for release in the future. NetCDF also has an abstract data type that relieves users from understanding details of the binary file structure; data are written and retrieved by an intuitive, user-supplied name rather than by file position. Users are aided further by Unidata's inclusion of the Common Data Language, CDL, a printable text-equivalent of the contents of a netCDF file. Unidata provides numerous operators and utilities for processing netCDF files. In addition, a number of public-domain and proprietary netCDF utilities from other sources are available at this time or will be available later this year. The U.S. Geological Survey has produced and is producing a number of public-domain netCDF utilities.
The purpose of this document is to introduce through a case study the use of the ground water geohydrology computer program WhAEM for Microsoft Windows (32-bit), or WhAEM2000. WhAEM2000 is a public domain, ground-water flow model designed to facilitate capture zone delineation an...
McLanahan, L.O.
1991-01-01
The U.S. Geological Survey (USGS) was established by an act of Congress on March 3, 1879, to provide a permanent Federal agency to conduct the systematic and scientific 'classification of the public lands, and examination of the geological structure, mineral resources, and products of national domain'. Since 1879, the research and fact-finding role of the USGS has grown and has been modified to meet the changing needs of the Nation it serves. Moneys for program operation of the USGS in Pennsylvania come from joint-funding agreements with State and local agencies , transfer of funds from other Federal agencies, and direct Federal allotments to the USGS. Funding is distributed among the following programs: National Water Quality Assessment; water quality programs; surface water programs; groundwater programs; logging and geophysical services; computer services; scientific publication and information; hydrologic investigations; and hydrologic surveillance. (Lantz-PTT)
Crosswords to computers: a critical review of popular approaches to cognitive enhancement.
Jak, Amy J; Seelye, Adriana M; Jurick, Sarah M
2013-03-01
Cognitive enhancement strategies have gained recent popularity and have the potential to benefit clinical and non-clinical populations. As technology advances and the number of cognitively healthy adults seeking methods of improving or preserving cognitive functioning grows, the role of electronic (e.g., computer and video game based) cognitive training becomes more relevant and warrants greater scientific scrutiny. This paper serves as a critical review of empirical evaluations of publically available electronic cognitive training programs. Many studies have found that electronic training approaches result in significant improvements in trained cognitive tasks. Fewer studies have demonstrated improvements in untrained tasks within the trained cognitive domain, non-trained cognitive domains, or on measures of everyday function. Successful cognitive training programs will elicit effects that generalize to untrained, practical tasks for extended periods of time. Unfortunately, many studies of electronic cognitive training programs are hindered by methodological limitations such as lack of an adequate control group, long-term follow-up and ecologically valid outcome measures. Despite these limitations, evidence suggests that computerized cognitive training has the potential to positively impact one's sense of social connectivity and self-efficacy.
Stream temperature investigations: field and analytic methods
Bartholow, J.M.
1989-01-01
Alternative public domain stream and reservoir temperature models are contrasted with SNTEMP. A distinction is made between steady-flow and dynamic-flow models and their respective capabilities. Regression models are offered as an alternative approach for some situations, with appropriate mathematical formulas suggested. Appendices provide information on State and Federal agencies that are good data sources, vendors for field instrumentation, and small computer programs useful in data reduction.
User interfaces for computational science: A domain specific language for OOMMF embedded in Python
NASA Astrophysics Data System (ADS)
Beg, Marijan; Pepper, Ryan A.; Fangohr, Hans
2017-05-01
Computer simulations are used widely across the engineering and science disciplines, including in the research and development of magnetic devices using computational micromagnetics. In this work, we identify and review different approaches to configuring simulation runs: (i) the re-compilation of source code, (ii) the use of configuration files, (iii) the graphical user interface, and (iv) embedding the simulation specification in an existing programming language to express the computational problem. We identify the advantages and disadvantages of different approaches and discuss their implications on effectiveness and reproducibility of computational studies and results. Following on from this, we design and describe a domain specific language for micromagnetics that is embedded in the Python language, and allows users to define the micromagnetic simulations they want to carry out in a flexible way. We have implemented this micromagnetic simulation description language together with a computational backend that executes the simulation task using the Object Oriented MicroMagnetic Framework (OOMMF). We illustrate the use of this Python interface for OOMMF by solving the micromagnetic standard problem 4. All the code is publicly available and is open source.
Public health program capacity for sustainability: a new framework.
Schell, Sarah F; Luke, Douglas A; Schooley, Michael W; Elliott, Michael B; Herbers, Stephanie H; Mueller, Nancy B; Bunger, Alicia C
2013-02-01
Public health programs can only deliver benefits if they are able to sustain activities over time. There is a broad literature on program sustainability in public health, but it is fragmented and there is a lack of consensus on core constructs. The purpose of this paper is to present a new conceptual framework for program sustainability in public health. This developmental study uses a comprehensive literature review, input from an expert panel, and the results of concept-mapping to identify the core domains of a conceptual framework for public health program capacity for sustainability. The concept-mapping process included three types of participants (scientists, funders, and practitioners) from several public health areas (e.g., tobacco control, heart disease and stroke, physical activity and nutrition, and injury prevention). The literature review identified 85 relevant studies focusing on program sustainability in public health. Most of the papers described empirical studies of prevention-oriented programs aimed at the community level. The concept-mapping process identified nine core domains that affect a program's capacity for sustainability: Political Support, Funding Stability, Partnerships, Organizational Capacity, Program Evaluation, Program Adaptation, Communications, Public Health Impacts, and Strategic Planning. Concept-mapping participants further identified 93 items across these domains that have strong face validity-89% of the individual items composing the framework had specific support in the sustainability literature. The sustainability framework presented here suggests that a number of selected factors may be related to a program's ability to sustain its activities and benefits over time. These factors have been discussed in the literature, but this framework synthesizes and combines the factors and suggests how they may be interrelated with one another. The framework presents domains for public health decision makers to consider when developing and implementing prevention and intervention programs. The sustainability framework will be useful for public health decision makers, program managers, program evaluators, and dissemination and implementation researchers.
Human operator identification model and related computer programs
NASA Technical Reports Server (NTRS)
Kessler, K. M.; Mohr, J. N.
1978-01-01
Four computer programs which provide computational assistance in the analysis of man/machine systems are reported. The programs are: (1) Modified Transfer Function Program (TF); (2) Time Varying Response Program (TVSR); (3) Optimal Simulation Program (TVOPT); and (4) Linear Identification Program (SCIDNT). The TV program converts the time domain state variable system representative to frequency domain transfer function system representation. The TVSR program computes time histories of the input/output responses of the human operator model. The TVOPT program is an optimal simulation program and is similar to TVSR in that it produces time histories of system states associated with an operator in the loop system. The differences between the two programs are presented. The SCIDNT program is an open loop identification code which operates on the simulated data from TVOPT (or TVSR) or real operator data from motion simulators.
Web-phreeq: a WWW instructional tool for modeling the distribution of chemical species in water
NASA Astrophysics Data System (ADS)
Saini-Eidukat, Bernhardt; Yahin, Andrew
1999-05-01
A WWW-based tool, WEB-PHREEQ, was developed for classroom teaching and for routine calculation of low temperature aqueous speciation. Accessible with any computer that has an internet-connected forms-capable WWW-browser, WEB-PHREEQ provides user interface and other support for modeling, creates a properly formatted input file, passes it to the public domain program PHREEQC and returns the output to the WWW browser. Users can calculate the equilibrium speciation of a solution over a range of temperatures or can react solid minerals or gases with a particular water and examine the resulting chemistry. WEB-PHREEQ is one of a number of interactive distributed-computing programs available on the WWW that are of interest to geoscientists.
Gender and stereotypes in motivation to study computer programming for careers in multimedia
NASA Astrophysics Data System (ADS)
Doubé, Wendy; Lang, Catherine
2012-03-01
A multimedia university programme with relatively equal numbers of male and female students in elective programming subjects provided a rare opportunity to investigate female motivation to study and pursue computer programming in a career. The MSLQ was used to survey 85 participants. In common with research into deterrence of females from STEM domains, females displayed significantly lower self-efficacy and expectancy for success. In contrast to research into deterrence of females from STEM domains, both genders placed similar high values on computer programming and shared high extrinsic and intrinsic goal orientation. The authors propose that the stereotype associated with a creative multimedia career could attract female participation in computer programming whereas the stereotype associated with computer science could be a deterrent.
Aggregating Data for Computational Toxicology Applications ...
Computational toxicology combines data from high-throughput test methods, chemical structure analyses and other biological domains (e.g., genes, proteins, cells, tissues) with the goals of predicting and understanding the underlying mechanistic causes of chemical toxicity and for predicting toxicity of new chemicals and products. A key feature of such approaches is their reliance on knowledge extracted from large collections of data and data sets in computable formats. The U.S. Environmental Protection Agency (EPA) has developed a large data resource called ACToR (Aggregated Computational Toxicology Resource) to support these data-intensive efforts. ACToR comprises four main repositories: core ACToR (chemical identifiers and structures, and summary data on hazard, exposure, use, and other domains), ToxRefDB (Toxicity Reference Database, a compilation of detailed in vivo toxicity data from guideline studies), ExpoCastDB (detailed human exposure data from observational studies of selected chemicals), and ToxCastDB (data from high-throughput screening programs, including links to underlying biological information related to genes and pathways). The EPA DSSTox (Distributed Structure-Searchable Toxicity) program provides expert-reviewed chemical structures and associated information for these and other high-interest public inventories. Overall, the ACToR system contains information on about 400,000 chemicals from 1100 different sources. The entire system is built usi
Public health program capacity for sustainability: a new framework
2013-01-01
Background Public health programs can only deliver benefits if they are able to sustain activities over time. There is a broad literature on program sustainability in public health, but it is fragmented and there is a lack of consensus on core constructs. The purpose of this paper is to present a new conceptual framework for program sustainability in public health. Methods This developmental study uses a comprehensive literature review, input from an expert panel, and the results of concept-mapping to identify the core domains of a conceptual framework for public health program capacity for sustainability. The concept-mapping process included three types of participants (scientists, funders, and practitioners) from several public health areas (e.g., tobacco control, heart disease and stroke, physical activity and nutrition, and injury prevention). Results The literature review identified 85 relevant studies focusing on program sustainability in public health. Most of the papers described empirical studies of prevention-oriented programs aimed at the community level. The concept-mapping process identified nine core domains that affect a program’s capacity for sustainability: Political Support, Funding Stability, Partnerships, Organizational Capacity, Program Evaluation, Program Adaptation, Communications, Public Health Impacts, and Strategic Planning. Concept-mapping participants further identified 93 items across these domains that have strong face validity—89% of the individual items composing the framework had specific support in the sustainability literature. Conclusions The sustainability framework presented here suggests that a number of selected factors may be related to a program’s ability to sustain its activities and benefits over time. These factors have been discussed in the literature, but this framework synthesizes and combines the factors and suggests how they may be interrelated with one another. The framework presents domains for public health decision makers to consider when developing and implementing prevention and intervention programs. The sustainability framework will be useful for public health decision makers, program managers, program evaluators, and dissemination and implementation researchers. PMID:23375082
A computer program for helicopter rotor noise using Lowson's formula in the time domain
NASA Technical Reports Server (NTRS)
Parks, C. L.
1975-01-01
A computer program (D3910) was developed to calculate both the far field and near field acoustic pressure signature of a tilted rotor in hover or uniform forward speed. The analysis, carried out in the time domain, is based on Lowson's formulation of the acoustic field of a moving force. The digital computer program is described, including methods used in the calculations, a flow chart, program D3910 source listing, instructions for the user, and two test cases with input and output listings and output plots.
Bajorath, Jurgen
2012-01-01
We have generated a number of compound data sets and programs for different types of applications in pharmaceutical research. These data sets and programs were originally designed for our research projects and are made publicly available. Without consulting original literature sources, it is difficult to understand specific features of data sets and software tools, basic ideas underlying their design, and applicability domains. Currently, 30 different entries are available for download from our website. In this data article, we provide an overview of the data and tools we make available and designate the areas of research for which they should be useful. For selected data sets and methods/programs, detailed descriptions are given. This article should help interested readers to select data and tools for specific computational investigations. PMID:24358818
A Summary of the Foundation Research Program, Fiscal Year 1985.
1986-05-12
system in the domain of actuarial science. Publication: T. R. Sivasankaran and M. Jarke, "Coupling Expert .z- Systems and Actuarial Pricing Models... Actuarial Pricing Models," Workshop on Coupling Symbolic and Numerical Computing in Expert Systems, Bellevue, Washington, August 1985. 16 Title: Application...Ramjets", AIAA-85-1177, AIAA/SAE/ ASME /ASEE 21st Joint Propulsion Conference, July 8-10, 1985. A. Gany and D. W. Netzer, "Fuel Performance Evaluation
NASA Technical Reports Server (NTRS)
Stocks, Dana R.
1986-01-01
The Dynamic Gas Temperature Measurement System compensation software accepts digitized data from two different diameter thermocouples and computes a compensated frequency response spectrum for one of the thermocouples. Detailed discussions of the physical system, analytical model, and computer software are presented in this volume and in Volume 1 of this report under Task 3. Computer program software restrictions and test cases are also presented. Compensated and uncompensated data may be presented in either the time or frequency domain. Time domain data are presented as instantaneous temperature vs time. Frequency domain data may be presented in several forms such as power spectral density vs frequency.
NASA Technical Reports Server (NTRS)
Hanley, Lionel
1989-01-01
The Ada Software Repository is a public-domain collection of Ada software and information. The Ada Software Repository is one of several repositories located on the SIMTEL20 Defense Data Network host computer at White Sands Missile Range, and available to any host computer on the network since 26 November 1984. This repository provides a free source for Ada programs and information. The Ada Software Repository is divided into several subdirectories. These directories are organized by topic, and their names and a brief overview of their topics are contained. The Ada Software Repository on SIMTEL20 serves two basic roles: to promote the exchange and use (reusability) of Ada programs and tools (including components) and to promote Ada education.
32 CFR 310.52 - Computer matching publication and review requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 32 National Defense 2 2012-07-01 2012-07-01 false Computer matching publication and review... OF DEFENSE (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.52 Computer matching publication and review requirements. (a) DoD Components shall identify the...
32 CFR 310.52 - Computer matching publication and review requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 32 National Defense 2 2014-07-01 2014-07-01 false Computer matching publication and review... OF DEFENSE (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.52 Computer matching publication and review requirements. (a) DoD Components shall identify the...
32 CFR 310.52 - Computer matching publication and review requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 32 National Defense 2 2013-07-01 2013-07-01 false Computer matching publication and review... OF DEFENSE (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.52 Computer matching publication and review requirements. (a) DoD Components shall identify the...
32 CFR 310.52 - Computer matching publication and review requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 32 National Defense 2 2010-07-01 2010-07-01 false Computer matching publication and review... OF DEFENSE (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.52 Computer matching publication and review requirements. (a) DoD Components shall identify the...
32 CFR 310.52 - Computer matching publication and review requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 32 National Defense 2 2011-07-01 2011-07-01 false Computer matching publication and review... OF DEFENSE (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.52 Computer matching publication and review requirements. (a) DoD Components shall identify the...
NASA Astrophysics Data System (ADS)
Pardo-Igúzquiza, Eulogio; Rodríguez-Tovar, Francisco J.
2012-12-01
Many spectral analysis techniques have been designed assuming sequences taken with a constant sampling interval. However, there are empirical time series in the geosciences (sediment cores, fossil abundance data, isotope analysis, …) that do not follow regular sampling because of missing data, gapped data, random sampling or incomplete sequences, among other reasons. In general, interpolating an uneven series in order to obtain a succession with a constant sampling interval alters the spectral content of the series. In such cases it is preferable to follow an approach that works with the uneven data directly, avoiding the need for an explicit interpolation step. The Lomb-Scargle periodogram is a popular choice in such circumstances, as there are programs available in the public domain for its computation. One new computer program for spectral analysis improves the standard Lomb-Scargle periodogram approach in two ways: (1) It explicitly adjusts the statistical significance to any bias introduced by variance reduction smoothing, and (2) it uses a permutation test to evaluate confidence levels, which is better suited than parametric methods when neighbouring frequencies are highly correlated. Another novel program for cross-spectral analysis offers the advantage of estimating the Lomb-Scargle cross-periodogram of two uneven time series defined on the same interval, and it evaluates the confidence levels of the estimated cross-spectra by a non-parametric computer intensive permutation test. Thus, the cross-spectrum, the squared coherence spectrum, the phase spectrum, and the Monte Carlo statistical significance of the cross-spectrum and the squared-coherence spectrum can be obtained. Both of the programs are written in ANSI Fortran 77, in view of its simplicity and compatibility. The program code is of public domain, provided on the website of the journal (http://www.iamg.org/index.php/publisher/articleview/frmArticleID/112/). Different examples (with simulated and real data) are described in this paper to corroborate the methodology and the implementation of these two new programs.
Judson, Richard S.; Martin, Matthew T.; Egeghy, Peter; Gangwal, Sumit; Reif, David M.; Kothiya, Parth; Wolf, Maritja; Cathey, Tommy; Transue, Thomas; Smith, Doris; Vail, James; Frame, Alicia; Mosher, Shad; Cohen Hubal, Elaine A.; Richard, Ann M.
2012-01-01
Computational toxicology combines data from high-throughput test methods, chemical structure analyses and other biological domains (e.g., genes, proteins, cells, tissues) with the goals of predicting and understanding the underlying mechanistic causes of chemical toxicity and for predicting toxicity of new chemicals and products. A key feature of such approaches is their reliance on knowledge extracted from large collections of data and data sets in computable formats. The U.S. Environmental Protection Agency (EPA) has developed a large data resource called ACToR (Aggregated Computational Toxicology Resource) to support these data-intensive efforts. ACToR comprises four main repositories: core ACToR (chemical identifiers and structures, and summary data on hazard, exposure, use, and other domains), ToxRefDB (Toxicity Reference Database, a compilation of detailed in vivo toxicity data from guideline studies), ExpoCastDB (detailed human exposure data from observational studies of selected chemicals), and ToxCastDB (data from high-throughput screening programs, including links to underlying biological information related to genes and pathways). The EPA DSSTox (Distributed Structure-Searchable Toxicity) program provides expert-reviewed chemical structures and associated information for these and other high-interest public inventories. Overall, the ACToR system contains information on about 400,000 chemicals from 1100 different sources. The entire system is built using open source tools and is freely available to download. This review describes the organization of the data repository and provides selected examples of use cases. PMID:22408426
Judson, Richard S; Martin, Matthew T; Egeghy, Peter; Gangwal, Sumit; Reif, David M; Kothiya, Parth; Wolf, Maritja; Cathey, Tommy; Transue, Thomas; Smith, Doris; Vail, James; Frame, Alicia; Mosher, Shad; Cohen Hubal, Elaine A; Richard, Ann M
2012-01-01
Computational toxicology combines data from high-throughput test methods, chemical structure analyses and other biological domains (e.g., genes, proteins, cells, tissues) with the goals of predicting and understanding the underlying mechanistic causes of chemical toxicity and for predicting toxicity of new chemicals and products. A key feature of such approaches is their reliance on knowledge extracted from large collections of data and data sets in computable formats. The U.S. Environmental Protection Agency (EPA) has developed a large data resource called ACToR (Aggregated Computational Toxicology Resource) to support these data-intensive efforts. ACToR comprises four main repositories: core ACToR (chemical identifiers and structures, and summary data on hazard, exposure, use, and other domains), ToxRefDB (Toxicity Reference Database, a compilation of detailed in vivo toxicity data from guideline studies), ExpoCastDB (detailed human exposure data from observational studies of selected chemicals), and ToxCastDB (data from high-throughput screening programs, including links to underlying biological information related to genes and pathways). The EPA DSSTox (Distributed Structure-Searchable Toxicity) program provides expert-reviewed chemical structures and associated information for these and other high-interest public inventories. Overall, the ACToR system contains information on about 400,000 chemicals from 1100 different sources. The entire system is built using open source tools and is freely available to download. This review describes the organization of the data repository and provides selected examples of use cases.
Smartphone Microscopy of Parasite Eggs Accumulated into a Single Field of View
Sowerby, Stephen J.; Crump, John A.; Johnstone, Maree C.; Krause, Kurt L.; Hill, Philip C.
2016-01-01
A Nokia Lumia 1020 cellular phone (Microsoft Corp., Auckland, New Zealand) was configured to image the ova of Ascaris lumbricoides converged into a single field of view but on different focal planes. The phone was programmed to acquire images at different distances and, using public domain computer software, composite images were created that brought all the eggs into sharp focus. This proof of concept informs a framework for field-deployable, point of care monitoring of soil-transmitted helminths. PMID:26572870
NASA Technical Reports Server (NTRS)
Bodley, C. S.; Devers, A. D.; Park, A. C.; Frisch, H. P.
1978-01-01
A theoretical development and associated digital computer program system for the dynamic simulation and stability analysis of passive and actively controlled spacecraft are presented. The dynamic system (spacecraft) is modeled as an assembly of rigid and/or flexible bodies not necessarily in a topological tree configuration. The computer program system is used to investigate total system dynamic characteristics, including interaction effects between rigid and/or flexible bodies, control systems, and a wide range of environmental loadings. In addition, the program system is used for designing attitude control systems and for evaluating total dynamic system performance, including time domain response and frequency domain stability analyses.
A Review of Global Health Competencies for Postgraduate Public Health Education
Sawleshwarkar, Shailendra; Negin, Joel
2017-01-01
During the last decade, the literature about global health has grown exponentially. Academic institutions are also exploring the scope of their public health educational programs to meet the demand for a global health professional. This has become more relevant in the context of the sustainable development goals. There have been attempts to describe global health competencies for specific professional groups. The focus of these competencies has been variable with a variety of different themes being described ranging from globalization and health care, analysis and program management, as well as equity and capacity strengthening. This review aims to describe global health competencies and attempts to distill common competency domains to assist in curriculum development and integration in postgraduate public health education programs. A literature search was conducted using relevant keywords with a focus on public health education. This resulted in identification of 13 articles that described global health competencies. All these articles were published between 2005 and 2015 with six from the USA, two each from Canada and Australia, and one each from UK, Europe, and Americas. A range of methods used to describe competency domains included literature review, interviews with experts and employers, surveys of staff and students, and description or review of an academic program. Eleven competency domains were distilled from the selected articles. These competency domains primarily referred to three main aspects, one that focuses on burden of disease and the determinants of health. A second set focuses on core public health skills including policy development, analysis, and program management. Another set of competency domains could be classified as “soft skills” and includes collaboration, partnering, communication, professionalism, capacity building, and political awareness. This review presents the landscape of defined global health competencies for postgraduate public health education. The discussion about use of “global health,” “international health,” and “global public health” will continue, and academic institutions need to explore ways to integrate these competencies in postgraduate public health programs. This is critical in the post-MDG era that we prepare global public health workforce for the challenges of improving health of the “global” population in the context of sustainable development goals. PMID:28373970
NASA Technical Reports Server (NTRS)
Mishchenko, Michael I.; Yang, Ping
2018-01-01
In this paper we make practical use of the recently developed first-principles approach to electromagnetic scattering by particles immersed in an unbounded absorbing host medium. Specifically, we introduce an actual computational tool for the calculation of pertinent far-field optical observables in the context of the classical Lorenzâ€"Mie theory. The paper summarizes the relevant theoretical formalism, explains various aspects of the corresponding numerical algorithm, specifies the input and output parameters of a FORTRAN program available at https://www.giss.nasa.gov/staff/mmishchenko/Lorenz-Mie.html, and tabulates benchmark results useful for testing purposes. This public-domain FORTRAN program enables one to solve the following two important problems: (i) simulate theoretically the reading of a remote well-collimated radiometer measuring electromagnetic scattering by an individual spherical particle or a small random group of spherical particles; and (ii) compute the single-scattering parameters that enter the vector radiative transfer equation derived directly from the Maxwell equations.
Programming Tools: Status, Evaluation, and Comparison
NASA Technical Reports Server (NTRS)
Cheng, Doreen Y.; Cooper, D. M. (Technical Monitor)
1994-01-01
In this tutorial I will first describe the characteristics of scientific applications and their developers, and describe the computing environment in a typical high-performance computing center. I will define the user requirements for tools that support application portability and present the difficulties to satisfy them. These form the basis of the evaluation and comparison of the tools. I will then describe the tools available in the market and the tools available in the public domain. Specifically, I will describe the tools for converting sequential programs, tools for developing portable new programs, tools for debugging and performance tuning, tools for partitioning and mapping, and tools for managing network of resources. I will introduce the main goals and approaches of the tools, and show main features of a few tools in each category. Meanwhile, I will compare tool usability for real-world application development and compare their different technological approaches. Finally, I will indicate the future directions of the tools in each category.
NASA Astrophysics Data System (ADS)
Mishchenko, Michael I.; Yang, Ping
2018-01-01
In this paper we make practical use of the recently developed first-principles approach to electromagnetic scattering by particles immersed in an unbounded absorbing host medium. Specifically, we introduce an actual computational tool for the calculation of pertinent far-field optical observables in the context of the classical Lorenz-Mie theory. The paper summarizes the relevant theoretical formalism, explains various aspects of the corresponding numerical algorithm, specifies the input and output parameters of a FORTRAN program available at https://www.giss.nasa.gov/staff/mmishchenko/Lorenz-Mie.html, and tabulates benchmark results useful for testing purposes. This public-domain FORTRAN program enables one to solve the following two important problems: (i) simulate theoretically the reading of a remote well-collimated radiometer measuring electromagnetic scattering by an individual spherical particle or a small random group of spherical particles; and (ii) compute the single-scattering parameters that enter the vector radiative transfer equation derived directly from the Maxwell equations.
Designing Educational Games for Computer Programming: A Holistic Framework
ERIC Educational Resources Information Center
Malliarakis, Christos; Satratzemi, Maya; Xinogalos, Stelios
2014-01-01
Computer science is continuously evolving during the past decades. This has also brought forth new knowledge that should be incorporated and new learning strategies must be adopted for the successful teaching of all sub-domains. For example, computer programming is a vital knowledge area within computer science with constantly changing curriculum…
Rhetorical Consequences of the Computer Society: Expert Systems and Human Communication.
ERIC Educational Resources Information Center
Skopec, Eric Wm.
Expert systems are computer programs that solve selected problems by modelling domain-specific behaviors of human experts. These computer programs typically consist of an input/output system that feeds data into the computer and retrieves advice, an inference system using the reasoning and heuristic processes of human experts, and a knowledge…
GMES: A Python package for solving Maxwell’s equations using the FDTD method
NASA Astrophysics Data System (ADS)
Chun, Kyungwon; Kim, Huioon; Kim, Hyounggyu; Jung, Kil Su; Chung, Youngjoo
2013-04-01
This paper describes GMES, a free Python package for solving Maxwell’s equations using the finite-difference time-domain (FDTD) method. The design of GMES follows the object-oriented programming (OOP) approach and adopts a unique design strategy where the voxels in the computational domain are grouped and then updated according to its material type. This piecewise updating scheme ensures that GMES can adopt OOP without losing its simple structure and time-stepping speed. The users can easily add various material types, sources, and boundary conditions into their code using the Python programming language. The key design features, along with the supported material types, excitation sources, boundary conditions and parallel calculations employed in GMES are also described in detail. Catalog identifier: AEOK_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOK_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU General Public License v3.0 No. of lines in distributed program, including test data, etc.: 17700 No. of bytes in distributed program, including test data, etc.: 89878 Distribution format: tar.gz Programming language: C++, Python. Computer: Any computer with a Unix-like system with a C++ compiler, and a Python interpreter; developed on 2.53 GHz Intel CoreTM i3. Operating system: Any Unix-like system; developed under Ubuntu 12.04 LTS 64 bit. Has the code been vectorized or parallelized?: Yes. Parallelized with MPI directives (optional). RAM: Problem dependent (a simulation with real valued electromagnetic field uses roughly 0.18 KB per Yee cell.) Classification: 10. External routines: SWIG [1], Cython [2], NumPy [3], SciPy [4], matplotlib [5], MPI for Python [6] Nature of problem: Classical electrodynamics Solution method: Finite-difference time-domain (FDTD) method Additional comments: This article describes version 0.9.5. The most recent version can be downloaded at the GMES project homepage [7]. Running time: Problem dependent (a simulation with real valued electromagnetic field takes typically about 0.16 μs per Yee cell per time-step.) SWIG, http://www.swig.org. Cython, http://www.cython.org. NumPy, http://numpy.scipy.org. SciPy, http://www.scipy.org. matplotlib, http://matplotlib.sourceforge.net. MPI for Python, http://mpi4py.scipy.org. GMES, http://sourceforge.net/projects/gmes.
Transient upset models in computer systems
NASA Technical Reports Server (NTRS)
Mason, G. M.
1983-01-01
Essential factors for the design of transient upset monitors for computers are discussed. The upset is a system level event that is software dependent. It can occur in the program flow, the opcode set, the opcode address domain, the read address domain, and the write address domain. Most upsets are in the program flow. It is shown that simple, external monitors functioning transparently relative to the system operations can be built if a detailed accounting is made of the characteristics of the faults that can happen. Sample applications are provided for different states of the Z-80 and 8085 based system.
Future perspectives - proposal for Oxford Physiome Project.
Oku, Yoshitaka
2010-01-01
The Physiome Project is an effort to understand living creatures using "analysis by synthesis" strategy, i.e., by reproducing their behaviors. In order to achieve its goal, sharing developed models between different computer languages and application programs to incorporate into integrated models is critical. To date, several XML-based markup languages has been developed for this purpose. However, source codes written with XML-based languages are very difficult to read and edit using text editors. An alternative way is to use an object-oriented meta-language, which can be translated to different computer languages and transplanted to different application programs. Object-oriented languages are suitable for describing structural organization by hierarchical classes and taking advantage of statistical properties to reduce the number of parameter while keeping the complexity of behaviors. Using object-oriented languages to describe each element and posting it to a public domain should be the next step to build up integrated models of the respiratory control system.
Smartphone Microscopy of Parasite Eggs Accumulated into a Single Field of View.
Sowerby, Stephen J; Crump, John A; Johnstone, Maree C; Krause, Kurt L; Hill, Philip C
2016-01-01
A Nokia Lumia 1020 cellular phone (Microsoft Corp., Auckland, New Zealand) was configured to image the ova of Ascaris lumbricoides converged into a single field of view but on different focal planes. The phone was programmed to acquire images at different distances and, using public domain computer software, composite images were created that brought all the eggs into sharp focus. This proof of concept informs a framework for field-deployable, point of care monitoring of soil-transmitted helminths. © The American Society of Tropical Medicine and Hygiene.
A learning apprentice for software parts composition
NASA Technical Reports Server (NTRS)
Allen, Bradley P.; Holtzman, Peter L.
1987-01-01
An overview of the knowledge acquisition component of the Bauhaus, a prototype computer aided software engineering (CASE) workstation for the development of domain-specific automatic programming systems (D-SAPS) is given. D-SAPS use domain knowledge in the refinement of a description of an application program into a compilable implementation. The approach to the construction of D-SAPS was to automate the process of refining a description of a program, expressed in an object-oriented domain language, into a configuration of software parts that implement the behavior of the domain objects.
A Framework for Understanding Physics Students' Computational Modeling Practices
ERIC Educational Resources Information Center
Lunk, Brandon Robert
2012-01-01
With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content…
Mapping University Students' Epistemic Framing of Computational Physics Using Network Analysis
ERIC Educational Resources Information Center
Bodin, Madelen
2012-01-01
Solving physics problem in university physics education using a computational approach requires knowledge and skills in several domains, for example, physics, mathematics, programming, and modeling. These competences are in turn related to students' beliefs about the domains as well as about learning. These knowledge and beliefs components are…
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCaskey, Alexander J.
Hybrid programming models for beyond-CMOS technologies will prove critical for integrating new computing technologies alongside our existing infrastructure. Unfortunately the software infrastructure required to enable this is lacking or not available. XACC is a programming framework for extreme-scale, post-exascale accelerator architectures that integrates alongside existing conventional applications. It is a pluggable framework for programming languages developed for next-gen computing hardware architectures like quantum and neuromorphic computing. It lets computational scientists efficiently off-load classically intractable work to attached accelerators through user-friendly Kernel definitions. XACC makes post-exascale hybrid programming approachable for domain computational scientists.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erez, Mattan; Yelick, Katherine; Sarkar, Vivek
The Dynamic, Exascale Global Address Space programming environment (DEGAS) project will develop the next generation of programming models and runtime systems to meet the challenges of Exascale computing. Our approach is to provide an efficient and scalable programming model that can be adapted to application needs through the use of dynamic runtime features and domain-specific languages for computational kernels. We address the following technical challenges: Programmability: Rich set of programming constructs based on a Hierarchical Partitioned Global Address Space (HPGAS) model, demonstrated in UPC++. Scalability: Hierarchical locality control, lightweight communication (extended GASNet), and ef- ficient synchronization mechanisms (Phasers). Performance Portability:more » Just-in-time specialization (SEJITS) for generating hardware-specific code and scheduling libraries for domain-specific adaptive runtimes (Habanero). Energy Efficiency: Communication-optimal code generation to optimize energy efficiency by re- ducing data movement. Resilience: Containment Domains for flexible, domain-specific resilience, using state capture mechanisms and lightweight, asynchronous recovery mechanisms. Interoperability: Runtime and language interoperability with MPI and OpenMP to encourage broad adoption.« less
Dynamic gas temperature measurement system. Volume 2: Operation and program manual
NASA Technical Reports Server (NTRS)
Purpura, P. T.
1983-01-01
The hot section technology (HOST) dynamic gas temperature measurement system computer program acquires data from two type B thermocouples of different diameters. The analysis method determines the in situ value of an aerodynamic parameter T, containing the heat transfer coefficient from the transfer function of the two thermocouples. This aerodynamic parameter is used to compute a fequency response spectrum and compensate the dynamic portion of the signal of the smaller thermocouple. The calculations for the aerodynamic parameter and the data compensation technique are discussed. Compensated data are presented in either the time or frequency domain, time domain data as dynamic temperature vs time, or frequency domain data.
Third CLIPS Conference Proceedings, volume 2
NASA Technical Reports Server (NTRS)
Riley, Gary (Editor)
1994-01-01
Expert systems are computer programs which emulate human expertise in well defined problem domains. The C Language Integrated Production System (CLIPS) is an expert system building tool, developed at the Johnson Space Center, which provides a complete environment for the development and delivery of rule and/or object based expert systems. CLIPS was specifically designed to provide a low cost option for developing and deploying expert system applications across a wide range of hardware platforms. The development of CLIPS has helped to improve the ability to deliver expert system technology throughout the public and private sectors for a wide range of applications and diverse computing environments. The Third Conference on CLIPS provided a forum for CLIPS users to present and discuss papers relating to CLIPS applications, uses, and extensions.
Information Leakage Analysis by Abstract Interpretation
NASA Astrophysics Data System (ADS)
Zanioli, Matteo; Cortesi, Agostino
Protecting the confidentiality of information stored in a computer system or transmitted over a public network is a relevant problem in computer security. The approach of information flow analysis involves performing a static analysis of the program with the aim of proving that there will not be leaks of sensitive information. In this paper we propose a new domain that combines variable dependency analysis, based on propositional formulas, and variables' value analysis, based on polyhedra. The resulting analysis is strictly more accurate than the state of the art abstract interpretation based analyses for information leakage detection. Its modular construction allows to deal with the tradeoff between efficiency and accuracy by tuning the granularity of the abstraction and the complexity of the abstract operators.
Personal Computer and Workstation Operating Systems Tutorial
1994-03-01
to a RAM area where it is executed by the CPU. The program consists of instructions that perform operations on data. The CPU will perform two basic...memory to improve system performance. More often the user will buy a new fixed disk so the computer will hold more programs internally. The trend today...MHZ. Another way to view how fast the information is going into the register is in a time domain rather than a frequency domain knowing that time and
An effective and secure key-management scheme for hierarchical access control in E-medicine system.
Odelu, Vanga; Das, Ashok Kumar; Goswami, Adrijit
2013-04-01
Recently several hierarchical access control schemes are proposed in the literature to provide security of e-medicine systems. However, most of them are either insecure against 'man-in-the-middle attack' or they require high storage and computational overheads. Wu and Chen proposed a key management method to solve dynamic access control problems in a user hierarchy based on hybrid cryptosystem. Though their scheme improves computational efficiency over Nikooghadam et al.'s approach, it suffers from large storage space for public parameters in public domain and computational inefficiency due to costly elliptic curve point multiplication. Recently, Nikooghadam and Zakerolhosseini showed that Wu-Chen's scheme is vulnerable to man-in-the-middle attack. In order to remedy this security weakness in Wu-Chen's scheme, they proposed a secure scheme which is again based on ECC (elliptic curve cryptography) and efficient one-way hash function. However, their scheme incurs huge computational cost for providing verification of public information in the public domain as their scheme uses ECC digital signature which is costly when compared to symmetric-key cryptosystem. In this paper, we propose an effective access control scheme in user hierarchy which is only based on symmetric-key cryptosystem and efficient one-way hash function. We show that our scheme reduces significantly the storage space for both public and private domains, and computational complexity when compared to Wu-Chen's scheme, Nikooghadam-Zakerolhosseini's scheme, and other related schemes. Through the informal and formal security analysis, we further show that our scheme is secure against different attacks and also man-in-the-middle attack. Moreover, dynamic access control problems in our scheme are also solved efficiently compared to other related schemes, making our scheme is much suitable for practical applications of e-medicine systems.
ERIC Educational Resources Information Center
Wendel, Holly Marie
2016-01-01
The purpose of this study was to determine the relationship each of the mathematics web-based programs, MyMathLab and Assessments and Learning in Knowledge Spaces (ALEKS), has with students' mathematics achievement. In addition, the study examined the relationship between students' affective domain and the type of program as well as student…
Computer program for single input-output, single-loop feedback systems
NASA Technical Reports Server (NTRS)
1976-01-01
Additional work is reported on a completely automatic computer program for the design of single input/output, single loop feedback systems with parameter uncertainly, to satisfy time domain bounds on the system response to step commands and disturbances. The inputs to the program are basically the specified time-domain response bounds, the form of the constrained plant transfer function and the ranges of the uncertain parameters of the plant. The program output consists of the transfer functions of the two free compensation networks, in the form of the coefficients of the numerator and denominator polynomials, and the data on the prescribed bounds and the extremes actually obtained for the system response to commands and disturbances.
A Public Domain Software Library for Reading and Language Arts.
ERIC Educational Resources Information Center
Balajthy, Ernest
A three-year project carried out by the Microcomputers and Reading Committee of the New Jersey Reading Association involved the collection, improvement, and distribution of free microcomputer software (public domain programs) designed to deal with reading and writing skills. Acknowledging that this free software is not without limitations (poor…
Debugging Techniques Used by Experienced Programmers to Debug Their Own Code.
1990-09-01
IS. NUMBER OF PAGES code debugging 62 computer programmers 16. PRICE CODE debug programming 17. SECURITY CLASSIFICATION 18. SECURITY CLASSIFICATION 119...Davis, and Schultz (1987) also compared experts and novices, but focused on the way a computer program is represented cognitively and how that...of theories in the emerging computer programming domain (Fisher, 1987). In protocol analysis, subjects are asked to talk/think aloud as they solve
SPA- STATISTICAL PACKAGE FOR TIME AND FREQUENCY DOMAIN ANALYSIS
NASA Technical Reports Server (NTRS)
Brownlow, J. D.
1994-01-01
The need for statistical analysis often arises when data is in the form of a time series. This type of data is usually a collection of numerical observations made at specified time intervals. Two kinds of analysis may be performed on the data. First, the time series may be treated as a set of independent observations using a time domain analysis to derive the usual statistical properties including the mean, variance, and distribution form. Secondly, the order and time intervals of the observations may be used in a frequency domain analysis to examine the time series for periodicities. In almost all practical applications, the collected data is actually a mixture of the desired signal and a noise signal which is collected over a finite time period with a finite precision. Therefore, any statistical calculations and analyses are actually estimates. The Spectrum Analysis (SPA) program was developed to perform a wide range of statistical estimation functions. SPA can provide the data analyst with a rigorous tool for performing time and frequency domain studies. In a time domain statistical analysis the SPA program will compute the mean variance, standard deviation, mean square, and root mean square. It also lists the data maximum, data minimum, and the number of observations included in the sample. In addition, a histogram of the time domain data is generated, a normal curve is fit to the histogram, and a goodness-of-fit test is performed. These time domain calculations may be performed on both raw and filtered data. For a frequency domain statistical analysis the SPA program computes the power spectrum, cross spectrum, coherence, phase angle, amplitude ratio, and transfer function. The estimates of the frequency domain parameters may be smoothed with the use of Hann-Tukey, Hamming, Barlett, or moving average windows. Various digital filters are available to isolate data frequency components. Frequency components with periods longer than the data collection interval are removed by least-squares detrending. As many as ten channels of data may be analyzed at one time. Both tabular and plotted output may be generated by the SPA program. This program is written in FORTRAN IV and has been implemented on a CDC 6000 series computer with a central memory requirement of approximately 142K (octal) of 60 bit words. This core requirement can be reduced by segmentation of the program. The SPA program was developed in 1978.
Cloud identification using genetic algorithms and massively parallel computation
NASA Technical Reports Server (NTRS)
Buckles, Bill P.; Petry, Frederick E.
1996-01-01
As a Guest Computational Investigator under the NASA administered component of the High Performance Computing and Communication Program, we implemented a massively parallel genetic algorithm on the MasPar SIMD computer. Experiments were conducted using Earth Science data in the domains of meteorology and oceanography. Results obtained in these domains are competitive with, and in most cases better than, similar problems solved using other methods. In the meteorological domain, we chose to identify clouds using AVHRR spectral data. Four cloud speciations were used although most researchers settle for three. Results were remarkedly consistent across all tests (91% accuracy). Refinements of this method may lead to more timely and complete information for Global Circulation Models (GCMS) that are prevalent in weather forecasting and global environment studies. In the oceanographic domain, we chose to identify ocean currents from a spectrometer having similar characteristics to AVHRR. Here the results were mixed (60% to 80% accuracy). Given that one is willing to run the experiment several times (say 10), then it is acceptable to claim the higher accuracy rating. This problem has never been successfully automated. Therefore, these results are encouraging even though less impressive than the cloud experiment. Successful conclusion of an automated ocean current detection system would impact coastal fishing, naval tactics, and the study of micro-climates. Finally we contributed to the basic knowledge of GA (genetic algorithm) behavior in parallel environments. We developed better knowledge of the use of subpopulations in the context of shared breeding pools and the migration of individuals. Rigorous experiments were conducted based on quantifiable performance criteria. While much of the work confirmed current wisdom, for the first time we were able to submit conclusive evidence. The software developed under this grant was placed in the public domain. An extensive user's manual was written and distributed nationwide to scientists whose work might benefit from its availability. Several papers, including two journal articles, were produced.
Teaching Computer Literacy with Freeware and Shareware.
ERIC Educational Resources Information Center
Hobart, R. Dale; And Others
1988-01-01
Describes workshops given at Ferris State University for faculty and staff who want to acquire computer skills. Considered are a computer literacy and a software toolkit distributed to participants made from public domain/shareware resources. Stresses the benefits of shareware as an educational resource. (CW)
NASA Technical Reports Server (NTRS)
Bodley, C. S.; Devers, D. A.; Park, C. A.
1975-01-01
A theoretical development and associated digital computer program system is presented. The dynamic system (spacecraft) is modeled as an assembly of rigid and/or flexible bodies not necessarily in a topological tree configuration. The computer program system may be used to investigate total system dynamic characteristics including interaction effects between rigid and/or flexible bodies, control systems, and a wide range of environmental loadings. Additionally, the program system may be used for design of attitude control systems and for evaluation of total dynamic system performance including time domain response and frequency domain stability analyses. Volume 1 presents the theoretical developments including a description of the physical system, the equations of dynamic equilibrium, discussion of kinematics and system topology, a complete treatment of momentum wheel coupling, and a discussion of gravity gradient and environmental effects. Volume 2, is a program users' guide and includes a description of the overall digital program code, individual subroutines and a description of required program input and generated program output. Volume 3 presents the results of selected demonstration problems that illustrate all program system capabilities.
NASA Technical Reports Server (NTRS)
Sharma, Naveen
1992-01-01
In this paper we briefly describe a combined symbolic and numeric approach for solving mathematical models on parallel computers. An experimental software system, PIER, is being developed in Common Lisp to synthesize computationally intensive and domain formulation dependent phases of finite element analysis (FEA) solution methods. Quantities for domain formulation like shape functions, element stiffness matrices, etc., are automatically derived using symbolic mathematical computations. The problem specific information and derived formulae are then used to generate (parallel) numerical code for FEA solution steps. A constructive approach to specify a numerical program design is taken. The code generator compiles application oriented input specifications into (parallel) FORTRAN77 routines with the help of built-in knowledge of the particular problem, numerical solution methods and the target computer.
Variation in school health policies and programs by demographic characteristics of US schools, 2006.
Balaji, Alexandra B; Brener, Nancy D; McManus, Tim
2010-12-01
To identify whether school health policies and programs vary by demographic characteristics of schools, using data from the School Health Policies and Programs Study (SHPPS) 2006. This study updates a similar study conducted with SHPPS 2000 data and assesses several additional policies and programs measured for the first time in SHPPS 2006. SHPPS 2006 assessed the status of 8 components of the coordinated school health model using a nationally representative sample of public, Catholic, and private schools at the elementary, middle, and high school levels. Data were collected from school faculty and staff using computer-assisted personal interviews and then linked with extant data on school characteristics. Results from a series of regression analyses indicated that a number of school policies and programs varied by school type (public, Catholic, or private), urbanicity, school size, discretionary dollars per pupil, percentage of white students, percentage of students qualifying for free lunch funds, and, among high schools, percentage of college-bound students. Catholic and private schools, smaller schools, and those with low discretionary dollars per pupil did not have as many key school health policies and programs as did schools that were public, larger, and had higher discretionary dollars per pupil. However, no single type of school had all key components of a coordinated school health program in place. Although some categories of schools had fewer policies and programs in place, all had both strengths and weaknesses. Regardless of school characteristics, all schools have the potential to implement a quality school health program. © Published 2010. This article is a US Government work and is in the public domain in the USA.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-07
...; Computer Matching Program (SSA/ Bureau of the Public Debt (BPD))--Match Number 1038 AGENCY: Social Security... as shown above. SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection... containing SSNs extracted from the Supplemental Security Record database. Exchanges for this computer...
Report of the Advisory Panel to the Mathematical and Information Science Directorate
1988-04-01
how to program S computers so that...Engineering, to expand the domain of behaviors we know how to program computers to perform to include more behaviors that previously only humans could do...technology? It is not easy to make clear the difference between making an advance in discovering how to program a behavior that no one knew how to program
CIRCUS--A digital computer program for transient analysis of electronic circuits
NASA Technical Reports Server (NTRS)
Moore, W. T.; Steinbert, L. L.
1968-01-01
Computer program simulates the time domain response of an electronic circuit to an arbitrary forcing function. CIRCUS uses a charge-control parameter model to represent each semiconductor device. Given the primary photocurrent, the transient behavior of a circuit in a radiation environment is determined.
Another Program For Generating Interactive Graphics
NASA Technical Reports Server (NTRS)
Costenbader, Jay; Moleski, Walt; Szczur, Martha; Howell, David; Engelberg, Norm; Li, Tin P.; Misra, Dharitri; Miller, Philip; Neve, Leif; Wolf, Karl;
1991-01-01
VAX/Ultrix version of Transportable Applications Environment Plus (TAE+) computer program provides integrated, portable software environment for developing and running interactive window, text, and graphical-object-based application software systems. Enables programmer or nonprogrammer to construct easily custom software interface between user and application program and to move resulting interface program and its application program to different computers. When used throughout company for wide range of applications, makes both application program and computer seem transparent, with noticeable improvements in learning curve. Available in form suitable for following six different groups of computers: DEC VAX station and other VMS VAX computers, Macintosh II computers running AUX, Apollo Domain Series 3000, DEC VAX and reduced-instruction-set-computer workstations running Ultrix, Sun 3- and 4-series workstations running Sun OS and IBM RT/PC's and PS/2 computers running AIX, and HP 9000 S
Expansion of Tabulated Scattering Matrices in Generalized Spherical Functions
NASA Technical Reports Server (NTRS)
Mishchenko, Michael I.; Geogdzhayev, Igor V.; Yang, Ping
2016-01-01
An efficient way to solve the vector radiative transfer equation for plane-parallel turbid media is to Fourier-decompose it in azimuth. This methodology is typically based on the analytical computation of the Fourier components of the phase matrix and is predicated on the knowledge of the coefficients appearing in the expansion of the normalized scattering matrix in generalized spherical functions. Quite often the expansion coefficients have to be determined from tabulated values of the scattering matrix obtained from measurements or calculated by solving the Maxwell equations. In such cases one needs an efficient and accurate computer procedure converting a tabulated scattering matrix into the corresponding set of expansion coefficients. This short communication summarizes the theoretical basis of this procedure and serves as the user guide to a simple public-domain FORTRAN program.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-09
... (PPRs) to capture quarterly and annual reports for each project type (Infrastructure, Public Computer... Information Collection; Comment Request; Broadband Technology Opportunities Program (BTOP) Quarterly and..., which included competitive grants to expand public computer center capacity and innovative programs to...
Cameron, Roy; Manske, Stephen; Brown, K. Stephen; Jolin, Mari Alice; Murnaghan, Donna; Lovato, Chris
2007-01-01
The Canadian Cancer Society and the National Cancer Institute of Canada have charged their Centre for Behavioral Research and Program Evaluation with contributing to the development of the country’s systemic capacity to link research, policy, and practice related to population-level interventions. Local data collection and feedback systems are integral to this capacity. Canada’s School Health Action Planning and Evaluation System (SHAPES) allows data to be collected from all of a school’s students, and these data are used to produce computer-generated school “health profiles.” SHAPES is being used for intervention planning, evaluation, surveillance, and research across Canada. Strong demand and multipartner investment suggest that SHAPES is adding value in all of these domains. Such systems can contribute substantially to evidence-informed public health practice, public engagement, participatory action research, and relevant, timely population intervention research. PMID:17329662
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamlet, Jason R.; Keliiaa, Curtis M.
This report assesses current public domain cyber security practices with respect to cyber indications and warnings. It describes cybersecurity industry and government activities, including cybersecurity tools, methods, practices, and international and government-wide initiatives known to be impacting current practice. Of particular note are the U.S. Government's Trusted Internet Connection (TIC) and 'Einstein' programs, which are serving to consolidate the Government's internet access points and to provide some capability to monitor and mitigate cyber attacks. Next, this report catalogs activities undertaken by various industry and government entities. In addition, it assesses the benchmarks of HPC capability and other HPC attributes thatmore » may lend themselves to assist in the solution of this problem. This report draws few conclusions, as it is intended to assess current practice in preparation for future work, however, no explicit references to HPC usage for the purpose of analyzing cyber infrastructure in near-real-time were found in the current practice. This report and a related SAND2010-4766 National Cyber Defense High Performance Computing and Analysis: Concepts, Planning and Roadmap report are intended to provoke discussion throughout a broad audience about developing a cohesive HPC centric solution to wide-area cybersecurity problems.« less
NASA Technical Reports Server (NTRS)
Tranter, W. H.; Ziemer, R. E.; Fashano, M. J.
1975-01-01
This paper reviews the SYSTID technique for performance evaluation of communication systems using time-domain computer simulation. An example program illustrates the language. The inclusion of both Gaussian and impulse noise models make accurate simulation possible in a wide variety of environments. A very flexible postprocessor makes possible accurate and efficient performance evaluation.
Overview of a Linguistic Theory of Design. AI Memo 383A.
ERIC Educational Resources Information Center
Miller, Mark L.; Goldstein, Ira P.
The SPADE theory, which uses linguistic formalisms to model the planning and debugging processes of computer programming, was simultaneously developed and tested in three separate contexts--computer uses in education, automatic programming (a traditional artificial intelligence arena), and protocol analysis (the domain of information processing…
41 CFR 105-64.110 - When may GSA establish computer matching programs?
Code of Federal Regulations, 2013 CFR
2013-07-01
... computer matching programs? 105-64.110 Section 105-64.110 Public Contracts and Property Management Federal... GSA establish computer matching programs? (a) System managers will establish computer matching... direction of the GSA Data Integrity Board that will be established when and if computer matching programs...
41 CFR 105-64.110 - When may GSA establish computer matching programs?
Code of Federal Regulations, 2012 CFR
2012-01-01
... computer matching programs? 105-64.110 Section 105-64.110 Public Contracts and Property Management Federal... GSA establish computer matching programs? (a) System managers will establish computer matching... direction of the GSA Data Integrity Board that will be established when and if computer matching programs...
41 CFR 105-64.110 - When may GSA establish computer matching programs?
Code of Federal Regulations, 2014 CFR
2014-01-01
... computer matching programs? 105-64.110 Section 105-64.110 Public Contracts and Property Management Federal... GSA establish computer matching programs? (a) System managers will establish computer matching... direction of the GSA Data Integrity Board that will be established when and if computer matching programs...
41 CFR 105-64.110 - When may GSA establish computer matching programs?
Code of Federal Regulations, 2010 CFR
2010-07-01
... computer matching programs? 105-64.110 Section 105-64.110 Public Contracts and Property Management Federal... GSA establish computer matching programs? (a) System managers will establish computer matching... direction of the GSA Data Integrity Board that will be established when and if computer matching programs...
41 CFR 105-64.110 - When may GSA establish computer matching programs?
Code of Federal Regulations, 2011 CFR
2011-01-01
... computer matching programs? 105-64.110 Section 105-64.110 Public Contracts and Property Management Federal... GSA establish computer matching programs? (a) System managers will establish computer matching... direction of the GSA Data Integrity Board that will be established when and if computer matching programs...
Proposal of an Extended Taxonomy of Serious Games for Health Rehabilitation.
Rego, Paula Alexandra; Moreira, Pedro Miguel; Reis, Luís Paulo
2018-06-29
Serious Games is a field of research that has evolved substantially with valuable contributions to many application domains and areas. Patients often consider traditional rehabilitation approaches to be repetitive and boring, making it difficult for them to maintain their ongoing interest and assure the completion of the treatment program. Since the publication of our first taxonomy of Serious Games for Health Rehabilitation (SGHR), many studies have been published with game prototypes in this area. Based on literature review, our goal is to propose an updated taxonomy taking into account the works, updates, and innovations in game criteria that have been researched since our first publication in 2010. In addition, we aim to present the validation mechanism used for the proposed extended taxonomy. Based on a literature review in the area and on the analysis of the contributions made by other researchers, we propose an extended taxonomy for SGHR. For validating the taxonomy proposal, a questionnaire was designed to use on a survey among experts in the area. An extended taxonomy for SGHR was proposed. As we have identified that, in general, and besides the mechanisms associated with the adoption of a given taxonomy, there were no reported validation mechanisms for the proposals, we designed a mechanism to validate our proposal. The mechanism uses a questionnaire addressed to a sample of researchers and professionals with experience and expertise in domains of knowledge interrelated with SGHR, such as Computer Graphics, Game Design, Interaction Design, Computer Programming, and Health Rehabilitation. The extended taxonomy proposal for health rehabilitation serious games provides the research community with a tool to fully characterize serious games. The mechanism designed for validating the taxonomy proposal is another contribution of this work.
Description of research interests and current work related to automating software design
NASA Technical Reports Server (NTRS)
Kaindl, Hermann
1992-01-01
Enclosed is a list of selected and recent publications. Most of these publications concern applied research in the areas of software engineering and human-computer interaction. It is felt that domain-specific knowledge plays a major role in software development. Additionally, it is believed that improvements in the general software development process (e.g., object-oriented approaches) will have to be combined with the use of large domain-specific knowledge bases.
Goals of Government-Funded Public Domain Software Efforts
Rishel, Wesley J.
1980-01-01
The development of public domain software under Federal aegis and support has made possible a broadly competitive field of computer - oriented management information system consulting organizations with high technical competence and the potential for strong user orientation and loyalty. The impact of this assumption of major “front-end costs” by the Federal government has additional spin-off effects in terms of standardization and transportability features as well as reduced capital costs to the user.
Automatic violence detection in digital movies
NASA Astrophysics Data System (ADS)
Fischer, Stephan
1996-11-01
Research on computer-based recognition of violence is scant. We are working on the automatic recognition of violence in digital movies, a first step towards the goal of a computer- assisted system capable of protecting children against TV programs containing a great deal of violence. In the video domain a collision detection and a model-mapping to locate human figures are run, while the creation and comparison of fingerprints to find certain events are run int he audio domain. This article centers on the recognition of fist- fights in the video domain and on the recognition of shots, explosions and cries in the audio domain.
Application of a sensitivity analysis technique to high-order digital flight control systems
NASA Technical Reports Server (NTRS)
Paduano, James D.; Downing, David R.
1987-01-01
A sensitivity analysis technique for multiloop flight control systems is studied. This technique uses the scaled singular values of the return difference matrix as a measure of the relative stability of a control system. It then uses the gradients of these singular values with respect to system and controller parameters to judge sensitivity. The sensitivity analysis technique is first reviewed; then it is extended to include digital systems, through the derivation of singular-value gradient equations. Gradients with respect to parameters which do not appear explicitly as control-system matrix elements are also derived, so that high-order systems can be studied. A complete review of the integrated technique is given by way of a simple example: the inverted pendulum problem. The technique is then demonstrated on the X-29 control laws. Results show linear models of real systems can be analyzed by this sensitivity technique, if it is applied with care. A computer program called SVA was written to accomplish the singular-value sensitivity analysis techniques. Thus computational methods and considerations form an integral part of many of the discussions. A user's guide to the program is included. The SVA is a fully public domain program, running on the NASA/Dryden Elxsi computer.
Program For Generating Interactive Displays
NASA Technical Reports Server (NTRS)
Costenbader, Jay; Moleski, Walt; Szczur, Martha; Howell, David; Engelberg, Norm; Li, Tin P.; Misra, Dharitri; Miller, Philip; Neve, Leif; Wolf, Karl;
1991-01-01
Sun/Unix version of Transportable Applications Environment Plus (TAE+) computer program provides integrated, portable software environment for developing and running interactive window, text, and graphical-object-based application software systems. Enables programmer or nonprogrammer to construct easily custom software interface between user and application program and to move resulting interface program and its application program to different computers. Plus viewed as productivity tool for application developers and application end users, who benefit from resultant consistent and well-designed user interface sheltering them from intricacies of computer. Available in form suitable for following six different groups of computers: DEC VAX station and other VMS VAX computers, Macintosh II computers running AUX, Apollo Domain Series 3000, DEC VAX and reduced-instruction-set-computer workstations running Ultrix, Sun 3- and 4-series workstations running Sun OS and IBM RT/PC and PS/2 compute
ERIC Educational Resources Information Center
Amodeo, Luiza B.; Martin, Jeanette
To a large extent the Southwest can be described as a rural area. Under these circumstances, programs for public understanding of technology become, first of all, exercises in logistics. In 1982, New Mexico State University introduced a program to inform teachers about computer technology. This program takes microcomputers into rural classrooms…
Designing the framework for competency-based master of public health programs in India.
Sharma, Kavya; Zodpey, Sanjay; Morgan, Alison; Gaidhane, Abhay; Syed, Zahiruddin Quazi; Kumar, Rajeev
2013-01-01
Competency in the practice of public health is the implicit goal of education institutions that offer master of public health (MPH) programs. With the expanding number of institutions offering courses in public health in India, it is timely to develop a common framework to ensure that graduates are proficient in critical public health. Steps such as situation assessment, survey of public health care professionals in India, and national consultation were undertaken to develop a proposed competency-based framework for MPH programs in India. The existing curricula of all 23 Indian MPH courses vary significantly in content with regard to core, concentration, and crosscutting discipline areas and course durations. The competency or learning outcome is not well defined. The findings of the survey suggest that MPH graduates in India should have competencies ranging from monitoring of health problems and epidemics in the community, applying biostatistics in public health, conducting action research, understanding social and community influence on public health developing indicators and instruments to monitor and evaluate community health programs, developing proposals, and involving community in planning, delivery, and monitoring of health programs. Competency statements were framed and mapped with domains including epidemiology, biostatistics, social and behavioral sciences, health care system, policy, planning, and financing, and environmental health sciences and a crosscutting domain that include health communication and informatics, health management and leadership, professionalism, systems thinking, and public health biology. The proposed competency-based framework for Indian MPH programs can be adapted to meet the needs of diverse, unique programs. The framework ensures the uniqueness and diversity of individual MPH programs in India while contributing to measures of overall program success.
32 CFR 701.125 - Computer matching program.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 32 National Defense 5 2012-07-01 2012-07-01 false Computer matching program. 701.125 Section 701... OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC DON Privacy Program § 701.125 Computer matching program. The DPO has responsibility for coordinating the approval of DOD's participation in Computer Matching...
32 CFR 701.125 - Computer matching program.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 32 National Defense 5 2014-07-01 2014-07-01 false Computer matching program. 701.125 Section 701... OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC DON Privacy Program § 701.125 Computer matching program. The DPO has responsibility for coordinating the approval of DOD's participation in Computer Matching...
32 CFR 701.125 - Computer matching program.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 32 National Defense 5 2013-07-01 2013-07-01 false Computer matching program. 701.125 Section 701... OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC DON Privacy Program § 701.125 Computer matching program. The DPO has responsibility for coordinating the approval of DOD's participation in Computer Matching...
32 CFR 701.125 - Computer matching program.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 32 National Defense 5 2010-07-01 2010-07-01 false Computer matching program. 701.125 Section 701... OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC DON Privacy Program § 701.125 Computer matching program. The DPO has responsibility for coordinating the approval of DOD's participation in Computer Matching...
32 CFR 701.125 - Computer matching program.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 32 National Defense 5 2011-07-01 2011-07-01 false Computer matching program. 701.125 Section 701... OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC DON Privacy Program § 701.125 Computer matching program. The DPO has responsibility for coordinating the approval of DOD's participation in Computer Matching...
A Project-Based Learning Setting to Human-Computer Interaction for Teenagers
ERIC Educational Resources Information Center
Geyer, Cornelia; Geisler, Stefan
2012-01-01
Knowledge of fundamentals of human-computer interaction resp. usability engineering is getting more and more important in technical domains. However this interdisciplinary field of work and corresponding degree programs are not broadly known. Therefore at the Hochschule Ruhr West, University of Applied Sciences, a program was developed to give…
An Object-Oriented Approach to Writing Computational Electromagnetics Codes
NASA Technical Reports Server (NTRS)
Zimmerman, Martin; Mallasch, Paul G.
1996-01-01
Presently, most computer software development in the Computational Electromagnetics (CEM) community employs the structured programming paradigm, particularly using the Fortran language. Other segments of the software community began switching to an Object-Oriented Programming (OOP) paradigm in recent years to help ease design and development of highly complex codes. This paper examines design of a time-domain numerical analysis CEM code using the OOP paradigm, comparing OOP code and structured programming code in terms of software maintenance, portability, flexibility, and speed.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-07
... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2010-0034] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Bureau of the Public Debt (BPD))--Match Number 1304 AGENCY: Social Security... as shown above. SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection...
SBION: A Program for Analyses of Salt-Bridges from Multiple Structure Files.
Gupta, Parth Sarthi Sen; Mondal, Sudipta; Mondal, Buddhadev; Islam, Rifat Nawaz Ul; Banerjee, Shyamashree; Bandyopadhyay, Amal K
2014-01-01
Salt-bridge and network salt-bridge are specific electrostatic interactions that contribute to the overall stability of proteins. In hierarchical protein folding model, these interactions play crucial role in nucleation process. The advent and growth of protein structure database and its availability in public domain made an urgent need for context dependent rapid analysis of salt-bridges. While these analyses on single protein is cumbersome and time-consuming, batch analyses need efficient software for rapid topological scan of a large number of protein for extracting details on (i) fraction of salt-bridge residues (acidic and basic). (ii) Chain specific intra-molecular salt-bridges, (iii) inter-molecular salt-bridges (protein-protein interactions) in all possible binary combinations (iv) network salt-bridges and (v) secondary structure distribution of salt-bridge residues. To the best of our knowledge, such efficient software is not available in public domain. At this juncture, we have developed a program i.e. SBION which can perform all the above mentioned computations for any number of protein with any number of chain at any given distance of ion-pair. It is highly efficient, fast, error-free and user friendly. Finally we would say that our SBION indeed possesses potential for applications in the field of structural and comparative bioinformatics studies. SBION is freely available for non-commercial/academic institutions on formal request to the corresponding author (akbanerjee@biotech.buruniv.ac.in).
Computer program for thin-wire structures in a homogeneous conducting medium
NASA Technical Reports Server (NTRS)
Richmond, J. H.
1974-01-01
A computer program is presented for thin-wire antennas and scatters in a homogeneous conducting medium. The anaylsis is performed in the real or complex frequency domain. The program handles insulated and bare wires with finite conductivity and lumped loads. The output data includes the current distribution, impedance, radiation efficiency, gain, absorption cross section, scattering cross section, echo area and the polarization scattering matrix. The program uses sinusoidal bases and Galerkin's method.
32 CFR 505.13 - Computer Matching Agreement Program.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 32 National Defense 3 2013-07-01 2013-07-01 false Computer Matching Agreement Program. 505.13... AUTHORITIES AND PUBLIC RELATIONS ARMY PRIVACY ACT PROGRAM § 505.13 Computer Matching Agreement Program. (a) General provisions. (1) Pursuant to the Privacy Act and this part, DA records may be subject to computer...
32 CFR 505.13 - Computer Matching Agreement Program.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 32 National Defense 3 2012-07-01 2009-07-01 true Computer Matching Agreement Program. 505.13... AUTHORITIES AND PUBLIC RELATIONS ARMY PRIVACY ACT PROGRAM § 505.13 Computer Matching Agreement Program. (a) General provisions. (1) Pursuant to the Privacy Act and this part, DA records may be subject to computer...
32 CFR 505.13 - Computer Matching Agreement Program.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 32 National Defense 3 2014-07-01 2014-07-01 false Computer Matching Agreement Program. 505.13... AUTHORITIES AND PUBLIC RELATIONS ARMY PRIVACY ACT PROGRAM § 505.13 Computer Matching Agreement Program. (a) General provisions. (1) Pursuant to the Privacy Act and this part, DA records may be subject to computer...
32 CFR 505.13 - Computer Matching Agreement Program.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 32 National Defense 3 2011-07-01 2009-07-01 true Computer Matching Agreement Program. 505.13... AUTHORITIES AND PUBLIC RELATIONS ARMY PRIVACY ACT PROGRAM § 505.13 Computer Matching Agreement Program. (a) General provisions. (1) Pursuant to the Privacy Act and this part, DA records may be subject to computer...
1DTempPro: analyzing temperature profiles for groundwater/surface-water exchange.
Voytek, Emily B; Drenkelfuss, Anja; Day-Lewis, Frederick D; Healy, Richard; Lane, John W; Werkema, Dale
2014-01-01
A new computer program, 1DTempPro, is presented for the analysis of vertical one-dimensional (1D) temperature profiles under saturated flow conditions. 1DTempPro is a graphical user interface to the U.S. Geological Survey code Variably Saturated 2-Dimensional Heat Transport (VS2DH), which numerically solves the flow and heat-transport equations. Pre- and postprocessor features allow the user to calibrate VS2DH models to estimate vertical groundwater/surface-water exchange and also hydraulic conductivity for cases where hydraulic head is known. Published 2013. This article is a U.S. Government work and is in the public domain in the USA.
Song, Hyun-Seob; Goldberg, Noam; Mahajan, Ashutosh; Ramkrishna, Doraiswami
2017-08-01
Elementary (flux) modes (EMs) have served as a valuable tool for investigating structural and functional properties of metabolic networks. Identification of the full set of EMs in genome-scale networks remains challenging due to combinatorial explosion of EMs in complex networks. It is often, however, that only a small subset of relevant EMs needs to be known, for which optimization-based sequential computation is a useful alternative. Most of the currently available methods along this line are based on the iterative use of mixed integer linear programming (MILP), the effectiveness of which significantly deteriorates as the number of iterations builds up. To alleviate the computational burden associated with the MILP implementation, we here present a novel optimization algorithm termed alternate integer linear programming (AILP). Our algorithm was designed to iteratively solve a pair of integer programming (IP) and linear programming (LP) to compute EMs in a sequential manner. In each step, the IP identifies a minimal subset of reactions, the deletion of which disables all previously identified EMs. Thus, a subsequent LP solution subject to this reaction deletion constraint becomes a distinct EM. In cases where no feasible LP solution is available, IP-derived reaction deletion sets represent minimal cut sets (MCSs). Despite the additional computation of MCSs, AILP achieved significant time reduction in computing EMs by orders of magnitude. The proposed AILP algorithm not only offers a computational advantage in the EM analysis of genome-scale networks, but also improves the understanding of the linkage between EMs and MCSs. The software is implemented in Matlab, and is provided as supplementary information . hyunseob.song@pnnl.gov. Supplementary data are available at Bioinformatics online. Published by Oxford University Press 2017. This work is written by US Government employees and are in the public domain in the US.
Third CLIPS Conference Proceedings, volume 1
NASA Technical Reports Server (NTRS)
Riley, Gary (Editor)
1994-01-01
Expert systems are computed programs which emulate human expertise in well defined problem domains. The potential payoff from expert systems is high: valuable expertise can be captured and preserved, repetitive and/or mundane tasks requiring human expertise can be automated, and uniformity can be applied in decision making processes. The C Language Integrated Production Systems (CLIPS) is an expert system building tool, developed at the Johnson Space Center, which provides a complete environment for the development and delivery of rule and/or object based expert systems. CLIPS was specifically designed to provide a low cost option for developing and deploying expert system applications across a wide range of hardware platforms. The development of CLIPS has helped to improve the ability to deliver expert systems technology throughout the public and private sectors for a wide range of applications and diverse computing environments.
Research in mathematical theory of computation. [computer programming applications
NASA Technical Reports Server (NTRS)
Mccarthy, J.
1973-01-01
Research progress in the following areas is reviewed: (1) new version of computer program LCF (logic for computable functions) including a facility to search for proofs automatically; (2) the description of the language PASCAL in terms of both LCF and in first order logic; (3) discussion of LISP semantics in LCF and attempt to prove the correctness of the London compilers in a formal way; (4) design of both special purpose and domain independent proving procedures specifically program correctness in mind; (5) design of languages for describing such proof procedures; and (6) the embedding of ideas in the first order checker.
NASA Astrophysics Data System (ADS)
Iwasawa, Masaki; Tanikawa, Ataru; Hosono, Natsuki; Nitadori, Keigo; Muranushi, Takayuki; Makino, Junichiro
2016-08-01
We present the basic idea, implementation, measured performance, and performance model of FDPS (Framework for Developing Particle Simulators). FDPS is an application-development framework which helps researchers to develop simulation programs using particle methods for large-scale distributed-memory parallel supercomputers. A particle-based simulation program for distributed-memory parallel computers needs to perform domain decomposition, exchange of particles which are not in the domain of each computing node, and gathering of the particle information in other nodes which are necessary for interaction calculation. Also, even if distributed-memory parallel computers are not used, in order to reduce the amount of computation, algorithms such as the Barnes-Hut tree algorithm or the Fast Multipole Method should be used in the case of long-range interactions. For short-range interactions, some methods to limit the calculation to neighbor particles are required. FDPS provides all of these functions which are necessary for efficient parallel execution of particle-based simulations as "templates," which are independent of the actual data structure of particles and the functional form of the particle-particle interaction. By using FDPS, researchers can write their programs with the amount of work necessary to write a simple, sequential and unoptimized program of O(N2) calculation cost, and yet the program, once compiled with FDPS, will run efficiently on large-scale parallel supercomputers. A simple gravitational N-body program can be written in around 120 lines. We report the actual performance of these programs and the performance model. The weak scaling performance is very good, and almost linear speed-up was obtained for up to the full system of the K computer. The minimum calculation time per timestep is in the range of 30 ms (N = 107) to 300 ms (N = 109). These are currently limited by the time for the calculation of the domain decomposition and communication necessary for the interaction calculation. We discuss how we can overcome these bottlenecks.
NASA Technical Reports Server (NTRS)
Richmond, J. H.
1974-01-01
A computer program is presented for a thin-wire antenna over a perfect ground plane. The analysis is performed in the frequency domain, and the exterior medium is free space. The antenna may have finite conductivity and lumped loads. The output data includes the current distribution, impedance, radiation efficiency, and gain. The program uses sinusoidal bases and Galerkin's method.
PHoToNs–A parallel heterogeneous and threads oriented code for cosmological N-body simulation
NASA Astrophysics Data System (ADS)
Wang, Qiao; Cao, Zong-Yan; Gao, Liang; Chi, Xue-Bin; Meng, Chen; Wang, Jie; Wang, Long
2018-06-01
We introduce a new code for cosmological simulations, PHoToNs, which incorporates features for performing massive cosmological simulations on heterogeneous high performance computer (HPC) systems and threads oriented programming. PHoToNs adopts a hybrid scheme to compute gravitational force, with the conventional Particle-Mesh (PM) algorithm to compute the long-range force, the Tree algorithm to compute the short range force and the direct summation Particle-Particle (PP) algorithm to compute gravity from very close particles. A self-similar space filling a Peano-Hilbert curve is used to decompose the computing domain. Threads programming is advantageously used to more flexibly manage the domain communication, PM calculation and synchronization, as well as Dual Tree Traversal on the CPU+MIC platform. PHoToNs scales well and efficiency of the PP kernel achieves 68.6% of peak performance on MIC and 74.4% on CPU platforms. We also test the accuracy of the code against the much used Gadget-2 in the community and found excellent agreement.
LTPP Computed Parameter: Moisture Content
DOT National Transportation Integrated Search
2008-01-01
A study was conducted to compute in situ soil parameters based on time domain reflectometry (TDR) traces obtained from Long Term Pavement Performance (LTPP) test sections instrumented for the seasonal monitoring program (SMP). Ten TDR sensors were in...
TCP/IP Interface for the Satellite Orbit Analysis Program (SOAP)
NASA Technical Reports Server (NTRS)
Carnright, Robert; Stodden, David; Coggi, John
2009-01-01
The Transmission Control Protocol/ Internet protocol (TCP/IP) interface for the Satellite Orbit Analysis Program (SOAP) provides the means for the software to establish real-time interfaces with other software. Such interfaces can operate between two programs, either on the same computer or on different computers joined by a network. The SOAP TCP/IP module employs a client/server interface where SOAP is the server and other applications can be clients. Real-time interfaces between software offer a number of advantages over embedding all of the common functionality within a single program. One advantage is that they allow each program to divide the computation labor between processors or computers running the separate applications. Secondly, each program can be allowed to provide its own expertise domain with other programs able to use this expertise.
Applications of Out-of-Domain Knowledge in Students' Reasoning about Computer Program State
ERIC Educational Resources Information Center
Lewis, Colleen Marie
2012-01-01
To meet a growing demand and a projected deficit in the supply of computer professionals (NCWIT, 2009), it is of vital importance to expand students' access to computer science. However, many researchers in the computer science education community unproductively assume that some students lack an innate ability for computer science and…
Ayres, Daniel L; Darling, Aaron; Zwickl, Derrick J; Beerli, Peter; Holder, Mark T; Lewis, Paul O; Huelsenbeck, John P; Ronquist, Fredrik; Swofford, David L; Cummings, Michael P; Rambaut, Andrew; Suchard, Marc A
2012-01-01
Phylogenetic inference is fundamental to our understanding of most aspects of the origin and evolution of life, and in recent years, there has been a concentration of interest in statistical approaches such as Bayesian inference and maximum likelihood estimation. Yet, for large data sets and realistic or interesting models of evolution, these approaches remain computationally demanding. High-throughput sequencing can yield data for thousands of taxa, but scaling to such problems using serial computing often necessitates the use of nonstatistical or approximate approaches. The recent emergence of graphics processing units (GPUs) provides an opportunity to leverage their excellent floating-point computational performance to accelerate statistical phylogenetic inference. A specialized library for phylogenetic calculation would allow existing software packages to make more effective use of available computer hardware, including GPUs. Adoption of a common library would also make it easier for other emerging computing architectures, such as field programmable gate arrays, to be used in the future. We present BEAGLE, an application programming interface (API) and library for high-performance statistical phylogenetic inference. The API provides a uniform interface for performing phylogenetic likelihood calculations on a variety of compute hardware platforms. The library includes a set of efficient implementations and can currently exploit hardware including GPUs using NVIDIA CUDA, central processing units (CPUs) with Streaming SIMD Extensions and related processor supplementary instruction sets, and multicore CPUs via OpenMP. To demonstrate the advantages of a common API, we have incorporated the library into several popular phylogenetic software packages. The BEAGLE library is free open source software licensed under the Lesser GPL and available from http://beagle-lib.googlecode.com. An example client program is available as public domain software.
Ayres, Daniel L.; Darling, Aaron; Zwickl, Derrick J.; Beerli, Peter; Holder, Mark T.; Lewis, Paul O.; Huelsenbeck, John P.; Ronquist, Fredrik; Swofford, David L.; Cummings, Michael P.; Rambaut, Andrew; Suchard, Marc A.
2012-01-01
Abstract Phylogenetic inference is fundamental to our understanding of most aspects of the origin and evolution of life, and in recent years, there has been a concentration of interest in statistical approaches such as Bayesian inference and maximum likelihood estimation. Yet, for large data sets and realistic or interesting models of evolution, these approaches remain computationally demanding. High-throughput sequencing can yield data for thousands of taxa, but scaling to such problems using serial computing often necessitates the use of nonstatistical or approximate approaches. The recent emergence of graphics processing units (GPUs) provides an opportunity to leverage their excellent floating-point computational performance to accelerate statistical phylogenetic inference. A specialized library for phylogenetic calculation would allow existing software packages to make more effective use of available computer hardware, including GPUs. Adoption of a common library would also make it easier for other emerging computing architectures, such as field programmable gate arrays, to be used in the future. We present BEAGLE, an application programming interface (API) and library for high-performance statistical phylogenetic inference. The API provides a uniform interface for performing phylogenetic likelihood calculations on a variety of compute hardware platforms. The library includes a set of efficient implementations and can currently exploit hardware including GPUs using NVIDIA CUDA, central processing units (CPUs) with Streaming SIMD Extensions and related processor supplementary instruction sets, and multicore CPUs via OpenMP. To demonstrate the advantages of a common API, we have incorporated the library into several popular phylogenetic software packages. The BEAGLE library is free open source software licensed under the Lesser GPL and available from http://beagle-lib.googlecode.com. An example client program is available as public domain software. PMID:21963610
NASA Technical Reports Server (NTRS)
Nguyen, D. T.; Watson, Willie R. (Technical Monitor)
2005-01-01
The overall objectives of this research work are to formulate and validate efficient parallel algorithms, and to efficiently design/implement computer software for solving large-scale acoustic problems, arised from the unified frameworks of the finite element procedures. The adopted parallel Finite Element (FE) Domain Decomposition (DD) procedures should fully take advantages of multiple processing capabilities offered by most modern high performance computing platforms for efficient parallel computation. To achieve this objective. the formulation needs to integrate efficient sparse (and dense) assembly techniques, hybrid (or mixed) direct and iterative equation solvers, proper pre-conditioned strategies, unrolling strategies, and effective processors' communicating schemes. Finally, the numerical performance of the developed parallel finite element procedures will be evaluated by solving series of structural, and acoustic (symmetrical and un-symmetrical) problems (in different computing platforms). Comparisons with existing "commercialized" and/or "public domain" software are also included, whenever possible.
A scoping review of cloud computing in healthcare.
Griebel, Lena; Prokosch, Hans-Ulrich; Köpcke, Felix; Toddenroth, Dennis; Christoph, Jan; Leb, Ines; Engel, Igor; Sedlmayr, Martin
2015-03-19
Cloud computing is a recent and fast growing area of development in healthcare. Ubiquitous, on-demand access to virtually endless resources in combination with a pay-per-use model allow for new ways of developing, delivering and using services. Cloud computing is often used in an "OMICS-context", e.g. for computing in genomics, proteomics and molecular medicine, while other field of application still seem to be underrepresented. Thus, the objective of this scoping review was to identify the current state and hot topics in research on cloud computing in healthcare beyond this traditional domain. MEDLINE was searched in July 2013 and in December 2014 for publications containing the terms "cloud computing" and "cloud-based". Each journal and conference article was categorized and summarized independently by two researchers who consolidated their findings. 102 publications have been analyzed and 6 main topics have been found: telemedicine/teleconsultation, medical imaging, public health and patient self-management, hospital management and information systems, therapy, and secondary use of data. Commonly used features are broad network access for sharing and accessing data and rapid elasticity to dynamically adapt to computing demands. Eight articles favor the pay-for-use characteristics of cloud-based services avoiding upfront investments. Nevertheless, while 22 articles present very general potentials of cloud computing in the medical domain and 66 articles describe conceptual or prototypic projects, only 14 articles report from successful implementations. Further, in many articles cloud computing is seen as an analogy to internet-/web-based data sharing and the characteristics of the particular cloud computing approach are unfortunately not really illustrated. Even though cloud computing in healthcare is of growing interest only few successful implementations yet exist and many papers just use the term "cloud" synonymously for "using virtual machines" or "web-based" with no described benefit of the cloud paradigm. The biggest threat to the adoption in the healthcare domain is caused by involving external cloud partners: many issues of data safety and security are still to be solved. Until then, cloud computing is favored more for singular, individual features such as elasticity, pay-per-use and broad network access, rather than as cloud paradigm on its own.
1998-01-01
usually written up by Logistics or Maintenance (4790 is the Maintenance “ Bible ”). If need be, and if resources are available, one could collect all...Public domain) SATAN (System Administration Tool for Analyzing Networks) (Public Domain) STAT ( Security Test and Analysis Tool) (Harris Corporation...Service-Filtering Tools 1. TCP/IP wrapper program • Tools to Scan Hosts for Known Vulnerabilities 1. ISS (Internet Security Scanner) 2. SATAN (Security
Ontology-Oriented Programming for Biomedical Informatics.
Lamy, Jean-Baptiste
2016-01-01
Ontologies are now widely used in the biomedical domain. However, it is difficult to manipulate ontologies in a computer program and, consequently, it is not easy to integrate ontologies with databases or websites. Two main approaches have been proposed for accessing ontologies in a computer program: traditional API (Application Programming Interface) and ontology-oriented programming, either static or dynamic. In this paper, we will review these approaches and discuss their appropriateness for biomedical ontologies. We will also present an experience feedback about the integration of an ontology in a computer software during the VIIIP research project. Finally, we will present OwlReady, the solution we developed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, MooHyun
2014-08-01
This report presents the development of offshore anchor data sets which are intended to be used to develop a database that allows preliminary selection and sizing of anchors for the conceptual design of floating offshore wind turbines (FOWTs). The study is part of a project entitled “Development of Mooring-Anchor Program in Public Domain for Coupling with Floater Program for FOWTs (Floating Offshore Wind Turbines)”, under the direction of Dr. Moo-Hyun Kim at the Texas A&M University and with the sponsorship from the US Department of Energy (Contract No. DE-EE0005479, CFDA # 81.087 for DE-FOA-0000415, Topic Area 1.3: Subsurface Mooring andmore » Anchoring Dynamics Models).« less
Sampling for Explosives Residues at Fort Greely, Alaska. Reconnaissance Visit July 2000
2001-11-01
on lands withdrawn from the public domain under the Military Lands With- drawal Act (PL 106 -65). The Army has pledged to implement a program to...from the public domain under the Military Lands Withdrawal Act (Public Law 106 -65); the withdrawal of land was recently renewed. As part of the...option fuse Pellet booster comp A5 (RDX (98.5%)and stearic acid (1.5%) 8 g Lead charge PBXN -5 (HMX 95% and Binder 5%) 152 mg Detonator assembly HMX
NASA Technical Reports Server (NTRS)
Bailey, R. T.; Shih, T. I.-P.; Nguyen, H. L.; Roelke, R. J.
1990-01-01
An efficient computer program, called GRID2D/3D, was developed to generate single and composite grid systems within geometrically complex two- and three-dimensional (2- and 3-D) spatial domains that can deform with time. GRID2D/3D generates single grid systems by using algebraic grid generation methods based on transfinite interpolation in which the distribution of grid points within the spatial domain is controlled by stretching functions. All single grid systems generated by GRID2D/3D can have grid lines that are continuous and differentiable everywhere up to the second-order. Also, grid lines can intersect boundaries of the spatial domain orthogonally. GRID2D/3D generates composite grid systems by patching together two or more single grid systems. The patching can be discontinuous or continuous. For continuous composite grid systems, the grid lines are continuous and differentiable everywhere up to the second-order except at interfaces where different single grid systems meet. At interfaces where different single grid systems meet, the grid lines are only differentiable up to the first-order. For 2-D spatial domains, the boundary curves are described by using either cubic or tension spline interpolation. For 3-D spatial domains, the boundary surfaces are described by using either linear Coon's interpolation, bi-hyperbolic spline interpolation, or a new technique referred to as 3-D bi-directional Hermite interpolation. Since grid systems generated by algebraic methods can have grid lines that overlap one another, GRID2D/3D contains a graphics package for evaluating the grid systems generated. With the graphics package, the user can generate grid systems in an interactive manner with the grid generation part of GRID2D/3D. GRID2D/3D is written in FORTRAN 77 and can be run on any IBM PC, XT, or AT compatible computer. In order to use GRID2D/3D on workstations or mainframe computers, some minor modifications must be made in the graphics part of the program; no modifications are needed in the grid generation part of the program. The theory and method used in GRID2D/3D is described.
Public health situation awareness: toward a semantic approach
NASA Astrophysics Data System (ADS)
Mirhaji, Parsa; Richesson, Rachel L.; Turley, James P.; Zhang, Jiajie; Smith, Jack W.
2004-04-01
We propose a knowledge-based public health situation awareness system. The basis for this system is an explicit representation of public health situation awareness concepts and their interrelationships. This representation is based upon the users" (public health decision makers) cognitive model of the world, and optimized towards the efficacy of performance and relevance to the public health situation awareness processes and tasks. In our approach, explicit domain knowledge is the foundation for interpretation of public health data, as apposed to conventional systems where the statistical methods are the essence of the processes. Objectives: To develop a prototype knowledge-based system for public health situation awareness and to demonstrate the utility of knowledge intensive approaches in integration of heterogeneous information, eliminating the effects of incomplete and poor quality surveillance data, uncertainty in syndrome and aberration detection and visualization of complex information structures in public health surveillance settings, particularly in the context of bioterrorism (BT) preparedness. The system employs the Resource Definition Framework (RDF) and additional layers of more expressive languages to explicate the knowledge of domain experts into machine interpretable and computable problem-solving modules that can then guide users and computer systems in sifting through the most "relevant" data for syndrome and outbreak detection and investigation of root cause of the event. The Center for Biosecurity and Public Health Informatics Research is developing a prototype knowledge-based system around influenza, which has complex natural disease patterns, many public health implications, and is a potential agent for bioterrorism. The preliminary data from this effort may demonstrate superior performance in information integration, syndrome and aberration detection, information access through information visualization, and cross-domain investigation of the root causes of public health events.
Hypercube matrix computation task
NASA Technical Reports Server (NTRS)
Calalo, Ruel H.; Imbriale, William A.; Jacobi, Nathan; Liewer, Paulett C.; Lockhart, Thomas G.; Lyzenga, Gregory A.; Lyons, James R.; Manshadi, Farzin; Patterson, Jean E.
1988-01-01
A major objective of the Hypercube Matrix Computation effort at the Jet Propulsion Laboratory (JPL) is to investigate the applicability of a parallel computing architecture to the solution of large-scale electromagnetic scattering problems. Three scattering analysis codes are being implemented and assessed on a JPL/California Institute of Technology (Caltech) Mark 3 Hypercube. The codes, which utilize different underlying algorithms, give a means of evaluating the general applicability of this parallel architecture. The three analysis codes being implemented are a frequency domain method of moments code, a time domain finite difference code, and a frequency domain finite elements code. These analysis capabilities are being integrated into an electromagnetics interactive analysis workstation which can serve as a design tool for the construction of antennas and other radiating or scattering structures. The first two years of work on the Hypercube Matrix Computation effort is summarized. It includes both new developments and results as well as work previously reported in the Hypercube Matrix Computation Task: Final Report for 1986 to 1987 (JPL Publication 87-18).
32 CFR 505.13 - Computer Matching Agreement Program.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 32 National Defense 3 2010-07-01 2010-07-01 true Computer Matching Agreement Program. 505.13 Section 505.13 National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY AID OF CIVIL AUTHORITIES AND PUBLIC RELATIONS ARMY PRIVACY ACT PROGRAM § 505.13 Computer Matching Agreement Program. (a...
MPI implementation of PHOENICS: A general purpose computational fluid dynamics code
NASA Astrophysics Data System (ADS)
Simunovic, S.; Zacharia, T.; Baltas, N.; Spalding, D. B.
1995-03-01
PHOENICS is a suite of computational analysis programs that are used for simulation of fluid flow, heat transfer, and dynamical reaction processes. The parallel version of the solver EARTH for the Computational Fluid Dynamics (CFD) program PHOENICS has been implemented using Message Passing Interface (MPI) standard. Implementation of MPI version of PHOENICS makes this computational tool portable to a wide range of parallel machines and enables the use of high performance computing for large scale computational simulations. MPI libraries are available on several parallel architectures making the program usable across different architectures as well as on heterogeneous computer networks. The Intel Paragon NX and MPI versions of the program have been developed and tested on massively parallel supercomputers Intel Paragon XP/S 5, XP/S 35, and Kendall Square Research, and on the multiprocessor SGI Onyx computer at Oak Ridge National Laboratory. The preliminary testing results of the developed program have shown scalable performance for reasonably sized computational domains.
MPI implementation of PHOENICS: A general purpose computational fluid dynamics code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simunovic, S.; Zacharia, T.; Baltas, N.
1995-04-01
PHOENICS is a suite of computational analysis programs that are used for simulation of fluid flow, heat transfer, and dynamical reaction processes. The parallel version of the solver EARTH for the Computational Fluid Dynamics (CFD) program PHOENICS has been implemented using Message Passing Interface (MPI) standard. Implementation of MPI version of PHOENICS makes this computational tool portable to a wide range of parallel machines and enables the use of high performance computing for large scale computational simulations. MPI libraries are available on several parallel architectures making the program usable across different architectures as well as on heterogeneous computer networks. Themore » Intel Paragon NX and MPI versions of the program have been developed and tested on massively parallel supercomputers Intel Paragon XP/S 5, XP/S 35, and Kendall Square Research, and on the multiprocessor SGI Onyx computer at Oak Ridge National Laboratory. The preliminary testing results of the developed program have shown scalable performance for reasonably sized computational domains.« less
42 CFR 422.152 - Quality improvement program.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., psychosocial, or clinical domains (for example, quality of life indicators, depression scales, or chronic... 42 Public Health 3 2010-10-01 2010-10-01 false Quality improvement program. 422.152 Section 422... (CONTINUED) MEDICARE PROGRAM MEDICARE ADVANTAGE PROGRAM Quality Improvement § 422.152 Quality improvement...
ERIC Educational Resources Information Center
Haberman, Bruria; Yehezkel, Cecile
2008-01-01
The rapid evolvement of the computing domain has posed challenges in attempting to bridge the gap between school and the contemporary world of computing, which is related to content, learning culture, and professional norms. We believe that the interaction of high-school students who major in computer science or software engineering with leading…
Long term pavement performance computed parameter : moisture content
DOT National Transportation Integrated Search
2008-01-01
A study was conducted to compute in situ soil parameters based on time domain reflectometry (TDR) traces obtained from Long Term Pavement Performance (LTPP) test sections instrumented for the seasonal monitoring program (SMP). Ten TDR sensors were in...
Computer automation for feedback system design
NASA Technical Reports Server (NTRS)
1975-01-01
Mathematical techniques and explanations of various steps used by an automated computer program to design feedback systems are summarized. Special attention was given to refining the automatic evaluation suboptimal loop transmission and the translation of time to frequency domain specifications.
Public Domain Software for Education.
ERIC Educational Resources Information Center
Scholastech, Inc., Cambridge, MA.
This report describes a project undertaken by Scholastech, a public charity addressing the development needs of computing educators, which was designed to advocate and support the increased use of free software in the college curriculum. The first of five sections provides a brief overview and statement of project objectives, i.e., to serve…
Regression model for estimating inactivation of microbial aerosols by solar radiation.
Ben-David, Avishai; Sagripanti, Jose-Luis
2013-01-01
The inactivation of pathogenic aerosols by solar radiation is relevant to public health and biodefense. We investigated whether a relatively simple method to calculate solar diffuse and total irradiances could be developed and used in environmental photobiology estimations instead of complex atmospheric radiative transfer computer programs. The second-order regression model that we developed reproduced 13 radiation quantities calculated for equinoxes and solstices at 35(°) latitude with a computer-intensive and rather complex atmospheric radiative transfer program (MODTRAN) with a mean error <6% (2% for most radiation quantities). Extending the application of the regression model from a reference latitude and date (chosen as 35° latitude for 21 March) to different latitudes and days of the year was accomplished with variable success: usually with a mean error <15% (but as high as 150% for some combination of latitudes and days of year). This accuracy of the methodology proposed here compares favorably to photobiological experiments where the microbial survival is usually measured with an accuracy no better than ±0.5 log10 units. The approach and equations presented in this study should assist in estimating the maximum time during which microbial pathogens remain infectious after accidental or intentional aerosolization in open environments. © Published 2013. This article is a U.S. Government work and is in the public domain in the USA. Photochemistry and Photobiology © 2013 The American Society of Photobiology.
NASA Astrophysics Data System (ADS)
Peckham, S. D.
2017-12-01
Standardized, deep descriptions of digital resources (e.g. data sets, computational models, software tools and publications) make it possible to develop user-friendly software systems that assist scientists with the discovery and appropriate use of these resources. Semantic metadata makes it possible for machines to take actions on behalf of humans, such as automatically identifying the resources needed to solve a given problem, retrieving them and then automatically connecting them (despite their heterogeneity) into a functioning workflow. Standardized model metadata also helps model users to understand the important details that underpin computational models and to compare the capabilities of different models. These details include simplifying assumptions on the physics, governing equations and the numerical methods used to solve them, discretization of space (the grid) and time (the time-stepping scheme), state variables (input or output), model configuration parameters. This kind of metadata provides a "deep description" of a computational model that goes well beyond other types of metadata (e.g. author, purpose, scientific domain, programming language, digital rights, provenance, execution) and captures the science that underpins a model. A carefully constructed, unambiguous and rules-based schema to address this problem, called the Geoscience Standard Names ontology will be presented that utilizes Semantic Web best practices and technologies. It has also been designed to work across science domains and to be readable by both humans and machines.
Sharma, Parichit; Mantri, Shrikant S
2014-01-01
The function of a newly sequenced gene can be discovered by determining its sequence homology with known proteins. BLAST is the most extensively used sequence analysis program for sequence similarity search in large databases of sequences. With the advent of next generation sequencing technologies it has now become possible to study genes and their expression at a genome-wide scale through RNA-seq and metagenome sequencing experiments. Functional annotation of all the genes is done by sequence similarity search against multiple protein databases. This annotation task is computationally very intensive and can take days to obtain complete results. The program mpiBLAST, an open-source parallelization of BLAST that achieves superlinear speedup, can be used to accelerate large-scale annotation by using supercomputers and high performance computing (HPC) clusters. Although many parallel bioinformatics applications using the Message Passing Interface (MPI) are available in the public domain, researchers are reluctant to use them due to lack of expertise in the Linux command line and relevant programming experience. With these limitations, it becomes difficult for biologists to use mpiBLAST for accelerating annotation. No web interface is available in the open-source domain for mpiBLAST. We have developed WImpiBLAST, a user-friendly open-source web interface for parallel BLAST searches. It is implemented in Struts 1.3 using a Java backbone and runs atop the open-source Apache Tomcat Server. WImpiBLAST supports script creation and job submission features and also provides a robust job management interface for system administrators. It combines script creation and modification features with job monitoring and management through the Torque resource manager on a Linux-based HPC cluster. Use case information highlights the acceleration of annotation analysis achieved by using WImpiBLAST. Here, we describe the WImpiBLAST web interface features and architecture, explain design decisions, describe workflows and provide a detailed analysis.
Sharma, Parichit; Mantri, Shrikant S.
2014-01-01
The function of a newly sequenced gene can be discovered by determining its sequence homology with known proteins. BLAST is the most extensively used sequence analysis program for sequence similarity search in large databases of sequences. With the advent of next generation sequencing technologies it has now become possible to study genes and their expression at a genome-wide scale through RNA-seq and metagenome sequencing experiments. Functional annotation of all the genes is done by sequence similarity search against multiple protein databases. This annotation task is computationally very intensive and can take days to obtain complete results. The program mpiBLAST, an open-source parallelization of BLAST that achieves superlinear speedup, can be used to accelerate large-scale annotation by using supercomputers and high performance computing (HPC) clusters. Although many parallel bioinformatics applications using the Message Passing Interface (MPI) are available in the public domain, researchers are reluctant to use them due to lack of expertise in the Linux command line and relevant programming experience. With these limitations, it becomes difficult for biologists to use mpiBLAST for accelerating annotation. No web interface is available in the open-source domain for mpiBLAST. We have developed WImpiBLAST, a user-friendly open-source web interface for parallel BLAST searches. It is implemented in Struts 1.3 using a Java backbone and runs atop the open-source Apache Tomcat Server. WImpiBLAST supports script creation and job submission features and also provides a robust job management interface for system administrators. It combines script creation and modification features with job monitoring and management through the Torque resource manager on a Linux-based HPC cluster. Use case information highlights the acceleration of annotation analysis achieved by using WImpiBLAST. Here, we describe the WImpiBLAST web interface features and architecture, explain design decisions, describe workflows and provide a detailed analysis. PMID:24979410
Computer Integrated Manufacturing Programs in Higher Education.
ERIC Educational Resources Information Center
International Business Machines Corp., Milford, CT. Academic Information Systems.
This publication focuses on computer integrated manufacturing (CIM) programs at several higher education institutions which teach the use of computing in manufacturing. The document describes programs at the following institutions: University of Alabama (where researchers are investigating CIM techniques with a key focus on transferring their…
Public library computer training for older adults to access high-quality Internet health information
Xie, Bo; Bugg, Julie M.
2010-01-01
An innovative experiment to develop and evaluate a public library computer training program to teach older adults to access and use high-quality Internet health information involved a productive collaboration among public libraries, the National Institute on Aging and the National Library of Medicine of the National Institutes of Health (NIH), and a Library and Information Science (LIS) academic program at a state university. One hundred and thirty-one older adults aged 54–89 participated in the study between September 2007 and July 2008. Key findings include: a) participants had overwhelmingly positive perceptions of the training program; b) after learning about two NIH websites (http://nihseniorhealth.gov and http://medlineplus.gov) from the training, many participants started using these online resources to find high quality health and medical information and, further, to guide their decision-making regarding a health- or medically-related matter; and c) computer anxiety significantly decreased (p < .001) while computer interest and efficacy significantly increased (p = .001 and p < .001, respectively) from pre- to post-training, suggesting statistically significant improvements in computer attitudes between pre- and post-training. The findings have implications for public libraries, LIS academic programs, and other organizations interested in providing similar programs in their communities. PMID:20161649
U.S. Spacesuit Knowledge Capture Accomplishments in Fiscal Years 2012 and 2013
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Oliva, Vladenka R.
2014-01-01
The NASA U.S. Spacesuit Knowledge Capture (KC) program has existed since the beginning of 2008. The program was designed to augment engineers and other technical team members with historical spacesuit information to add to their understanding of the spacesuit, its evolution, its limitations, and its capabilities. Over 40 seminars have captured spacesuit history and knowledge over the last six years of the program's existence. Subject matter experts have provided lectures and some were interviewed to help bring the spacesuit to life so that lessons learned will never be lost. As well, the program concentrated in reaching out to the public and industry by making the recorded events part of the public domain through the NASA technical library through YouTube media. The U.S. Spacesuit KC topics have included lessons learned from some of the most prominent spacesuit experts and spacesuit users including current and former astronauts. The events have enriched the spacesuit legacy knowledge from Gemini, Apollo, Skylab, Space Shuttle and International Space Station Programs. As well, expert engineers and scientists have shared their challenges and successes to be remembered. Based on evidence by the thousands of people who have viewed the recordings online, the last few years have been some of the most successful years of the KC program's life with numerous digital recordings and public releases. This paper reviews the events accomplished and archived over Fiscal Years 2012 and 2013 and highlights a few of the most memorable ones. This paper also communicates ways to access the events that are available internally on the NASA domain as well as those released on the public domain.
Seismpol_ a visual-basic computer program for interactive and automatic earthquake waveform analysis
NASA Astrophysics Data System (ADS)
Patanè, Domenico; Ferrari, Ferruccio
1997-11-01
A Microsoft Visual-Basic computer program for waveform analysis of seismic signals is presented. The program combines interactive and automatic processing of digital signals using data recorded by three-component seismic stations. The analysis procedure can be used in either an interactive earthquake analysis or an automatic on-line processing of seismic recordings. The algorithm works in the time domain using the Covariance Matrix Decomposition method (CMD), so that polarization characteristics may be computed continuously in real time and seismic phases can be identified and discriminated. Visual inspection of the particle motion in hortogonal planes of projection (hodograms) reduces the danger of misinterpretation derived from the application of the polarization filter. The choice of time window and frequency intervals improves the quality of the extracted polarization information. In fact, the program uses a band-pass Butterworth filter to process the signals in the frequency domain by analysis of a selected signal window into a series of narrow frequency bands. Significant results supported by well defined polarizations and source azimuth estimates for P and S phases are also obtained for short-period seismic events (local microearthquakes).
Digital optical computers at the optoelectronic computing systems center
NASA Technical Reports Server (NTRS)
Jordan, Harry F.
1991-01-01
The Digital Optical Computing Program within the National Science Foundation Engineering Research Center for Opto-electronic Computing Systems has as its specific goal research on optical computing architectures suitable for use at the highest possible speeds. The program can be targeted toward exploiting the time domain because other programs in the Center are pursuing research on parallel optical systems, exploiting optical interconnection and optical devices and materials. Using a general purpose computing architecture as the focus, we are developing design techniques, tools and architecture for operation at the speed of light limit. Experimental work is being done with the somewhat low speed components currently available but with architectures which will scale up in speed as faster devices are developed. The design algorithms and tools developed for a general purpose, stored program computer are being applied to other systems such as optimally controlled optical communication networks.
Federal Public Health Workforce Development: An Evidence-Based Approach for Defining Competencies.
Mumford, Karen; Young, Andrea C; Nawaz, Saira
2016-01-01
This study reports the use of exploratory factor analysis to describe essential skills and knowledge for an important segment of the domestic public health workforce-Centers for Disease Control and Prevention (CDC) project officers-using an evidence-based approach to competency development and validation. A multicomponent survey was conducted. Exploratory factor analysis was used to examine the underlying domains and relationships between competency domains and key behaviors. The Cronbach α coefficient determined the reliability of the overall scale and identified factors. All domestic (US state, tribe, local, and territorial) grantees who received funding from the CDC during fiscal year 2011 to implement nonresearch prevention or intervention programs were invited to participate in a Web-based questionnaire. A total of 34 key behaviors representing knowledge, skills, and abilities, grouped in 7 domains-communication, grant administration and management, public health applied science and knowledge, program planning and development, program management, program monitoring and improvement, and organizational consultation-were examined. There were 795 responses (58% response rate). A total of 6 factors were identified with loadings of 0.40 or more for all 34 behavioral items. The Cronbach α coefficient was 0.95 overall and ranged between 0.73 and 0.91 for the factors. This study provides empirical evidence for the construct validity of 6 competencies and 34 key behaviors important for CDC project officers and serves as an important first step to evidence-driven workforce development efforts in public health.
Methods and principles for determining task dependent interface content
NASA Technical Reports Server (NTRS)
Shalin, Valerie L.; Geddes, Norman D.; Mikesell, Brian G.
1992-01-01
Computer generated information displays provide a promising technology for offsetting the increasing complexity of the National Airspace System. To realize this promise, however, we must extend and adapt the domain-dependent knowledge that informally guides the design of traditional dedicated displays. In our view, the successful exploitation of computer generated displays revolves around the idea of information management, that is, the identification, organization, and presentation of relevant and timely information in a complex task environment. The program of research that is described leads to methods and principles for information management in the domain of commercial aviation. The multi-year objective of the proposed program of research is to develop methods and principles for determining task dependent interface content.
EMGAN: A computer program for time and frequency domain reduction of electromyographic data
NASA Technical Reports Server (NTRS)
Hursta, W. N.
1975-01-01
An experiment in electromyography utilizing surface electrode techniques was developed for the Apollo-Soyuz test project. This report describes the computer program, EMGAN, which was written to provide first order data reduction for the experiment. EMG signals are produced by the membrane depolarization of muscle fibers during a muscle contraction. Surface electrodes detect a spatially summated signal from a large number of muscle fibers commonly called an interference pattern. An interference pattern is usually so complex that analysis through signal morphology is extremely difficult if not impossible. It has become common to process EMG interference patterns in the frequency domain. Muscle fatigue and certain myopathic conditions are recognized through changes in muscle frequency spectra.
SH2 Ligand Prediction-Guidance for In-Silico Screening.
Li, Shawn S C; Li, Lei
2017-01-01
Systematic identification of binding partners for SH2 domains is important for understanding the biological function of the corresponding SH2 domain-containing proteins. Here, we describe two different web-accessible computer programs, SMALI and DomPep, for predicting binding ligands for SH2 domains. The former was developed using a Scoring Matrix method and the latter based on the Support Vector Machine model.
ERIC Educational Resources Information Center
Gemignani, Michael
1981-01-01
The concept of computer programs is discussed from many perspectives and shown to be many different things. The ambiguity of the term is reviewed in light of potential ramifications for computer specialists, attorneys, and the general public. (MP)
ERIC Educational Resources Information Center
de la Torre, Jose Garcia; Cifre, Jose G. Hernandez; Martinez, M. Carmen Lopez
2008-01-01
This paper describes a computational exercise at undergraduate level that demonstrates the employment of Monte Carlo simulation to study the conformational statistics of flexible polymer chains, and to predict solution properties. Three simple chain models, including excluded volume interactions, have been implemented in a public-domain computer…
Computer-Based Information System Cultivated To Support a College of Education.
ERIC Educational Resources Information Center
Smith, Gary R.
This brief paper discusses four of the computer applications explored at Wayne State University over the past decade to provide alternative solutions to problems commonly encountered in teacher education and in providing support for the classroom teacher. These studies examined only databases that are available in the public domain; obtained…
Computer-Focused Russian Bilingual Instructional Program, 1988-89. OREA Report.
ERIC Educational Resources Information Center
Berney, Tomi D.; Gritzer, Glenn
In its fourth year, the computer-Focused Russian Bilingual Instructional Program provided instructional and support activities to 276 Russian-speaking students, most of whom are limited English proficient, at 4 public and 2 private high schools in Brooklyn. Instructional activities varied by site. Public school students took English as a Second…
Identifying the impact of G-quadruplexes on Affymetrix 3' arrays using cloud computing.
Memon, Farhat N; Owen, Anne M; Sanchez-Graillet, Olivia; Upton, Graham J G; Harrison, Andrew P
2010-01-15
A tetramer quadruplex structure is formed by four parallel strands of DNA/ RNA containing runs of guanine. These quadruplexes are able to form because guanine can Hoogsteen hydrogen bond to other guanines, and a tetrad of guanines can form a stable arrangement. Recently we have discovered that probes on Affymetrix GeneChips that contain runs of guanine do not measure gene expression reliably. We associate this finding with the likelihood that quadruplexes are forming on the surface of GeneChips. In order to cope with the rapidly expanding size of GeneChip array datasets in the public domain, we are exploring the use of cloud computing to replicate our experiments on 3' arrays to look at the effect of the location of G-spots (runs of guanines). Cloud computing is a recently introduced high-performance solution that takes advantage of the computational infrastructure of large organisations such as Amazon and Google. We expect that cloud computing will become widely adopted because it enables bioinformaticians to avoid capital expenditure on expensive computing resources and to only pay a cloud computing provider for what is used. Moreover, as well as financial efficiency, cloud computing is an ecologically-friendly technology, it enables efficient data-sharing and we expect it to be faster for development purposes. Here we propose the advantageous use of cloud computing to perform a large data-mining analysis of public domain 3' arrays.
Radar target classification studies: Software development and documentation
NASA Astrophysics Data System (ADS)
Kamis, A.; Garber, F.; Walton, E.
1985-09-01
Three computer programs were developed to process and analyze calibrated radar returns. The first program, called DATABASE, was developed to create and manage a random accessed data base. The second program, called FTRAN DB, was developed to process horizontal and vertical polarizations radar returns into different formats (i.e., time domain, circular polarizations and polarization parameters). The third program, called RSSE, was developed to simulate a variety of radar systems and to evaluate their ability to identify radar returns. Complete computer listings are included in the appendix volumes.
An Ada programming support environment
NASA Technical Reports Server (NTRS)
Tyrrill, AL; Chan, A. David
1986-01-01
The toolset of an Ada Programming Support Environment (APSE) being developed at North American Aircraft Operations (NAAO) of Rockwell International, is described. The APSE is resident on three different hosts and must support developments for the hosts and for embedded targets. Tools and developed software must be freely portable between the hosts. The toolset includes the usual editors, compilers, linkers, debuggers, configuration magnagers, and documentation tools. Generally, these are being supplied by the host computer vendors. Other tools, for example, pretty printer, cross referencer, compilation order tool, and management tools were obtained from public-domain sources, are implemented in Ada and are being ported to the hosts. Several tools being implemented in-house are of interest, these include an Ada Design Language processor based on compilable Ada. A Standalone Test Environment Generator facilitates test tool construction and partially automates unit level testing. A Code Auditor/Static Analyzer permits the Ada programs to be evaluated against measures of quality. An Ada Comment Box Generator partially automates generation of header comment boxes.
Banta, E.R.; Hill, M.C.; Poeter, E.; Doherty, J.E.; Babendreier, J.
2008-01-01
The open-source, public domain JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) API (Application Programming Interface) provides conventions and Fortran-90 modules to develop applications (computer programs) for analyzing process models. The input and output conventions allow application users to access various applications and the analysis methods they embody with a minimum of time and effort. Process models simulate, for example, physical, chemical, and (or) biological systems of interest using phenomenological, theoretical, or heuristic approaches. The types of model analyses supported by the JUPITER API include, but are not limited to, sensitivity analysis, data needs assessment, calibration, uncertainty analysis, model discrimination, and optimization. The advantages provided by the JUPITER API for users and programmers allow for rapid programming and testing of new ideas. Application-specific coding can be in languages other than the Fortran-90 of the API. This article briefly describes the capabilities and utility of the JUPITER API, lists existing applications, and uses UCODE_2005 as an example.
Independent Verification and Validation Program
NASA Technical Reports Server (NTRS)
Bailey, Brandon T.
2015-01-01
Presentation to be given to European Space Agency counterparts to give an overview of NASA's IVV Program and the layout and structure of the Software Testing and Research laboratory maintained at IVV. Seeking STI-ITAR review due to the international audience. Most of the information has been presented to public audiences in the past, with some variations on data, or is in the public domain.
White, Mary C; Babcock, Frances; Hayes, Nikki S; Mariotto, Angela B; Wong, Faye L; Kohler, Betsy A; Weir, Hannah K
2017-12-15
Because cancer registry data provide a census of cancer cases, registry data can be used to: 1) define and monitor cancer incidence at the local, state, and national levels; 2) investigate patterns of cancer treatment; and 3) evaluate the effectiveness of public health efforts to prevent cancer cases and improve cancer survival. The purpose of this article is to provide a broad overview of the history of cancer surveillance programs in the United States, and illustrate the expanding ways in which cancer surveillance data are being made available and contributing to cancer control programs. The article describes the building of the cancer registry infrastructure and the successful coordination of efforts among the 2 federal agencies that support cancer registry programs, the Centers for Disease Control and Prevention and the National Cancer Institute, and the North American Association of Central Cancer Registries. The major US cancer control programs also are described, including the National Comprehensive Cancer Control Program, the National Breast and Cervical Cancer Early Detection Program, and the Colorectal Cancer Control Program. This overview illustrates how cancer registry data can inform public health actions to reduce disparities in cancer outcomes and may be instructional for a variety of cancer control professionals in the United States and in other countries. Cancer 2017;123:4969-76. Published 2017. This article is a U.S. Government work and is in the public domain in the USA. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.
NASA Technical Reports Server (NTRS)
Smith, R. E.; Pitts, J. I.; Lambiotte, J. J., Jr.
1978-01-01
The computer program FLO-22 for analyzing inviscid transonic flow past 3-D swept-wing configurations was modified to use vector operations and run on the STAR-100 computer. The vectorized version described herein was called FLO-22-V1. Vector operations were incorporated into Successive Line Over-Relaxation in the transformed horizontal direction. Vector relational operations and control vectors were used to implement upwind differencing at supersonic points. A high speed of computation and extended grid domain were characteristics of FLO-22-V1. The new program was not the optimal vectorization of Successive Line Over-Relaxation applied to transonic flow; however, it proved that vector operations can readily be implemented to increase the computation rate of the algorithm.
jInv: A Modular and Scalable Framework for Electromagnetic Inverse Problems
NASA Astrophysics Data System (ADS)
Belliveau, P. T.; Haber, E.
2016-12-01
Inversion is a key tool in the interpretation of geophysical electromagnetic (EM) data. Three-dimensional (3D) EM inversion is very computationally expensive and practical software for inverting large 3D EM surveys must be able to take advantage of high performance computing (HPC) resources. It has traditionally been difficult to achieve those goals in a high level dynamic programming environment that allows rapid development and testing of new algorithms, which is important in a research setting. With those goals in mind, we have developed jInv, a framework for PDE constrained parameter estimation problems. jInv provides optimization and regularization routines, a framework for user defined forward problems, and interfaces to several direct and iterative solvers for sparse linear systems. The forward modeling framework provides finite volume discretizations of differential operators on rectangular tensor product meshes and tetrahedral unstructured meshes that can be used to easily construct forward modeling and sensitivity routines for forward problems described by partial differential equations. jInv is written in the emerging programming language Julia. Julia is a dynamic language targeted at the computational science community with a focus on high performance and native support for parallel programming. We have developed frequency and time-domain EM forward modeling and sensitivity routines for jInv. We will illustrate its capabilities and performance with two synthetic time-domain EM inversion examples. First, in airborne surveys, which use many sources, we achieve distributed memory parallelism by decoupling the forward and inverse meshes and performing forward modeling for each source on small, locally refined meshes. Secondly, we invert grounded source time-domain data from a gradient array style induced polarization survey using a novel time-stepping technique that allows us to compute data from different time-steps in parallel. These examples both show that it is possible to invert large scale 3D time-domain EM datasets within a modular, extensible framework written in a high-level, easy to use programming language.
NASA Technical Reports Server (NTRS)
Shih, T. I.-P.; Bailey, R. T.; Nguyen, H. L.; Roelke, R. J.
1990-01-01
An efficient computer program, called GRID2D/3D was developed to generate single and composite grid systems within geometrically complex two- and three-dimensional (2- and 3-D) spatial domains that can deform with time. GRID2D/3D generates single grid systems by using algebraic grid generation methods based on transfinite interpolation in which the distribution of grid points within the spatial domain is controlled by stretching functions. All single grid systems generated by GRID2D/3D can have grid lines that are continuous and differentiable everywhere up to the second-order. Also, grid lines can intersect boundaries of the spatial domain orthogonally. GRID2D/3D generates composite grid systems by patching together two or more single grid systems. The patching can be discontinuous or continuous. For continuous composite grid systems, the grid lines are continuous and differentiable everywhere up to the second-order except at interfaces where different single grid systems meet. At interfaces where different single grid systems meet, the grid lines are only differentiable up to the first-order. For 2-D spatial domains, the boundary curves are described by using either cubic or tension spline interpolation. For 3-D spatial domains, the boundary surfaces are described by using either linear Coon's interpolation, bi-hyperbolic spline interpolation, or a new technique referred to as 3-D bi-directional Hermite interpolation. Since grid systems generated by algebraic methods can have grid lines that overlap one another, GRID2D/3D contains a graphics package for evaluating the grid systems generated. With the graphics package, the user can generate grid systems in an interactive manner with the grid generation part of GRID2D/3D. GRID2D/3D is written in FORTRAN 77 and can be run on any IBM PC, XT, or AT compatible computer. In order to use GRID2D/3D on workstations or mainframe computers, some minor modifications must be made in the graphics part of the program; no modifications are needed in the grid generation part of the program. This technical memorandum describes the theory and method used in GRID2D/3D.
Parallel mutual information estimation for inferring gene regulatory networks on GPUs
2011-01-01
Background Mutual information is a measure of similarity between two variables. It has been widely used in various application domains including computational biology, machine learning, statistics, image processing, and financial computing. Previously used simple histogram based mutual information estimators lack the precision in quality compared to kernel based methods. The recently introduced B-spline function based mutual information estimation method is competitive to the kernel based methods in terms of quality but at a lower computational complexity. Results We present a new approach to accelerate the B-spline function based mutual information estimation algorithm with commodity graphics hardware. To derive an efficient mapping onto this type of architecture, we have used the Compute Unified Device Architecture (CUDA) programming model to design and implement a new parallel algorithm. Our implementation, called CUDA-MI, can achieve speedups of up to 82 using double precision on a single GPU compared to a multi-threaded implementation on a quad-core CPU for large microarray datasets. We have used the results obtained by CUDA-MI to infer gene regulatory networks (GRNs) from microarray data. The comparisons to existing methods including ARACNE and TINGe show that CUDA-MI produces GRNs of higher quality in less time. Conclusions CUDA-MI is publicly available open-source software, written in CUDA and C++ programming languages. It obtains significant speedup over sequential multi-threaded implementation by fully exploiting the compute capability of commonly used CUDA-enabled low-cost GPUs. PMID:21672264
Cloud Computing for Complex Performance Codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Appel, Gordon John; Hadgu, Teklu; Klein, Brandon Thorin
This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.
Microsoft C#.NET program and electromagnetic depth sounding for large loop source
NASA Astrophysics Data System (ADS)
Prabhakar Rao, K.; Ashok Babu, G.
2009-07-01
A program, in the C# (C Sharp) language with Microsoft.NET Framework, is developed to compute the normalized vertical magnetic field of a horizontal rectangular loop source placed on the surface of an n-layered earth. The field can be calculated either inside or outside the loop. Five C# classes with member functions in each class are, designed to compute the kernel, Hankel transform integral, coefficients for cubic spline interpolation between computed values and the normalized vertical magnetic field. The program computes the vertical magnetic field in the frequency domain using the integral expressions evaluated by a combination of straightforward numerical integration and the digital filter technique. The code utilizes different object-oriented programming (OOP) features. It finally computes the amplitude and phase of the normalized vertical magnetic field. The computed results are presented for geometric and parametric soundings. The code is developed in Microsoft.NET visual studio 2003 and uses various system class libraries.
COSMIC: A catalog of selected computer programs
NASA Technical Reports Server (NTRS)
1980-01-01
Information is presented on various computer programs developed in the space program which are now available to the public. Many programs from the Department of Defense and selected software from other government agencies are also offered. Over 1500 programs in almost every technical or managerial discipline are available.
Li, Lixin; Losser, Travis; Yorke, Charles; Piltner, Reinhard
2014-09-03
Epidemiological studies have identified associations between mortality and changes in concentration of particulate matter. These studies have highlighted the public concerns about health effects of particulate air pollution. Modeling fine particulate matter PM2.5 exposure risk and monitoring day-to-day changes in PM2.5 concentration is a critical step for understanding the pollution problem and embarking on the necessary remedy. This research designs, implements and compares two inverse distance weighting (IDW)-based spatiotemporal interpolation methods, in order to assess the trend of daily PM2.5 concentration for the contiguous United States over the year of 2009, at both the census block group level and county level. Traditionally, when handling spatiotemporal interpolation, researchers tend to treat space and time separately and reduce the spatiotemporal interpolation problems to a sequence of snapshots of spatial interpolations. In this paper, PM2.5 data interpolation is conducted in the continuous space-time domain by integrating space and time simultaneously, using the so-called extension approach. Time values are calculated with the help of a factor under the assumption that spatial and temporal dimensions are equally important when interpolating a continuous changing phenomenon in the space-time domain. Various IDW-based spatiotemporal interpolation methods with different parameter configurations are evaluated by cross-validation. In addition, this study explores computational issues (computer processing speed) faced during implementation of spatiotemporal interpolation for huge data sets. Parallel programming techniques and an advanced data structure, named k-d tree, are adapted in this paper to address the computational challenges. Significant computational improvement has been achieved. Finally, a web-based spatiotemporal IDW-based interpolation application is designed and implemented where users can visualize and animate spatiotemporal interpolation results.
Li, Lixin; Losser, Travis; Yorke, Charles; Piltner, Reinhard
2014-01-01
Epidemiological studies have identified associations between mortality and changes in concentration of particulate matter. These studies have highlighted the public concerns about health effects of particulate air pollution. Modeling fine particulate matter PM2.5 exposure risk and monitoring day-to-day changes in PM2.5 concentration is a critical step for understanding the pollution problem and embarking on the necessary remedy. This research designs, implements and compares two inverse distance weighting (IDW)-based spatiotemporal interpolation methods, in order to assess the trend of daily PM2.5 concentration for the contiguous United States over the year of 2009, at both the census block group level and county level. Traditionally, when handling spatiotemporal interpolation, researchers tend to treat space and time separately and reduce the spatiotemporal interpolation problems to a sequence of snapshots of spatial interpolations. In this paper, PM2.5 data interpolation is conducted in the continuous space-time domain by integrating space and time simultaneously, using the so-called extension approach. Time values are calculated with the help of a factor under the assumption that spatial and temporal dimensions are equally important when interpolating a continuous changing phenomenon in the space-time domain. Various IDW-based spatiotemporal interpolation methods with different parameter configurations are evaluated by cross-validation. In addition, this study explores computational issues (computer processing speed) faced during implementation of spatiotemporal interpolation for huge data sets. Parallel programming techniques and an advanced data structure, named k-d tree, are adapted in this paper to address the computational challenges. Significant computational improvement has been achieved. Finally, a web-based spatiotemporal IDW-based interpolation application is designed and implemented where users can visualize and animate spatiotemporal interpolation results. PMID:25192146
A novel potential/viscous flow coupling technique for computing helicopter flow fields
NASA Technical Reports Server (NTRS)
Summa, J. Michael; Strash, Daniel J.; Yoo, Sungyul
1993-01-01
The primary objective of this work was to demonstrate the feasibility of a new potential/viscous flow coupling procedure for reducing computational effort while maintaining solution accuracy. This closed-loop, overlapped velocity-coupling concept has been developed in a new two-dimensional code, ZAP2D (Zonal Aerodynamics Program - 2D), a three-dimensional code for wing analysis, ZAP3D (Zonal Aerodynamics Program - 3D), and a three-dimensional code for isolated helicopter rotors in hover, ZAPR3D (Zonal Aerodynamics Program for Rotors - 3D). Comparisons with large domain ARC3D solutions and with experimental data for a NACA 0012 airfoil have shown that the required domain size can be reduced to a few tenths of a percent chord for the low Mach and low angle of attack cases and to less than 2-5 chords for the high Mach and high angle of attack cases while maintaining solution accuracies to within a few percent. This represents CPU time reductions by a factor of 2-4 compared with ARC2D. The current ZAP3D calculation for a rectangular plan-form wing of aspect ratio 5 with an outer domain radius of about 1.2 chords represents a speed-up in CPU time over the ARC3D large domain calculation by about a factor of 2.5 while maintaining solution accuracies to within a few percent. A ZAPR3D simulation for a two-bladed rotor in hover with a reduced grid domain of about two chord lengths was able to capture the wake effects and compared accurately with the experimental pressure data. Further development is required in order to substantiate the promise of computational improvements due to the ZAPR3D coupling concept.
76 FR 11435 - Privacy Act of 1974; Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-02
... Security Administration. SUMMARY: Pursuant to the Computer Matching and Privacy Protection Act of 1988, Public Law 100-503, the Computer Matching and Privacy Protections Amendments of 1990, Pub. L. 101-508... Interpreting the Provisions of Public Law 100-503, the Computer Matching and Privacy Protection Act of 1988...
Ames Research Center publications: A continuing bibliography, 1980
NASA Technical Reports Server (NTRS)
1981-01-01
This bibliography lists formal NASA publications, journal articles, books, chapters of books, patents, contractor reports, and computer programs that were issued by Ames Research Center and indexed by Scientific and Technical Aerospace Reports, Limited Scientific and Technical Aerospace Reports, International Aerospace Abstracts, and Computer Program Abstracts in 1980. Citations are arranged by directorate, type of publication, and NASA accession numbers. Subject, personal author, corporate source, contract number, and report/accession number indexes are provided.
A Guide to IRUS-II Application Development
1989-09-01
Stallard (editors). Research and Develo; nent in Natural Language b’nderstan,;ng as Part of t/i Strategic Computing Program . chapter 3, pages 27-34...Development in Natural Language Processing in the Strategic Computing Program . Compi-nrional Linguistics 12(2):132-136. April-June, 1986. [24] Sidner. C.L...assist developers interested in adapting IRUS-11 to new application domains Chapter 2 provides a general introduction and overviev ,. Chapter 3 describes
An on-line system for hand-printed input
NASA Technical Reports Server (NTRS)
Williams, T. G.; Bebb, J.
1971-01-01
The capability of graphic input/output systems is described. Topics considered are a character recognizer and dictionary building program, an initial flow chart element input program, and a system entitled The Assistant Mathematician, which uses ordinary mathematics to specify numeric computation. All three parts are necessary to allow a user to carry on a mathematical dialogue with the computer in the language and notation of his discipline or problem domain.
Bringing computational science to the public.
McDonagh, James L; Barker, Daniel; Alderson, Rosanna G
2016-01-01
The increasing use of computers in science allows for the scientific analyses of large datasets at an increasing pace. We provided examples and interactive demonstrations at Dundee Science Centre as part of the 2015 Women in Science festival, to present aspects of computational science to the general public. We used low-cost Raspberry Pi computers to provide hands on experience in computer programming and demonstrated the application of computers to biology. Computer games were used as a means to introduce computers to younger visitors. The success of the event was evaluated by voluntary feedback forms completed by visitors, in conjunction with our own self-evaluation. This work builds on the original work of the 4273π bioinformatics education program of Barker et al. (2013, BMC Bioinform. 14:243). 4273π provides open source education materials in bioinformatics. This work looks at the potential to adapt similar materials for public engagement events. It appears, at least in our small sample of visitors (n = 13), that basic computational science can be conveyed to people of all ages by means of interactive demonstrations. Children as young as five were able to successfully edit simple computer programs with supervision. This was, in many cases, their first experience of computer programming. The feedback is predominantly positive, showing strong support for improving computational science education, but also included suggestions for improvement. Our conclusions are necessarily preliminary. However, feedback forms suggest methods were generally well received among the participants; "Easy to follow. Clear explanation" and "Very easy. Demonstrators were very informative." Our event, held at a local Science Centre in Dundee, demonstrates that computer games and programming activities suitable for young children can be performed alongside a more specialised and applied introduction to computational science for older visitors.
Choi, Hyungsuk; Choi, Woohyuk; Quan, Tran Minh; Hildebrand, David G C; Pfister, Hanspeter; Jeong, Won-Ki
2014-12-01
As the size of image data from microscopes and telescopes increases, the need for high-throughput processing and visualization of large volumetric data has become more pressing. At the same time, many-core processors and GPU accelerators are commonplace, making high-performance distributed heterogeneous computing systems affordable. However, effectively utilizing GPU clusters is difficult for novice programmers, and even experienced programmers often fail to fully leverage the computing power of new parallel architectures due to their steep learning curve and programming complexity. In this paper, we propose Vivaldi, a new domain-specific language for volume processing and visualization on distributed heterogeneous computing systems. Vivaldi's Python-like grammar and parallel processing abstractions provide flexible programming tools for non-experts to easily write high-performance parallel computing code. Vivaldi provides commonly used functions and numerical operators for customized visualization and high-throughput image processing applications. We demonstrate the performance and usability of Vivaldi on several examples ranging from volume rendering to image segmentation.
15 CFR 996.33 - Acceptance of program by non-Federal entities.
Code of Federal Regulations, 2014 CFR
2014-01-01
... ASSURANCE AND CERTIFICATION REQUIREMENTS FOR NOAA HYDROGRAPHIC PRODUCTS AND SERVICES QUALITY ASSURANCE AND CERTIFICATION REQUIREMENTS FOR NOAA HYDROGRAPHIC PRODUCTS AND SERVICES Other Quality Assurance Program Matters... information submitted to NOAA under this Program shall be deemed to be in the public domain, and no...
15 CFR 996.33 - Acceptance of program by non-Federal entities.
Code of Federal Regulations, 2013 CFR
2013-01-01
... ASSURANCE AND CERTIFICATION REQUIREMENTS FOR NOAA HYDROGRAPHIC PRODUCTS AND SERVICES QUALITY ASSURANCE AND CERTIFICATION REQUIREMENTS FOR NOAA HYDROGRAPHIC PRODUCTS AND SERVICES Other Quality Assurance Program Matters... information submitted to NOAA under this Program shall be deemed to be in the public domain, and no...
15 CFR 996.33 - Acceptance of program by non-Federal entities.
Code of Federal Regulations, 2012 CFR
2012-01-01
... ASSURANCE AND CERTIFICATION REQUIREMENTS FOR NOAA HYDROGRAPHIC PRODUCTS AND SERVICES QUALITY ASSURANCE AND CERTIFICATION REQUIREMENTS FOR NOAA HYDROGRAPHIC PRODUCTS AND SERVICES Other Quality Assurance Program Matters... information submitted to NOAA under this Program shall be deemed to be in the public domain, and no...
15 CFR 996.33 - Acceptance of program by non-Federal entities.
Code of Federal Regulations, 2011 CFR
2011-01-01
... ASSURANCE AND CERTIFICATION REQUIREMENTS FOR NOAA HYDROGRAPHIC PRODUCTS AND SERVICES QUALITY ASSURANCE AND CERTIFICATION REQUIREMENTS FOR NOAA HYDROGRAPHIC PRODUCTS AND SERVICES Other Quality Assurance Program Matters... information submitted to NOAA under this Program shall be deemed to be in the public domain, and no...
Application of the control volume mixed finite element method to a triangular discretization
Naff, R.L.
2012-01-01
A two-dimensional control volume mixed finite element method is applied to the elliptic equation. Discretization of the computational domain is based in triangular elements. Shape functions and test functions are formulated on the basis of an equilateral reference triangle with unit edges. A pressure support based on the linear interpolation of elemental edge pressures is used in this formulation. Comparisons are made between results from the standard mixed finite element method and this control volume mixed finite element method. Published 2011. This article is a US Government work and is in the public domain in the USA. ?? 2012 John Wiley & Sons, Ltd. This article is a US Government work and is in the public domain in the USA.
Plans for wind energy system simulation
NASA Technical Reports Server (NTRS)
Dreier, M. E.
1978-01-01
A digital computer code and a special purpose hybrid computer, were introduced. The digital computer program, the Root Perturbation Method or RPM, is an implementation of the classic floquet procedure which circumvents numerical problems associated with the extraction of Floquet roots. The hybrid computer, the Wind Energy System Time domain simulator (WEST), yields real time loads and deformation information essential to design and system stability investigations.
A knowledge-based approach to automated flow-field zoning for computational fluid dynamics
NASA Technical Reports Server (NTRS)
Vogel, Alison Andrews
1989-01-01
An automated three-dimensional zonal grid generation capability for computational fluid dynamics is shown through the development of a demonstration computer program capable of automatically zoning the flow field of representative two-dimensional (2-D) aerodynamic configurations. The applicability of a knowledge-based programming approach to the domain of flow-field zoning is examined. Several aspects of flow-field zoning make the application of knowledge-based techniques challenging: the need for perceptual information, the role of individual bias in the design and evaluation of zonings, and the fact that the zoning process is modeled as a constructive, design-type task (for which there are relatively few examples of successful knowledge-based systems in any domain). Engineering solutions to the problems arising from these aspects are developed, and a demonstration system is implemented which can design, generate, and output flow-field zonings for representative 2-D aerodynamic configurations.
Managing Watersheds with WMOST (Watershed Management Optimization Support Tool)
EPA’s Green Infrastructure research program and EPA Region 1 recently released a new public-domain software application, WMOST, which supports community applications of Integrated Water Resources Management (IWRM) principles (http://cfpub.epa.gov/si/si_public_record_report....
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khalsa, Siri Sahib; Ho, Clifford Kuofei
2010-04-01
A rigorous computational fluid dynamics (CFD) approach to calculating temperature distributions, radiative and convective losses, and flow fields in a cavity receiver irradiated by a heliostat field is typically limited to the receiver domain alone for computational reasons. A CFD simulation cannot realistically yield a precise solution that includes the details within the vast domain of an entire heliostat field in addition to the detailed processes and features within a cavity receiver. Instead, the incoming field irradiance can be represented as a boundary condition on the receiver domain. This paper describes a program, the Solar Patch Calculator, written in Microsoftmore » Excel VBA to characterize multiple beams emanating from a 'solar patch' located at the aperture of a cavity receiver, in order to represent the incoming irradiance from any field of heliostats as a boundary condition on the receiver domain. This program accounts for cosine losses; receiver location; heliostat reflectivity, areas and locations; field location; time of day and day of year. This paper also describes the implementation of the boundary conditions calculated by this program into a Discrete Ordinates radiation model using Ansys{reg_sign} FLUENT (www.fluent.com), and compares the results to experimental data and to results generated by the code DELSOL.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khalsa, Siri Sahib S.; Ho, Clifford Kuofei
2010-05-01
A rigorous computational fluid dynamics (CFD) approach to calculating temperature distributions, radiative and convective losses, and flow fields in a cavity receiver irradiated by a heliostat field is typically limited to the receiver domain alone for computational reasons. A CFD simulation cannot realistically yield a precise solution that includes the details within the vast domain of an entire heliostat field in addition to the detailed processes and features within a cavity receiver. Instead, the incoming field irradiance can be represented as a boundary condition on the receiver domain. This paper describes a program, the Solar Patch Calculator, written in Microsoftmore » Excel VBA to characterize multiple beams emanating from a 'solar patch' located at the aperture of a cavity receiver, in order to represent the incoming irradiance from any field of heliostats as a boundary condition on the receiver domain. This program accounts for cosine losses; receiver location; heliostat reflectivity, areas and locations; field location; time of day and day of year. This paper also describes the implementation of the boundary conditions calculated by this program into a Discrete Ordinates radiation model using Ansys{reg_sign} FLUENT (www.fluent.com), and compares the results to experimental data and to results generated by the code DELSOL.« less
CLIPS: The C language integrated production system
NASA Technical Reports Server (NTRS)
Riley, Gary
1994-01-01
Expert systems are computer programs which emulate human expertise in well defined problem domains. The potential payoff from expert systems is high: valuable expertise can be captured and preserved, repetitive and/or mundane tasks requiring human expertise can be automated, and uniformity can be applied in decision making processes. The C Language Integrated Production System (CLIPS) is an expert system building tool, developed at the Johnson Space Center, which provides a complete environment for the development and delivery of rule and/or object based expert systems. CLIPS was specifically designed to provide a low cost option for developing and deploying expert system applications across a wide range of hardware platforms. The commercial potential of CLIPS is vast. Currently, CLIPS is being used by over 5,000 individuals throughout the public and private sector. Because the CLIPS source code is readily available, numerous groups have used CLIPS as the basis for their own expert system tools. To date, three commercially available tools have been derived from CLIPS. In general, the development of CLIPS has helped to improve the ability to deliver expert system technology throughout the public and private sectors for a wide range of applications and diverse computing environments.
Cognitive training in Parkinson disease: cognition-specific vs nonspecific computer training.
Zimmermann, Ronan; Gschwandtner, Ute; Benz, Nina; Hatz, Florian; Schindler, Christian; Taub, Ethan; Fuhr, Peter
2014-04-08
In this study, we compared a cognition-specific computer-based cognitive training program with a motion-controlled computer sports game that is not cognition-specific for their ability to enhance cognitive performance in various cognitive domains in patients with Parkinson disease (PD). Patients with PD were trained with either a computer program designed to enhance cognition (CogniPlus, 19 patients) or a computer sports game with motion-capturing controllers (Nintendo Wii, 20 patients). The effect of training in 5 cognitive domains was measured by neuropsychological testing at baseline and after training. Group differences over all variables were assessed with multivariate analysis of variance, and group differences in single variables were assessed with 95% confidence intervals of mean difference. The groups were similar regarding age, sex, and educational level. Patients with PD who were trained with Wii for 4 weeks performed better in attention (95% confidence interval: -1.49 to -0.11) than patients trained with CogniPlus. In our study, patients with PD derived at least the same degree of cognitive benefit from non-cognition-specific training involving movement as from cognition-specific computerized training. For patients with PD, game consoles may be a less expensive and more entertaining alternative to computer programs specifically designed for cognitive training. This study provides Class III evidence that, in patients with PD, cognition-specific computer-based training is not superior to a motion-controlled computer game in improving cognitive performance.
Lowering industry firewalls: pre-competitive informatics initiatives in drug discovery.
Barnes, Michael R; Harland, Lee; Foord, Steven M; Hall, Matthew D; Dix, Ian; Thomas, Scott; Williams-Jones, Bryn I; Brouwer, Cory R
2009-09-01
Pharmaceutical research and development is facing substantial challenges that have prompted the industry to shift funding from early- to late-stage projects. Among the effects is a major change in the attitude of many companies to their internal bioinformatics resources: the focus has moved from the vigorous pursuit of intellectual property towards exploration of pre-competitive cross-industry collaborations and engagement with the public domain. High-quality, open and accessible data are the foundation of pre-competitive research, and strong public-private partnerships have considerable potential to enhance public data resources, which would benefit everyone engaged in drug discovery. In this article, we discuss the background to these changes and propose new areas of collaboration in computational biology and chemistry between the public domain and the pharmaceutical industry.
45 CFR 79.27 - Computation of time.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 45 Public Welfare 1 2010-10-01 2010-10-01 false Computation of time. 79.27 Section 79.27 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION PROGRAM FRAUD CIVIL REMEDIES § 79.27 Computation of time. (a) In computing any period of time under this part or in an order issued...
CCARES: A computer algorithm for the reliability analysis of laminated CMC components
NASA Technical Reports Server (NTRS)
Duffy, Stephen F.; Gyekenyesi, John P.
1993-01-01
Structural components produced from laminated CMC (ceramic matrix composite) materials are being considered for a broad range of aerospace applications that include various structural components for the national aerospace plane, the space shuttle main engine, and advanced gas turbines. Specifically, these applications include segmented engine liners, small missile engine turbine rotors, and exhaust nozzles. Use of these materials allows for improvements in fuel efficiency due to increased engine temperatures and pressures, which in turn generate more power and thrust. Furthermore, this class of materials offers significant potential for raising the thrust-to-weight ratio of gas turbine engines by tailoring directions of high specific reliability. The emerging composite systems, particularly those with silicon nitride or silicon carbide matrix, can compete with metals in many demanding applications. Laminated CMC prototypes have already demonstrated functional capabilities at temperatures approaching 1400 C, which is well beyond the operational limits of most metallic materials. Laminated CMC material systems have several mechanical characteristics which must be carefully considered in the design process. Test bed software programs are needed that incorporate stochastic design concepts that are user friendly, computationally efficient, and have flexible architectures that readily incorporate changes in design philosophy. The CCARES (Composite Ceramics Analysis and Reliability Evaluation of Structures) program is representative of an effort to fill this need. CCARES is a public domain computer algorithm, coupled to a general purpose finite element program, which predicts the fast fracture reliability of a structural component under multiaxial loading conditions.
Over the last 10 years the EPA has invested in analytic elements as a computational method used in public domain software supporting capture zone delineation for source water assessments and wellhead protection. The current release is called WhAEM2000 (wellhead analytic element ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brun, B.
1997-07-01
Computer technology has improved tremendously during the last years with larger media capacity, memory and more computational power. Visual computing with high-performance graphic interface and desktop computational power have changed the way engineers accomplish everyday tasks, development and safety studies analysis. The emergence of parallel computing will permit simulation over a larger domain. In addition, new development methods, languages and tools have appeared in the last several years.
Concepts of Concurrent Programming
1990-04-01
to the material presented. Carriero89 Carriero, N., and Gelernter, D. " How to Write Parallel Programs : A Guide to the Perplexed." ACM...between the architectures on which programs can be executed and the application domains from which problems are drawn. Our goal is to show how programs ...Sept. 1989), 251-510. Abstract: There are four papers: 1. Programming Languages for Distributed Computing Systems (52); 2. How to Write Parallel
Chapter 1: Biomedical knowledge integration.
Payne, Philip R O
2012-01-01
The modern biomedical research and healthcare delivery domains have seen an unparalleled increase in the rate of innovation and novel technologies over the past several decades. Catalyzed by paradigm-shifting public and private programs focusing upon the formation and delivery of genomic and personalized medicine, the need for high-throughput and integrative approaches to the collection, management, and analysis of heterogeneous data sets has become imperative. This need is particularly pressing in the translational bioinformatics domain, where many fundamental research questions require the integration of large scale, multi-dimensional clinical phenotype and bio-molecular data sets. Modern biomedical informatics theory and practice has demonstrated the distinct benefits associated with the use of knowledge-based systems in such contexts. A knowledge-based system can be defined as an intelligent agent that employs a computationally tractable knowledge base or repository in order to reason upon data in a targeted domain and reproduce expert performance relative to such reasoning operations. The ultimate goal of the design and use of such agents is to increase the reproducibility, scalability, and accessibility of complex reasoning tasks. Examples of the application of knowledge-based systems in biomedicine span a broad spectrum, from the execution of clinical decision support, to epidemiologic surveillance of public data sets for the purposes of detecting emerging infectious diseases, to the discovery of novel hypotheses in large-scale research data sets. In this chapter, we will review the basic theoretical frameworks that define core knowledge types and reasoning operations with particular emphasis on the applicability of such conceptual models within the biomedical domain, and then go on to introduce a number of prototypical data integration requirements and patterns relevant to the conduct of translational bioinformatics that can be addressed via the design and use of knowledge-based systems.
Chapter 1: Biomedical Knowledge Integration
Payne, Philip R. O.
2012-01-01
The modern biomedical research and healthcare delivery domains have seen an unparalleled increase in the rate of innovation and novel technologies over the past several decades. Catalyzed by paradigm-shifting public and private programs focusing upon the formation and delivery of genomic and personalized medicine, the need for high-throughput and integrative approaches to the collection, management, and analysis of heterogeneous data sets has become imperative. This need is particularly pressing in the translational bioinformatics domain, where many fundamental research questions require the integration of large scale, multi-dimensional clinical phenotype and bio-molecular data sets. Modern biomedical informatics theory and practice has demonstrated the distinct benefits associated with the use of knowledge-based systems in such contexts. A knowledge-based system can be defined as an intelligent agent that employs a computationally tractable knowledge base or repository in order to reason upon data in a targeted domain and reproduce expert performance relative to such reasoning operations. The ultimate goal of the design and use of such agents is to increase the reproducibility, scalability, and accessibility of complex reasoning tasks. Examples of the application of knowledge-based systems in biomedicine span a broad spectrum, from the execution of clinical decision support, to epidemiologic surveillance of public data sets for the purposes of detecting emerging infectious diseases, to the discovery of novel hypotheses in large-scale research data sets. In this chapter, we will review the basic theoretical frameworks that define core knowledge types and reasoning operations with particular emphasis on the applicability of such conceptual models within the biomedical domain, and then go on to introduce a number of prototypical data integration requirements and patterns relevant to the conduct of translational bioinformatics that can be addressed via the design and use of knowledge-based systems. PMID:23300416
An innovative approach to compensator design
NASA Technical Reports Server (NTRS)
Mitchell, J. R.
1972-01-01
The primary goal is to present for a control system a computer-aided-compensator design technique from a frequency domain point of view. The thesis for developing this technique is to describe the open loop frequency response by n discrete frequency points which result in n functions of the compensator coefficients. Several of these functions are chosen so that the system specifications are properly portrayed; then mathematical programming is used to improve all of these functions which have values below minimum standards. In order to do this several definitions in regard to measuring the performance of a system in the frequency domain are given. Next, theorems which govern the number of compensator coefficients necessary to make improvements in a certain number of functions are proved. After this a mathematical programming tool for aiding in the solution of the problem is developed. Then for applying the constraint improvement algorithm generalized gradients for the constraints are derived. Finally, the necessary theory is incorporated in a computer program called CIP (compensator improvement program).
Developing an Undergraduate Public Health Introductory Core Course Series
Nelson-Hurwitz, Denise C.; Tagorda, Michelle; Kehl, Lisa; Buchthal, Opal V.; Braun, Kathryn L.
2018-01-01
The number of undergraduate public health education programs is increasing, but few publications provide examples of introductory public health courses that provide foundational knowledge and meet 2016 Council on Education in Public Health (CEPH) accreditation standards. This article presents the development and testing of a three-course, introductory series in public health at the University of Hawai‘i at Mānoa (UHM). Development was informed by best pedagogical practices in education, web review of existing programs, literature review, key informant interviews, and accreditation standards. Student mastery of required concepts, domains, and competencies is assessed through testing and class assignments. Data from course evaluations, students' exit questionnaires at graduation, and faculty feedback were used to continuously evolve and adapt the curriculum. The three-course series—including Introduction to Public Health, Public Health Issues in Hawai‘i, and Introduction to Global Health—was designed to provide incoming undergraduate public health students with a foundation in local, national, and global public health concepts and domains, while improving their skills in public health communication and information literacy. Data from class assignments, examinations, and later coursework suggest students are mastering the course materials and gaining required competencies. Data from course evaluation and exit questionnaires suggest that the students appreciate the series' approach and the challenge to apply course concepts locally and globally in subsequent courses. This foundational public health series provides a model for an introductory course series that can be implemented with existing resources by most programs, meets the new CEPH requirements, is well-received by students, and prepares students well for upper-division public health courses. PMID:29892596
Developing an Undergraduate Public Health Introductory Core Course Series.
Nelson-Hurwitz, Denise C; Tagorda, Michelle; Kehl, Lisa; Buchthal, Opal V; Braun, Kathryn L
2018-01-01
The number of undergraduate public health education programs is increasing, but few publications provide examples of introductory public health courses that provide foundational knowledge and meet 2016 Council on Education in Public Health (CEPH) accreditation standards. This article presents the development and testing of a three-course, introductory series in public health at the University of Hawai'i at Mānoa (UHM). Development was informed by best pedagogical practices in education, web review of existing programs, literature review, key informant interviews, and accreditation standards. Student mastery of required concepts, domains, and competencies is assessed through testing and class assignments. Data from course evaluations, students' exit questionnaires at graduation, and faculty feedback were used to continuously evolve and adapt the curriculum. The three-course series-including Introduction to Public Health, Public Health Issues in Hawai'i, and Introduction to Global Health-was designed to provide incoming undergraduate public health students with a foundation in local, national, and global public health concepts and domains, while improving their skills in public health communication and information literacy. Data from class assignments, examinations, and later coursework suggest students are mastering the course materials and gaining required competencies. Data from course evaluation and exit questionnaires suggest that the students appreciate the series' approach and the challenge to apply course concepts locally and globally in subsequent courses. This foundational public health series provides a model for an introductory course series that can be implemented with existing resources by most programs, meets the new CEPH requirements, is well-received by students, and prepares students well for upper-division public health courses.
A Multiphysics and Multiscale Software Environment for Modeling Astrophysical Systems
NASA Astrophysics Data System (ADS)
Portegies Zwart, Simon; McMillan, Steve; O'Nualláin, Breanndán; Heggie, Douglas; Lombardi, James; Hut, Piet; Banerjee, Sambaran; Belkus, Houria; Fragos, Tassos; Fregeau, John; Fuji, Michiko; Gaburov, Evghenii; Glebbeek, Evert; Groen, Derek; Harfst, Stefan; Izzard, Rob; Jurić, Mario; Justham, Stephen; Teuben, Peter; van Bever, Joris; Yaron, Ofer; Zemp, Marcel
We present MUSE, a software framework for tying together existing computational tools for different astrophysical domains into a single multiphysics, multiscale workload. MUSE facilitates the coupling of existing codes written in different languages by providing inter-language tools and by specifying an interface between each module and the framework that represents a balance between generality and computational efficiency. This approach allows scientists to use combinations of codes to solve highly-coupled problems without the need to write new codes for other domains or significantly alter their existing codes. MUSE currently incorporates the domains of stellar dynamics, stellar evolution and stellar hydrodynamics for a generalized stellar systems workload. MUSE has now reached a "Noah's Ark" milestone, with two available numerical solvers for each domain. MUSE can treat small stellar associations, galaxies and everything in between, including planetary systems, dense stellar clusters and galactic nuclei. Here we demonstrate an examples calculated with MUSE: the merger of two galaxies. In addition we demonstrate the working of MUSE on a distributed computer. The current MUSE code base is publicly available as open source at http://muse.li.
VOE Computer Programming: Scope and Sequence.
ERIC Educational Resources Information Center
Nashville - Davidson County Metropolitan Public Schools, TN.
This guide, which was written as an initial step in the development of a systemwide articulated curriculum sequence for all vocational programs within the Metropolitan Nashville Public School System, outlines the suggested scope and sequence of a 3-year program in computer programming. The guide consists of a course description; general course…
ERIC Educational Resources Information Center
Kozbelt, Aaron; Dexter, Scott; Dolese, Melissa; Meredith, Daniel; Ostrofsky, Justin
2015-01-01
We applied computer-based text analyses of regressive imagery to verbal protocols of individuals engaged in creative problem-solving in two domains: visual art (23 experts, 23 novices) and computer programming (14 experts, 14 novices). Percentages of words involving primary process and secondary process thought, plus emotion-related words, were…
Roth, Alexis M; Ackermann, Ronald T; Downs, Stephen M; Downs, Anne M; Zillich, Alan J; Holmes, Ann M; Katz, Barry P; Murray, Michael D; Inui, Thomas S
2010-06-01
In 2003, the Indiana Office of Medicaid Policy and Planning launched the Indiana Chronic Disease Management Program (ICDMP), a programme intended to improve the health and healthcare utilization of 15,000 Aged, Blind and Disabled Medicaid members living with diabetes and/or congestive heart failure in Indiana. Within ICDMP, programme components derived from the Chronic Care Model and education based on an integrated theoretical framework were utilized to create a telephonic care management intervention that was delivered by trained, non-clinical Care Managers (CMs) working under the supervision of a Registered Nurse. CMs utilized computer-assisted health education scripts to address clinically important topics, including medication adherence, diet, exercise and prevention of disease-specific complications. Employing reflective listening techniques, barriers to optimal self-management were assessed and members were encouraged to engage in health-improving actions. ICDMP evaluation results suggest that this low-intensity telephonic intervention shifted utilization and lowered costs. We discuss this patient-centred method for motivating behaviour change, the theoretical constructs underlying the scripts and the branched-logic format that makes them suitable to use as a computer-based application. Our aim is to share these public-domain materials with other programmes.
Frequency-Domain Identification Of Aeroelastic Modes
NASA Technical Reports Server (NTRS)
Acree, C. W., Jr.; Tischler, Mark B.
1991-01-01
Report describes flight measurements and frequency-domain analyses of aeroelastic vibrational modes of wings of XV-15 tilt-rotor aircraft. Begins with description of flight-test methods. Followed by brief discussion of methods of analysis, which include Fourier-transform computations using chirp z transformers, use of coherence and other spectral functions, and methods and computer programs to obtain frequencies and damping coefficients from measurements. Includes brief description of results of flight tests and comparisions among various experimental and theoretical results. Ends with section on conclusions and recommended improvements in techniques.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-21
... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2012-0067] Privacy Act of 1974; Computer Matching... Security Administration (SSA). ACTION: Notice of a renewal of an existing computer matching program... INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988 (Public Law (Pub. L.) 100-503...
The Generation Challenge Programme Platform: Semantic Standards and Workbench for Crop Science
Bruskiewich, Richard; Senger, Martin; Davenport, Guy; Ruiz, Manuel; Rouard, Mathieu; Hazekamp, Tom; Takeya, Masaru; Doi, Koji; Satoh, Kouji; Costa, Marcos; Simon, Reinhard; Balaji, Jayashree; Akintunde, Akinnola; Mauleon, Ramil; Wanchana, Samart; Shah, Trushar; Anacleto, Mylah; Portugal, Arllet; Ulat, Victor Jun; Thongjuea, Supat; Braak, Kyle; Ritter, Sebastian; Dereeper, Alexis; Skofic, Milko; Rojas, Edwin; Martins, Natalia; Pappas, Georgios; Alamban, Ryan; Almodiel, Roque; Barboza, Lord Hendrix; Detras, Jeffrey; Manansala, Kevin; Mendoza, Michael Jonathan; Morales, Jeffrey; Peralta, Barry; Valerio, Rowena; Zhang, Yi; Gregorio, Sergio; Hermocilla, Joseph; Echavez, Michael; Yap, Jan Michael; Farmer, Andrew; Schiltz, Gary; Lee, Jennifer; Casstevens, Terry; Jaiswal, Pankaj; Meintjes, Ayton; Wilkinson, Mark; Good, Benjamin; Wagner, James; Morris, Jane; Marshall, David; Collins, Anthony; Kikuchi, Shoshi; Metz, Thomas; McLaren, Graham; van Hintum, Theo
2008-01-01
The Generation Challenge programme (GCP) is a global crop research consortium directed toward crop improvement through the application of comparative biology and genetic resources characterization to plant breeding. A key consortium research activity is the development of a GCP crop bioinformatics platform to support GCP research. This platform includes the following: (i) shared, public platform-independent domain models, ontology, and data formats to enable interoperability of data and analysis flows within the platform; (ii) web service and registry technologies to identify, share, and integrate information across diverse, globally dispersed data sources, as well as to access high-performance computational (HPC) facilities for computationally intensive, high-throughput analyses of project data; (iii) platform-specific middleware reference implementations of the domain model integrating a suite of public (largely open-access/-source) databases and software tools into a workbench to facilitate biodiversity analysis, comparative analysis of crop genomic data, and plant breeding decision making. PMID:18483570
ERIC Educational Resources Information Center
Hawaii State Dept. of Education, Honolulu. Office of Instructional Services.
Intended to provide guidance in the selection of the best computer software available to support instruction and to make optimal use of schools' financial resources, this publication provides a listing of computer software programs that have been evaluated according to their currency, relevance, and value to Hawaii's educational programs. The…
An ecological and theoretical deconstruction of a school-based obesity prevention program in Mexico.
Safdie, Margarita; Cargo, Margaret; Richard, Lucie; Lévesque, Lucie
2014-08-10
Ecological intervention programs are recommended to prevent overweight and obesity in children. The National Institute of Public Health (INSP) in Mexico implemented a successful ecological intervention program to promote healthy lifestyle behaviors in school age children. This study assessed the integration of ecological principles and Social Cognitive Theory (SCT) constructs in this effective school-based obesity prevention program implemented in 15 elementary schools in Mexico City. Two coders applied the Intervention Analysis Procedure (IAP) to "map" the program's integration of ecological principles. A checklist gauged the use of SCT theory in program activities. Thirty-two distinct intervention strategies were implemented in one setting (i.e., school) to engage four different target-groups (students, parents, school representatives, government) across two domains (Nutrition and Physical Activity). Overall, 47.5% of the strategies targeted the school infrastructure and/or personnel; 37.5% of strategies targeted a key political actor, the Public Education Secretariat while fewer strategies targeted parents (12.5%) and children (3%). More strategies were implemented in the Nutrition domain (69%) than Physical Activity (31%). The most frequently used SCT construct within both intervention domains was Reciprocal Determinism (e.g., where changes to the environment influence changes in behavior and these behavioral changes influence further changes to the environment); no significant differences were observed in the use of SCT constructs across domains. Findings provide insight into a promising combination of strategies and theoretical constructs that can be used to implement a school-based obesity prevention program. Strategies emphasized school-level infrastructure/personnel change and strong political engagement and were most commonly underpinned by Reciprocal Determinism for both Nutrition and Physical Activity.
CIS Program Redesign Driven by IS2010 Model: A Case Study
ERIC Educational Resources Information Center
Surendran, Ken; Amer, Suhair; Schwieger, Dana
2012-01-01
The release of the IS2010 Model Curriculum has triggered review of existing Information Systems (IS) programs. It also provides an opportunity to replace low enrollment IS programs with flexible ones that focus on specific application domains. In this paper, the authors present a case study of their redesigned Computer Information Systems (CIS)…
NASA Astrophysics Data System (ADS)
Fabien-Ouellet, Gabriel; Gloaguen, Erwan; Giroux, Bernard
2017-03-01
Full Waveform Inversion (FWI) aims at recovering the elastic parameters of the Earth by matching recordings of the ground motion with the direct solution of the wave equation. Modeling the wave propagation for realistic scenarios is computationally intensive, which limits the applicability of FWI. The current hardware evolution brings increasing parallel computing power that can speed up the computations in FWI. However, to take advantage of the diversity of parallel architectures presently available, new programming approaches are required. In this work, we explore the use of OpenCL to develop a portable code that can take advantage of the many parallel processor architectures now available. We present a program called SeisCL for 2D and 3D viscoelastic FWI in the time domain. The code computes the forward and adjoint wavefields using finite-difference and outputs the gradient of the misfit function given by the adjoint state method. To demonstrate the code portability on different architectures, the performance of SeisCL is tested on three different devices: Intel CPUs, NVidia GPUs and Intel Xeon PHI. Results show that the use of GPUs with OpenCL can speed up the computations by nearly two orders of magnitudes over a single threaded application on the CPU. Although OpenCL allows code portability, we show that some device-specific optimization is still required to get the best performance out of a specific architecture. Using OpenCL in conjunction with MPI allows the domain decomposition of large models on several devices located on different nodes of a cluster. For large enough models, the speedup of the domain decomposition varies quasi-linearly with the number of devices. Finally, we investigate two different approaches to compute the gradient by the adjoint state method and show the significant advantages of using OpenCL for FWI.
2012-05-01
cloud computing 17 NASA Nebula Platform • Cloud computing pilot program at NASA Ames • Integrates open-source components into seamless, self...Mission support • Education and public outreach (NASA Nebula , 2010) 18 NSF Supported Cloud Research • Support for Cloud Computing in...Mell, P. & Grance, T. (2011). The NIST Definition of Cloud Computing. NIST Special Publication 800-145 • NASA Nebula (2010). Retrieved from
ATK Launch Systems Engineering NASA Programs Engineering Examples
NASA Technical Reports Server (NTRS)
Richardson, David
2007-01-01
This presentation provides an overview of the work done at ATK Launch Systems with and indication of how engineering knowledge can be applied to several real world problems. All material in the presentation has been screened to meet ITAR restrictions. The information provided is a compilation of general engineering knowledge and material available in the public domain. The presentation provides an overview of ATK Launch Systems and NASA programs. Some discussion is provided about the types of engineering conducted at the Promontory plant with added detail about RSRM nozzle engineering. Some brief examples of examples of nozzle technical issues with regard to adhesives and phenolics are shared. These technical issue discussions are based on material available in the public domain.
Core Competencies for Injury and Violence Prevention
Stephens-Stidham, Shelli; Peek-Asa, Corinne; Bou-Saada, Ingrid; Hunter, Wanda; Lindemer, Kristen; Runyan, Carol
2009-01-01
Efforts to reduce the burden of injury and violence require a workforce that is knowledgeable and skilled in prevention. However, there has been no systematic process to ensure that professionals possess the necessary competencies. To address this deficiency, we developed a set of core competencies for public health practitioners in injury and violence prevention programs. The core competencies address domains including public health significance, data, the design and implementation of prevention activities, evaluation, program management, communication, stimulating change, and continuing education. Specific learning objectives establish goals for training in each domain. The competencies assist in efforts to reduce the burden of injury and violence and can provide benchmarks against which to assess progress in professional capacity for injury and violence prevention. PMID:19197083
Code of Federal Regulations, 2012 CFR
2012-10-01
... 42 Public Health 1 2012-10-01 2012-10-01 false Definitions. 121.2 Section 121.2 Public Health... PROCUREMENT AND TRANSPLANTATION NETWORK § 121.2 Definitions. As used in this part— Act means the Public Health..., transplant recipient, or organ donor. OPTN computer match program means a set of computer-based instructions...
Code of Federal Regulations, 2011 CFR
2011-10-01
... 42 Public Health 1 2011-10-01 2011-10-01 false Definitions. 121.2 Section 121.2 Public Health... PROCUREMENT AND TRANSPLANTATION NETWORK § 121.2 Definitions. As used in this part— Act means the Public Health..., transplant recipient, or organ donor. OPTN computer match program means a set of computer-based instructions...
Advanced public transportation systems : the state of the art update 2000
DOT National Transportation Integrated Search
2000-12-01
This report documents work performed under FTA's Advanced Public Transportation Systems (APTS) Program, a program structured to undertake research and development of innovative applications of advanced navigation, communication, information, computer...
2013-01-01
Background In recent years, there have been numerous initiatives undertaken to describe critical information needs related to the collection, management, analysis, and dissemination of data in support of biomedical research (J Investig Med 54:327-333, 2006); (J Am Med Inform Assoc 16:316–327, 2009); (Physiol Genomics 39:131-140, 2009); (J Am Med Inform Assoc 18:354–357, 2011). A common theme spanning such reports has been the importance of understanding and optimizing people, organizational, and leadership factors in order to achieve the promise of efficient and timely research (J Am Med Inform Assoc 15:283–289, 2008). With the emergence of clinical and translational science (CTS) as a national priority in the United States, and the corresponding growth in the scale and scope of CTS research programs, the acuity of such information needs continues to increase (JAMA 289:1278–1287, 2003); (N Engl J Med 353:1621–1623, 2005); (Sci Transl Med 3:90, 2011). At the same time, systematic evaluations of optimal people, organizational, and leadership factors that influence the provision of data, information, and knowledge management technologies and methods are notably lacking. Methods In response to the preceding gap in knowledge, we have conducted both: 1) a structured survey of domain experts at Academic Health Centers (AHCs); and 2) a subsequent thematic analysis of public-domain documentation provided by those same organizations. The results of these approaches were then used to identify critical factors that may influence access to informatics expertise and resources relevant to the CTS domain. Results A total of 31 domain experts, spanning the Biomedical Informatics (BMI), Computer Science (CS), Information Science (IS), and Information Technology (IT) disciplines participated in a structured surveyprocess. At a high level, respondents identified notable differences in theaccess to BMI, CS, and IT expertise and services depending on the establishment of a formal BMI academic unit and the perceived relationship between BMI, CS, IS, and IT leaders. Subsequent thematic analysis of the aforementioned public domain documents demonstrated a discordance between perceived and reported integration across and between BMI, CS, IS, and IT programs and leaders with relevance to the CTS domain. Conclusion Differences in people, organization, and leadership factors do influence the effectiveness of CTS programs, particularly with regard to the ability to access and leverage BMI, CS, IS, and IT expertise and resources. Based on this finding, we believe that the development of a better understanding of how optimal BMI, CS, IS, and IT organizational structures and leadership models are designed and implemented is critical to both the advancement of CTS and ultimately, to improvements in the quality, safety, and effectiveness of healthcare. PMID:23388243
ERIC Educational Resources Information Center
Chamberlain, Ed
A cost benefit study was conducted to determine the effectiveness of a computer assisted instruction/computer management system (CAI/CMS) as an alternative to conventional methods of teaching reading within Chapter 1 and DPPF funded programs of the Columbus (Ohio) Public Schools. The Chapter 1 funded Compensatory Language Experiences and Reading…
Diamond High Assurance Security Program: Trusted Computing Exemplar
2002-09-01
computing component, the Embedded MicroKernel Prototype. A third-party evaluation of the component will be initiated during development (e.g., once...target technologies and larger projects is a topic for future research. Trusted Computing Reference Component – The Embedded MicroKernel Prototype We...Kernel The primary security function of the Embedded MicroKernel will be to enforce process and data-domain separation, while providing primitive
Application of ubiquitous computing in personal health monitoring systems.
Kunze, C; Grossmann, U; Stork, W; Müller-Glaser, K D
2002-01-01
A possibility to significantly reduce the costs of public health systems is to increasingly use information technology. The Laboratory for Information Processing Technology (ITIV) at the University of Karlsruhe is developing a personal health monitoring system, which should improve health care and at the same time reduce costs by combining micro-technological smart sensors with personalized, mobile computing systems. In this paper we present how ubiquitous computing theory can be applied in the health-care domain.
ERIC Educational Resources Information Center
Warschauer, Mark; Arada, Kathleen; Zheng, Binbin
2010-01-01
Can daily access to laptop computers help students become better writers? Are such programs affordable? Evidence from the Inspired Writing program in Littleton Public Schools, Colorado, USA, provides a resounding yes to both questions. The program employs student netbooks, open-source software, cloud computing, and social media to help students in…
Advanced public transportation systems : the state of the art update of 1998
DOT National Transportation Integrated Search
1998-01-01
This report documents work performed under FTA's Advanced Public Transportation Systems (APTS) Program, a program structured to undertake research and development of innovative applications of advanced navigation, information, computer, and communica...
ERIC Educational Resources Information Center
Barrera-Osorio, Felipe; Linden, Leigh L.
2009-01-01
This paper presents the evaluation of the program Computers for Education. The program aims to integrate computers, donated by the private sector, into the teaching of language in public schools. The authors conduct a two-year randomized evaluation of the program using a sample of 97 schools and 5,201 children. Overall, the program seems to have…
ERIC Educational Resources Information Center
Veley, Victor F.; And Others
This report presents a master plan for the development of computer science and computer-related programs at Los Angeles Trade-Technical College for 1982 through 1985. Introductory material outlines the main elements of the plan: to analyze existing computer courses, to create new courses in Laser Technology, Genetic Engineering, and Robotics; and…
Computers and Children: Problems and Possibilities.
ERIC Educational Resources Information Center
Siegfried, Pat
1983-01-01
Discusses the use of computers by children, highlighting a definition of computer literacy, computer education in schools, computer software, microcomputers, programming languages, and public library involvement. Seven references and a 40-item bibliography are included. (EJS)
U.S. Spacesuit Knowledge Capture Accomplishments in Fiscal Years 2012 and 2013
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Oliva, Vladenka R.
2014-01-01
The NASA U.S. spacesuit knowledge capture (KC) program has been in operations since the beginning 2008. The program was designed to augment engineers and others with information about spacesuits in a historical way. A multitude of seminars have captured spacesuit history and knowledge over the last six years of the programs existence. Subject matter experts have provided lectures and were interviewed to help bring the spacesuit to life so that lessons learned will never be lost. As well, the program concentrated in reaching out to the public and industry by making the recorded events part of the public domain through the NASA technical library via You Tube media. The U.S. spacesuit KC topics have included lessons learned from some of the most prominent spacesuit experts and spacesuit users including current and former astronauts. The events have enriched the spacesuit legacy knowledge from Gemini, Apollo, Skylab, Space Shuttle and International Space Station Programs. As well, expert engineers and scientists have shared their challenges and successes to be remembered. The last few years have been some of the most successful years of the KC program program's life with numerous recordings and releases to the public. It is evidenced by the thousands that have view the recordings online. This paper reviews the events accomplished and archived over Fiscal Years 2012 and 2013 and highlights a few of the most memorable ones. This paper also communicates ways to access the events that are available internally to NASA as well as in the public domain.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 45 Public Welfare 2 2013-10-01 2012-10-01 true Retrospective budgeting; determining eligibility and computing the assistance payment in the initial one or two months. 233.24 Section 233.24 Public Welfare Regulations Relating to Public Welfare OFFICE OF FAMILY ASSISTANCE (ASSISTANCE PROGRAMS...
Code of Federal Regulations, 2010 CFR
2010-10-01
... 45 Public Welfare 2 2010-10-01 2010-10-01 false Retrospective budgeting; determining eligibility and computing the assistance payment in the initial one or two months. 233.24 Section 233.24 Public Welfare Regulations Relating to Public Welfare OFFICE OF FAMILY ASSISTANCE (ASSISTANCE PROGRAMS...
Code of Federal Regulations, 2010 CFR
2010-10-01
... 45 Public Welfare 2 2010-10-01 2010-10-01 false Retrospective budgeting; computing the assistance payment after the initial one or two months. 233.25 Section 233.25 Public Welfare Regulations Relating to Public Welfare OFFICE OF FAMILY ASSISTANCE (ASSISTANCE PROGRAMS), ADMINISTRATION FOR CHILDREN AND...
Code of Federal Regulations, 2013 CFR
2013-10-01
... 45 Public Welfare 2 2013-10-01 2012-10-01 true Retrospective budgeting; computing the assistance payment after the initial one or two months. 233.25 Section 233.25 Public Welfare Regulations Relating to Public Welfare OFFICE OF FAMILY ASSISTANCE (ASSISTANCE PROGRAMS), ADMINISTRATION FOR CHILDREN AND...
ERIC Educational Resources Information Center
Recker, Margaret M.; Pirolli, Peter
Students learning to program recursive LISP functions in a typical school-like lesson on recursion were observed. The typical lesson contains text and examples and involves solving a series of programming problems. The focus of this study is on students' learning strategies in new domains. In this light, a Soar computational model of…
Miller, J.J.
1982-01-01
The spectral analysis and filter program package is written in the BASIC language for the HP-9845T desktop computer. The program's main purpose is to perform spectral analyses on digitized time-domain data. In addition, band-pass filtering of the data can be performed in the time domain. Various other processes such as autocorrelation can be performed to the time domain data in order to precondition them for spectral analyses. The frequency domain data can also be transformed back into the time domain if desired. Any data can be displayed on the CRT in graphic form using a variety of plot routines. A hard copy can be obtained immediately using the internal thermal printer. Data can also be displayed in tabular form on the CRT or internal thermal printer or it can be stored permanently on a mass storage device like a tape or disk. A list of the processes performed in the order in which they occurred can be displayed at any time.
Mekhora, Keerin; Jalayondeja, Wattana; Jalayondeja, Chutima; Bhuanantanondh, Petcharatana; Dusadiisariyavong, Asadang; Upiriyasakul, Rujiret; Anuraktam, Khajornyod
2014-07-01
To develop an online, self-report questionnaire on computer work-related exposure (OSCWE) and to determine the internal consistency, face and content validity of the questionnaire. The online, self-report questionnaire was developed to determine the risk factors related to musculoskeletal disorders in computer users. It comprised five domains: personal, work-related, work environment, physical health and psychosocial factors. The questionnaire's content was validated by an occupational medical doctor and three physical therapy lecturers involved in ergonomic teaching. Twenty-five lay people examined the feasibility of computer-administered and the user-friendly language. The item correlation in each domain was analyzed by the internal consistency (Cronbach's alpha; alpha). The content of the questionnaire was considered congruent with the testing purposes. Eight hundred and thirty-five computer users at the PTT Exploration and Production Public Company Limited registered to the online self-report questionnaire. The internal consistency of the five domains was: personal (alpha = 0.58), work-related (alpha = 0.348), work environment (alpha = 0.72), physical health (alpha = 0.68) and psychosocial factor (alpha = 0.93). The findings suggested that the OSCWE had acceptable internal consistency for work environment and psychosocial factors. The OSCWE is available to use in population-based survey research among computer office workers.
Foresters' Metric Conversions program (version 1.0). [Computer program
Jefferson A. Palmer
1999-01-01
The conversion of scientific measurements has become commonplace in the fields of - engineering, research, and forestry. Foresters? Metric Conversions is a Windows-based computer program that quickly converts user-defined measurements from English to metric and from metric to English. Foresters? Metric Conversions was derived from the publication "Metric...
SYSTID - A flexible tool for the analysis of communication systems.
NASA Technical Reports Server (NTRS)
Dawson, C. T.; Tranter, W. H.
1972-01-01
Description of the System Time Domain Simulation (SYSTID) computer-aided analysis program which is specifically structured for communication systems analysis. The SYSTID program is user oriented so that very little knowledge of computer techniques and very little programming ability are required for proper application. The program is designed so that the user can go from a system block diagram to an accurate simulation by simply programming a single English language statement for each block in the system. The mathematical and functional models available in the SYSTID library are presented. An example problem is given which illustrates the ease of modeling communication systems. Examples of the outputs available are presented, and proposed improvements are summarized.
Yang, L. H.; Brooks III, E. D.; Belak, J.
1992-01-01
A molecular dynamics algorithm for performing large-scale simulations using the Parallel C Preprocessor (PCP) programming paradigm on the BBN TC2000, a massively parallel computer, is discussed. The algorithm uses a linked-cell data structure to obtain the near neighbors of each atom as time evoles. Each processor is assigned to a geometric domain containing many subcells and the storage for that domain is private to the processor. Within this scheme, the interdomain (i.e., interprocessor) communication is minimized.
The Vale rangeland rehabilitation program: the desert repaired in southeastern Oregon.
Harold F. Heady; James Bartolome
1977-01-01
Discusses the initiation, execution, and outcome of an 11-year program of range rehabilitation on public domain lands in southeastern Oregon. Initiated primarily to benefit the livestock industry, the investment of $10 million in range improvements also profoundly affected other multiple uses. The analysis of this large and successful program should serve as a useful...
1980-12-01
Commun- ications Corporation, Palo Alto, CA (March 1978). g. [Walter at al. 74] Walter, K.G. et al., " Primitive Models for Computer .. Security", ESD-TR...discussion is followed by a presenta- tion of the Kernel primitive operations upon these objects. All Kernel objects shall be referenced by a common...set of sizes. All process segments, regardless of domain, shall be manipulated by the same set of Kernel segment primitives . User domain segments
Ridgely, M Susan; Giard, Julienne; Shern, David; Mulkern, Virginia; Burnam, M Audrey
2002-01-01
Objective To develop an instrument to characterize public sector managed behavioral health care arrangements to capture key differences between managed and “unmanaged” care and among managed care arrangements. Study Design The instrument was developed by a multi-institutional group of collaborators with participation of an expert panel. Included are six domains predicted to have an impact on access, service utilization, costs, and quality. The domains are: characteristics of the managed care plan, enrolled population, benefit design, payment and risk arrangements, composition of provider networks, and accountability. Data are collected at three levels: managed care organization, subcontractor, and network of service providers. Data Collection Methods Data are collected through contract abstraction and key informant interviews. A multilevel coding scheme is used to organize the data into a matrix along key domains, which is then reviewed and verified by the key informants. Principal Findings This instrument can usefully differentiate between and among Medicaid fee-for-service programs and Medicaid managed care plans along key domains of interest. Beyond documenting basic features of the plans and providing contextual information, these data will support the refinement and testing of hypotheses about the impact of public sector managed care on access, quality, costs, and outcomes of care. Conclusions If managed behavioral health care research is to advance beyond simple case study comparisons, a well-conceptualized set of instruments is necessary. PMID:12236386
Using Computer Graphics in the 90's.
ERIC Educational Resources Information Center
Towne, Violet A.
Computer-Aided Design, a hands-on program for public school teachers, was first offered in the summer of 1987 as an outgrowth of a 1986 robotics training program. Area technology teachers needed computer-aided design (CAD) training because of a New York State Education system transition from the industrial arts curriculum to a new curriculum in…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-09
... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA-2009-0077] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Office of Personnel Management (OPM))--Match 1307 AGENCY: Social Security... INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988 (Public Law (Pub. L.) 100-503...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-09
... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA-2009-0066] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Internal Revenue Service (IRS))--Match 1305 AGENCY: Social Security... INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988 (Public Law (Pub. L.) 100-503...
Rocheta, Margarida; Dionísio, F Miguel; Fonseca, Luís; Pires, Ana M
2007-12-01
Paternity analysis using microsatellite information is a well-studied subject. These markers are ideal for parentage studies and fingerprinting, due to their high-discrimination power. This type of data is used to assign paternity, to compute the average selfing and outcrossing rates and to estimate the biparental inbreeding. There are several public domain programs that compute all this information from data. Most of the time, it is necessary to export data to some sort of format, feed it to the program and import the output to an Excel book for further processing. In this article we briefly describe a program referred from now on as Paternity Analysis in Excel (PAE), developed at IST and IBET (see the acknowledgments) that computes paternity candidates from data, and other information, from within Excel. In practice this means that the end user provides the data in an Excel sheet and, by pressing an appropriate button, obtains the results in another Excel sheet. For convenience PAE is divided into two modules. The first one is a filtering module that selects data from the sequencer and reorganizes it in a format appropriate to process paternity analysis, assuming certain conventions for the names of parents and offspring from the sequencer. The second module carries out the paternity analysis assuming that one parent is known. Both modules are written in Excel-VBA and can be obtained at the address (www.math.ist.utl.pt/~fmd/pa/pa.zip). They are free for non-commercial purposes and have been tested with different data and against different software (Cervus, FaMoz, and MLTR).
Description of an Introductory Learning Strategies Course for the Job Skills Educational Program.
ERIC Educational Resources Information Center
Murphy, Debra Ann; Derry, Sharon J.
The Job Skills Educational Program (JSEP), currently under development for the Army Research Institute, embeds learner strategies training within the context of a basic skills computer-assisted instruction curriculum. The curriculum is designed for low-ability soldiers, and consists largely of instruction in the domain of intellectual skills. An…
SLEEC: Semantics-Rich Libraries for Effective Exascale Computation. Final Technical Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Milind, Kulkarni
SLEEC (Semantics-rich Libraries for Effective Exascale Computation) was a project funded by the Department of Energy X-Stack Program, award number DE-SC0008629. The initial project period was September 2012–August 2015. The project was renewed for an additional year, expiring August 2016. Finally, the project received a no-cost extension, leading to a final expiry date of August 2017. Modern applications, especially those intended to run at exascale, are not written from scratch. Instead, they are built by stitching together various carefully-written, hand-tuned libraries. Correctly composing these libraries is difficult, but traditional compilers are unable to effectively analyze and transform across abstraction layers.more » Domain specific compilers integrate semantic knowledge into compilers, allowing them to transform applications that use particular domain-specific languages, or domain libraries. But they do not help when new domains are developed, or applications span multiple domains. SLEEC aims to fix these problems. To do so, we are building generic compiler and runtime infrastructures that are semantics-aware but not domain-specific. By performing optimizations related to the semantics of a domain library, the same infrastructure can be made generic and apply across multiple domains.« less
An innovative approach to compensator design
NASA Technical Reports Server (NTRS)
Mitchell, J. R.; Mcdaniel, W. L., Jr.
1973-01-01
The design is considered of a computer-aided-compensator for a control system from a frequency domain point of view. The design technique developed is based on describing the open loop frequency response by n discrete frequency points which result in n functions of the compensator coefficients. Several of these functions are chosen so that the system specifications are properly portrayed; then mathematical programming is used to improve all of these functions which have values below minimum standards. To do this, several definitions in regard to measuring the performance of a system in the frequency domain are given, e.g., relative stability, relative attenuation, proper phasing, etc. Next, theorems which govern the number of compensator coefficients necessary to make improvements in a certain number of functions are proved. After this a mathematical programming tool for aiding in the solution of the problem is developed. This tool is called the constraint improvement algorithm. Then for applying the constraint improvement algorithm generalized, gradients for the constraints are derived. Finally, the necessary theory is incorporated in a Computer program called CIP (compensator Improvement Program). The practical usefulness of CIP is demonstrated by two large system examples.
Spiral: Automated Computing for Linear Transforms
NASA Astrophysics Data System (ADS)
Püschel, Markus
2010-09-01
Writing fast software has become extraordinarily difficult. For optimal performance, programs and their underlying algorithms have to be adapted to take full advantage of the platform's parallelism, memory hierarchy, and available instruction set. To make things worse, the best implementations are often platform-dependent and platforms are constantly evolving, which quickly renders libraries obsolete. We present Spiral, a domain-specific program generation system for important functionality used in signal processing and communication including linear transforms, filters, and other functions. Spiral completely replaces the human programmer. For a desired function, Spiral generates alternative algorithms, optimizes them, compiles them into programs, and intelligently searches for the best match to the computing platform. The main idea behind Spiral is a mathematical, declarative, domain-specific framework to represent algorithms and the use of rewriting systems to generate and optimize algorithms at a high level of abstraction. Experimental results show that the code generated by Spiral competes with, and sometimes outperforms, the best available human-written code.
Knowledge-based public health situation awareness
NASA Astrophysics Data System (ADS)
Mirhaji, Parsa; Zhang, Jiajie; Srinivasan, Arunkumar; Richesson, Rachel L.; Smith, Jack W.
2004-09-01
There have been numerous efforts to create comprehensive databases from multiple sources to monitor the dynamics of public health and most specifically to detect the potential threats of bioterrorism before widespread dissemination. But there are not many evidences for the assertion that these systems are timely and dependable, or can reliably identify man made from natural incident. One must evaluate the value of so called 'syndromic surveillance systems' along with the costs involved in design, development, implementation and maintenance of such systems and the costs involved in investigation of the inevitable false alarms1. In this article we will introduce a new perspective to the problem domain with a shift in paradigm from 'surveillance' toward 'awareness'. As we conceptualize a rather different approach to tackle the problem, we will introduce a different methodology in application of information science, computer science, cognitive science and human-computer interaction concepts in design and development of so called 'public health situation awareness systems'. We will share some of our design and implementation concepts for the prototype system that is under development in the Center for Biosecurity and Public Health Informatics Research, in the University of Texas Health Science Center at Houston. The system is based on a knowledgebase containing ontologies with different layers of abstraction, from multiple domains, that provide the context for information integration, knowledge discovery, interactive data mining, information visualization, information sharing and communications. The modular design of the knowledgebase and its knowledge representation formalism enables incremental evolution of the system from a partial system to a comprehensive knowledgebase of 'public health situation awareness' as it acquires new knowledge through interactions with domain experts or automatic discovery of new knowledge.
ERIC Educational Resources Information Center
Conn, Richard
1983-01-01
The two-part ZCPR2 System (system itself and related utilities) runs in place of CP/M 2.2 CCP. These public domain programs provide functions similar to those provided by CCP, but offer many enhancements. ZCPR2 programs, features, documentation, and uses are discussed. Problems in using ZCPR2 are also discussed. (JN)
Wildlife software: procedures for publication of computer software
Samuel, M.D.
1990-01-01
Computers and computer software have become an integral part of the practice of wildlife science. Computers now play an important role in teaching, research, and management applications. Because of the specialized nature of wildlife problems, specific computer software is usually required to address a given problem (e.g., home range analysis). This type of software is not usually available from commercial vendors and therefore must be developed by those wildlife professionals with particular skill in computer programming. Current journal publication practices generally prevent a detailed description of computer software associated with new techniques. In addition, peer review of journal articles does not usually include a review of associated computer software. Thus, many wildlife professionals are usually unaware of computer software that would meet their needs or of major improvements in software they commonly use. Indeed most users of wildlife software learn of new programs or important changes only by word of mouth.
NASA Astrophysics Data System (ADS)
Mishchenko, Michael I.
2017-01-01
The second - revised and enlarged - edition of this popular monograph is co-authored by Michael Kahnert and is published as Volume 145 of the Springer Series in Optical Sciences. As in the first edition, the main emphasis is on the mathematics of electromagnetic scattering and on numerically exact computer solutions of the frequency-domain macroscopic Maxwell equations for particles with complex shapes. The book is largely centered on Green-function solution of relevant boundary value problems and the T-matrix methodology, although other techniques (the method of lines, integral equation methods, and Lippmann-Schwinger equations) are also covered. The first four chapters serve as a thorough overview of key theoretical aspects of electromagnetic scattering intelligible to readers with undergraduate training in mathematics. A separate chapter provides an instructive analysis of the Rayleigh hypothesis which is still viewed by many as a highly controversial aspect of electromagnetic scattering by nonspherical objects. Another dedicated chapter introduces basic quantities serving as optical observables in practical applications. A welcome extension of the first edition is the new chapter on group theoretical aspects of electromagnetic scattering by particles with discrete symmetries. An essential part of the book is the penultimate chapter describing in detail popular public-domain computer programs mieschka and Tsym which can be applied to a wide range of particle shapes. The final chapter provides a general overview of available literature on electromagnetic scattering by particles and gives useful reading advice.
Usage of Thin-Client/Server Architecture in Computer Aided Education
ERIC Educational Resources Information Center
Cimen, Caghan; Kavurucu, Yusuf; Aydin, Halit
2014-01-01
With the advances of technology, thin-client/server architecture has become popular in multi-user/single network environments. Thin-client is a user terminal in which the user can login to a domain and run programs by connecting to a remote server. Recent developments in network and hardware technologies (cloud computing, virtualization, etc.)…
Computer-Aided Design/Manufacturing (CAD/M) for High-Speed Interconnect.
1981-10-01
are frequency sensitive and hence lend themselves to frequency domain ananlysis . Most of the classical microwave analysis is handled in the frequency ...capability integrated into a time-domain analysis program. This approach allows determination of frequency -dependent transmission line (interconnect...the items to consider in any interconnect study is that of the frequency range of interest. This determines whether the interconnections must be treated
ERIC Educational Resources Information Center
Smyth, Carol B.; Grannell, Dorothy S.; Moore, Miriam
The Literacy Resource Center project, a program of the Wayne Township Public Library also known as the Morrisson-Reeves Library (Richmond, Indiana), involved recruitment, retention, coalition building, public awareness, training, basic literacy, collection development, tutoring, computer-assisted, other technology, employment oriented,…
ERIC Educational Resources Information Center
Cole, Lucy; Fraser, Ruth
The Columbia County Public Library (Lake City, Florida) conducted a project that involved recruitment, retention, public awareness, training, basic literacy, collection development, tutoring, computer- assisted, other technology, intergenerational/family, and English as a Second Language (ESL) programs. The project served a community of…
Campbell, Vincent
2009-03-01
Extinct animals have always been popular subjects for the media, in both fiction, and factual output. In recent years, a distinctive new type of factual television program has emerged in which computer generated imagery is used extensively to bring extinct animals back to life. Such has been the commercial audience success of these programs that they have generated some public and academic debates about their relative status as science, documentary, and entertainment, as well as about their reflection of trends in factual television production, and the aesthetic tensions in the application of new media technologies. Such discussions ignore a crucial contextual feature of computer generated extinct animal programs, namely the established tradition of paleoimagery. This paper examines a selection of extinct animal shows in terms of the dominant frames of the paleoimagery genre. The paper suggests that such an examination has two consequences. First, it allows for a more context-sensitive evaluation of extinct animal programs, acknowledging rather than ignoring relevant representational traditions. Second, it allows for an appraisal and evaluation of public and critical reception of extinct animal programs above and beyond the traditional debates about tensions between science, documentary, entertainment, and public understanding.
Weather Research and Forecasting Model with Vertical Nesting Capability
DOE Office of Scientific and Technical Information (OSTI.GOV)
2014-08-01
The Weather Research and Forecasting (WRF) model with vertical nesting capability is an extension of the WRF model, which is available in the public domain, from www.wrf-model.org. The new code modifies the nesting procedure, which passes lateral boundary conditions between computational domains in the WRF model. Previously, the same vertical grid was required on all domains, while the new code allows different vertical grids to be used on concurrently run domains. This new functionality improves WRF's ability to produce high-resolution simulations of the atmosphere by allowing a wider range of scales to be efficiently resolved and more accurate lateral boundarymore » conditions to be provided through the nesting procedure.« less
NASA Astrophysics Data System (ADS)
Liu, Shaoyong; Gu, Hanming; Tang, Yongjie; Bingkai, Han; Wang, Huazhong; Liu, Dingjin
2018-04-01
Angle-domain common image-point gathers (ADCIGs) can alleviate the limitations of common image-point gathers in an offset domain, and have been widely used for velocity inversion and amplitude variation with angle (AVA) analysis. We propose an effective algorithm for generating ADCIGs in transversely isotropic (TI) media based on the gradient of traveltime by Kirchhoff pre-stack depth migration (KPSDM), as the dynamic programming method for computing the traveltime in TI media would not suffer from the limitation of shadow zones and traveltime interpolation. Meanwhile, we present a specific implementation strategy for ADCIG extraction via KPSDM. Three major steps are included in the presented strategy: (1) traveltime computation using a dynamic programming approach in TI media; (2) slowness vector calculation by the gradient of a traveltime table calculated previously; (3) construction of illumination vectors and subsurface angles in the migration process. Numerical examples are included to demonstrate the effectiveness of our approach, which henceforce shows its potential application for subsequent tomographic velocity inversion and AVA.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-16
... assessment of 15-year-olds which focuses on assessing students science, mathematics, and reading literacy... domain. The field test will also include computer- based assessments in reading, mathematics, and...
Microcomputers and the future of epidemiology.
Dean, A G
1994-01-01
The Workshop on Microcomputers and the Future of Epidemiology was held March 8-9, 1993, at the Turner Conference Center, Atlanta, GA, with 130 public health professionals participating. The purpose of the workshop was to define microcomputer needs in epidemiology and to propose future initiatives. Thirteen groups representing public health disciplines defined their needs for better and more useful data, development of computer technology appropriate to epidemiology, user support and human infrastructure development, and global communication and planning. Initiatives proposed were demonstration of health surveillance systems, new software and hardware, computer-based training, projects to establish or improve data bases and community access to data bases, improved international communication, conferences on microcomputer use in particular disciplines, a suggestion to encourage competition in the production of public-domain software, and longrange global planning for epidemiologic computing and data management. Other interested groups are urged to study, modify, and implement those ideas. PMID:7910692
Nature-Computer Camp 1991. Chapter 2 Program Evaluation Report.
ERIC Educational Resources Information Center
District of Columbia Public Schools, Washington, DC. Dept. of Research and Evaluation.
The District of Columbia Public Schools Nature Computer Camp (NCC) is an environmental/computer program which has been operating in the Catoctin Mountain Park (Maryland) since 1983. The camp operates for five one-week sessions serving a total of 406 regular sixth-grade students representing 84 elementary schools with an average of 81 students per…
The Nature-Computer Camp. Final Evaluation Report, 1982-1983. E.C.I.A. Chapter 2.
ERIC Educational Resources Information Center
District of Columbia Public Schools, Washington, DC. Div. of Quality Assurance.
This report presents a description and evaluation of the Nature-Computer Camp (NCC), an environmental and computer science program designed for sixth grade students in the District of Columbia public schools. Among the major components of the program were: planning for administration of operating the camp and for instruction in environmental…
NASA Technical Reports Server (NTRS)
Brentner, K. S.
1986-01-01
A computer program has been developed at the Langley Research Center to predict the discrete frequency noise of conventional and advanced helicopter rotors. The program, called WOPWOP, uses the most advanced subsonic formulation of Farassat that is less sensitive to errors and is valid for nearly all helicopter rotor geometries and flight conditions. A brief derivation of the acoustic formulation is presented along with a discussion of the numerical implementation of the formulation. The computer program uses realistic helicopter blade motion and aerodynamic loadings, input by the user, for noise calculation in the time domain. A detailed definition of all the input variables, default values, and output data is included. A comparison with experimental data shows good agreement between prediction and experiment; however, accurate aerodynamic loading is needed.
DOT National Transportation Integrated Search
2018-01-01
With the release of the Bitcoin concept into the public domain in late 2008, the world of cryptocurrency (electronic currency such as Bitcoin, Ethereum, and hundreds of others) and distributed computing gained a new kind of trust protocol called b...
Paranoia.Ada: A diagnostic program to evaluate Ada floating-point arithmetic
NASA Technical Reports Server (NTRS)
Hjermstad, Chris
1986-01-01
Many essential software functions in the mission critical computer resource application domain depend on floating point arithmetic. Numerically intensive functions associated with the Space Station project, such as emphemeris generation or the implementation of Kalman filters, are likely to employ the floating point facilities of Ada. Paranoia.Ada appears to be a valuabe program to insure that Ada environments and their underlying hardware exhibit the precision and correctness required to satisfy mission computational requirements. As a diagnostic tool, Paranoia.Ada reveals many essential characteristics of an Ada floating point implementation. Equipped with such knowledge, programmers need not tremble before the complex task of floating point computation.
Study of basic computer competence among public health nurses in Taiwan.
Yang, Kuei-Feng; Yu, Shu; Lin, Ming-Sheng; Hsu, Chia-Ling
2004-03-01
Rapid advances in information technology and media have made distance learning on the Internet possible. This new model of learning allows greater efficiency and flexibility in knowledge acquisition. Since basic computer competence is a prerequisite for this new learning model, this study was conducted to examine the basic computer competence of public health nurses in Taiwan and explore factors influencing computer competence. A national cross-sectional randomized study was conducted with 329 public health nurses. A questionnaire was used to collect data and was delivered by mail. Results indicate that basic computer competence of public health nurses in Taiwan is still needs to be improved (mean = 57.57 +- 2.83, total score range from 26-130). Among the five most frequently used software programs, nurses were most knowledgeable about Word and least knowledgeable about PowerPoint. Stepwise multiple regression analysis revealed eight variables (weekly number of hours spent online at home, weekly amount of time spent online at work, weekly frequency of computer use at work, previous computer training, computer at workplace and Internet access, job position, education level, and age) that significantly influenced computer competence, which accounted for 39.0 % of the variance. In conclusion, greater computer competence, broader educational programs regarding computer technology, and a greater emphasis on computers at work are necessary to increase the usefulness of distance learning via the Internet in Taiwan. Building a user-friendly environment is important in developing this new media model of learning for the future.
A multiphysics and multiscale software environment for modeling astrophysical systems
NASA Astrophysics Data System (ADS)
Portegies Zwart, Simon; McMillan, Steve; Harfst, Stefan; Groen, Derek; Fujii, Michiko; Nualláin, Breanndán Ó.; Glebbeek, Evert; Heggie, Douglas; Lombardi, James; Hut, Piet; Angelou, Vangelis; Banerjee, Sambaran; Belkus, Houria; Fragos, Tassos; Fregeau, John; Gaburov, Evghenii; Izzard, Rob; Jurić, Mario; Justham, Stephen; Sottoriva, Andrea; Teuben, Peter; van Bever, Joris; Yaron, Ofer; Zemp, Marcel
2009-05-01
We present MUSE, a software framework for combining existing computational tools for different astrophysical domains into a single multiphysics, multiscale application. MUSE facilitates the coupling of existing codes written in different languages by providing inter-language tools and by specifying an interface between each module and the framework that represents a balance between generality and computational efficiency. This approach allows scientists to use combinations of codes to solve highly coupled problems without the need to write new codes for other domains or significantly alter their existing codes. MUSE currently incorporates the domains of stellar dynamics, stellar evolution and stellar hydrodynamics for studying generalized stellar systems. We have now reached a "Noah's Ark" milestone, with (at least) two available numerical solvers for each domain. MUSE can treat multiscale and multiphysics systems in which the time- and size-scales are well separated, like simulating the evolution of planetary systems, small stellar associations, dense stellar clusters, galaxies and galactic nuclei. In this paper we describe three examples calculated using MUSE: the merger of two galaxies, the merger of two evolving stars, and a hybrid N-body simulation. In addition, we demonstrate an implementation of MUSE on a distributed computer which may also include special-purpose hardware, such as GRAPEs or GPUs, to accelerate computations. The current MUSE code base is publicly available as open source at http://muse.li.
NASA Astrophysics Data System (ADS)
Harfst, S.; Portegies Zwart, S.; McMillan, S.
2008-12-01
We present MUSE, a software framework for combining existing computational tools from different astrophysical domains into a single multi-physics, multi-scale application. MUSE facilitates the coupling of existing codes written in different languages by providing inter-language tools and by specifying an interface between each module and the framework that represents a balance between generality and computational efficiency. This approach allows scientists to use combinations of codes to solve highly-coupled problems without the need to write new codes for other domains or significantly alter their existing codes. MUSE currently incorporates the domains of stellar dynamics, stellar evolution and stellar hydrodynamics for studying generalized stellar systems. We have now reached a ``Noah's Ark'' milestone, with (at least) two available numerical solvers for each domain. MUSE can treat multi-scale and multi-physics systems in which the time- and size-scales are well separated, like simulating the evolution of planetary systems, small stellar associations, dense stellar clusters, galaxies and galactic nuclei. In this paper we describe two examples calculated using MUSE: the merger of two galaxies and an N-body simulation with live stellar evolution. In addition, we demonstrate an implementation of MUSE on a distributed computer which may also include special-purpose hardware, such as GRAPEs or GPUs, to accelerate computations. The current MUSE code base is publicly available as open source at http://muse.li.
The CompTox Chemistry Dashboard: a community data resource for environmental chemistry.
Williams, Antony J; Grulke, Christopher M; Edwards, Jeff; McEachran, Andrew D; Mansouri, Kamel; Baker, Nancy C; Patlewicz, Grace; Shah, Imran; Wambaugh, John F; Judson, Richard S; Richard, Ann M
2017-11-28
Despite an abundance of online databases providing access to chemical data, there is increasing demand for high-quality, structure-curated, open data to meet the various needs of the environmental sciences and computational toxicology communities. The U.S. Environmental Protection Agency's (EPA) web-based CompTox Chemistry Dashboard is addressing these needs by integrating diverse types of relevant domain data through a cheminformatics layer, built upon a database of curated substances linked to chemical structures. These data include physicochemical, environmental fate and transport, exposure, usage, in vivo toxicity, and in vitro bioassay data, surfaced through an integration hub with link-outs to additional EPA data and public domain online resources. Batch searching allows for direct chemical identifier (ID) mapping and downloading of multiple data streams in several different formats. This facilitates fast access to available structure, property, toxicity, and bioassay data for collections of chemicals (hundreds to thousands at a time). Advanced search capabilities are available to support, for example, non-targeted analysis and identification of chemicals using mass spectrometry. The contents of the chemistry database, presently containing ~ 760,000 substances, are available as public domain data for download. The chemistry content underpinning the Dashboard has been aggregated over the past 15 years by both manual and auto-curation techniques within EPA's DSSTox project. DSSTox chemical content is subject to strict quality controls to enforce consistency among chemical substance-structure identifiers, as well as list curation review to ensure accurate linkages of DSSTox substances to chemical lists and associated data. The Dashboard, publicly launched in April 2016, has expanded considerably in content and user traffic over the past year. It is continuously evolving with the growth of DSSTox into high-interest or data-rich domains of interest to EPA, such as chemicals on the Toxic Substances Control Act listing, while providing the user community with a flexible and dynamic web-based platform for integration, processing, visualization and delivery of data and resources. The Dashboard provides support for a broad array of research and regulatory programs across the worldwide community of toxicologists and environmental scientists.
ERIC Educational Resources Information Center
Hess, Therese M.
The Martinsburg-Berkeley County Public Library (West Virginia) conducted a project that involved recruitment, retention, coalition building, public awareness, training, basic literacy, collection development, tutoring, computer assisted, other technology, and English as a Second Language (ESL) programs. The project served a three-county community…
Generic, Type-Safe and Object Oriented Computer Algebra Software
NASA Astrophysics Data System (ADS)
Kredel, Heinz; Jolly, Raphael
Advances in computer science, in particular object oriented programming, and software engineering have had little practical impact on computer algebra systems in the last 30 years. The software design of existing systems is still dominated by ad-hoc memory management, weakly typed algorithm libraries and proprietary domain specific interactive expression interpreters. We discuss a modular approach to computer algebra software: usage of state-of-the-art memory management and run-time systems (e.g. JVM) usage of strongly typed, generic, object oriented programming languages (e.g. Java) and usage of general purpose, dynamic interactive expression interpreters (e.g. Python) To illustrate the workability of this approach, we have implemented and studied computer algebra systems in Java and Scala. In this paper we report on the current state of this work by presenting new examples.
The Effects of Learning a Computer Programming Language on the Logical Reasoning of School Children.
ERIC Educational Resources Information Center
Seidman, Robert H.
The research reported in this paper explores the syntactical and semantic link between computer programming statements and logical principles, and addresses the effects of learning a programming language on logical reasoning ability. Fifth grade students in a public school in Syracuse, New York, were randomly selected as subjects, and then…
Laptops and the Gender Gap: An Investigation of a High School Core Curriculum Program
ERIC Educational Resources Information Center
Wade, Melanie
2010-01-01
Girls and women continue to be underrepresented in high school Advanced Placement computer science courses, undergraduate and graduate computer science programs at colleges and universities, and engineering programs and related careers. This is not to suggest that public schools train students to fulfill specific job needs, yet it is evident that…
TBGG- INTERACTIVE ALGEBRAIC GRID GENERATION
NASA Technical Reports Server (NTRS)
Smith, R. E.
1994-01-01
TBGG, Two-Boundary Grid Generation, applies an interactive algebraic grid generation technique in two dimensions. The program incorporates mathematical equations that relate the computational domain to the physical domain. TBGG has application to a variety of problems using finite difference techniques, such as computational fluid dynamics. Examples include the creation of a C-type grid about an airfoil and a nozzle configuration in which no left or right boundaries are specified. The underlying two-boundary technique of grid generation is based on Hermite cubic interpolation between two fixed, nonintersecting boundaries. The boundaries are defined by two ordered sets of points, referred to as the top and bottom. Left and right side boundaries may also be specified, and call upon linear blending functions to conform interior interpolation to the side boundaries. Spacing between physical grid coordinates is determined as a function of boundary data and uniformly spaced computational coordinates. Control functions relating computational coordinates to parametric intermediate variables that affect the distance between grid points are embedded in the interpolation formulas. A versatile control function technique with smooth cubic spline functions is also presented. The TBGG program is written in FORTRAN 77. It works best in an interactive graphics environment where computational displays and user responses are quickly exchanged. The program has been implemented on a CDC Cyber 170 series computer using NOS 2.4 operating system, with a central memory requirement of 151,700 (octal) 60 bit words. TBGG requires a Tektronix 4015 terminal and the DI-3000 Graphics Library of Precision Visuals, Inc. TBGG was developed in 1986.
Structural-Vibration-Response Data Analysis
NASA Technical Reports Server (NTRS)
Smith, W. R.; Hechenlaible, R. N.; Perez, R. C.
1983-01-01
Computer program developed as structural-vibration-response data analysis tool for use in dynamic testing of Space Shuttle. Program provides fast and efficient time-domain least-squares curve-fitting procedure for reducing transient response data to obtain structural model frequencies and dampings from free-decay records. Procedure simultaneously identifies frequencies, damping values, and participation factors for noisy multiple-response records.
ERIC Educational Resources Information Center
Scott, Michael J.; Ghinea, Gheorghita
2014-01-01
Deliberate practice is important in many areas of learning, including that of learning to program computers. However, beliefs about the nature of personal traits, known as "mindsets," can have a profound impact on such practice. Previous research has shown that those with a "fixed mindset" believe their traits cannot change;…
ERIC Educational Resources Information Center
McGuire-Kuletz, Maureen; Hergenrather, Kenneth C.
2008-01-01
The Council on Rehabilitation Education (CORE) CORE revised the standards for rehabilitation counseling master's degree program accreditation in 2004. These standards seek to promote effective rehabilitation services to persons with disabilities in both private and public programs (CORE, 2008). This article focuses on the new CORE standard…
HIGHLIGHTS OF THE RUSSIAN HEALTH STUDIES PROGRAM AND UPDATED RESEARCH FINDINGS.
Fountos, Barrett N
2017-04-01
Recognized for conducting cutting-edge science in the field of radiation health effects research, the Department of Energy's (DOE) Russian Health Studies Program has continued to generate excitement and enthusiasm throughout its 23-year mission to assess worker and public health risks from radiation exposure resulting from nuclear weapons production activities in the former Soviet Union. The three goals of the Program are to: (1) clarify the relationship between health effects and chronic, low-to-medium dose radiation exposure; (2) estimate the cancer risks from exposure to gamma, neutron, and alpha radiation; and (3) provide information to the national and international organizations that determine radiation protection standards and practices. Research sponsored by DOE's Russian Health Studies Program is conducted under the authority of the Joint Coordinating Committee for Radiation Effects Research (JCCRER), a bi-national committee representing Federal agencies in the United States and the Russian Federation. Signed in 1994, the JCCRER Agreement established the legal basis for the collaborative research between USA and Russian scientists to determine the risks associated with working at or living near Russian former nuclear weapons production sites. The products of the Program are peer-reviewed publications on cancer risk estimates from worker and community exposure to ionizing radiation following the production of nuclear weapons in Russia. The scientific return on investment has been substantial. Through 31 December 2015, JCCRER researchers have published 299 peer-reviewed publications. To date, the research has focused on the Mayak Production Association (Mayak) in Ozersk, Russia, which is the site of the first Soviet nuclear weapons production facility, and people in surrounding communities along the Techa River. There are five current projects in the Russian Health Studies Program: two radiation epidemiology studies; two historical dose reconstruction studies and a worker biorepository. National and international standard-setting organizations use cancer risk estimates computed from epidemiological and historical dose reconstruction studies to validate or revise radiation protection standards. An overview of the most important research results will be presented. Published by Oxford University Press 2016. This work is written by (a) US Government employee(s) and is in the public domain in the US.
Physicist's simple access to protein structures: the computer program WHAT IF
NASA Astrophysics Data System (ADS)
Altenberg-Greulich, Brigitte; Zech, Stephan G.; Stehlik, Dietmar; Vriend, Gert
2001-06-01
We describe the computer program WHAT IF and its application to two physical examples. For the DNA binding protein, OCT-1 (pou domain) the location of amino acids with a sidechain amino group is shown. Such knowledge is required when staining this molecule with a fluorescence dye, which binds chemically to the amino terminus as well as amino groups in sidechains. The program shows that most sidechain amino groups are protected when DNA is bound to OCT-1, allowing selective staining of the amino terminal NH2 group. A protein stained this way can be used in fluorescence spectroscopic studies on function aspects of OCT-1.
NASA Technical Reports Server (NTRS)
Ortega, J. M.
1984-01-01
Several short summaries of the work performed during this reporting period are presented. Topics discussed in this document include: (1) resilient seeded errors via simple techniques; (2) knowledge representation for engineering design; (3) analysis of faults in a multiversion software experiment; (4) implementation of parallel programming environment; (5) symbolic execution of concurrent programs; (6) two computer graphics systems for visualization of pressure distribution and convective density particles; (7) design of a source code management system; (8) vectorizing incomplete conjugate gradient on the Cyber 203/205; (9) extensions of domain testing theory and; (10) performance analyzer for the pisces system.
Adaptive Fuzzy Systems in Computational Intelligence
NASA Technical Reports Server (NTRS)
Berenji, Hamid R.
1996-01-01
In recent years, the interest in computational intelligence techniques, which currently includes neural networks, fuzzy systems, and evolutionary programming, has grown significantly and a number of their applications have been developed in the government and industry. In future, an essential element in these systems will be fuzzy systems that can learn from experience by using neural network in refining their performances. The GARIC architecture, introduced earlier, is an example of a fuzzy reinforcement learning system which has been applied in several control domains such as cart-pole balancing, simulation of to Space Shuttle orbital operations, and tether control. A number of examples from GARIC's applications in these domains will be demonstrated.
Levine, Rachel B; González-Fernández, Marlís; Bodurtha, Joann; Skarupski, Kimberly A; Fivush, Barbara
2015-05-01
Women continue to be underrepresented in top leadership roles in academic medicine. Leadership training programs for women are designed to enhance women's leadership skills and confidence and increase overall leadership diversity. The authors present a description and evaluation of a longitudinal, cohort-based, experiential leadership program for women faculty at the Johns Hopkins University School of Medicine. We compared pre- and post-program self-assessed ratings of 11 leadership skills and specific negotiation behaviors from 3 cohorts of leadership program participants (n=134) from 2010 to 2013. Women reported significant improvements in skills across 11 domains with the exceptions of 2 domains, Public Speaking and Working in Teams, both of which received high scores in the pre-program assessment. The greatest improvement in rankings occurred within the domain of negotiation skills. Although women reported an increase in their negotiation skills, we were not able to demonstrate an increase in the number of times that women negotiated for salary, space, or promotion following participation in the program. The Johns Hopkins School of Medicine Leadership Program for Women Faculty has demonstrable value for the professional development of participants and addresses institutional strategies to enhance leadership diversity and the advancement of women.
COMPUTER SIMULATOR (BEST) FOR DESIGNING SULFATE-REDUCING BACTERIA FIELD BIOREACTORS
BEST (bioreactor economics, size and time of operation) is a spreadsheet-based model that is used in conjunction with public domain software, PhreeqcI. BEST is used in the design process of sulfate-reducing bacteria (SRB) field bioreactors to passively treat acid mine drainage (A...
Bibliography of Marine Education Software.
ERIC Educational Resources Information Center
McLamb, Skip; Walton, Susan
A bibliography of marine-oriented commercial and public domain courseware has been maintained by the Computer Education Committee of the Mid-Atlantic Marine Education Association for several years. This compilation is provided to interested persons by an established network with the following purposes: (1) to review and critique commercial and…
DESIGNING SULFATE-REDUCING BACTERIA FIELD BIOREACTORS USING THE BEST MODEL
BEST (bioreactor economics, size and time of operation) is a spreadsheet-based model that is used in conjunction with a public domain computer software package, PHREEQCI. BEST is intended to be used in the design process of sulfate-reducing bacteria (SRB)field bioreactors to pas...
Harvesting costs for management planning for ponderosa pine plantations.
Roger D. Fight; Alex Gicqueau; Bruce R. Hartsough
1999-01-01
The PPHARVST computer application is Windows-based, public-domain software used to estimate harvesting costs for management planning for ponderosa pine (Pinus ponderosa Dougl. ex Laws.) plantations. The equipment production rates were developed from existing studies. Equipment cost rates were based on 1996 prices for new...
The Ever-Present Demand for Public Computing Resources. CDS Spotlight
ERIC Educational Resources Information Center
Pirani, Judith A.
2014-01-01
This Core Data Service (CDS) Spotlight focuses on public computing resources, including lab/cluster workstations in buildings, virtual lab/cluster workstations, kiosks, laptop and tablet checkout programs, and workstation access in unscheduled classrooms. The findings are derived from 758 CDS 2012 participating institutions. A dataset of 529…
Yun, Young Ho; Sim, Jin Ah; Lim, Ye Jin; Lim, Cheol Il; Kang, Sung-Choon; Kang, Joon-Ho; Park, Jun Dong; Noh, Dong Young
2016-06-01
The objective of this study was to develop the Worksite Health Index (WHI) and validate its psychometric properties. The development of the WHI questionnaire included item generation, item construction, and field testing. To assess the instrument's reliability and validity, we recruited 30 different Korean worksites. We developed the WHI questionnaire of 136 items categorized into five domains, namely Governance and Infrastructure, Need Assessment and Planning, Health Prevention and Promotion Program, Occupational Safety, and Monitoring and Feedback. All WHI domains demonstrated a high reliability with good internal consistency. The total WHI scores differentiated worksite groups effectively according to firm size. Each domain was associated significantly with employees' health status, absence, and financial outcome. The WHI can assess comprehensive worksite health programs. This tool is publicly available for addressing the growing need for worksite health programs.
The JPL Library information retrieval system
NASA Technical Reports Server (NTRS)
Walsh, J.
1975-01-01
The development, capabilities, and products of the computer-based retrieval system of the Jet Propulsion Laboratory Library are described. The system handles books and documents, produces a book catalog, and provides a machine search capability. Programs and documentation are available to the public through NASA's computer software dissemination program.
A Mentoring Model for Technology Education for Teachers.
ERIC Educational Resources Information Center
MacArthur, Charles A.; And Others
1996-01-01
Describes a computer mentor program that was developed by the University of Maryland and the Prince George's County Public Schools (Maryland) to help teachers learn to integrate computers into their instruction. Topics include a course for the mentors, school administrative support, and results of program evaluations. (LRW)
Poeppel, David
2014-01-01
We outline what an integrated approach to language research that connects experimental, theoretical, and neurobiological domains of inquiry would look like, and ask to what extent unification is possible across domains. At the center of the program is the idea that computational/representational (CR) theories of language must be used to investigate its neurobiological (NB) foundations. We consider different ways in which CR and NB might be connected. These are (1) A Correlational way, in which NB computation is correlated with the CR theory; (2) An Integrated way, in which NB data provide crucial evidence for choosing among CR theories; and (3) an Explanatory way, in which properties of NB explain why a CR theory is the way it is. We examine various questions concerning the prospects for Explanatory connections in particular, including to what extent it makes sense to say that NB could be specialized for particular computations. PMID:25914888
Embick, David; Poeppel, David
2015-05-01
We outline what an integrated approach to language research that connects experimental, theoretical, and neurobiological domains of inquiry would look like, and ask to what extent unification is possible across domains. At the center of the program is the idea that computational/representational (CR) theories of language must be used to investigate its neurobiological (NB) foundations. We consider different ways in which CR and NB might be connected. These are (1) A Correlational way, in which NB computation is correlated with the CR theory; (2) An Integrated way, in which NB data provide crucial evidence for choosing among CR theories; and (3) an Explanatory way, in which properties of NB explain why a CR theory is the way it is. We examine various questions concerning the prospects for Explanatory connections in particular, including to what extent it makes sense to say that NB could be specialized for particular computations.
Joint Services Electronics Program.
1980-05-01
STATEMMEN A Approved for public release, COD Distribution Unlimited.99 Joint Services Electronics Program* _-ANNUAL PROGRESS RP O. 93) 7 / Covering Period...and the temperature dependence of that (dispersive transport) trap limited mobility has shown interesting new effects. Publications of the Research...Low-Cost Laboratory Computer Interface System," (Scheduled for publication May, 1980, Review ot Scinti’i3 Instruments). | i III. INFORMATION
Several advances in the analytic element method have been made to enhance its performance and facilitate three-dimensional ground-water flow modeling in a regional aquifer setting. First, a new public domain modular code (ModAEM) has been developed for modeling ground-water flow ...
Image Algebra Matlab language version 2.3 for image processing and compression research
NASA Astrophysics Data System (ADS)
Schmalz, Mark S.; Ritter, Gerhard X.; Hayden, Eric
2010-08-01
Image algebra is a rigorous, concise notation that unifies linear and nonlinear mathematics in the image domain. Image algebra was developed under DARPA and US Air Force sponsorship at University of Florida for over 15 years beginning in 1984. Image algebra has been implemented in a variety of programming languages designed specifically to support the development of image processing and computer vision algorithms and software. The University of Florida has been associated with development of the languages FORTRAN, Ada, Lisp, and C++. The latter implementation involved a class library, iac++, that supported image algebra programming in C++. Since image processing and computer vision are generally performed with operands that are array-based, the Matlab™ programming language is ideal for implementing the common subset of image algebra. Objects include sets and set operations, images and operations on images, as well as templates and image-template convolution operations. This implementation, called Image Algebra Matlab (IAM), has been found to be useful for research in data, image, and video compression, as described herein. Due to the widespread acceptance of the Matlab programming language in the computing community, IAM offers exciting possibilities for supporting a large group of users. The control over an object's computational resources provided to the algorithm designer by Matlab means that IAM programs can employ versatile representations for the operands and operations of the algebra, which are supported by the underlying libraries written in Matlab. In a previous publication, we showed how the functionality of IAC++ could be carried forth into a Matlab implementation, and provided practical details of a prototype implementation called IAM Version 1. In this paper, we further elaborate the purpose and structure of image algebra, then present a maturing implementation of Image Algebra Matlab called IAM Version 2.3, which extends the previous implementation of IAM to include polymorphic operations over different point sets, as well as recursive convolution operations and functional composition. We also show how image algebra and IAM can be employed in image processing and compression research, as well as algorithm development and analysis.
Physics Problem Workbook, Instructor Manual.
ERIC Educational Resources Information Center
Jones, John L.
This publication of Computer Oriented Materials Production for Undergraduate Teaching (COMPUTe), is intended to aid in the development of an autotutorial program for college-level undergraduate physics. Particularly in the area of mechanics, the author feels there is a need for a tutorial program which enables students to use a variety of…
ERIC Educational Resources Information Center
Fisher, Patience; And Others
This guide was prepared to help teachers of the Lincoln Public School's introductory computer programming course in BASIC to make the necessary adjustments for changes made in the course since the purchase of microcomputers and such peripheral devices as television monitors and disk drives, and the addition of graphics. Intended to teach a…
Generalized three-dimensional experimental lightning code (G3DXL) user's manual
NASA Technical Reports Server (NTRS)
Kunz, Karl S.
1986-01-01
Information concerning the programming, maintenance and operation of the G3DXL computer program is presented and the theoretical basis for the code is described. The program computes time domain scattering fields and surface currents and charges induced by a driving function on and within a complex scattering object which may be perfectly conducting or a lossy dielectric. This is accomplished by modeling the object with cells within a three-dimensional, rectangular problem space, enforcing the appropriate boundary conditions and differencing Maxwell's equations in time. In the present version of the program, the driving function can be either the field radiated by a lightning strike or a direct lightning strike. The F-106 B aircraft is used as an example scattering object.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Summers, Michael S
2017-11-08
HPC software for ab-initio, condensed-matter physics, quantum mechanics calculations needs to be built on top of well tested libraries some of which address requirements unique to the programming domain. During the development of the DCA++ code, that we use in our research, we have developed a collection of libraries that may be of use to other computational scientists working in the same or similar domains. The libraries include: a) a pythonic input-language system, b) tensors whose shape is constructed from generalized dimension objects such at time domains. frequency domains, momentum domains, vertex domains et. al. and c) linear algebra operationsmore » that resolve to BLA/LAPACK operations when possible. This supports the implementation of Greens functions and operations on them such as are used in condensed matter physics.« less
Cotter, Elizabeth W; Hamilton, Natia S; Kelly, Nichole R; Harney, Megan B; Greene, LaShaun; White, Kelly A; Mazzeo, Suzanne E
2016-09-01
Although African American families are at particular risk for obesity and its associated health comorbidities, few interventions have directly targeted low-income members of this group living in subsidized public housing. Using a consensual qualitative research approach, we conducted 11 interviews with African American mothers living in two public housing communities to enhance understanding of their perceived barriers and facilitators to health. Five primary domains emerged, including barriers (access, financial, personal, and neighborhood concerns), resources (personal and community), current behaviors (diet, physical activity, and program participation), definition of health (mental well-being, physical well-being, and health behaviors), and needs/interests in programming (health behavior-specific programs, non-health-related programs, child-focused programming, and qualities of programs and their leaders). Results demonstrate the complex interaction among social, environmental, and personal factors on health behaviors for this priority population, and highlight the need for community members' involvement in the development of community-based obesity prevention programming. © 2016 Society for Public Health Education.
Owens, Douglas K; Lohr, Kathleen N; Atkins, David; Treadwell, Jonathan R; Reston, James T; Bass, Eric B; Chang, Stephanie; Helfand, Mark
2010-05-01
To establish guidance on grading strength of evidence for the Evidence-based Practice Center (EPC) program of the U.S. Agency for Healthcare Research and Quality. Authors reviewed authoritative systems for grading strength of evidence, identified domains and methods that should be considered when grading bodies of evidence in systematic reviews, considered public comments on an earlier draft, and discussed the approach with representatives of the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) working group. The EPC approach is conceptually similar to the GRADE system of evidence rating; it requires assessment of four domains: risk of bias, consistency, directness, and precision. Additional domains to be used when appropriate include dose-response association, presence of confounders that would diminish an observed effect, strength of association, and publication bias. Strength of evidence receives a single grade: high, moderate, low, or insufficient. We give definitions, examples, mechanisms for scoring domains, and an approach for assigning strength of evidence. EPCs should grade strength of evidence separately for each major outcome and, for comparative effectiveness reviews, all major comparisons. We will collaborate with the GRADE group to address ongoing challenges in assessing the strength of evidence.
Modeling and Analysis of Power Processing Systems (MAPPS), initial phase 2
NASA Technical Reports Server (NTRS)
Yu, Y.; Lee, F. C.; Wangenheim, H.; Warren, D.
1977-01-01
The overall objective of the program is to provide the engineering tools to reduce the analysis, design, and development effort, and thus the cost, in achieving the required performances for switching regulators and dc-dc converter systems. The program was both tutorial and application oriented. Various analytical methods were described in detail and supplemented with examples, and those with standardization appeals were reduced into computer-based subprograms. Major program efforts included those concerning small and large signal control-dependent performance analysis and simulation, control circuit design, power circuit design and optimization, system configuration study, and system performance simulation. Techniques including discrete time domain, conventional frequency domain, Lagrange multiplier, nonlinear programming, and control design synthesis were employed in these efforts. To enhance interactive conversation between the modeling and analysis subprograms and the user, a working prototype of the Data Management Program was also developed to facilitate expansion as future subprogram capabilities increase.
Technology to the People: Kirsten Thompson--Milwaukee Public Library
ERIC Educational Resources Information Center
Library Journal, 2004
2004-01-01
Considering Kirsten Thompson's career in the Milwaukee Public Library (MPL), it is no surprise that as an undergraduate she studied both computer programming and psychology. The combination is the driving force behind her multiple projects at MPL, which have all focused on bringing technology to users. "Having the computers is great, but you…
Coupling between Current and Dynamic Magnetization : from Domain Walls to Spin Waves
NASA Astrophysics Data System (ADS)
Lucassen, M. E.
2012-05-01
So far, we have derived some general expressions for domain-wall motion and the spin motive force. We have seen that the β parameter plays a large role in both subjects. In all chapters of this thesis, there is an emphasis on the determination of this parameter. We also know how to incorporate thermal fluctuations for rigid domain walls, as shown above. In Chapter 2, we study a different kind of fluctuations: shot noise. This noise is caused by the fact that an electric current consists of electrons, and therefore has fluctuations. In the process, we also compute transmission and reflection coefficients for a rigid domain wall, and from them the linear momentum transfer. More work on fluctuations is done in Chapter 3. Here, we consider a (extrinsically pinned) rigid domain wall under the influence of thermal fluctuations that induces a current via spin motive force. We compute how the resulting noise in the current is related to the β parameter. In Chapter 4 we look into in more detail into the spin motive forces from field driven domain walls. Using micro magnetic simulations, we compute the spin motive force due to vortex domain walls explicitly. As mentioned before, this gives qualitatively different results than for a rigid domain wall. The final subject in Chapter 5 is the application of the general expression for spin motive forces to magnons. Although this might seem to be unrelated to domain-wall motion, this calculation allows us to relate the β parameter to macroscopic transport coefficients. This work was supported by Stichting voor Fundamenteel Onderzoek der Materie (FOM), the Netherlands Organization for Scientific Research (NWO), and by the European Research Council (ERC) under the Seventh Framework Program (FP7).
CDD/SPARCLE: functional classification of proteins via subfamily domain architectures.
Marchler-Bauer, Aron; Bo, Yu; Han, Lianyi; He, Jane; Lanczycki, Christopher J; Lu, Shennan; Chitsaz, Farideh; Derbyshire, Myra K; Geer, Renata C; Gonzales, Noreen R; Gwadz, Marc; Hurwitz, David I; Lu, Fu; Marchler, Gabriele H; Song, James S; Thanki, Narmada; Wang, Zhouxi; Yamashita, Roxanne A; Zhang, Dachuan; Zheng, Chanjuan; Geer, Lewis Y; Bryant, Stephen H
2017-01-04
NCBI's Conserved Domain Database (CDD) aims at annotating biomolecular sequences with the location of evolutionarily conserved protein domain footprints, and functional sites inferred from such footprints. An archive of pre-computed domain annotation is maintained for proteins tracked by NCBI's Entrez database, and live search services are offered as well. CDD curation staff supplements a comprehensive collection of protein domain and protein family models, which have been imported from external providers, with representations of selected domain families that are curated in-house and organized into hierarchical classifications of functionally distinct families and sub-families. CDD also supports comparative analyses of protein families via conserved domain architectures, and a recent curation effort focuses on providing functional characterizations of distinct subfamily architectures using SPARCLE: Subfamily Protein Architecture Labeling Engine. CDD can be accessed at https://www.ncbi.nlm.nih.gov/Structure/cdd/cdd.shtml. Published by Oxford University Press on behalf of Nucleic Acids Research 2016. This work is written by (a) US Government employee(s) and is in the public domain in the US.
76 FR 13984 - Cloud Computing Forum & Workshop III
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-15
... DEPARTMENT OF COMMERCE National Institute of Standards and Technology Cloud Computing Forum... public workshop. SUMMARY: NIST announces the Cloud Computing Forum & Workshop III to be held on April 7... provide information on the NIST strategic and tactical Cloud Computing program, including progress on the...
Banerjee, Shyamashree; Gupta, Parth Sarthi Sen; Nayek, Arnab; Das, Sunit; Sur, Vishma Pratap; Seth, Pratyay; Islam, Rifat Nawaz Ul; Bandyopadhyay, Amal K
2015-01-01
Automated genome sequencing procedure is enriching the sequence database very fast. To achieve a balance between the entry of sequences in the database and their analyses, efficient software is required. In this end PHYSICO2, compare to earlier PHYSICO and other public domain tools, is most efficient in that it i] extracts physicochemical, window-dependent and homologousposition-based-substitution (PWS) properties including positional and BLOCK-specific diversity and conservation, ii] provides users with optional-flexibility in setting relevant input-parameters, iii] helps users to prepare BLOCK-FASTA-file by the use of Automated Block Preparation Tool of the program, iv] performs fast, accurate and user-friendly analyses and v] redirects itemized outputs in excel format along with detailed methodology. The program package contains documentation describing application of methods. Overall the program acts as efficient PWS-analyzer and finds application in sequence-bioinformatics. PHYSICO2: is freely available at http://sourceforge.net/projects/physico2/ along with its documentation at https://sourceforge.net/projects/physico2/files/Documentation.pdf/download for all users.
Banerjee, Shyamashree; Gupta, Parth Sarthi Sen; Nayek, Arnab; Das, Sunit; Sur, Vishma Pratap; Seth, Pratyay; Islam, Rifat Nawaz Ul; Bandyopadhyay, Amal K
2015-01-01
Automated genome sequencing procedure is enriching the sequence database very fast. To achieve a balance between the entry of sequences in the database and their analyses, efficient software is required. In this end PHYSICO2, compare to earlier PHYSICO and other public domain tools, is most efficient in that it i] extracts physicochemical, window-dependent and homologousposition-based-substitution (PWS) properties including positional and BLOCK-specific diversity and conservation, ii] provides users with optional-flexibility in setting relevant input-parameters, iii] helps users to prepare BLOCK-FASTA-file by the use of Automated Block Preparation Tool of the program, iv] performs fast, accurate and user-friendly analyses and v] redirects itemized outputs in excel format along with detailed methodology. The program package contains documentation describing application of methods. Overall the program acts as efficient PWS-analyzer and finds application in sequence-bioinformatics. Availability PHYSICO2: is freely available at http://sourceforge.net/projects/physico2/ along with its documentation at https://sourceforge.net/projects/physico2/files/Documentation.pdf/download for all users. PMID:26339154
Development and Applications of a Modular Parallel Process for Large Scale Fluid/Structures Problems
NASA Technical Reports Server (NTRS)
Guruswamy, Guru P.; Byun, Chansup; Kwak, Dochan (Technical Monitor)
2001-01-01
A modular process that can efficiently solve large scale multidisciplinary problems using massively parallel super computers is presented. The process integrates disciplines with diverse physical characteristics by retaining the efficiency of individual disciplines. Computational domain independence of individual disciplines is maintained using a meta programming approach. The process integrates disciplines without affecting the combined performance. Results are demonstrated for large scale aerospace problems on several supercomputers. The super scalability and portability of the approach is demonstrated on several parallel computers.
Gener: a minimal programming module for chemical controllers based on DNA strand displacement
Kahramanoğulları, Ozan; Cardelli, Luca
2015-01-01
Summary: Gener is a development module for programming chemical controllers based on DNA strand displacement. Gener is developed with the aim of providing a simple interface that minimizes the opportunities for programming errors: Gener allows the user to test the computations of the DNA programs based on a simple two-domain strand displacement algebra, the minimal available so far. The tool allows the user to perform stepwise computations with respect to the rules of the algebra as well as exhaustive search of the computation space with different options for exploration and visualization. Gener can be used in combination with existing tools, and in particular, its programs can be exported to Microsoft Research’s DSD tool as well as to LaTeX. Availability and implementation: Gener is available for download at the Cosbi website at http://www.cosbi.eu/research/prototypes/gener as a windows executable that can be run on Mac OS X and Linux by using Mono. Contact: ozan@cosbi.eu PMID:25957353
Gener: a minimal programming module for chemical controllers based on DNA strand displacement.
Kahramanoğulları, Ozan; Cardelli, Luca
2015-09-01
: Gener is a development module for programming chemical controllers based on DNA strand displacement. Gener is developed with the aim of providing a simple interface that minimizes the opportunities for programming errors: Gener allows the user to test the computations of the DNA programs based on a simple two-domain strand displacement algebra, the minimal available so far. The tool allows the user to perform stepwise computations with respect to the rules of the algebra as well as exhaustive search of the computation space with different options for exploration and visualization. Gener can be used in combination with existing tools, and in particular, its programs can be exported to Microsoft Research's DSD tool as well as to LaTeX. Gener is available for download at the Cosbi website at http://www.cosbi.eu/research/prototypes/gener as a windows executable that can be run on Mac OS X and Linux by using Mono. ozan@cosbi.eu. © The Author 2015. Published by Oxford University Press.
A new approach in the design of an interactive environment for teaching Hamiltonian digraphs
NASA Astrophysics Data System (ADS)
Iordan, A. E.; Panoiu, M.
2014-03-01
In this article the authors present the necessary steps in object orientated design of an interactive environment that is dedicated to the process of acquaintances assimilation in Hamiltonian graphs theory domain, especially for the simulation of algorithms which determine the Hamiltonian trails and circuits. The modelling of the interactive environment is achieved through specific UML diagrams representing the steps of analysis, design and implementation. This interactive environment is very useful for both students and professors, because computer programming domain, especially digraphs theory domain is comprehended and assimilated with difficulty by students.
DNET: A communications facility for distributed heterogeneous computing
NASA Technical Reports Server (NTRS)
Tole, John; Nagappan, S.; Clayton, J.; Ruotolo, P.; Williamson, C.; Solow, H.
1989-01-01
This document describes DNET, a heterogeneous data communications networking facility. DNET allows programs operating on hosts on dissimilar networks to communicate with one another without concern for computer hardware, network protocol, or operating system differences. The overall DNET network is defined as the collection of host machines/networks on which the DNET software is operating. Each underlying network is considered a DNET 'domain'. Data communications service is provided between any two processes on any two hosts on any of the networks (domains) that may be reached via DNET. DNET provides protocol transparent, reliable, streaming data transmission between hosts (restricted, initially to DECnet and TCP/IP networks). DNET also provides variable length datagram service with optional return receipts.
NASA Astrophysics Data System (ADS)
Tang, William M., Dr.
2006-01-01
The second annual Scientific Discovery through Advanced Computing (SciDAC) Conference was held from June 25-29, 2006 at the new Hyatt Regency Hotel in Denver, Colorado. This conference showcased outstanding SciDAC-sponsored computational science results achieved during the past year across many scientific domains, with an emphasis on science at scale. Exciting computational science that has been accomplished outside of the SciDAC program both nationally and internationally was also featured to help foster communication between SciDAC computational scientists and those funded by other agencies. This was illustrated by many compelling examples of how domain scientists collaborated productively with applied mathematicians and computer scientists to effectively take advantage of terascale computers (capable of performing trillions of calculations per second) not only to accelerate progress in scientific discovery in a variety of fields but also to show great promise for being able to utilize the exciting petascale capabilities in the near future. The SciDAC program was originally conceived as an interdisciplinary computational science program based on the guiding principle that strong collaborative alliances between domain scientists, applied mathematicians, and computer scientists are vital to accelerated progress and associated discovery on the world's most challenging scientific problems. Associated verification and validation are essential in this successful program, which was funded by the US Department of Energy Office of Science (DOE OS) five years ago. As is made clear in many of the papers in these proceedings, SciDAC has fundamentally changed the way that computational science is now carried out in response to the exciting challenge of making the best use of the rapid progress in the emergence of more and more powerful computational platforms. In this regard, Dr. Raymond Orbach, Energy Undersecretary for Science at the DOE and Director of the OS has stated: `SciDAC has strengthened the role of high-end computing in furthering science. It is defining whole new fields for discovery.' (SciDAC Review, Spring 2006, p8). Application domains within the SciDAC 2006 conference agenda encompassed a broad range of science including: (i) the DOE core mission of energy research involving combustion studies relevant to fuel efficiency and pollution issues faced today and magnetic fusion investigations impacting prospects for future energy sources; (ii) fundamental explorations into the building blocks of matter, ranging from quantum chromodynamics - the basic theory that describes how quarks make up the protons and neutrons of all matter - to the design of modern high-energy accelerators; (iii) the formidable challenges of predicting and controlling the behavior of molecules in quantum chemistry and the complex biomolecules determining the evolution of biological systems; (iv) studies of exploding stars for insights into the nature of the universe; and (v) integrated climate modeling to enable realistic analysis of earth's changing climate. Associated research has made it quite clear that advanced computation is often the only means by which timely progress is feasible when dealing with these complex, multi-component physical, chemical, and biological systems operating over huge ranges of temporal and spatial scales. Working with the domain scientists, applied mathematicians and computer scientists have continued to develop the discretizations of the underlying equations and the complementary algorithms to enable improvements in solutions on modern parallel computing platforms as they evolve from the terascale toward the petascale regime. Moreover, the associated tremendous growth of data generated from the terabyte to the petabyte range demands not only the advanced data analysis and visualization methods to harvest the scientific information but also the development of efficient workflow strategies which can deal with the data input/output, management, movement, and storage challenges. If scientific discovery is expected to keep apace with the continuing progression from tera- to petascale platforms, the vital alliance between domain scientists, applied mathematicians, and computer scientists will be even more crucial. During the SciDAC 2006 Conference, some of the future challenges and opportunities in interdisciplinary computational science were emphasized in the Advanced Architectures Panel and by Dr. Victor Reis, Senior Advisor to the Secretary of Energy, who gave a featured presentation on `Simulation, Computation, and the Global Nuclear Energy Partnership.' Overall, the conference provided an excellent opportunity to highlight the rising importance of computational science in the scientific enterprise and to motivate future investment in this area. As Michael Strayer, SciDAC Program Director, has noted: `While SciDAC may have started out as a specific program, Scientific Discovery through Advanced Computing has become a powerful concept for addressing some of the biggest challenges facing our nation and our world.' Looking forward to next year, the SciDAC 2007 Conference will be held from June 24-28 at the Westin Copley Plaza in Boston, Massachusetts. Chairman: David Keyes, Columbia University. The Organizing Committee for the SciDAC 2006 Conference would like to acknowledge the individuals whose talents and efforts were essential to the success of the meeting. Special thanks go to Betsy Riley for her leadership in building the infrastructure support for the conference, for identifying and then obtaining contributions from our corporate sponsors, for coordinating all media communications, and for her efforts in organizing and preparing the conference proceedings for publication; to Tim Jones for handling the hotel scouting, subcontracts, and exhibits and stage production; to Angela Harris for handling supplies, shipping, and tracking, poster sessions set-up, and for her efforts in coordinating and scheduling the promotional activities that took place during the conference; to John Bui and John Smith for their superb wireless networking and A/V set-up and support; to Cindy Latham for Web site design, graphic design, and quality control of proceedings submissions; and to Pamelia Nixon-Hartje of Ambassador for budget and quality control of catering. We are grateful for the highly professional dedicated efforts of all of these individuals, who were the cornerstones of the SciDAC 2006 Conference. Thanks also go to Angela Beach of the ORNL Conference Center for her efforts in executing the contracts with the hotel, Carolyn James of Colorado State for on-site registration supervision, Lora Wolfe and Brittany Hagen for administrative support at ORNL, and Dami Rich and Andrew Sproles for graphic design and production. We are also most grateful to the Oak Ridge National Laboratory, especially Jeff Nichols, and to our corporate sponsors, Data Direct Networks, Cray, IBM, SGI, and Institute of Physics Publishing for their support. We especially express our gratitude to the featured speakers, invited oral speakers, invited poster presenters, session chairs, and advanced architecture panelists and chair for their excellent contributions on behalf of SciDAC 2006. We would like to express our deep appreciation to Lali Chatterjee, Graham Douglas, Margaret Smith, and the production team of Institute of Physics Publishing, who worked tirelessly to publish the final conference proceedings in a timely manner. Finally, heartfelt thanks are extended to Michael Strayer, Associate Director for OASCR and SciDAC Director, and to the DOE program managers associated with SciDAC for their continuing enthusiasm and strong support for the annual SciDAC Conferences as a special venue to showcase the exciting scientific discovery achievements enabled by the interdisciplinary collaborations championed by the SciDAC program.
Public health engineering education in India: current scenario, opportunities and challenges.
Hussain, Mohammad Akhtar; Sharma, Kavya; Zodpey, Sanjay
2011-01-01
Public health engineering can play an important and significant role in solving environmental health issues. In order to confront public health challenges emerging out of environmental problems we need adequately trained public health engineers / environmental engineers. Considering the current burden of disease attributable to environmental factors and expansion in scope of applications of public health / environmental engineering science, it is essential to understand the present scenario of teaching, training and capacity building programs in these areas. Against this background the present research was carried out to know the current teaching and training programs in public health engineering and related disciplines in India and to understand the potential opportunities and challenges available. A systematic, predefined approach was used to collect and assemble the data related to various teaching and training programs in public health engineering / environmental engineering in India. Public health engineering / environmental engineering education and training in the country is mainly offered through engineering institutions, as pre-service and in-service training. Pre-service programs include diploma, degree (graduate) and post-graduate courses affiliated to various state technical boards, institutes and universities, whereas in-service training is mainly provided by Government of India recognized engineering and public health training institutes. Though trainees of these programs acquire skills related to engineering sciences, they significantly lack in public health skills. The teaching and training of public health engineering / environmental engineering is limited as a part of public health programs (MD Community Medicine, MPH, DPH) in India. There is need for developing teaching and training of public health engineering or environmental engineering as an interdisciplinary subject. Public health institutes can play an important and significant role in this regard by engaging themselves in initiating specialized programs in this domain.
NASA Astrophysics Data System (ADS)
Kudryavtsev, Alexey N.; Kashkovsky, Alexander V.; Borisov, Semyon P.; Shershnev, Anton A.
2017-10-01
In the present work a computer code RCFS for numerical simulation of chemically reacting compressible flows on hybrid CPU/GPU supercomputers is developed. It solves 3D unsteady Euler equations for multispecies chemically reacting flows in general curvilinear coordinates using shock-capturing TVD schemes. Time advancement is carried out using the explicit Runge-Kutta TVD schemes. Program implementation uses CUDA application programming interface to perform GPU computations. Data between GPUs is distributed via domain decomposition technique. The developed code is verified on the number of test cases including supersonic flow over a cylinder.
NASA Astrophysics Data System (ADS)
Jogesh Babu, G.
2017-01-01
A year-long research (Aug 2016- May 2017) program on `Statistical, Mathematical and Computational Methods for Astronomy (ASTRO)’ is well under way at Statistical and Applied Mathematical Sciences Institute (SAMSI), a National Science Foundation research institute in Research Triangle Park, NC. This program has brought together astronomers, computer scientists, applied mathematicians and statisticians. The main aims of this program are: to foster cross-disciplinary activities; to accelerate the adoption of modern statistical and mathematical tools into modern astronomy; and to develop new tools needed for important astronomical research problems. The program provides multiple avenues for cross-disciplinary interactions, including several workshops, long-term visitors, and regular teleconferences, so participants can continue collaborations, even if they can only spend limited time in residence at SAMSI. The main program is organized around five working groups:i) Uncertainty Quantification and Astrophysical Emulationii) Synoptic Time Domain Surveysiii) Multivariate and Irregularly Sampled Time Seriesiv) Astrophysical Populationsv) Statistics, computation, and modeling in cosmology.A brief description of each of the work under way by these groups will be given. Overlaps among various working groups will also be highlighted. How the wider astronomy community can both participate and benefit from the activities, will be briefly mentioned.
PDS: A Performance Database Server
Berry, Michael W.; Dongarra, Jack J.; Larose, Brian H.; ...
1994-01-01
The process of gathering, archiving, and distributing computer benchmark data is a cumbersome task usually performed by computer users and vendors with little coordination. Most important, there is no publicly available central depository of performance data for all ranges of machines from personal computers to supercomputers. We present an Internet-accessible performance database server (PDS) that can be used to extract current benchmark data and literature. As an extension to the X-Windows-based user interface (Xnetlib) to the Netlib archival system, PDS provides an on-line catalog of public domain computer benchmarks such as the LINPACK benchmark, Perfect benchmarks, and the NAS parallelmore » benchmarks. PDS does not reformat or present the benchmark data in any way that conflicts with the original methodology of any particular benchmark; it is thereby devoid of any subjective interpretations of machine performance. We believe that all branches (research laboratories, academia, and industry) of the general computing community can use this facility to archive performance metrics and make them readily available to the public. PDS can provide a more manageable approach to the development and support of a large dynamic database of published performance metrics.« less
NASA Astrophysics Data System (ADS)
Pevenstein, Jack Edward
This dissertation presents 18 alternative models for computing the social rate of return (SRR) of the joint Department of Energy (DOE)-National Institute of Standards and Technology (NIST) Energy-Related Inventions Program (ERIP) from 1975 to 1995. The models differ on the on the choice of societal benefit, adjustments made to the benefits, accounting for initial investments in ERIP and annual program appropriations. Alternative quantitative measures of societal benefit include annual gross market sales of successfully commercialized ERIP-supported inventions, annual energy savings resulting from the use of such inventions, pollution-remediation cost reductions due to decreased carbon emissions from greenhouse gases associated with more efficient energy generation. SRR computation employs the net present value (NPV) model with the SRR being the discount rate that reduces the NPV of a stream of societal benefits to zero over a period of n years given an initial investment and annual program appropriations. The SRR is the total rate of return to the nation from public investment in ERIP. The data used for computation were assembled by Dr. Marilyn A. Brown and her staff at Oak Ridge National Laboratory under contract to DOE since 1985. Other data on energy use and carbon emission from greenhouse gas production come from official publications of DOE's Energy Information Administration. Mean ERIP SRR = 412.7% with standard deviation = +/-426.5%. The population of the SRR sample is accepted as normally distributed at an alpha = 0.05, using the Kolmogorov-Smirnov test. These SRR's, which appear reasonable in comparison with those computed by Professor Edwin Mansfield, (Wharton School) for inventions and by Dr. Gregory Tassey (NIST Chief Economist) for NIST programs supporting innovations in measurement technology, show a significant underinvestment in public service technology innovation evaluation programs for independent inventors and small technology-oriented businesses. Moreover, it is argued that ERIP [with its participants] is a good representation of a larger community of independent inventors and innovators comprising a resource the writer calls the "national innovation infrastructure." This national innovation infrastructure, like ERIP, is underinvested in terms of public support. Thus, the nation would benefit from a large-scale, value-adding, public-service innovative technology evaluation program modeled on ERIP. Further, support of such technology evaluation programs at both state and Federal levels should be an important priority of public technology policy.
GPR data processing computer software for the PC
Lucius, Jeffrey E.; Powers, Michael H.
2002-01-01
The computer software described in this report is designed for processing ground penetrating radar (GPR) data on Intel-compatible personal computers running the MS-DOS operating system or MS Windows 3.x/95/98/ME/2000. The earliest versions of these programs were written starting in 1990. At that time, commercially available GPR software did not meet the processing and display requirements of the USGS. Over the years, the programs were refined and new features and programs were added. The collection of computer programs presented here can perform all basic processing of GPR data, including velocity analysis and generation of CMP stacked sections and data volumes, as well as create publication quality data images.
Provenance Challenges for Earth Science Dataset Publication
NASA Technical Reports Server (NTRS)
Tilmes, Curt
2011-01-01
Modern science is increasingly dependent on computational analysis of very large data sets. Organizing, referencing, publishing those data has become a complex problem. Published research that depends on such data often fails to cite the data in sufficient detail to allow an independent scientist to reproduce the original experiments and analyses. This paper explores some of the challenges related to data identification, equivalence and reproducibility in the domain of data intensive scientific processing. It will use the example of Earth Science satellite data, but the challenges also apply to other domains.
Dislocation dynamics in non-convex domains using finite elements with embedded discontinuities
NASA Astrophysics Data System (ADS)
Romero, Ignacio; Segurado, Javier; LLorca, Javier
2008-04-01
The standard strategy developed by Van der Giessen and Needleman (1995 Modelling Simul. Mater. Sci. Eng. 3 689) to simulate dislocation dynamics in two-dimensional finite domains was modified to account for the effect of dislocations leaving the crystal through a free surface in the case of arbitrary non-convex domains. The new approach incorporates the displacement jumps across the slip segments of the dislocations that have exited the crystal within the finite element analysis carried out to compute the image stresses on the dislocations due to the finite boundaries. This is done in a simple computationally efficient way by embedding the discontinuities in the finite element solution, a strategy often used in the numerical simulation of crack propagation in solids. Two academic examples are presented to validate and demonstrate the extended model and its implementation within a finite element program is detailed in the appendix.
ERIC Educational Resources Information Center
Wood, Eileen; Gottardo, Alexandra; Grant, Amy; Evans, Mary Ann; Phillips, Linda; Savage, Robert
2012-01-01
As computers become an increasingly ubiquitous part of young children's lives there is a need to examine how best to harness digital technologies to promote learning in early childhood education contexts. The development of emergent literacy skills is 1 domain for which numerous software programs are available for young learners. In this study, we…
11 CFR 100.94 - Uncompensated Internet activity by individuals that is not a contribution.
Code of Federal Regulations, 2012 CFR
2012-01-01
... of another person's Web site; and any other form of communication distributed over the Internet. (c... not limited to: Computers, software, Internet domain names, Internet Service Providers (ISP), and any... definition of contribution: (1) Any payment for a public communication (as defined in 11 CFR 100.26) other...
11 CFR 100.94 - Uncompensated Internet activity by individuals that is not a contribution.
Code of Federal Regulations, 2013 CFR
2013-01-01
... of another person's Web site; and any other form of communication distributed over the Internet. (c... not limited to: Computers, software, Internet domain names, Internet Service Providers (ISP), and any... definition of contribution: (1) Any payment for a public communication (as defined in 11 CFR 100.26) other...
11 CFR 100.155 - Uncompensated Internet activity by individuals that is not an expenditure.
Code of Federal Regulations, 2013 CFR
2013-01-01
... another person's website; and any other form of communication distributed over the Internet. (c) Equipment... limited to: Computers, software, Internet domain names, Internet Service Providers (ISP), and any other...: (1) Any payment for a public communication (as defined in 11 CFR 100.26) other than a nominal fee; (2...
11 CFR 100.155 - Uncompensated Internet activity by individuals that is not an expenditure.
Code of Federal Regulations, 2012 CFR
2012-01-01
... another person's website; and any other form of communication distributed over the Internet. (c) Equipment... limited to: Computers, software, Internet domain names, Internet Service Providers (ISP), and any other...: (1) Any payment for a public communication (as defined in 11 CFR 100.26) other than a nominal fee; (2...
BEST (bioreactor economics, size and time of operation) is an Excel™ spreadsheet-based model that is used in conjunction with the public domain geochemical modeling software, PHREEQCI. The BEST model is used in the design process of sulfate-reducing bacteria (SRB) field bioreacto...
Information Security Awareness On-Line Materials Design with Knowledge Maps
ERIC Educational Resources Information Center
Shaw, Ruey-Shiang; Keh, Huan-Chao; Huang, Nan-Ching; Huang, Tien-Chuan
2011-01-01
Information Security Awareness, though known as a primary and important issue in the domain of Information Security, CSI computer crime and security survey showed poor security awareness training in public and private sectors. In many studies, the authors have found that the usage of knowledge maps helps the process of learning and conception…
A Summary of the Naval Postgraduate School Research Program and Recent Publications
1990-09-01
principles to divide the spectrum of MATLAB computer program on a 386-type a wide-band spread-spectrum signal into sub- computer. Because of the high rf...original in time and a large data sample was required. An signal. Effects due the fiber optic pickup array extended version of MATLAB that allows and...application, such as orbital mechanics and weather prediction. Professor Gragg has also developed numerous MATLAB programs for linear programming problems
How a Climbing Wall Became Part of a NEW Physical Education Program
ERIC Educational Resources Information Center
Cook, Gordon; Boyan, Al; Mendelsohn, Alice; Green, Alison; Woolvett, Colleen
2007-01-01
The introduction of a NEW physical education (PE) program at Ancaster Senior Public School had, at its root, the desire to make physical activity an inclusive domain for both athletic students and those not so inclined. With the growing concerns over the rapid and consistent rise in childhood obesity rates it was evident that the current model of…
Code of Federal Regulations, 2010 CFR
2010-07-01
... such as, but not limited to, the exercise of police powers affecting (or relating to) the health, safety, and welfare of the affected population; taxation; and the exercise of the power of eminent domain... existing environmental or public health programs administered by the tribal governing body and a copy of...
Intelligent automated surface grid generation
NASA Technical Reports Server (NTRS)
Yao, Ke-Thia; Gelsey, Andrew
1995-01-01
The goal of our research is to produce a flexible, general grid generator for automated use by other programs, such as numerical optimizers. The current trend in the gridding field is toward interactive gridding. Interactive gridding more readily taps into the spatial reasoning abilities of the human user through the use of a graphical interface with a mouse. However, a sometimes fruitful approach to generating new designs is to apply an optimizer with shape modification operators to improve an initial design. In order for this approach to be useful, the optimizer must be able to automatically grid and evaluate the candidate designs. This paper describes and intelligent gridder that is capable of analyzing the topology of the spatial domain and predicting approximate physical behaviors based on the geometry of the spatial domain to automatically generate grids for computational fluid dynamics simulators. Typically gridding programs are given a partitioning of the spatial domain to assist the gridder. Our gridder is capable of performing this partitioning. This enables the gridder to automatically grid spatial domains of wide range of configurations.
Arc_Mat: a Matlab-based spatial data analysis toolbox
NASA Astrophysics Data System (ADS)
Liu, Xingjian; Lesage, James
2010-03-01
This article presents an overview of Arc_Mat, a Matlab-based spatial data analysis software package whose source code has been placed in the public domain. An earlier version of the Arc_Mat toolbox was developed to extract map polygon and database information from ESRI shapefiles and provide high quality mapping in the Matlab software environment. We discuss revisions to the toolbox that: utilize enhanced computing and graphing capabilities of more recent versions of Matlab, restructure the toolbox with object-oriented programming features, and provide more comprehensive functions for spatial data analysis. The Arc_Mat toolbox functionality includes basic choropleth mapping; exploratory spatial data analysis that provides exploratory views of spatial data through various graphs, for example, histogram, Moran scatterplot, three-dimensional scatterplot, density distribution plot, and parallel coordinate plots; and more formal spatial data modeling that draws on the extensive Spatial Econometrics Toolbox functions. A brief review of the design aspects of the revised Arc_Mat is described, and we provide some illustrative examples that highlight representative uses of the toolbox. Finally, we discuss programming with and customizing the Arc_Mat toolbox functionalities.
Molecular scaffold analysis of natural products databases in the public domain.
Yongye, Austin B; Waddell, Jacob; Medina-Franco, José L
2012-11-01
Natural products represent important sources of bioactive compounds in drug discovery efforts. In this work, we compiled five natural products databases available in the public domain and performed a comprehensive chemoinformatic analysis focused on the content and diversity of the scaffolds with an overview of the diversity based on molecular fingerprints. The natural products databases were compared with each other and with a set of molecules obtained from in-house combinatorial libraries, and with a general screening commercial library. It was found that publicly available natural products databases have different scaffold diversity. In contrast to the common concept that larger libraries have the largest scaffold diversity, the largest natural products collection analyzed in this work was not the most diverse. The general screening library showed, overall, the highest scaffold diversity. However, considering the most frequent scaffolds, the general reference library was the least diverse. In general, natural products databases in the public domain showed low molecule overlap. In addition to benzene and acyclic compounds, flavones, coumarins, and flavanones were identified as the most frequent molecular scaffolds across the different natural products collections. The results of this work have direct implications in the computational and experimental screening of natural product databases for drug discovery. © 2012 John Wiley & Sons A/S.
1992-02-01
universities and industry who have resident appointments for limited periods of time , and by consultants. Members of NASA’s research staff also may be...Submitted to Journal of Computational Physics. Banks, H. T., G. Propst, and R. J. Silcox: A comparison of time domain boundary conditions for acoustic...2, pp. 117-145, i991. Nicol, David M.: T/ cost of conservative synchronization in parallel discrete event sim- ulations. ICASE Report No. 90-20, May
Ketoff, Serge; Khonsari, Roman Hossein; Schouman, Thomas; Bertolus, Chloé
2014-11-01
Handling 3-dimensional reconstructions of computed tomographic scans on portable devices is problematic because of the size of the Digital Imaging and Communications in Medicine (DICOM) stacks. The authors provide a user-friendly method allowing the production, transfer, and sharing of good-quality 3-dimensional reconstructions on smartphones and tablets. Copyright © 2014 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
Granich, Reuben; Gupta, Somya; Hall, Irene; Aberle-Grasse, John; Hader, Shannon; Mermin, Jonathan
2017-04-01
In 2014, the Joint United Nations Program on HIV/AIDS (UNAIDS) issued treatment goals for human immunodeficiency virus (HIV). The 90-90-90 target specifies that by 2020, 90% of individuals living with HIV will know their HIV status, 90% of people with diagnosed HIV infection will receive antiretroviral treatment (ART), and 90% of those taking ART will be virally suppressed. Consistent methods and routine reporting in the public domain will be necessary for tracking progress towards the 90-90-90 target. For the period 2010-2016, we searched PubMed, UNAIDS country progress reports, World Health Organization (WHO), UNAIDS reports, national surveillance and program reports, United States President's Emergency Plan for AIDS Relief (PEPFAR) Country Operational Plans, and conference presentations and/or abstracts for the latest available national HIV care continuum in the public domain. Continua of care included the number and proportion of people living with HIV (PLHIV) who are diagnosed, on ART, and virally suppressed out of the estimated number of PLHIV. We ranked the described methods for indicators to derive high-, medium-, and low-quality continuum. For 2010-2016, we identified 53 national care continua with viral suppression estimates representing 19.7 million (54%) of the 2015 global estimate of PLHIV. Of the 53, 6 (with 2% of global burden) were high quality, using standard surveillance methods to derive an overall denominator and program data from national cohorts for estimating steps in the continuum. Only nine countries in sub-Saharan Africa had care continua with viral suppression estimates. Of the 53 countries, the average proportion of the aggregate of PLHIV from all countries on ART was 48%, and the proportion of PLHIV who were virally suppressed was 40%. Seven countries (Sweden, Cambodia, United Kingdom, Switzerland, Denmark, Rwanda, and Namibia) were within 12% and 10% of achieving the 90-90-90 target for "on ART" and for "viral suppression," respectively. The limitations to consider when interpreting the results include significant variation in methods used to determine national continua and the possibility that complete continua were not available through our comprehensive search of the public domain. Relatively few complete national continua of care are available in the public domain, and there is considerable variation in the methods for determining progress towards the 90-90-90 target. Despite bearing the highest HIV burden, national care continua from sub-Saharan Africa were less likely to be in the public domain. A standardized monitoring and evaluation approach could improve the use of scarce resources to achieve 90-90-90 through improved transparency, accountability, and efficiency.
Sumpradit, Nithima; Suttajit, Siritree; Hunnangkul, Saowalak; Wisaijohn, Thunthita; Putthasri, Weerasak
2014-01-01
Introduction Thai pharmacy education consists of two undergraduate programs, a 5-year Bachelor of Science in Pharmacy (BScPsci and BScPcare) degree and a 6-year Doctor of Pharmacy (Pharm D). Pharmacy students who wish to serve in the public sector need to enroll in the public service program. This study aims to compare the perception of professional competency among new pharmacy graduates from the three different pharmacy programs available in 2013 who enrolled in the public service program. Methods A cross-sectional survey was conducted among new pharmacy graduates in 2013 using a self-administered, structured, close-ended questionnaire. The questionnaire consisted of respondents’ characteristics and perception of professional competencies. The competency questions consisted of 13 items with a 5-point scale. Data collection was conducted during Thailand’s annual health professional meeting on April 2, 2013 for workplace selection of pharmacy graduates. Results A total of 266 new pharmacy graduates responded to the questionnaire (response rate 49.6%). There were no significant differences in sex and admission modes across the three pharmacy programs. Pharm D graduates reported highest competency in acute care services, medication reconciliation services, and primary care services among the other two programs. BScPsci graduates reported more competence in consumer health protection and herbal and alternative medicines than BScPcare graduates. There were significant differences in three competency domains: patient care, consumer protection and community health services, and drug review and information, but no significant differences in the health administration and communication domain among three pharmacy programs. Conclusion Despite a complete change into a 6-year Pharm D program in 2014, pharmacy education in Thailand should continue evolving to be responsive to the needs of the health system. An annual survey of new pharmacy graduates should be continued, to monitor changes of professional competency across different program tracks and other factors which may influence their contribution to the health service system. Likewise, a longitudinal monitoring of their competencies in the graduate cohort should be conducted. PMID:25337000
Sumpradit, Nithima; Suttajit, Siritree; Hunnangkul, Saowalak; Wisaijohn, Thunthita; Putthasri, Weerasak
2014-01-01
Thai pharmacy education consists of two undergraduate programs, a 5-year Bachelor of Science in Pharmacy (BScPsci and BScPcare) degree and a 6-year Doctor of Pharmacy (Pharm D). Pharmacy students who wish to serve in the public sector need to enroll in the public service program. This study aims to compare the perception of professional competency among new pharmacy graduates from the three different pharmacy programs available in 2013 who enrolled in the public service program. A cross-sectional survey was conducted among new pharmacy graduates in 2013 using a self-administered, structured, close-ended questionnaire. The questionnaire consisted of respondents' characteristics and perception of professional competencies. The competency questions consisted of 13 items with a 5-point scale. Data collection was conducted during Thailand's annual health professional meeting on April 2, 2013 for workplace selection of pharmacy graduates. A total of 266 new pharmacy graduates responded to the questionnaire (response rate 49.6%). There were no significant differences in sex and admission modes across the three pharmacy programs. Pharm D graduates reported highest competency in acute care services, medication reconciliation services, and primary care services among the other two programs. BScPsci graduates reported more competence in consumer health protection and herbal and alternative medicines than BScPcare graduates. There were significant differences in three competency domains: patient care, consumer protection and community health services, and drug review and information, but no significant differences in the health administration and communication domain among three pharmacy programs. Despite a complete change into a 6-year Pharm D program in 2014, pharmacy education in Thailand should continue evolving to be responsive to the needs of the health system. An annual survey of new pharmacy graduates should be continued, to monitor changes of professional competency across different program tracks and other factors which may influence their contribution to the health service system. Likewise, a longitudinal monitoring of their competencies in the graduate cohort should be conducted.
Simulation tools for robotics research and assessment
NASA Astrophysics Data System (ADS)
Fields, MaryAnne; Brewer, Ralph; Edge, Harris L.; Pusey, Jason L.; Weller, Ed; Patel, Dilip G.; DiBerardino, Charles A.
2016-05-01
The Robotics Collaborative Technology Alliance (RCTA) program focuses on four overlapping technology areas: Perception, Intelligence, Human-Robot Interaction (HRI), and Dexterous Manipulation and Unique Mobility (DMUM). In addition, the RCTA program has a requirement to assess progress of this research in standalone as well as integrated form. Since the research is evolving and the robotic platforms with unique mobility and dexterous manipulation are in the early development stage and very expensive, an alternate approach is needed for efficient assessment. Simulation of robotic systems, platforms, sensors, and algorithms, is an attractive alternative to expensive field-based testing. Simulation can provide insight during development and debugging unavailable by many other means. This paper explores the maturity of robotic simulation systems for applications to real-world problems in robotic systems research. Open source (such as Gazebo and Moby), commercial (Simulink, Actin, LMS), government (ANVEL/VANE), and the RCTA-developed RIVET simulation environments are examined with respect to their application in the robotic research domains of Perception, Intelligence, HRI, and DMUM. Tradeoffs for applications to representative problems from each domain are presented, along with known deficiencies and disadvantages. In particular, no single robotic simulation environment adequately covers the needs of the robotic researcher in all of the domains. Simulation for DMUM poses unique constraints on the development of physics-based computational models of the robot, the environment and objects within the environment, and the interactions between them. Most current robot simulations focus on quasi-static systems, but dynamic robotic motion places an increased emphasis on the accuracy of the computational models. In order to understand the interaction of dynamic multi-body systems, such as limbed robots, with the environment, it may be necessary to build component-level computational models to provide the necessary simulation fidelity for accuracy. However, the Perception domain remains the most problematic for adequate simulation performance due to the often cartoon nature of computer rendering and the inability to model realistic electromagnetic radiation effects, such as multiple reflections, in real-time.
Progress in protein crystallography.
Dauter, Zbigniew; Wlodawer, Alexander
2016-01-01
Macromolecular crystallography evolved enormously from the pioneering days, when structures were solved by "wizards" performing all complicated procedures almost by hand. In the current situation crystal structures of large systems can be often solved very effectively by various powerful automatic programs in days or hours, or even minutes. Such progress is to a large extent coupled to the advances in many other fields, such as genetic engineering, computer technology, availability of synchrotron beam lines and many other techniques, creating the highly interdisciplinary science of macromolecular crystallography. Due to this unprecedented success crystallography is often treated as one of the analytical methods and practiced by researchers interested in structures of macromolecules, but not highly competent in the procedures involved in the process of structure determination. One should therefore take into account that the contemporary, highly automatic systems can produce results almost without human intervention, but the resulting structures must be carefully checked and validated before their release into the public domain.
NASA Technical Reports Server (NTRS)
Elrad, Tzilla (Editor); Filman, Robert E. (Editor); Bader, Atef (Editor)
2001-01-01
Computer science has experienced an evolution in programming languages and systems from the crude assembly and machine codes of the earliest computers through concepts such as formula translation, procedural programming, structured programming, functional programming, logic programming, and programming with abstract data types. Each of these steps in programming technology has advanced our ability to achieve clear separation of concerns at the source code level. Currently, the dominant programming paradigm is object-oriented programming - the idea that one builds a software system by decomposing a problem into objects and then writing the code of those objects. Such objects abstract together behavior and data into a single conceptual and physical entity. Object-orientation is reflected in the entire spectrum of current software development methodologies and tools - we have OO methodologies, analysis and design tools, and OO programming languages. Writing complex applications such as graphical user interfaces, operating systems, and distributed applications while maintaining comprehensible source code has been made possible with OOP. Success at developing simpler systems leads to aspirations for greater complexity. Object orientation is a clever idea, but has certain limitations. We are now seeing that many requirements do not decompose neatly into behavior centered on a single locus. Object technology has difficulty localizing concerns invoking global constraints and pandemic behaviors, appropriately segregating concerns, and applying domain-specific knowledge. Post-object programming (POP) mechanisms that look to increase the expressiveness of the OO paradigm are a fertile arena for current research. Examples of POP technologies include domain-specific languages, generative programming, generic programming, constraint languages, reflection and metaprogramming, feature-oriented development, views/viewpoints, and asynchronous message brokering. (Czarneclu and Eisenecker s book includes a good survey of many of these technologies).
Preconditioned implicit solvers for the Navier-Stokes equations on distributed-memory machines
NASA Technical Reports Server (NTRS)
Ajmani, Kumud; Liou, Meng-Sing; Dyson, Rodger W.
1994-01-01
The GMRES method is parallelized, and combined with local preconditioning to construct an implicit parallel solver to obtain steady-state solutions for the Navier-Stokes equations of fluid flow on distributed-memory machines. The new implicit parallel solver is designed to preserve the convergence rate of the equivalent 'serial' solver. A static domain-decomposition is used to partition the computational domain amongst the available processing nodes of the parallel machine. The SPMD (Single-Program Multiple-Data) programming model is combined with message-passing tools to develop the parallel code on a 32-node Intel Hypercube and a 512-node Intel Delta machine. The implicit parallel solver is validated for internal and external flow problems, and is found to compare identically with flow solutions obtained on a Cray Y-MP/8. A peak computational speed of 2300 MFlops/sec has been achieved on 512 nodes of the Intel Delta machine,k for a problem size of 1024 K equations (256 K grid points).
ERIC Educational Resources Information Center
Bergerson, Peter J., Ed.
The 16 chapters of this book offer innovative instructional techniques used to train public managers. It presents public management concepts along with such subtopics as organizational theory and ethics, research skills, program evaluation, financial management, computers and communication skills in public administration, comparative public…
Cumulative reports and publications through December 31, 1991
NASA Technical Reports Server (NTRS)
1992-01-01
A reports and publications list is given from the Institute for Computer Applications in Science and Engineering (ICASE) through December 31, 1991. The major categories of the current ICASE research program are; numerical methods, control and parameter identification problems, computational problems in engineering and the physical sciences, and computer systems and software. Since ICASE reports are intended to be preprints of articles that will appear in journals or conference proceedings, the published reference is included when available.
Qi, Hong; Qiao, Yao-Bin; Ren, Ya-Tao; Shi, Jing-Wen; Zhang, Ze-Yu; Ruan, Li-Ming
2016-10-17
Sequential quadratic programming (SQP) is used as an optimization algorithm to reconstruct the optical parameters based on the time-domain radiative transfer equation (TD-RTE). Numerous time-resolved measurement signals are obtained using the TD-RTE as forward model. For a high computational efficiency, the gradient of objective function is calculated using an adjoint equation technique. SQP algorithm is employed to solve the inverse problem and the regularization term based on the generalized Gaussian Markov random field (GGMRF) model is used to overcome the ill-posed problem. Simulated results show that the proposed reconstruction scheme performs efficiently and accurately.
ERIC Educational Resources Information Center
District of Columbia Public Schools, Washington, DC.
Designed for use by vendors, this guide provides an overview of the objectives for the 5-year computer literacy program to be implemented in the District of Columbia Public Schools; outlines requirements which are mandatory elements of vendors' bids unless explicitly designated "desirable"; and details specifications for computing…
The Coded Schoolhouse: One-to-One Tablet Computer Programs and Urban Education
ERIC Educational Resources Information Center
Crooks, Roderic N.
2016-01-01
Using a South Los Angeles charter school of approximately 650 students operated by a non-profit charter management organization (CMO) as the primary field site, this two-year, ethnographic research project examines the implementation of a one-to-one tablet computer program in a public high school. This dissertation examines the variety of ways…
Imagine, Invent, Program, Share: A Library-Hosted Computer Club Promotes 21st Century Skills
ERIC Educational Resources Information Center
Myers, Brian
2009-01-01
During at least one afternoon each month, Wilmette (Illinois) Public Library (WPL) hosts a local group of computer programmers, designers, and artists, who meet to discuss digital projects and resources, technical challenges, and successful design or programming strategies. WPL's Game Design Club, now in its third year, owes its existence to a…
What do computer scientists tweet? Analyzing the link-sharing practice on Twitter.
Schmitt, Marco; Jäschke, Robert
2017-01-01
Twitter communication has permeated every sphere of society. To highlight and share small pieces of information with possibly vast audiences or small circles of the interested has some value in almost any aspect of social life. But what is the value exactly for a scientific field? We perform a comprehensive study of computer scientists using Twitter and their tweeting behavior concerning the sharing of web links. Discerning the domains, hosts and individual web pages being tweeted and the differences between computer scientists and a Twitter sample enables us to look in depth at the Twitter-based information sharing practices of a scientific community. Additionally, we aim at providing a deeper understanding of the role and impact of altmetrics in computer science and give a glance at the publications mentioned on Twitter that are most relevant for the computer science community. Our results show a link sharing culture that concentrates more heavily on public and professional quality information than the Twitter sample does. The results also show a broad variety in linked sources and especially in linked publications with some publications clearly related to community-specific interests of computer scientists, while others with a strong relation to attention mechanisms in social media. This refers to the observation that Twitter is a hybrid form of social media between an information service and a social network service. Overall the computer scientists' style of usage seems to be more on the information-oriented side and to some degree also on professional usage. Therefore, altmetrics are of considerable use in analyzing computer science.
What do computer scientists tweet? Analyzing the link-sharing practice on Twitter
Schmitt, Marco
2017-01-01
Twitter communication has permeated every sphere of society. To highlight and share small pieces of information with possibly vast audiences or small circles of the interested has some value in almost any aspect of social life. But what is the value exactly for a scientific field? We perform a comprehensive study of computer scientists using Twitter and their tweeting behavior concerning the sharing of web links. Discerning the domains, hosts and individual web pages being tweeted and the differences between computer scientists and a Twitter sample enables us to look in depth at the Twitter-based information sharing practices of a scientific community. Additionally, we aim at providing a deeper understanding of the role and impact of altmetrics in computer science and give a glance at the publications mentioned on Twitter that are most relevant for the computer science community. Our results show a link sharing culture that concentrates more heavily on public and professional quality information than the Twitter sample does. The results also show a broad variety in linked sources and especially in linked publications with some publications clearly related to community-specific interests of computer scientists, while others with a strong relation to attention mechanisms in social media. This refers to the observation that Twitter is a hybrid form of social media between an information service and a social network service. Overall the computer scientists’ style of usage seems to be more on the information-oriented side and to some degree also on professional usage. Therefore, altmetrics are of considerable use in analyzing computer science. PMID:28636619
Scott, Kristin M; Barbarin, Oscar A; Brown, Jeffrey M
2013-01-01
This study examines the relations of higher order (i.e., abstract) thinking (HOT) skills to specific domains of social competence in Black boys (n = 108) attending publicly sponsored prekindergarten (pre-K) programs. Data for the study were collected as part of the National Center for Early Development and Learning (NCEDL) Multi-State Study, a national, longitudinal study examining the quality and outcomes in a representative sample of publicly sponsored pre-K programs in six states (N = 240). Pre-K and kindergarten teachers rated randomly selected children on measures of abstract thinking, self-regulation, and social functioning at the beginning and end of each school year. Applying structural equation modeling, compared with earlier time points, HOT measured in the fall of kindergarten significantly predicted each of the domains of social competence in the spring of kindergarten, with the exception of peer social skills, while controlling for general cognitive ability. Results suggest that early intervention to improve HOT may be an effective and more focused approach to address concerns about Black boys' early social competencies in specific domains and potentially reduce the risk of later social difficulties. © 2013 American Orthopsychiatric Association.
Moonseong, Heo; Erica, Irvin; Natania, Ostrovsky; Carmen, Isasi; Shawn, Hayes; Judith, Wylie-Rosett
2015-01-01
BACKGROUND HealthCorps provides school wellness programming using curricula to promote changes in nutrition, mental health and physical activity behaviors. The research objective was to evaluate effects of implementing its curricula on nutrition, mental health and physical activity knowledge and behavior. METHODS Pre- and post-survey data were collected (N = 2255) during the 2012-13 academic year from 14 New York City public high schools. An 18-item knowledge questionnaire addressed 3 domains; 26 behavioral items were analyzed by factor analysis to identify 6 behavior domains, breakfast being a seventh one-item domain. We examined the effects stratified by sex, applying mixed-effects models to take into account clustering effects of schools and participants adjusted for age. RESULTS The HealthCorps program significantly increased all 3 knowledge domains (p < .05), and significantly changed several key behavioral domains. Boys significantly increased fruits/vegetables intake (p = .03). Girls increased acceptance of new fruits/vegetables (p = .03) and breakfast consumption (p = .04), and decreased sugar-sweetened beverages and energy dense food intake (p = .03). The associations between knowledge and behavior were stronger in boys than girls. CONCLUSION The HealthCorps program significantly increased participants’ knowledge on nutrition, mental health and physical activity. It also improved several key behavioral domains, which are targets of the 2010 Dietary Guidelines to address obesity in youth. PMID:26762819
Heo, Moonseong; Irvin, Erica; Ostrovsky, Natania; Isasi, Carmen; Blank, Arthur E; Lounsbury, David W; Fredericks, Lynn; Yom, Tiana; Ginsberg, Mindy; Hayes, Shawn; Wylie-Rosett, Judith
2016-02-01
HealthCorps provides school wellness programming using curricula to promote changes in nutrition, mental health, and physical activity behaviors. The research objective was to evaluate effects of implementing its curricula on nutrition, mental health, and physical activity knowledge and behavior. Pre- and postsurvey data were collected (N = 2255) during the 2012-2013 academic year from 14 New York City public high schools. An 18-item knowledge questionnaire addressed 3 domains; 26 behavioral items were analyzed by factor analysis to identify 6 behavior domains, breakfast being a seventh 1-item domain. We examined the effects stratified by sex, applying mixed-effects models to take into account clustering effects of schools and participants adjusted for age. The HealthCorps program significantly increased all 3 knowledge domains (p < .05), and significantly changed several key behavioral domains. Boys significantly increased fruits/vegetables intake (p = .03). Girls increased acceptance of new fruits/vegetables (p = .03) and breakfast consumption (p = .04), and decreased sugar-sweetened beverages and energy dense food intake (p = .03). The associations between knowledge and behavior were stronger in boys than girls. The HealthCorps program significantly increased participants' knowledge on nutrition, mental health, and physical activity. It also improved several key behavioral domains, which are targets of the 2010 Dietary Guidelines to address obesity in youth. © 2016, American School Health Association.
An interactive program for computer-aided map design, display, and query: EMAPKGS2
Pouch, G.W.
1997-01-01
EMAPKGS2 is a user-friendly, PC-based electronic mapping tool for use in hydrogeologic exploration and appraisal. EMAPKGS2 allows the analyst to construct maps interactively from data stored in a relational database, perform point-oriented spatial queries such as locating all wells within a specified radius, perform geographic overlays, and export the data to other programs for further analysis. EMAPKGS2 runs under Microsoft?? Windows??? 3.1 and compatible operating systems. EMAPKGS2 is a public domain program available from the Kansas Geological Survey. EMAPKGS2 is the centerpiece of WHEAT, the Windows-based Hydrogeologic Exploration and Appraisal Toolkit, a suite of user-friendly Microsoft?? Windows??? programs for natural resource exploration and management. The principal goals in development of WHEAT have been ease of use, hardware independence, low cost, and end-user extensibility. WHEAT'S native data format is a Microsoft?? Access?? database. WHEAT stores a feature's geographic coordinates as attributes so they can be accessed easily by the user. The WHEAT programs are designed to be used in conjunction with other Microsoft?? Windows??? software to allow the natural resource scientist to perform work easily and effectively. WHEAT and EMAPKGS have been used at several of Kansas' Groundwater Management Districts and the Kansas Geological Survey on groundwater management operations, groundwater modeling projects, and geologic exploration projects. ?? 1997 Elsevier Science Ltd.
An Ada Based Expert System for the Ada Version of SAtool II. Volume 1 and 2
1991-06-06
Integrated Computer-Aided Manufacturing (ICAM) (20). In fact, IDEF 0 stands for ICAM Definition Method Zero . IDEF0 defines a subset of SA that omits...reasoning that has been programmed). An expert’s knowledge is specific to one problem domain as opposed to knowledge about general problem-solving...techniques. General problem domains are medicine, finance, science or engineering and so forth in which an expert can solve specific problems very well
A Roadmap for caGrid, an Enterprise Grid Architecture for Biomedical Research
Saltz, Joel; Hastings, Shannon; Langella, Stephen; Oster, Scott; Kurc, Tahsin; Payne, Philip; Ferreira, Renato; Plale, Beth; Goble, Carole; Ervin, David; Sharma, Ashish; Pan, Tony; Permar, Justin; Brezany, Peter; Siebenlist, Frank; Madduri, Ravi; Foster, Ian; Shanbhag, Krishnakant; Mead, Charlie; Hong, Neil Chue
2012-01-01
caGrid is a middleware system which combines the Grid computing, the service oriented architecture, and the model driven architecture paradigms to support development of interoperable data and analytical resources and federation of such resources in a Grid environment. The functionality provided by caGrid is an essential and integral component of the cancer Biomedical Informatics Grid (caBIG™) program. This program is established by the National Cancer Institute as a nationwide effort to develop enabling informatics technologies for collaborative, multi-institutional biomedical research with the overarching goal of accelerating translational cancer research. Although the main application domain for caGrid is cancer research, the infrastructure provides a generic framework that can be employed in other biomedical research and healthcare domains. The development of caGrid is an ongoing effort, adding new functionality and improvements based on feedback and use cases from the community. This paper provides an overview of potential future architecture and tooling directions and areas of improvement for caGrid and caGrid-like systems. This summary is based on discussions at a roadmap workshop held in February with participants from biomedical research, Grid computing, and high performance computing communities. PMID:18560123
A roadmap for caGrid, an enterprise Grid architecture for biomedical research.
Saltz, Joel; Hastings, Shannon; Langella, Stephen; Oster, Scott; Kurc, Tahsin; Payne, Philip; Ferreira, Renato; Plale, Beth; Goble, Carole; Ervin, David; Sharma, Ashish; Pan, Tony; Permar, Justin; Brezany, Peter; Siebenlist, Frank; Madduri, Ravi; Foster, Ian; Shanbhag, Krishnakant; Mead, Charlie; Chue Hong, Neil
2008-01-01
caGrid is a middleware system which combines the Grid computing, the service oriented architecture, and the model driven architecture paradigms to support development of interoperable data and analytical resources and federation of such resources in a Grid environment. The functionality provided by caGrid is an essential and integral component of the cancer Biomedical Informatics Grid (caBIG) program. This program is established by the National Cancer Institute as a nationwide effort to develop enabling informatics technologies for collaborative, multi-institutional biomedical research with the overarching goal of accelerating translational cancer research. Although the main application domain for caGrid is cancer research, the infrastructure provides a generic framework that can be employed in other biomedical research and healthcare domains. The development of caGrid is an ongoing effort, adding new functionality and improvements based on feedback and use cases from the community. This paper provides an overview of potential future architecture and tooling directions and areas of improvement for caGrid and caGrid-like systems. This summary is based on discussions at a roadmap workshop held in February with participants from biomedical research, Grid computing, and high performance computing communities.
Clarke, M G; Kennedy, K P; MacDonagh, R P
2009-01-01
To develop a clinical prediction model enabling the calculation of an individual patient's life expectancy (LE) and survival probability based on age, sex, and comorbidity for use in the joint decision-making process regarding medical treatment. A computer software program was developed with a team of 3 clinicians, 2 professional actuaries, and 2 professional computer programmers. This incorporated statistical spreadsheet and database access design methods. Data sources included life insurance industry actuarial rating factor tables (public and private domain), Government Actuary Department UK life tables, professional actuarial sources, and evidence-based medical literature. The main outcome measures were numerical and graphical display of comorbidity-adjusted LE; 5-, 10-, and 15-year survival probability; in addition to generic UK population LE. Nineteen medical conditions, which impacted significantly on LE in actuarial terms and were commonly encountered in clinical practice, were incorporated in the final model. Numerical and graphical representations of statistical predictions of LE and survival probability were successfully generated for patients with either no comorbidity or a combination of the 19 medical conditions included. Validation and testing, including actuarial peer review, confirmed consistency with the data sources utilized. The evidence-based actuarial data utilized in this computer program design represent a valuable resource for use in the clinical decision-making process, where an accurate objective assessment of patient LE can so often make the difference between patients being offered or denied medical and surgical treatment. Ongoing development to incorporate additional comorbidities and enable Web-based access will enhance its use further.
Merritt, M.L.
1977-01-01
A computerized index of water-data collection activities and retrieval software to generate publication list of this information was developed for Florida. This system serves a vital need in the administration of the many and diverse water-data collection activities. Previously, needed data was very difficult to assemble for use in program planning or project implementation. Largely descriptive, the report tells how a file of computer card images has been established which contains entries for all sites in Florida at which there is currently a water-data-collection activity. Entries include information such as identification number, station name, location, type of site, county, information about data collection, funding, and other pertinent details. The computer program FINDEX selectively retrieves entries and lists them in a format suitable for publication. Updating the index is done routinely. (Woodard-USGS)
Gps-Denied Geo-Localisation Using Visual Odometry
NASA Astrophysics Data System (ADS)
Gupta, Ashish; Chang, Huan; Yilmaz, Alper
2016-06-01
The primary method for geo-localization is based on GPS which has issues of localization accuracy, power consumption, and unavailability. This paper proposes a novel approach to geo-localization in a GPS-denied environment for a mobile platform. Our approach has two principal components: public domain transport network data available in GIS databases or OpenStreetMap; and a trajectory of a mobile platform. This trajectory is estimated using visual odometry and 3D view geometry. The transport map information is abstracted as a graph data structure, where various types of roads are modelled as graph edges and typically intersections are modelled as graph nodes. A search for the trajectory in real time in the graph yields the geo-location of the mobile platform. Our approach uses a simple visual sensor and it has a low memory and computational footprint. In this paper, we demonstrate our method for trajectory estimation and provide examples of geolocalization using public-domain map data. With the rapid proliferation of visual sensors as part of automated driving technology and continuous growth in public domain map data, our approach has the potential to completely augment, or even supplant, GPS based navigation since it functions in all environments.
2012-12-01
identity operation SIMD Single instruction, multiple datastream parallel computing Scala A byte-compiled programming language featuring dynamic type...Specific Languages 5a. CONTRACT NUMBER FA8750-10-1-0191 5b. GRANT NUMBER N/A 5c. PROGRAM ELEMENT NUMBER 61101E 6. AUTHOR(S) Armando Fox 5d...application performance, but usually must rely on efficiency programmers who are experts in explicit parallel programming to achieve it. Since such efficiency
David N. Bengston; David P. Fan
2002-01-01
Analyzes trends in favorable and unfavorable attitudes toward the Recreational Fee Demonstration Program (RFDP) in the national forests, updating an earlier study using computer content analysis of the public debate. About 65 percent of the attitudes toward the RFDP were favorable, comparable to the findings of survey research.
WhAEM2000 is computer program that solves steady state ground-water flow and advective streamlines in homogeneous, single layer aquifers. The program was designed for capture zone delineation in support of protection of the source water area surrounding public water supply well...
The Layer-Oriented Approach to Declarative Languages for Biological Modeling
Raikov, Ivan; De Schutter, Erik
2012-01-01
We present a new approach to modeling languages for computational biology, which we call the layer-oriented approach. The approach stems from the observation that many diverse biological phenomena are described using a small set of mathematical formalisms (e.g. differential equations), while at the same time different domains and subdomains of computational biology require that models are structured according to the accepted terminology and classification of that domain. Our approach uses distinct semantic layers to represent the domain-specific biological concepts and the underlying mathematical formalisms. Additional functionality can be transparently added to the language by adding more layers. This approach is specifically concerned with declarative languages, and throughout the paper we note some of the limitations inherent to declarative approaches. The layer-oriented approach is a way to specify explicitly how high-level biological modeling concepts are mapped to a computational representation, while abstracting away details of particular programming languages and simulation environments. To illustrate this process, we define an example language for describing models of ionic currents, and use a general mathematical notation for semantic transformations to show how to generate model simulation code for various simulation environments. We use the example language to describe a Purkinje neuron model and demonstrate how the layer-oriented approach can be used for solving several practical issues of computational neuroscience model development. We discuss the advantages and limitations of the approach in comparison with other modeling language efforts in the domain of computational biology and outline some principles for extensible, flexible modeling language design. We conclude by describing in detail the semantic transformations defined for our language. PMID:22615554
The layer-oriented approach to declarative languages for biological modeling.
Raikov, Ivan; De Schutter, Erik
2012-01-01
We present a new approach to modeling languages for computational biology, which we call the layer-oriented approach. The approach stems from the observation that many diverse biological phenomena are described using a small set of mathematical formalisms (e.g. differential equations), while at the same time different domains and subdomains of computational biology require that models are structured according to the accepted terminology and classification of that domain. Our approach uses distinct semantic layers to represent the domain-specific biological concepts and the underlying mathematical formalisms. Additional functionality can be transparently added to the language by adding more layers. This approach is specifically concerned with declarative languages, and throughout the paper we note some of the limitations inherent to declarative approaches. The layer-oriented approach is a way to specify explicitly how high-level biological modeling concepts are mapped to a computational representation, while abstracting away details of particular programming languages and simulation environments. To illustrate this process, we define an example language for describing models of ionic currents, and use a general mathematical notation for semantic transformations to show how to generate model simulation code for various simulation environments. We use the example language to describe a Purkinje neuron model and demonstrate how the layer-oriented approach can be used for solving several practical issues of computational neuroscience model development. We discuss the advantages and limitations of the approach in comparison with other modeling language efforts in the domain of computational biology and outline some principles for extensible, flexible modeling language design. We conclude by describing in detail the semantic transformations defined for our language.
Synfuel program analysis. Volume 2: VENVAL users manual
NASA Astrophysics Data System (ADS)
Muddiman, J. B.; Whelan, J. W.
1980-07-01
This volume is intended for program analysts and is a users manual for the VENVAL model. It contains specific explanations as to input data requirements and programming procedures for the use of this model. VENVAL is a generalized computer program to aid in evaluation of prospective private sector production ventures. The program can project interrelated values of installed capacity, production, sales revenue, operating costs, depreciation, investment, dent, earnings, taxes, return on investment, depletion, and cash flow measures. It can also compute related public sector and other external costs and revenues if unit costs are furnished.
On the utility of threads for data parallel programming
NASA Technical Reports Server (NTRS)
Fahringer, Thomas; Haines, Matthew; Mehrotra, Piyush
1995-01-01
Threads provide a useful programming model for asynchronous behavior because of their ability to encapsulate units of work that can then be scheduled for execution at runtime, based on the dynamic state of a system. Recently, the threaded model has been applied to the domain of data parallel scientific codes, and initial reports indicate that the threaded model can produce performance gains over non-threaded approaches, primarily through the use of overlapping useful computation with communication latency. However, overlapping computation with communication is possible without the benefit of threads if the communication system supports asynchronous primitives, and this comparison has not been made in previous papers. This paper provides a critical look at the utility of lightweight threads as applied to data parallel scientific programming.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 45 Public Welfare 2 2010-10-01 2010-10-01 false Computing the assistance payment under... FINANCIAL ASSISTANCE PROGRAMS § 233.35 Computing the assistance payment under retrospective budgeting after... shall be computed retrospectively, i.e., shall be based on income and other relevant circumstances in...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-03
... 1021 AGENCY: Social Security Administration (SSA). ACTION: Notice of a renewal of an existing computer.... SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988 (Public Law (Pub... computer matching involving the Federal government could be performed and adding certain protections for...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-15
... RAILROAD RETIREMENT BOARD Computer Matching and Privacy Protection Act of 1988; Report of Matching... Railroad Retirement Act. SUMMARY: As required by the Computer Matching and Privacy Protection Act of [[Page...: Under certain circumstances, the Computer Matching and Privacy Protection Act of 1988, Public Law 100...
Reusable design: A proposed approach to Public Health Informatics system design
2011-01-01
Background Since it was first defined in 1995, Public Health Informatics (PHI) has become a recognized discipline, with a research agenda, defined domain-specific competencies and a specialized corpus of technical knowledge. Information systems form a cornerstone of PHI research and implementation, representing significant progress for the nascent field. However, PHI does not advocate or incorporate standard, domain-appropriate design methods for implementing public health information systems. Reusable design is generalized design advice that can be reused in a range of similar contexts. We propose that PHI create and reuse information design knowledge by taking a systems approach that incorporates design methods from the disciplines of Human-Computer Interaction, Interaction Design and other related disciplines. Discussion Although PHI operates in a domain with unique characteristics, many design problems in public health correspond to classic design problems, suggesting that existing design methods and solution approaches are applicable to the design of public health information systems. Among the numerous methodological frameworks used in other disciplines, we identify scenario-based design and participatory design as two widely-employed methodologies that are appropriate for adoption as PHI standards. We make the case that these methods show promise to create reusable design knowledge in PHI. Summary We propose the formalization of a set of standard design methods within PHI that can be used to pursue a strategy of design knowledge creation and reuse for cost-effective, interoperable public health information systems. We suggest that all public health informaticians should be able to use these design methods and the methods should be incorporated into PHI training. PMID:21333000
Building a Data Science capability for USGS water research and communication
NASA Astrophysics Data System (ADS)
Appling, A.; Read, E. K.
2015-12-01
Interpreting and communicating water issues in an era of exponentially increasing information requires a blend of domain expertise, computational proficiency, and communication skills. The USGS Office of Water Information has established a Data Science team to meet these needs, providing challenging careers for diverse domain scientists and innovators in the fields of information technology and data visualization. Here, we detail the experience of building a Data Science capability as a bridging element between traditional water resources analyses and modern computing tools and data management techniques. This approach includes four major components: 1) building reusable research tools, 2) documenting data-intensive research approaches in peer reviewed journals, 3) communicating complex water resources issues with interactive web visualizations, and 4) offering training programs for our peers in scientific computing. These components collectively improve the efficiency, transparency, and reproducibility of USGS data analyses and scientific workflows.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kok Yan Chan, G.; Sclavounos, P. D.; Jonkman, J.
2015-04-02
A hydrodynamics computer module was developed for the evaluation of the linear and nonlinear loads on floating wind turbines using a new fluid-impulse formulation for coupling with the FAST program. The recently developed formulation allows the computation of linear and nonlinear loads on floating bodies in the time domain and avoids the computationally intensive evaluation of temporal and nonlinear free-surface problems and efficient methods are derived for its computation. The body instantaneous wetted surface is approximated by a panel mesh and the discretization of the free surface is circumvented by using the Green function. The evaluation of the nonlinear loadsmore » is based on explicit expressions derived by the fluid-impulse theory, which can be computed efficiently. Computations are presented of the linear and nonlinear loads on the MIT/NREL tension-leg platform. Comparisons were carried out with frequency-domain linear and second-order methods. Emphasis was placed on modeling accuracy of the magnitude of nonlinear low- and high-frequency wave loads in a sea state. Although fluid-impulse theory is applied to floating wind turbines in this paper, the theory is applicable to other offshore platforms as well.« less
Economic impact of large public programs: The NASA experience
NASA Technical Reports Server (NTRS)
Ginzburg, E.; Kuhn, J. W.; Schnee, J.; Yavitz, B.
1976-01-01
The economic impact of NASA programs on weather forecasting and the computer and semiconductor industries is discussed. Contributions to the advancement of the science of astronomy are also considered.
Expert Systems for Libraries at SCIL [Small Computers in Libraries]'88.
ERIC Educational Resources Information Center
Kochtanek, Thomas R.; And Others
1988-01-01
Six brief papers on expert systems for libraries cover (1) a knowledge-based approach to database design; (2) getting started in expert systems; (3) using public domain software to develop a business reference system; (4) a music cataloging inquiry system; (5) linguistic analysis of reference transactions; and (6) a model of a reference librarian.…
NASA Technical Standards Program
NASA Technical Reports Server (NTRS)
Gill, Paul S.; Vaughan, William W.; Parker, Nelson C. (Technical Monitor)
2002-01-01
The NASA Technical Standards Program was officially established in 1997 as result of a directive issued by the Administrator. It is responsible for Agency wide technical standards development, adoption (endorsement), and conversion of Center-unique standards for Agency wide use. One major element of the Program is the review of NSA technical standards products and replacement with non-Government Voluntary Consensus Standards in accordance with directions issued by the Office of Management and Budget. As part of the Program's function, it developed a NASA Integrated Technical Standards Initiative that consists of and Agency wide full-text system, standards update notification system, and lessons learned-standards integration system. The Program maintains a 'one stop-shop' Website for technical standards ad related information on aerospace materials, etc. This paper provides information on the development, current status, and plans for the NAS Technical Standards Program along with metrics on the utility of the products provided to both users within the nasa.gov Domain and the Public Domain.
NASA Technical Standards Program
NASA Technical Reports Server (NTRS)
Gill, Paul S.; Vaughan, WIlliam W.
2003-01-01
The NASA Technical Standards Program was officially established in 1997 as result of a directive issued by the Administrator. It is responsible for Agency wide technical standards development, adoption (endorsement), and conversion of Center-unique standards for Agency wide use. One major element of the Program is the review of NSA technical standards products and replacement with non-Government Voluntary Consensus Standards in accordance with directions issued by the Office of Management and Budget. As part of the Program s function, it developed a NASA Integrated Technical Standards Initiative that consists of and Agency wide full-text system, standards update notification system, and lessons learned - standards integration system. The Program maintains a "one stop-shop" Website for technical standards ad related information on aerospace materials, etc. This paper provides information on the development, current status, and plans for the NAS Technical Standards Program along with metrics on the utility of the products provided to both users within the nasa.gov Domain and the Public Domain.
Khomtchouk, Bohdan B; Weitz, Edmund; Karp, Peter D; Wahlestedt, Claes
2018-01-01
Abstract We present a rationale for expanding the presence of the Lisp family of programming languages in bioinformatics and computational biology research. Put simply, Lisp-family languages enable programmers to more quickly write programs that run faster than in other languages. Languages such as Common Lisp, Scheme and Clojure facilitate the creation of powerful and flexible software that is required for complex and rapidly evolving domains like biology. We will point out several important key features that distinguish languages of the Lisp family from other programming languages, and we will explain how these features can aid researchers in becoming more productive and creating better code. We will also show how these features make these languages ideal tools for artificial intelligence and machine learning applications. We will specifically stress the advantages of domain-specific languages (DSLs): languages that are specialized to a particular area, and thus not only facilitate easier research problem formulation, but also aid in the establishment of standards and best programming practices as applied to the specific research field at hand. DSLs are particularly easy to build in Common Lisp, the most comprehensive Lisp dialect, which is commonly referred to as the ‘programmable programming language’. We are convinced that Lisp grants programmers unprecedented power to build increasingly sophisticated artificial intelligence systems that may ultimately transform machine learning and artificial intelligence research in bioinformatics and computational biology. PMID:28040748
Khomtchouk, Bohdan B; Weitz, Edmund; Karp, Peter D; Wahlestedt, Claes
2018-05-01
We present a rationale for expanding the presence of the Lisp family of programming languages in bioinformatics and computational biology research. Put simply, Lisp-family languages enable programmers to more quickly write programs that run faster than in other languages. Languages such as Common Lisp, Scheme and Clojure facilitate the creation of powerful and flexible software that is required for complex and rapidly evolving domains like biology. We will point out several important key features that distinguish languages of the Lisp family from other programming languages, and we will explain how these features can aid researchers in becoming more productive and creating better code. We will also show how these features make these languages ideal tools for artificial intelligence and machine learning applications. We will specifically stress the advantages of domain-specific languages (DSLs): languages that are specialized to a particular area, and thus not only facilitate easier research problem formulation, but also aid in the establishment of standards and best programming practices as applied to the specific research field at hand. DSLs are particularly easy to build in Common Lisp, the most comprehensive Lisp dialect, which is commonly referred to as the 'programmable programming language'. We are convinced that Lisp grants programmers unprecedented power to build increasingly sophisticated artificial intelligence systems that may ultimately transform machine learning and artificial intelligence research in bioinformatics and computational biology.
A general numerical analysis program for the superconducting quasiparticle mixer
NASA Technical Reports Server (NTRS)
Hicks, R. G.; Feldman, M. J.; Kerr, A. R.
1986-01-01
A user-oriented computer program SISCAP (SIS Computer Analysis Program) for analyzing SIS mixers is described. The program allows arbitrary impedance terminations to be specified at all LO harmonics and sideband frequencies. It is therefore able to treat a much more general class of SIS mixers than the widely used three-frequency analysis, for which the harmonics are assumed to be short-circuited. An additional program, GETCHI, provides the necessary input data to program SISCAP. The SISCAP program performs a nonlinear analysis to determine the SIS junction voltage waveform produced by the local oscillator. The quantum theory of mixing is used in its most general form, treating the large signal properties of the mixer in the time domain. A small signal linear analysis is then used to find the conversion loss and port impedances. The noise analysis includes thermal noise from the termination resistances and shot noise from the periodic LO current. Quantum noise is not considered. Many aspects of the program have been adequately verified and found accurate.
Peterson, H.V.; Melin, K.R.
1979-01-01
The passage of the Taylor Grazing Act in 1934 marked the end of an era in the land policies in the United States in that disposal of the public lands by homesteading was terminated except under rigidly prescribed procedures, and the remaining public lands covering about 175 million acres in the western conterminous states were brought under regulatory authority for grazing use. In 1934 the lands were mostly in a severe state of deterioration as a result of overgrazing and drought. In addition to reducing numbers of livestock using the lands, successive programs of conservation practices were established of which the Soil and Moisture Conservation Program of the Department of the Interior is of particular interest here. The services of the Geological Survey, in an investigational and advisory capacity were enlisted in this program. The work of the Geological Survey has consisted of the collection of hydrologic data, investigations of range-water supplies to facilitate management and provide information for design of structures and land-treatment measures. Appraisal of the effects of treatment practices has also been an important activity. Conservation on the public domain involves mainly growing vegetation for forage and reducing erosion. The two elements are intimately related--accomplishment in one is usually reflected by an improvement in the other. Erosion is a serious problem on most of the public domain, but particularly in the Colorado River and Rio Grande basins where, despite low annual water yields, the public domain and similar lands on the Indian reservations contribute the major part of the sediment measured at the downstream gaging stations. In parts of the Missouri River basin also, erosion is obviously very active but the sediment yield contributed by the public domain cannot be as readily isolated. The reasons for the erosion are generally evident--the erodibility of the rock and soils and the sparsity of vegetation as a result of low precipitation, unfavorable soils, or past land use. How much is due to the land use is still controversial, resulting in many questions relative to planning corrective measures. The problem facing the early administrators of the Taylor Grazing Act to bring about proper use and conservation of the public domain was a difficult one because of the lack of records on actual grazing use in animal-unit months of the qualified allottees and the lack of data on treatment practices in an arid area. Reduction of grazing was imperative in some localities, but generally, it could not be brought about as rapidly as it should have been. Numbers of animal units in the grazing districts were reduced from about 3.6 million in 1941 to about 3.2 million in 1964, whereas the areas included in districts was increased about 3 percent. Reductions are still being made in certain areas where deterioration is evident. One of the earliest activities connected with management of the range was the development of water supplies to facilitate the distribution of grazing. The investigations needed for such development formed a large part of the early work in the Soil and Moisture program of the Geological Survey and has continued to be a major activity to the present time. Most of the work has involved investigations of sites for wells but has included also the investigation of proposed spring developments and collection of hydrologic data for use in reservoir design. Well-site investigations have been of two general types: (1) the investigation of a site selected by the land administration agency, and (2) an areal investigation covering entire grazing districts or units thereof. In each type of investigation, a study is made of the geology and the recharge conditions. Reports are prepared giving estimates of the depth of drilling required, the depth to water, the yield, and the quality of the water, together with other information on drilling conditions and developing. Springs are a significant so
Kratzer, Markus; Lasnik, Michael; Röhrig, Sören; Teichert, Christian; Deluca, Marco
2018-01-11
Lead zirconate titanate (PZT) is one of the prominent materials used in polycrystalline piezoelectric devices. Since the ferroelectric domain orientation is the most important parameter affecting the electromechanical performance, analyzing the domain orientation distribution is of great importance for the development and understanding of improved piezoceramic devices. Here, vector piezoresponse force microscopy (vector-PFM) has been applied in order to reconstruct the ferroelectric domain orientation distribution function of polished sections of device-ready polycrystalline lead zirconate titanate (PZT) material. A measurement procedure and a computer program based on the software Mathematica have been developed to automatically evaluate the vector-PFM data for reconstructing the domain orientation function. The method is tested on differently in-plane and out-of-plane poled PZT samples, and the results reveal the expected domain patterns and allow determination of the polarization orientation distribution function at high accuracy.
Predicting Flows of Rarefied Gases
NASA Technical Reports Server (NTRS)
LeBeau, Gerald J.; Wilmoth, Richard G.
2005-01-01
DSMC Analysis Code (DAC) is a flexible, highly automated, easy-to-use computer program for predicting flows of rarefied gases -- especially flows of upper-atmospheric, propulsion, and vented gases impinging on spacecraft surfaces. DAC implements the direct simulation Monte Carlo (DSMC) method, which is widely recognized as standard for simulating flows at densities so low that the continuum-based equations of computational fluid dynamics are invalid. DAC enables users to model complex surface shapes and boundary conditions quickly and easily. The discretization of a flow field into computational grids is automated, thereby relieving the user of a traditionally time-consuming task while ensuring (1) appropriate refinement of grids throughout the computational domain, (2) determination of optimal settings for temporal discretization and other simulation parameters, and (3) satisfaction of the fundamental constraints of the method. In so doing, DAC ensures an accurate and efficient simulation. In addition, DAC can utilize parallel processing to reduce computation time. The domain decomposition needed for parallel processing is completely automated, and the software employs a dynamic load-balancing mechanism to ensure optimal parallel efficiency throughout the simulation.
The Future Is Kids and Computers.
ERIC Educational Resources Information Center
Personal Computing, 1982
1982-01-01
Describes a project which produced educational computer programs for PET microcomputers and use of computers in money management, in a filter company, and in a certified public accountant firm (which cancelled a contract for a time-sharing service). Also describes a computerized eye information network for ophthalmologists. (JN)
Computer Applications in Assessment and Counseling.
ERIC Educational Resources Information Center
Veldman, Donald J.; Menaker, Shirley L.
Public school counselors and psychologists can expect valuable assistance from computer-based assessment and counseling techniques within a few years, as programs now under development become generally available for the typical computers now used by schools for grade-reporting and class-scheduling. Although routine information-giving and gathering…
NASA Astrophysics Data System (ADS)
Lengert, Wolfgang; Farres, Jordi; Lanari, Riccardo; Casu, Francesco; Manunta, Michele; Lassalle-Balier, Gerard
2014-05-01
Helix Nebula has established a growing public private partnership of more than 30 commercial cloud providers, SMEs, and publicly funded research organisations and e-infrastructures. The Helix Nebula strategy is to establish a federated cloud service across Europe. Three high-profile flagships, sponsored by CERN (high energy physics), EMBL (life sciences) and ESA/DLR/CNES/CNR (earth science), have been deployed and extensively tested within this federated environment. The commitments behind these initial flagships have created a critical mass that attracts suppliers and users to the initiative, to work together towards an "Information as a Service" market place. Significant progress in implementing the following 4 programmatic goals (as outlined in the strategic Plan Ref.1) has been achieved: × Goal #1 Establish a Cloud Computing Infrastructure for the European Research Area (ERA) serving as a platform for innovation and evolution of the overall infrastructure. × Goal #2 Identify and adopt suitable policies for trust, security and privacy on a European-level can be provided by the European Cloud Computing framework and infrastructure. × Goal #3 Create a light-weight governance structure for the future European Cloud Computing Infrastructure that involves all the stakeholders and can evolve over time as the infrastructure, services and user-base grows. × Goal #4 Define a funding scheme involving the three stake-holder groups (service suppliers, users, EC and national funding agencies) into a Public-Private-Partnership model to implement a Cloud Computing Infrastructure that delivers a sustainable business environment adhering to European level policies. Now in 2014 a first version of this generic cross-domain e-infrastructure is ready to go into operations building on federation of European industry and contributors (data, tools, knowledge, ...). This presentation describes how Helix Nebula is being used in the domain of earth science focusing on geohazards. The so called "Supersite Exploitation Platform" (SSEP) provides scientists an overarching federated e-infrastructure with a very fast access to (i) large volume of data (EO/non-space data), (ii) computing resources (e.g. hybrid cloud/grid), (iii) processing software (e.g. toolboxes, RTMs, retrieval baselines, visualization routines), and (iv) general platform capabilities (e.g. user management and access control, accounting, information portal, collaborative tools, social networks etc.). In this federation each data provider remains in full control of the implementation of its data policy. This presentation outlines the Architecture (technical and services) supporting very heterogeneous science domains as well as the procedures for new-comers to join the Helix Nebula Market Place. Ref.1 http://cds.cern.ch/record/1374172/files/CERN-OPEN-2011-036.pdf
NASA Astrophysics Data System (ADS)
Jamie, Majid
2016-11-01
Singh and Mogi (2003) presented a forward modeling (FWD) program, coded in FORTRAN 77 called "EMLCLLER", which is capable of computing the frequency-domain electromagnetic (EM) response of a large circular loop, in terms of vertical magnetic component (Hz), over 1D layer earth models; computations at this program could be performed by assuming variable transmitter-receiver configurations and incorporating both conduction and displacement currents into computations. Integral equations at this program are computed through digital linear filters based on the Hankel transforms together with analytic solutions based on hyper-geometric functions. Despite capabilities of EMLCLLER, there are some mistakes at this program that make its FWD results unreliable. The mistakes in EMLCLLER arise in using wrong algorithm for computing reflection coefficient of the EM wave in TE-mode (rTE), and using flawed algorithms for computing phase and normalized phase values relating to Hz; in this paper corrected form of these mistakes are presented. Moreover, in order to illustrate how these mistakes can affect FWD results, EMLCLLER and corrected version of this program presented in this paper titled "EMLCLLER_Corr" are conducted on different two- and three-layered earth models; afterwards their FWD results in terms of real and imaginary parts of Hz, its normalized amplitude, and the corresponding normalized phase curves are plotted versus frequency and compared to each other. In addition, in Singh and Mogi (2003) extra derivations for computing radial component of the magnetic field (Hr) and angular component of the electric field (Eϕ) are also presented where the numerical solution presented for Hr is incorrect; in this paper the correct numerical solution for this derivation is also presented.
One-to-One Computing in Public Schools: Lessons from "Laptops for All" Programs
ERIC Educational Resources Information Center
Abell Foundation, 2008
2008-01-01
The basic tenet of one-to-one computing is that the student and teacher have Internet-connected, wireless computing devices in the classroom and optimally at home as well. Also known as "ubiquitous computing," this strategy assumes that every teacher and student has her own computing device and obviates the need for moving classes to…
Improving student retention in computer engineering technology
NASA Astrophysics Data System (ADS)
Pierozinski, Russell Ivan
The purpose of this research project was to improve student retention in the Computer Engineering Technology program at the Northern Alberta Institute of Technology by reducing the number of dropouts and increasing the graduation rate. This action research project utilized a mixed methods approach of a survey and face-to-face interviews. The participants were male and female, with a large majority ranging from 18 to 21 years of age. The research found that participants recognized their skills and capability, but their capacity to remain in the program was dependent on understanding and meeting the demanding pace and rigour of the program. The participants recognized that curriculum delivery along with instructor-student interaction had an impact on student retention. To be successful in the program, students required support in four domains: academic, learning management, career, and social.
Extended Cost-Effectiveness Analysis for Health Policy Assessment: A Tutorial.
Verguet, Stéphane; Kim, Jane J; Jamison, Dean T
2016-09-01
Health policy instruments such as the public financing of health technologies (e.g., new drugs, vaccines) entail consequences in multiple domains. Fundamentally, public health policies aim at increasing the uptake of effective and efficient interventions and at subsequently leading to better health benefits (e.g., premature mortality and morbidity averted). In addition, public health policies can provide non-health benefits in addition to the sole well-being of populations and beyond the health sector. For instance, public policies such as social and health insurance programs can prevent illness-related impoverishment and procure financial risk protection. Furthermore, public policies can improve the distribution of health in the population and promote the equalization of health among individuals. Extended cost-effectiveness analysis was developed to address health policy assessment, specifically to evaluate the health and financial consequences of public policies in four domains: (1) the health gains; (2) the financial risk protection benefits; (3) the total costs to the policy makers; and (4) the distributional benefits. Here, we present a tutorial that describes both the intent of extended cost-effectiveness analysis and its keys to allow easy implementation for health policy assessment.
Utilizing Modern Technology in Adult and Continuing Education Programs.
ERIC Educational Resources Information Center
New York State Education Dept., Albany. Bureau of Curriculum Development.
This publication, designed as a supplement to the manual entitled "Managing Programs for Adults" (1983), provides guidelines for establishing or expanding the use of video and computers by administration and staff of adult education programs. The first section presents the use of video technology for program promotion, instruction, and staff…
WELLHEAD ANALYTIC ELEMENT MODEL FOR WINDOWS
WhAEM2000 (wellhead analytic element model for Win 98/00/NT/XP) is a public domain, ground-water flow model designed to facilitate capture zone delineation and protection area mapping in support of the State's and Tribe's Wellhead Protection Programs (WHPP) and Source Water Asses...
Introducing Seismic Tomography with Computational Modeling
NASA Astrophysics Data System (ADS)
Neves, R.; Neves, M. L.; Teodoro, V.
2011-12-01
Learning seismic tomography principles and techniques involves advanced physical and computational knowledge. In depth learning of such computational skills is a difficult cognitive process that requires a strong background in physics, mathematics and computer programming. The corresponding learning environments and pedagogic methodologies should then involve sets of computational modelling activities with computer software systems which allow students the possibility to improve their mathematical or programming knowledge and simultaneously focus on the learning of seismic wave propagation and inverse theory. To reduce the level of cognitive opacity associated with mathematical or programming knowledge, several computer modelling systems have already been developed (Neves & Teodoro, 2010). Among such systems, Modellus is particularly well suited to achieve this goal because it is a domain general environment for explorative and expressive modelling with the following main advantages: 1) an easy and intuitive creation of mathematical models using just standard mathematical notation; 2) the simultaneous exploration of images, tables, graphs and object animations; 3) the attribution of mathematical properties expressed in the models to animated objects; and finally 4) the computation and display of mathematical quantities obtained from the analysis of images and graphs. Here we describe virtual simulations and educational exercises which enable students an easy grasp of the fundamental of seismic tomography. The simulations make the lecture more interactive and allow students the possibility to overcome their lack of advanced mathematical or programming knowledge and focus on the learning of seismological concepts and processes taking advantage of basic scientific computation methods and tools.
The Teaching and Learning Environment SAIDA: Some Features and Lessons.
ERIC Educational Resources Information Center
Grandbastien, Monique; Morinet-Lambert, Josette
Written in ADA language, SAIDA, a Help System for Data Implementation, is an experimental teaching and learning environment which uses artificial intelligence techniques to teach a computer science course on abstract data representations. The application domain is teaching advanced programming concepts which have not received much attention from…
Processes and Knowledge in Designing Instruction.
ERIC Educational Resources Information Center
Greeno, James G.; And Others
Results from a study of problem solving in the domain of instructional design are presented. Subjects were eight teacher trainees who were recent graduates of or were enrolled in the Stanford Teacher Education Program at Stanford University (California). Subjects studied a computer-based tutorial--the VST2000--about a fictitious vehicle. The…
3D Printing of Protein Models in an Undergraduate Laboratory: Leucine Zippers
ERIC Educational Resources Information Center
Meyer, Scott C.
2015-01-01
An upper-division undergraduate laboratory experiment is described that explores the structure/function relationship of protein domains, namely leucine zippers, through a molecular graphics computer program and physical models fabricated by 3D printing. By generating solvent accessible surfaces and color-coding hydrophobic, basic, and acidic amino…
Intelligent Tutoring Systems and Learning Outcomes: A Meta-Analysis
ERIC Educational Resources Information Center
Ma, Wenting; Adesope, Olusola O.; Nesbit, John C.; Liu, Qing
2014-01-01
Intelligent Tutoring Systems (ITS) are computer programs that model learners' psychological states to provide individualized instruction. They have been developed for diverse subject areas (e.g., algebra, medicine, law, reading) to help learners acquire domain-specific, cognitive and metacognitive knowledge. A meta-analysis was conducted on…
Pedagogical Strategies for Human and Computer Tutoring.
ERIC Educational Resources Information Center
Reiser, Brian J.
The pedagogical strategies of human tutors in problem solving domains are described and the possibility of incorporating these techniques into computerized tutors is examined. GIL (Graphical Instruction in LISP), an intelligent tutoring system for LISP programming, is compared to human tutors teaching the same material in order to identify how the…
Activity Theory and Qualitative Research in Digital Domains
ERIC Educational Resources Information Center
Sam, Cecile
2012-01-01
Understanding the interactions between people, computer-mediated communication, and online life requires that researchers appropriate a set of methodological tools that would be best suited for capturing and analyzing the phenomenon. However, these tools are not limited to relevant technological forms of data collections and analysis programs; it…
Large-scale parallel lattice Boltzmann-cellular automaton model of two-dimensional dendritic growth
NASA Astrophysics Data System (ADS)
Jelinek, Bohumir; Eshraghi, Mohsen; Felicelli, Sergio; Peters, John F.
2014-03-01
An extremely scalable lattice Boltzmann (LB)-cellular automaton (CA) model for simulations of two-dimensional (2D) dendritic solidification under forced convection is presented. The model incorporates effects of phase change, solute diffusion, melt convection, and heat transport. The LB model represents the diffusion, convection, and heat transfer phenomena. The dendrite growth is driven by a difference between actual and equilibrium liquid composition at the solid-liquid interface. The CA technique is deployed to track the new interface cells. The computer program was parallelized using the Message Passing Interface (MPI) technique. Parallel scaling of the algorithm was studied and major scalability bottlenecks were identified. Efficiency loss attributable to the high memory bandwidth requirement of the algorithm was observed when using multiple cores per processor. Parallel writing of the output variables of interest was implemented in the binary Hierarchical Data Format 5 (HDF5) to improve the output performance, and to simplify visualization. Calculations were carried out in single precision arithmetic without significant loss in accuracy, resulting in 50% reduction of memory and computational time requirements. The presented solidification model shows a very good scalability up to centimeter size domains, including more than ten million of dendrites. Catalogue identifier: AEQZ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEQZ_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, UK Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 29,767 No. of bytes in distributed program, including test data, etc.: 3131,367 Distribution format: tar.gz Programming language: Fortran 90. Computer: Linux PC and clusters. Operating system: Linux. Has the code been vectorized or parallelized?: Yes. Program is parallelized using MPI. Number of processors used: 1-50,000 RAM: Memory requirements depend on the grid size Classification: 6.5, 7.7. External routines: MPI (http://www.mcs.anl.gov/research/projects/mpi/), HDF5 (http://www.hdfgroup.org/HDF5/) Nature of problem: Dendritic growth in undercooled Al-3 wt% Cu alloy melt under forced convection. Solution method: The lattice Boltzmann model solves the diffusion, convection, and heat transfer phenomena. The cellular automaton technique is deployed to track the solid/liquid interface. Restrictions: Heat transfer is calculated uncoupled from the fluid flow. Thermal diffusivity is constant. Unusual features: Novel technique, utilizing periodic duplication of a pre-grown “incubation” domain, is applied for the scaleup test. Running time: Running time varies from minutes to days depending on the domain size and number of computational cores.
Evaluation of the Nature-Computer Camp: Summer 1993.
ERIC Educational Resources Information Center
Negero, Arega
The purpose of the Nature Computer Camp (NCC) is to provide sixth-graders in District of Columbia Public Schools an opportunity to explore and appreciate nature in its natural setting. The program also aspires to develop computer proficiency, enhance students' social and interpersonal skills, stimulate group interaction, and strengthen students'…
Cloudgene: A graphical execution platform for MapReduce programs on private and public clouds
2012-01-01
Background The MapReduce framework enables a scalable processing and analyzing of large datasets by distributing the computational load on connected computer nodes, referred to as a cluster. In Bioinformatics, MapReduce has already been adopted to various case scenarios such as mapping next generation sequencing data to a reference genome, finding SNPs from short read data or matching strings in genotype files. Nevertheless, tasks like installing and maintaining MapReduce on a cluster system, importing data into its distributed file system or executing MapReduce programs require advanced knowledge in computer science and could thus prevent scientists from usage of currently available and useful software solutions. Results Here we present Cloudgene, a freely available platform to improve the usability of MapReduce programs in Bioinformatics by providing a graphical user interface for the execution, the import and export of data and the reproducibility of workflows on in-house (private clouds) and rented clusters (public clouds). The aim of Cloudgene is to build a standardized graphical execution environment for currently available and future MapReduce programs, which can all be integrated by using its plug-in interface. Since Cloudgene can be executed on private clusters, sensitive datasets can be kept in house at all time and data transfer times are therefore minimized. Conclusions Our results show that MapReduce programs can be integrated into Cloudgene with little effort and without adding any computational overhead to existing programs. This platform gives developers the opportunity to focus on the actual implementation task and provides scientists a platform with the aim to hide the complexity of MapReduce. In addition to MapReduce programs, Cloudgene can also be used to launch predefined systems (e.g. Cloud BioLinux, RStudio) in public clouds. Currently, five different bioinformatic programs using MapReduce and two systems are integrated and have been successfully deployed. Cloudgene is freely available at http://cloudgene.uibk.ac.at. PMID:22888776
Condition Recognition for a Program Synthesizer.
1981-06-01
suited for tnls type work since it neitner complains c±* boredom nor wanders from its assigned task. Tne macnine meticulously sequences throuzh a series...natural language understanding is a difficult problem that can De solvel only in limited domains. The use of natural language in programming ta been...and output behavior. For example, if someone wanted to lescribe a proeram to compute tne Fibonacci numbers tnen tie could supply tne input-outpost pairs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Solar-Lezama, Armando
The goal of the project was to develop a programming model that would significantly improve productivity in the high-performance computing domain by bringing together three components: a) Automated equivalence checking, b) Sketch-based program synthesis, and c) Autotuning. The report provides an executive summary of the research accomplished through this project. At the end of the report is appended a paper that describes in more detail the key technical accomplishments from this project, and which was published in SC 2014.
Solving the "Hidden Line" Problem
NASA Technical Reports Server (NTRS)
1984-01-01
David Hedgley Jr., a mathematician at Dryden Flight Research Center, has developed an accurate computer program that considers whether a line in a graphic model of a three dimensional object should or should not be visible. The Hidden Line Computer Code, program automatically removes superfluous lines and permits the computer to display an object from specific viewpoints, just as the human eye would see it. Users include Rowland Institute for Science in Cambridge, MA, several departments of Lockheed Georgia Co., and Nebraska Public Power District (NPPD).
Use of the Computer for Research on Instruction and Student Understanding in Physics.
NASA Astrophysics Data System (ADS)
Grayson, Diane Jeanette
This dissertation describes an investigation of how the computer may be utilized to perform research on instruction and on student understanding in physics. The research was conducted within three content areas: kinematics, waves and dynamics. The main focus of the research on instruction was the determination of factors needed for a computer program to be instructionally effective. The emphasis in the research on student understanding was the identification of specific conceptual and reasoning difficulties students encounter with the subject matter. Most of the research was conducted using the computer -based interview, a technique developed during the early part of the work, conducted within the domain of kinematics. In a computer-based interview, a student makes a prediction about how a particular system will behave under given circumstances, observes a simulation of the event on a computer screen, and then is asked by an interviewer to explain any discrepancy between prediction and observation. In the course of the research, a model was developed for producing educational software. The model has three important components: (i) research on student difficulties in the content area to be addressed, (ii) observations of students using the computer program, and (iii) consequent program modification. This model was used to guide the development of an instructional computer program dealing with graphical representations of transverse pulses. Another facet of the research involved the design of a computer program explicitly for the purposes of research. A computer program was written that simulates a modified Atwood's machine. The program was than used in computer -based interviews and proved to be an effective means of probing student understanding of dynamics concepts. In order to ascertain whether or not the student difficulties identified were peculiar to the computer, laboratory-based interviews with real equipment were also conducted. The laboratory-based interviews were designed to parallel the computer-based interviews as closely as possible. The results of both types of interviews are discussed in detail. The dissertation concludes with a discussion of some of the benefits of using the computer in physics instruction and physics education research. Attention is also drawn to some of the limitations of the computer as a research instrument or instructional device.
ERIC Educational Resources Information Center
MacCabe, Bruce
The Literacy Learning Center Project, a project of the Meriden Public Library (Connecticut), targeted the educationally underserved and functionally illiterate, and involved recruitment, retention, space renovation, coalition building, public awareness, training, basic literacy, collection development, tutoring, computer assisted services, and…
Realising the Mass Public Benefit of Evidence-Based Psychological Therapies: The IAPT Program
Clark, David M
2018-01-01
Empirically supported psychological therapies have been developed for many mental health conditions. However, in most countries only a small proportion of the public benefit from these advances. The English Improving Access to Psychological Therapies (IAPT) program aims to bridge the gap between research and practice by training over 10,500 new psychological therapists in empirically supported treatments and deploying them in new services for the treatment of depression and anxiety disorders. Currently IAPT treats over 560,000 patients per year, obtains clinical outcome data on 98.5% of these individuals and places this information in the public domain. Around 50% of patients treated in IAPT services recover and two-thirds show worthwhile benefits. The clinical and economic arguments on which IAPT is based are presented, along with details of the service model, how the program was implemented, and recent findings about service organization. Limitations and future directions are outlined. PMID:29350997
Computation of Southern Pine Site Index Using a TI-59 Calculator
Robert M. Farrar
1983-01-01
A program is described that permits computation of site index in the field using a Texas Instruments model TI-59 programmable, hand-held, battery-powered calculator. Based on a series of equations developed by R.M. Farrar, Jr., for the site index curves in USDA Miscellaneous Publication 50, the program can accommodate any index base age, tree age, and height within...
Do you BEHAVE? - Application of the BehavePlus fire modeling system
Pat Andrews
2010-01-01
The BehavePlus fire modeling system is the successor to BEHAVE, which was first used in the field in 1984. It is public domain software, available for free use on personal computers. Information on user communities and fire management applications can be useful in designing next generation systems. Several sources of information about BehavePlus are summarized to...
ERIC Educational Resources Information Center
Palmer, Loretta
A basic algebra unit was developed at Utah Valley State College to emphasize applications of mathematical concepts in the work world, using video and computer-generated graphics to integrate textual material. The course was implemented in three introductory algebra sections involving 80 students and taught algebraic concepts using such areas as…
Users guide for STHARVEST: software to estimate the cost of harvesting small timber.
Roger D. Fight; Xiaoshan Zhang; Bruce R. Hartsough
2003-01-01
The STHARVEST computer application is Windows-based, public-domain software used to estimate costs for harvesting small-diameter stands or the small-diameter component of a mixed-sized stand. The equipment production rates were developed from existing studies. Equipment operating cost rates were based on November 1998 prices for new equipment and wage rates for the...
Digital Dome versus Desktop Display: Learning Outcome Assessments by Domain Experts
ERIC Educational Resources Information Center
Jacobson, Jeffery
2013-01-01
In previous publications, the author reported that students learned about Egyptian architecture and society by playing an educational game based on a virtual representation of a temple. Students played the game in a digital dome or on a standard desktop computer, and (each) then recorded a video tour of the temple. Those who had used the dome…
HPCCP/CAS Workshop Proceedings 1998
NASA Technical Reports Server (NTRS)
Schulbach, Catherine; Mata, Ellen (Editor); Schulbach, Catherine (Editor)
1999-01-01
This publication is a collection of extended abstracts of presentations given at the HPCCP/CAS (High Performance Computing and Communications Program/Computational Aerosciences Project) Workshop held on August 24-26, 1998, at NASA Ames Research Center, Moffett Field, California. The objective of the Workshop was to bring together the aerospace high performance computing community, consisting of airframe and propulsion companies, independent software vendors, university researchers, and government scientists and engineers. The Workshop was sponsored by the HPCCP Office at NASA Ames Research Center. The Workshop consisted of over 40 presentations, including an overview of NASA's High Performance Computing and Communications Program and the Computational Aerosciences Project; ten sessions of papers representative of the high performance computing research conducted within the Program by the aerospace industry, academia, NASA, and other government laboratories; two panel sessions; and a special presentation by Mr. James Bailey.
Mentorship and competencies for applied chronic disease epidemiology.
Lengerich, Eugene J; Siedlecki, Jennifer C; Brownson, Ross; Aldrich, Tim E; Hedberg, Katrina; Remington, Patrick; Siegel, Paul Z
2003-01-01
To understand the potential and establish a framework for mentoring as a method to develop professional competencies of state-level applied chronic disease epidemiologists, model mentorship programs were reviewed, specific competencies were identified, and competencies were then matched to essential public health services. Although few existing mentorship programs in public health were identified, common themes in other professional mentorship programs support the potential of mentoring as an effective means to develop capacity for applied chronic disease epidemiology. Proposed competencies for chronic disease epidemiologists in a mentorship program include planning, analysis, communication, basic public health, informatics and computer knowledge, and cultural diversity. Mentoring may constitute a viable strategy to build chronic disease epidemiology capacity, especially in public health agencies where resource and personnel system constraints limit opportunities to recruit and hire new staff.
Exploring quantum computing application to satellite data assimilation
NASA Astrophysics Data System (ADS)
Cheung, S.; Zhang, S. Q.
2015-12-01
This is an exploring work on potential application of quantum computing to a scientific data optimization problem. On classical computational platforms, the physical domain of a satellite data assimilation problem is represented by a discrete variable transform, and classical minimization algorithms are employed to find optimal solution of the analysis cost function. The computation becomes intensive and time-consuming when the problem involves large number of variables and data. The new quantum computer opens a very different approach both in conceptual programming and in hardware architecture for solving optimization problem. In order to explore if we can utilize the quantum computing machine architecture, we formulate a satellite data assimilation experimental case in the form of quadratic programming optimization problem. We find a transformation of the problem to map it into Quadratic Unconstrained Binary Optimization (QUBO) framework. Binary Wavelet Transform (BWT) will be applied to the data assimilation variables for its invertible decomposition and all calculations in BWT are performed by Boolean operations. The transformed problem will be experimented as to solve for a solution of QUBO instances defined on Chimera graphs of the quantum computer.
PSYCHE: An Object-Oriented Approach to Simulating Medical Education
Mullen, Jamie A.
1990-01-01
Traditional approaches to computer-assisted instruction (CAI) do not provide realistic simulations of medical education, in part because they do not utilize heterogeneous knowledge bases for their source of domain knowledge. PSYCHE, a CAI program designed to teach hypothetico-deductive psychiatric decision-making to medical students, uses an object-oriented implementation of an intelligent tutoring system (ITS) to model the student, domain expert, and tutor. It models the transactions between the participants in complex transaction chains, and uses heterogeneous knowledge bases to represent both domain and procedural knowledge in clinical medicine. This object-oriented approach is a flexible and dynamic approach to modeling, and represents a potentially valuable tool for the investigation of medical education and decision-making.
RISC Processors and High Performance Computing
NASA Technical Reports Server (NTRS)
Bailey, David H.; Saini, Subhash; Craw, James M. (Technical Monitor)
1995-01-01
This tutorial will discuss the top five RISC microprocessors and the parallel systems in which they are used. It will provide a unique cross-machine comparison not available elsewhere. The effective performance of these processors will be compared by citing standard benchmarks in the context of real applications. The latest NAS Parallel Benchmarks, both absolute performance and performance per dollar, will be listed. The next generation of the NPB will be described. The tutorial will conclude with a discussion of future directions in the field. Technology Transfer Considerations: All of these computer systems are commercially available internationally. Information about these processors is available in the public domain, mostly from the vendors themselves. The NAS Parallel Benchmarks and their results have been previously approved numerous times for public release, beginning back in 1991.
Characterization of attacks on public telephone networks
NASA Astrophysics Data System (ADS)
Lorenz, Gary V.; Manes, Gavin W.; Hale, John C.; Marks, Donald; Davis, Kenneth; Shenoi, Sujeet
2001-02-01
The U.S. Public Telephone Network (PTN) is a massively connected distributed information systems, much like the Internet. PTN signaling, transmission and operations functions must be protected from physical and cyber attacks to ensure the reliable delivery of telecommunications services. The increasing convergence of PTNs with wireless communications systems, computer networks and the Internet itself poses serious threats to our nation's telecommunications infrastructure. Legacy technologies and advanced services encumber well-known and as of yet undiscovered vulnerabilities that render them susceptible to cyber attacks. This paper presents a taxonomy of cyber attacks on PTNs in converged environments that synthesizes exploits in computer and communications network domains. The taxonomy provides an opportunity for the systematic exploration of mitigative and preventive strategies, as well as for the identification and classification of emerging threats.
Lederer, Alyssa M; King, Mindy H; Sovinski, Danielle; Seo, Dong-Chul; Kim, Nayoung
2015-01-01
Curtailing childhood obesity is a public health imperative. Although multicomponent school-based programs reduce obesity among children, less is known about the implementation fidelity of these interventions. This study examines process evaluation findings for the Healthy, Energetic Ready, Outstanding, Enthusiastic, Schools (HEROES) Initiative, a tri-state school-based childhood obesity prevention intervention based on the coordinated school health (CSH) model. Site visits were conducted that included key stakeholder interviews, observation, and document review. Scores were given for 8 domains, and a total implementation score was calculated. Two-way analyses of variance were conducted to examine the relationship of 4 school-level characteristics: elementary vs. middle/high schools, public vs. private schools, district vs. building level implementation, and socioeconomic status on each implementation area. Overall, schools had high fidelity scores, although some domains were implemented more successfully than others. Three school-level characteristics were associated with 1 or more domains, with elementary schools and schools implementing at the building level consistently having higher implementation scores than their counterparts. Process evaluation findings provide insight into successes and challenges schools implementing the CSH approach may encounter. Although preliminary, these findings on school-level characteristics establish a new area of research related to school-based childhood obesity prevention programs' implementation fidelity. © 2014, American School Health Association.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-15
... Information Collection; Comment Request; Broadband Technology Opportunities Program Post-Award Quarterly and... (Recovery Act) establishes and provides $4.7 billion for the Broadband Technology Opportunities Program... million will be made available for competitive grants to expand public computer center capacity; at least...
Development of a core competency model for the master of public health degree.
Calhoun, Judith G; Ramiah, Kalpana; Weist, Elizabeth McGean; Shortell, Stephen M
2008-09-01
Core competencies have been used to redefine curricula across the major health professions in recent decades. In 2006, the Association of Schools of Public Health identified core competencies for the master of public health degree in graduate schools and programs of public health. We provide an overview of the model development process and a listing of 12 core domains and 119 competencies that can serve as a resource for faculty and students for enhancing the quality and accountability of graduate public health education and training. The primary vision for the initiative is the graduation of professionals who are more fully prepared for the many challenges and opportunities in public health in the forthcoming decade.
Translational bioinformatics in the cloud: an affordable alternative
2010-01-01
With the continued exponential expansion of publicly available genomic data and access to low-cost, high-throughput molecular technologies for profiling patient populations, computational technologies and informatics are becoming vital considerations in genomic medicine. Although cloud computing technology is being heralded as a key enabling technology for the future of genomic research, available case studies are limited to applications in the domain of high-throughput sequence data analysis. The goal of this study was to evaluate the computational and economic characteristics of cloud computing in performing a large-scale data integration and analysis representative of research problems in genomic medicine. We find that the cloud-based analysis compares favorably in both performance and cost in comparison to a local computational cluster, suggesting that cloud computing technologies might be a viable resource for facilitating large-scale translational research in genomic medicine. PMID:20691073
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-17
...: Social Security Administration (SSA). ACTION: Notice of a renewal of an existing computer-matching... INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988 (Public Law (Pub. L.) 100-503... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2012-0021] Privacy Act of 1974, as Amended...
Dastane, A; Vaidyanathan, T K; Vaidyanathan, J; Mehra, R; Hesby, R
1996-01-01
It is necessary to visualize and reconstruct tissue anatomic surfaces accurately for a variety of oral rehabilitation applications such as surface wear characterization and automated fabrication of dental restorations, accuracy of reproduction of impression and die materials, etc. In this investigation, a 3-D digitization and computer-graphic system was developed for surface characterization. The hardware consists of a profiler assembly for digitization in an MTS biomechanical test system with an artificial mouth, an IBM PS/2 computer model 70 for data processing and a Hewlett-Packard laser printer for hardcopy outputs. The software used includes a commercially available Surfer 3-D graphics package, a public domain data-fitting alignment software and an inhouse Pascal program for intercommunication plus some other limited tasks. Surfaces were digitized before and after rotation by angular displacement, the digital data were interpolated by Surfer to provide a data grid and the surfaces were computer graphically reconstructed: Misaligned surfaces were aligned by the data-fitting alignment software under different choices of parameters. The effect of different interpolation parameters (e.g. grid size, method of interpolation) and extent of rotation on the alignment accuracy was determined. The results indicate that improved alignment accuracy results from optimization of interpolation parameters and minimization of the initial misorientation between the digitized surfaces. The method provides important advantages for surface reconstruction and visualization, such as overlay of sequentially generated surfaces and accurate alignment of pairs of surfaces with small misalignment.
High-Speed GPU-Based Fully Three-Dimensional Diffuse Optical Tomographic System
Saikia, Manob Jyoti; Kanhirodan, Rajan; Mohan Vasu, Ram
2014-01-01
We have developed a graphics processor unit (GPU-) based high-speed fully 3D system for diffuse optical tomography (DOT). The reduction in execution time of 3D DOT algorithm, a severely ill-posed problem, is made possible through the use of (1) an algorithmic improvement that uses Broyden approach for updating the Jacobian matrix and thereby updating the parameter matrix and (2) the multinode multithreaded GPU and CUDA (Compute Unified Device Architecture) software architecture. Two different GPU implementations of DOT programs are developed in this study: (1) conventional C language program augmented by GPU CUDA and CULA routines (C GPU), (2) MATLAB program supported by MATLAB parallel computing toolkit for GPU (MATLAB GPU). The computation time of the algorithm on host CPU and the GPU system is presented for C and Matlab implementations. The forward computation uses finite element method (FEM) and the problem domain is discretized into 14610, 30823, and 66514 tetrahedral elements. The reconstruction time, so achieved for one iteration of the DOT reconstruction for 14610 elements, is 0.52 seconds for a C based GPU program for 2-plane measurements. The corresponding MATLAB based GPU program took 0.86 seconds. The maximum number of reconstructed frames so achieved is 2 frames per second. PMID:24891848
High-Speed GPU-Based Fully Three-Dimensional Diffuse Optical Tomographic System.
Saikia, Manob Jyoti; Kanhirodan, Rajan; Mohan Vasu, Ram
2014-01-01
We have developed a graphics processor unit (GPU-) based high-speed fully 3D system for diffuse optical tomography (DOT). The reduction in execution time of 3D DOT algorithm, a severely ill-posed problem, is made possible through the use of (1) an algorithmic improvement that uses Broyden approach for updating the Jacobian matrix and thereby updating the parameter matrix and (2) the multinode multithreaded GPU and CUDA (Compute Unified Device Architecture) software architecture. Two different GPU implementations of DOT programs are developed in this study: (1) conventional C language program augmented by GPU CUDA and CULA routines (C GPU), (2) MATLAB program supported by MATLAB parallel computing toolkit for GPU (MATLAB GPU). The computation time of the algorithm on host CPU and the GPU system is presented for C and Matlab implementations. The forward computation uses finite element method (FEM) and the problem domain is discretized into 14610, 30823, and 66514 tetrahedral elements. The reconstruction time, so achieved for one iteration of the DOT reconstruction for 14610 elements, is 0.52 seconds for a C based GPU program for 2-plane measurements. The corresponding MATLAB based GPU program took 0.86 seconds. The maximum number of reconstructed frames so achieved is 2 frames per second.
Kogien, Moisés; Cedaro, José Juliano
2014-01-01
Objectives to determine the psychosocial factors of work related to harm caused in the physical domain of the quality of life of nursing professionals working in a public emergency department. Method cross-sectional, descriptive study addressing 189 nursing professionals. The Job Stress Scale and the short version of an instrument from the World Health Organization to assess quality of life were used to collect data. Robert Karasek's Demand-Control Model was the reference for the analysis of the psychosocial configuration. The risk for damage was computed with a confidence interval of 95%. Results In regard to the psychosocial environment, the largest proportion of workers reported low psychological demands (66.1%) and low social support (52.4%), while 60.9% of the professionals experienced work situations with a greater potential for harm: high demand job (22.8%) and passive work (38.1%). Conclusions low intellectual discernment, low social support and experiencing a high demand job or a passive job were the main risk factors for damage in the physical domain of quality of life. PMID:24553703
Kogien, Moisés; Cedaro, José Juliano
2014-01-01
to determine the psychosocial factors of work related to harm caused in the physical domain of the quality of life of nursing professionals working in a public emergency department. cross-sectional, descriptive study addressing 189 nursing professionals. The Job Stress Scale and the short version of an instrument from the World Health Organization to assess quality of life were used to collect data. Robert Karasek's Demand-Control Model was the reference for the analysis of the psychosocial configuration. The risk for damage was computed with a confidence interval of 95%. In regard to the psychosocial environment, the largest proportion of workers reported low psychological demands (66.1%) and low social support (52.4%), while 60.9% of the professionals experienced work situations with a greater potential for harm: high demand job (22.8%) and passive work (38.1%). low intellectual discernment, low social support and experiencing a high demand job or a passive job were the main risk factors for damage in the physical domain of quality of life.
Measuring the Impact of Programs that Challenge the Public Stigma of Mental Illness
Corrigan, Patrick W.; Shapiro, Jenessa R.
2010-01-01
Public stigma robs people with mental illnesses from rightful opportunities related to work and other important life goals. Advocates have developed anti-stigma programs meant to address the prejudice and discrimination associated with these conditions. Evidence is now needed to make sense of program impact; this paper looks at measurement issues related to stigma change. Community based participatory research is central to this research and includes the involvement of a diverse collection of stakeholders in all phases of evaluation. Investigators should be cautious about measures vis-à-vis social desirability effects and should directed by social validity of targeted audiences. Conceptual domains with some research support that correspond with assessments include behavior, penetration, psychological perspective, knowledge, and physiological/information processes. These issues are summarized as ten recommendations for evaluation of anti-stigma programs. PMID:20674114
A PC based time domain reflectometer for space station cable fault isolation
NASA Technical Reports Server (NTRS)
Pham, Michael; McClean, Marty; Hossain, Sabbir; Vo, Peter; Kouns, Ken
1994-01-01
Significant problems are faced by astronauts on orbit in the Space Station when trying to locate electrical faults in multi-segment avionics and communication cables. These problems necessitate the development of an automated portable device that will detect and locate cable faults using the pulse-echo technique known as Time Domain Reflectometry. A breadboard time domain reflectometer (TDR) circuit board was designed and developed at the NASA-JSC. The TDR board works in conjunction with a GRiD lap-top computer to automate the fault detection and isolation process. A software program was written to automatically display the nature and location of any possible faults. The breadboard system can isolate open circuit and short circuit faults within two feet in a typical space station cable configuration. Follow-on efforts planned for 1994 will produce a compact, portable prototype Space Station TDR capable of automated switching in multi-conductor cables for high fidelity evaluation. This device has many possible commercial applications, including commercial and military aircraft avionics, cable TV, telephone, communication, information and computer network systems. This paper describes the principle of time domain reflectometry and the methodology for on-orbit avionics utility distribution system repair, utilizing the newly developed device called the Space Station Time Domain Reflectometer (SSTDR).
Parallel computation and the Basis system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, G.R.
1992-12-16
A software package has been written that can facilitate efforts to develop powerful, flexible, and easy-to-use programs that can run in single-processor, massively parallel, and distributed computing environments. Particular attention has been given to the difficulties posed by a program consisting of many science packages that represent subsystems of a complicated, coupled system. Methods have been found to maintain independence of the packages by hiding data structures without increasing the communication costs in a parallel computing environment. Concepts developed in this work are demonstrated by a prototype program that uses library routines from two existing software systems, Basis and Parallelmore » Virtual Machine (PVM). Most of the details of these libraries have been encapsulated in routines and macros that could be rewritten for alternative libraries that possess certain minimum capabilities. The prototype software uses a flexible master-and-slaves paradigm for parallel computation and supports domain decomposition with message passing for partitioning work among slaves. Facilities are provided for accessing variables that are distributed among the memories of slaves assigned to subdomains. The software is named PROTOPAR.« less
Parallel computation and the basis system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, G.R.
1993-05-01
A software package has been written that can facilitate efforts to develop powerful, flexible, and easy-to use programs that can run in single-processor, massively parallel, and distributed computing environments. Particular attention has been given to the difficulties posed by a program consisting of many science packages that represent subsystems of a complicated, coupled system. Methods have been found to maintain independence of the packages by hiding data structures without increasing the communications costs in a parallel computing environment. Concepts developed in this work are demonstrated by a prototype program that uses library routines from two existing software systems, Basis andmore » Parallel Virtual Machine (PVM). Most of the details of these libraries have been encapsulated in routines and macros that could be rewritten for alternative libraries that possess certain minimum capabilities. The prototype software uses a flexible master-and-slaves paradigm for parallel computation and supports domain decomposition with message passing for partitioning work among slaves. Facilities are provided for accessing variables that are distributed among the memories of slaves assigned to subdomains. The software is named PROTOPAR.« less
Development and Applications of a Modular Parallel Process for Large Scale Fluid/Structures Problems
NASA Technical Reports Server (NTRS)
Guruswamy, Guru P.; Kwak, Dochan (Technical Monitor)
2002-01-01
A modular process that can efficiently solve large scale multidisciplinary problems using massively parallel supercomputers is presented. The process integrates disciplines with diverse physical characteristics by retaining the efficiency of individual disciplines. Computational domain independence of individual disciplines is maintained using a meta programming approach. The process integrates disciplines without affecting the combined performance. Results are demonstrated for large scale aerospace problems on several supercomputers. The super scalability and portability of the approach is demonstrated on several parallel computers.
Colorado Children's Budget 2011
ERIC Educational Resources Information Center
Colorado Children's Campaign, 2011
2011-01-01
"Colorado Children's Budget 2011" tallies up Colorado's public investments during FY 2007-08 through FY 2011-12 for programs and services that enhance the well-being of children across four domains--Early Childhood, K-12 Education, Health, and Other Supports. It is intended to be a resource guide for policymakers and advocates who are…
Application of Component Scoring to a Complicated Cognitive Domain.
ERIC Educational Resources Information Center
Tatsuoka, Kikumi K.; Yamamoto, Kentaro
This study used the Montague-Riley Test to introduce a new scoring procedure that revealed errors in cognitive processes occurring at subcomponents of an electricity problem. The test, consisting of four parts with 36 open-ended problems each, was administered to 250 high school students. A computer program, ELTEST, was written applying a…
NASA Technical Reports Server (NTRS)
Farassat, F.; Succi, G. P.
1980-01-01
A review of propeller noise prediction technology is presented which highlights the developments in the field from the successful attempt of Gutin to the current sophisticated techniques. Two methods for the predictions of the discrete frequency noise from conventional and advanced propellers in forward flight are described. These methods developed at MIT and NASA Langley Research Center are based on different time domain formulations. Brief description of the computer algorithms based on these formulations are given. The output of these two programs, which is the acoustic pressure signature, is Fourier analyzed to get the acoustic pressure spectrum. The main difference between the programs as they are coded now is that the Langley program can handle propellers with supersonic tip speed while the MIT program is for subsonic tip speed propellers. Comparisons of the calculated and measured acoustic data for a conventional and an advanced propeller show good agreement in general.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arafat, Humayun; Dinan, James; Krishnamoorthy, Sriram
Task parallelism is an attractive approach to automatically load balance the computation in a parallel system and adapt to dynamism exhibited by parallel systems. Exploiting task parallelism through work stealing has been extensively studied in shared and distributed-memory contexts. In this paper, we study the design of a system that uses work stealing for dynamic load balancing of task-parallel programs executed on hybrid distributed-memory CPU-graphics processing unit (GPU) systems in a global-address space framework. We take into account the unique nature of the accelerator model employed by GPUs, the significant performance difference between GPU and CPU execution as a functionmore » of problem size, and the distinct CPU and GPU memory domains. We consider various alternatives in designing a distributed work stealing algorithm for CPU-GPU systems, while taking into account the impact of task distribution and data movement overheads. These strategies are evaluated using microbenchmarks that capture various execution configurations as well as the state-of-the-art CCSD(T) application module from the computational chemistry domain.« less
Work stealing for GPU-accelerated parallel programs in a global address space framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arafat, Humayun; Dinan, James; Krishnamoorthy, Sriram
Task parallelism is an attractive approach to automatically load balance the computation in a parallel system and adapt to dynamism exhibited by parallel systems. Exploiting task parallelism through work stealing has been extensively studied in shared and distributed-memory contexts. In this paper, we study the design of a system that uses work stealing for dynamic load balancing of task-parallel programs executed on hybrid distributed-memory CPU-graphics processing unit (GPU) systems in a global-address space framework. We take into account the unique nature of the accelerator model employed by GPUs, the significant performance difference between GPU and CPU execution as a functionmore » of problem size, and the distinct CPU and GPU memory domains. We consider various alternatives in designing a distributed work stealing algorithm for CPU-GPU systems, while taking into account the impact of task distribution and data movement overheads. These strategies are evaluated using microbenchmarks that capture various execution configurations as well as the state-of-the-art CCSD(T) application module from the computational chemistry domain« less
Vertical Interaction in Open Software Engineering Communities
2009-03-01
Program in CASOS (NSF,DGE-9972762), the Office of Naval Research under Dynamic Network Analysis program (N00014-02-1-0973, the Air Force Office of...W91WAW07C0063) for research in the area of dynamic network analysis. Additional support was provided by CASOS - the center for Computational Analysis of Social...methods across the domain. For a given project, de - velopers can choose from dozens of models, tools, platforms, and languages for specification, design
Poster Project to Emphasize Public Health in the Pharmacy Curriculum
Werremeyer, Amy B.
2011-01-01
Objective To implement and assess a required public health poster project in a doctor of pharmacy (PharmD) program. Design Third-year PharmD students collaborated in pairs to research a public health topic relating to pharmacy practice. Each student group prepared an informational poster, while receiving feedback from a faculty mentor at each stage of the project. The students presented their completed posters at a statewide pharmacy conference. Assessment Faculty members evaluated the posters with a grading rubric, and students completed a survey instrument that assessed the overall experience. In general, faculty members rated the class highly across all domains of the grading rubric. The class generally agreed that the poster project increased their awareness of public health issues related to pharmacy practice, overall knowledge of public health, and presentation skills. Conclusion The implementation of a poster project was well received by students and faculty members as an effective method for enhancing public health instruction in the PharmD program at North Dakota State University. PMID:21451754
Poster project to emphasize public health in the pharmacy curriculum.
Kelsch, Michael P; Werremeyer, Amy B
2011-02-10
To implement and assess a required public health poster project in a doctor of pharmacy (PharmD) program. Third-year PharmD students collaborated in pairs to research a public health topic relating to pharmacy practice. Each student group prepared an informational poster, while receiving feedback from a faculty mentor at each stage of the project. The students presented their completed posters at a statewide pharmacy conference. Faculty members evaluated the posters with a grading rubric, and students completed a survey instrument that assessed the overall experience. In general, faculty members rated the class highly across all domains of the grading rubric. The class generally agreed that the poster project increased their awareness of public health issues related to pharmacy practice, overall knowledge of public health, and presentation skills. The implementation of a poster project was well received by students and faculty members as an effective method for enhancing public health instruction in the PharmD program at North Dakota State University.
IEDA: Making Small Data BIG Through Interdisciplinary Partnerships Among Long-tail Domains
NASA Astrophysics Data System (ADS)
Lehnert, K. A.; Carbotte, S. M.; Arko, R. A.; Ferrini, V. L.; Hsu, L.; Song, L.; Ghiorso, M. S.; Walker, D. J.
2014-12-01
The Big Data world in the Earth Sciences so far exists primarily for disciplines that generate massive volumes of observational or computed data using large-scale, shared instrumentation such as global sensor networks, satellites, or high-performance computing facilities. These data are typically managed and curated by well-supported community data facilities that also provide the tools for exploring the data through visualization or statistical analysis. In many other domains, especially those where data are primarily acquired by individual investigators or small teams (known as 'Long-tail data'), data are poorly shared and integrated, lacking a community-based data infrastructure that ensures persistent access, quality control, standardization, and integration of data, as well as appropriate tools to fully explore and mine the data within the context of broader Earth Science datasets. IEDA (Integrated Earth Data Applications, www.iedadata.org) is a data facility funded by the US NSF to develop and operate data services that support data stewardship throughout the full life cycle of observational data in the solid earth sciences, with a focus on the data management needs of individual researchers. IEDA builds on a strong foundation of mature disciplinary data systems for marine geology and geophysics, geochemistry, and geochronology. These systems have dramatically advanced data resources in those long-tail Earth science domains. IEDA has strengthened these resources by establishing a consolidated, enterprise-grade infrastructure that is shared by the domain-specific data systems, and implementing joint data curation and data publication services that follow community standards. In recent years, other domain-specific data efforts have partnered with IEDA to take advantage of this infrastructure and improve data services to their respective communities with formal data publication, long-term preservation of data holdings, and better sustainability. IEDA hopes to foster such partnerships with streamlined data services, including user-friendly, single-point interfaces for data submission, discovery, and access across the partner systems to support interdisciplinary science.
A verification library for multibody simulation software
NASA Technical Reports Server (NTRS)
Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.
1989-01-01
A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.
Dynamics of domain coverage of the protein sequence universe.
Rekapalli, Bhanu; Wuichet, Kristin; Peterson, Gregory D; Zhulin, Igor B
2012-11-16
The currently known protein sequence space consists of millions of sequences in public databases and is rapidly expanding. Assigning sequences to families leads to a better understanding of protein function and the nature of the protein universe. However, a large portion of the current protein space remains unassigned and is referred to as its "dark matter". Here we suggest that true size of "dark matter" is much larger than stated by current definitions. We propose an approach to reducing the size of "dark matter" by identifying and subtracting regions in protein sequences that are not likely to contain any domain. Recent improvements in computational domain modeling result in a decrease, albeit slowly, in the relative size of "dark matter"; however, its absolute size increases substantially with the growth of sequence data.
Developing parenting programs to prevent child health risk behaviors: a practice model
Jackson, Christine; Dickinson, Denise M.
2009-01-01
Research indicates that developing public health programs to modify parenting behaviors could lead to multiple beneficial health outcomes for children. Developing feasible effective parenting programs requires an approach that applies a theory-based model of parenting to a specific domain of child health and engages participant representatives in intervention development. This article describes this approach to intervention development in detail. Our presentation emphasizes three points that provide key insights into the goals and procedures of parenting program development. These are a generalized theoretical model of parenting derived from the child development literature, an established eight-step parenting intervention development process and an approach to integrating experiential learning methods into interventions for parents and children. By disseminating this framework for a systematic theory-based approach to developing parenting programs, we aim to support the program development efforts of public health researchers and practitioners who recognize the potential of parenting programs to achieve primary prevention of health risk behaviors in children. PMID:19661165
Summary of Research 1997, Department of Computer Science.
1999-01-01
Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction Project (0704...contains summaries of research projects in the Department of Computer Science . A list of recent publications is also included which consists of conference...parallel programming. Recently, in a joint research project between NPS and the Russian Academy of Sciences Systems Programming Insti- tute in Moscow
Fuad, Anis; Sanjaya, Guardian Yoki; Lazuardi, Lutfan; Rahmanti, Annisa Ristya; Hsu, Chien-Yeh
2013-01-01
Public health informatics has been defined as the systematic application of information and computer science and technology to public health practice, research, and learning [1]. Unfortunately, limited reports exist concerning to the capacity building strategies to improve public health informatics workforce in limited-resources setting. In Indonesia, only three universities, including Universitas Gadjah Mada (UGM), offer master degree program on related public health informatics discipline. UGM started a new dedicated master program on Health Management Information Systems in 2005, under the auspice of the Graduate Program of Public Health at the Faculty of Medicine. This is the first tracer study to the alumni aiming to a) identify the gaps between curriculum and the current jobs and b) describe their perception on public health informatics competencies. We distributed questionnaires to 114 alumni with 36.84 % response rate. Despite low response rate, this study provided valuable resources to set up appropriate competencies, curriculum and capacity building strategies of public health informatics workforce in Indonesia.
Teen Area, Solon Branch, Cuyahoga County Public Library, Solon, Ohio.
ERIC Educational Resources Information Center
Voice of Youth Advocates, 2003
2003-01-01
Describes the teen area of the Solon Branch library in Cuyahoga County (Ohio). Highlights include the collection; catalog computers; hours and teen traffic; planning the space, including extra display units of bulletin boards; teen programming, including monthly programs and a summer reading program; and the teen advisory group. (LRW)
Hahn, P; Dullweber, F; Unglaub, F; Spies, C K
2014-06-01
Searching for relevant publications is becoming more difficult with the increasing number of scientific articles. Text mining as a specific form of computer-based data analysis may be helpful in this context. Highlighting relations between authors and finding relevant publications concerning a specific subject using text analysis programs are illustrated graphically by 2 performed examples. © Georg Thieme Verlag KG Stuttgart · New York.
Charon Toolkit for Parallel, Implicit Structured-Grid Computations: Functional Design
NASA Technical Reports Server (NTRS)
VanderWijngaart, Rob F.; Kutler, Paul (Technical Monitor)
1997-01-01
In a previous report the design concepts of Charon were presented. Charon is a toolkit that aids engineers in developing scientific programs for structured-grid applications to be run on MIMD parallel computers. It constitutes an augmentation of the general-purpose MPI-based message-passing layer, and provides the user with a hierarchy of tools for rapid prototyping and validation of parallel programs, and subsequent piecemeal performance tuning. Here we describe the implementation of the domain decomposition tools used for creating data distributions across sets of processors. We also present the hierarchy of parallelization tools that allows smooth translation of legacy code (or a serial design) into a parallel program. Along with the actual tool descriptions, we will present the considerations that led to the particular design choices. Many of these are motivated by the requirement that Charon must be useful within the traditional computational environments of Fortran 77 and C. Only the Fortran 77 syntax will be presented in this report.
Adaptation of a Control Center Development Environment for Industrial Process Control
NASA Technical Reports Server (NTRS)
Killough, Ronnie L.; Malik, James M.
1994-01-01
In the control center, raw telemetry data is received for storage, display, and analysis. This raw data must be combined and manipulated in various ways by mathematical computations to facilitate analysis, provide diversified fault detection mechanisms, and enhance display readability. A development tool called the Graphical Computation Builder (GCB) has been implemented which provides flight controllers with the capability to implement computations for use in the control center. The GCB provides a language that contains both general programming constructs and language elements specifically tailored for the control center environment. The GCB concept allows staff who are not skilled in computer programming to author and maintain computer programs. The GCB user is isolated from the details of external subsystem interfaces and has access to high-level functions such as matrix operators, trigonometric functions, and unit conversion macros. The GCB provides a high level of feedback during computation development that improves upon the often cryptic errors produced by computer language compilers. An equivalent need can be identified in the industrial data acquisition and process control domain: that of an integrated graphical development tool tailored to the application to hide the operating system, computer language, and data acquisition interface details. The GCB features a modular design which makes it suitable for technology transfer without significant rework. Control center-specific language elements can be replaced by elements specific to industrial process control.
Delaine, Khaya
2011-01-01
Background Despite recent recognition of the need for preventive sexual health materials for people with intellectual disability (ID), there have been remarkably few health-based interventions designed for people with mild to moderate ID. The purpose of this study was to evaluate the effects of a computer-based interactive multimedia (CBIM) program to teach HIV/AIDS knowledge, skills, and decision-making. Methods Twenty-five women with mild to moderate intellectual disability evaluated the program. The study used a quasi-experimental within-subjects design to assess the efficacy of the CBIM program. Research participants completed five qualitative and quantitative instruments that assessed HIV knowledge, and decision-making skills regarding HIV prevention practices and condom application skills (i.e., demonstration of skills opening a condom and putting it on a model penis). In addition, 18 service providers who work with women with ID reviewed the program and completed a demographics questionnaire and a professional customer satisfaction survey. Results Women with ID showed statistically significant increases from pretest to posttest in all knowledge and skill domains. Furthermore, the statistical gains were accompanied by medium to large effect sizes. Overall, service providers rated the program highly on several outcome measures (stimulation, relevance, and usability). Conclusions The results of this study indicate the CBIM program was effective in increasing HIV/AIDS knowledge and skills among women with ID, who live both semi-independently and independently, in a single-session intervention. Since the CBIM program is not dependent on staff for instructional delivery, it is a highly efficient teaching tool; and CBIM is an efficacious means to provide behavioral health content, compensating for the dearth of available health promotion materials for people with ID. As such, it has a potential for broad distribution and implementation by medical practitioners, and public health offices. People with ID are part of our society, yet continue to be overlooked, particularly in the area of health promotion. Special tools need to be developed in order to address the health disparities experienced by people with ID. PMID:21917052
A Joint Method of Envelope Inversion Combined with Hybrid-domain Full Waveform Inversion
NASA Astrophysics Data System (ADS)
CUI, C.; Hou, W.
2017-12-01
Full waveform inversion (FWI) aims to construct high-precision subsurface models by fully using the information in seismic records, including amplitude, travel time, phase and so on. However, high non-linearity and the absence of low frequency information in seismic data lead to the well-known cycle skipping problem and make inversion easily fall into local minima. In addition, those 3D inversion methods that are based on acoustic approximation ignore the elastic effects in real seismic field, and make inversion harder. As a result, the accuracy of final inversion results highly relies on the quality of initial model. In order to improve stability and quality of inversion results, multi-scale inversion that reconstructs subsurface model from low to high frequency are applied. But, the absence of very low frequencies (< 3Hz) in field data is still bottleneck in the FWI. By extracting ultra low-frequency data from field data, envelope inversion is able to recover low wavenumber model with a demodulation operator (envelope operator), though the low frequency data does not really exist in field data. To improve the efficiency and viability of the inversion, in this study, we proposed a joint method of envelope inversion combined with hybrid-domain FWI. First, we developed 3D elastic envelope inversion, and the misfit function and the corresponding gradient operator were derived. Then we performed hybrid-domain FWI with envelope inversion result as initial model which provides low wavenumber component of model. Here, forward modeling is implemented in the time domain and inversion in the frequency domain. To accelerate the inversion, we adopt CPU/GPU heterogeneous computing techniques. There were two levels of parallelism. In the first level, the inversion tasks are decomposed and assigned to each computation node by shot number. In the second level, GPU multithreaded programming is used for the computation tasks in each node, including forward modeling, envelope extraction, DFT (discrete Fourier transform) calculation and gradients calculation. Numerical tests demonstrated that the combined envelope inversion + hybrid-domain FWI could obtain much faithful and accurate result than conventional hybrid-domain FWI. The CPU/GPU heterogeneous parallel computation could improve the performance speed.
Synthesizing 3D Surfaces from Parameterized Strip Charts
NASA Technical Reports Server (NTRS)
Robinson, Peter I.; Gomez, Julian; Morehouse, Michael; Gawdiak, Yuri
2004-01-01
We believe 3D information visualization has the power to unlock new levels of productivity in the monitoring and control of complex processes. Our goal is to provide visual methods to allow for rapid human insight into systems consisting of thousands to millions of parameters. We explore this hypothesis in two complex domains: NASA program management and NASA International Space Station (ISS) spacecraft computer operations. We seek to extend a common form of visualization called the strip chart from 2D to 3D. A strip chart can display the time series progression of a parameter and allows for trends and events to be identified. Strip charts can be overlayed when multiple parameters need to visualized in order to correlate their events. When many parameters are involved, the direct overlaying of strip charts can become confusing and may not fully utilize the graphing area to convey the relationships between the parameters. We provide a solution to this problem by generating 3D surfaces from parameterized strip charts. The 3D surface utilizes significantly more screen area to illustrate the differences in the parameters and the overlayed strip charts, and it can rapidly be scanned by humans to gain insight. The selection of the third dimension must be a parallel or parameterized homogenous resource in the target domain, defined using a finite, ordered, enumerated type, and not a heterogeneous type. We demonstrate our concepts with examples from the NASA program management domain (assessing the state of many plans) and the computers of the ISS (assessing the state of many computers). We identify 2D strip charts in each domain and show how to construct the corresponding 3D surfaces. The user can navigate the surface, zooming in on regions of interest, setting a mark and drilling down to source documents from which the data points have been derived. We close by discussing design issues, related work, and implementation challenges.
Simonyan, Vahan; Chumakov, Konstantin; Dingerdissen, Hayley; Faison, William; Goldweber, Scott; Golikov, Anton; Gulzar, Naila; Karagiannis, Konstantinos; Vinh Nguyen Lam, Phuc; Maudru, Thomas; Muravitskaja, Olesja; Osipova, Ekaterina; Pan, Yang; Pschenichnov, Alexey; Rostovtsev, Alexandre; Santana-Quintero, Luis; Smith, Krista; Thompson, Elaine E.; Tkachenko, Valery; Torcivia-Rodriguez, John; Wan, Quan; Wang, Jing; Wu, Tsung-Jung; Wilson, Carolyn; Mazumder, Raja
2016-01-01
The High-performance Integrated Virtual Environment (HIVE) is a distributed storage and compute environment designed primarily to handle next-generation sequencing (NGS) data. This multicomponent cloud infrastructure provides secure web access for authorized users to deposit, retrieve, annotate and compute on NGS data, and to analyse the outcomes using web interface visual environments appropriately built in collaboration with research and regulatory scientists and other end users. Unlike many massively parallel computing environments, HIVE uses a cloud control server which virtualizes services, not processes. It is both very robust and flexible due to the abstraction layer introduced between computational requests and operating system processes. The novel paradigm of moving computations to the data, instead of moving data to computational nodes, has proven to be significantly less taxing for both hardware and network infrastructure. The honeycomb data model developed for HIVE integrates metadata into an object-oriented model. Its distinction from other object-oriented databases is in the additional implementation of a unified application program interface to search, view and manipulate data of all types. This model simplifies the introduction of new data types, thereby minimizing the need for database restructuring and streamlining the development of new integrated information systems. The honeycomb model employs a highly secure hierarchical access control and permission system, allowing determination of data access privileges in a finely granular manner without flooding the security subsystem with a multiplicity of rules. HIVE infrastructure will allow engineers and scientists to perform NGS analysis in a manner that is both efficient and secure. HIVE is actively supported in public and private domains, and project collaborations are welcomed. Database URL: https://hive.biochemistry.gwu.edu PMID:26989153
Simonyan, Vahan; Chumakov, Konstantin; Dingerdissen, Hayley; Faison, William; Goldweber, Scott; Golikov, Anton; Gulzar, Naila; Karagiannis, Konstantinos; Vinh Nguyen Lam, Phuc; Maudru, Thomas; Muravitskaja, Olesja; Osipova, Ekaterina; Pan, Yang; Pschenichnov, Alexey; Rostovtsev, Alexandre; Santana-Quintero, Luis; Smith, Krista; Thompson, Elaine E; Tkachenko, Valery; Torcivia-Rodriguez, John; Voskanian, Alin; Wan, Quan; Wang, Jing; Wu, Tsung-Jung; Wilson, Carolyn; Mazumder, Raja
2016-01-01
The High-performance Integrated Virtual Environment (HIVE) is a distributed storage and compute environment designed primarily to handle next-generation sequencing (NGS) data. This multicomponent cloud infrastructure provides secure web access for authorized users to deposit, retrieve, annotate and compute on NGS data, and to analyse the outcomes using web interface visual environments appropriately built in collaboration with research and regulatory scientists and other end users. Unlike many massively parallel computing environments, HIVE uses a cloud control server which virtualizes services, not processes. It is both very robust and flexible due to the abstraction layer introduced between computational requests and operating system processes. The novel paradigm of moving computations to the data, instead of moving data to computational nodes, has proven to be significantly less taxing for both hardware and network infrastructure.The honeycomb data model developed for HIVE integrates metadata into an object-oriented model. Its distinction from other object-oriented databases is in the additional implementation of a unified application program interface to search, view and manipulate data of all types. This model simplifies the introduction of new data types, thereby minimizing the need for database restructuring and streamlining the development of new integrated information systems. The honeycomb model employs a highly secure hierarchical access control and permission system, allowing determination of data access privileges in a finely granular manner without flooding the security subsystem with a multiplicity of rules. HIVE infrastructure will allow engineers and scientists to perform NGS analysis in a manner that is both efficient and secure. HIVE is actively supported in public and private domains, and project collaborations are welcomed. Database URL: https://hive.biochemistry.gwu.edu. © The Author(s) 2016. Published by Oxford University Press.
NASA Technical Reports Server (NTRS)
Brown, Robert L.; Doyle, Dee; Haines, Richard F.; Slocum, Michael
1989-01-01
As part of the Telescience Testbed Pilot Program, the Universities Space Research Association/ Research Institute for Advanced Computer Science (USRA/RIACS) proposed to support remote communication by providing a network of human/machine interfaces, computer resources, and experimental equipment which allows: remote science, collaboration, technical exchange, and multimedia communication. The telescience workstation is intended to provide a local computing environment for telescience. The purpose of the program are as follows: (1) to provide a suitable environment to integrate existing and new software for a telescience workstation; (2) to provide a suitable environment to develop new software in support of telescience activities; (3) to provide an interoperable environment so that a wide variety of workstations may be used in the telescience program; (4) to provide a supportive infrastructure and a common software base; and (5) to advance, apply, and evaluate the telescience technolgy base. A prototype telescience computing environment designed to bring practicing scientists in domains other than their computer science into a modern style of doing their computing was created and deployed. This environment, the Telescience Windowing Environment, Phase 1 (TeleWEn-1), met some, but not all of the goals stated above. The TeleWEn-1 provided a window-based workstation environment and a set of tools for text editing, document preparation, electronic mail, multimedia mail, raster manipulation, and system management.
Design of a real-time wind turbine simulator using a custom parallel architecture
NASA Technical Reports Server (NTRS)
Hoffman, John A.; Gluck, R.; Sridhar, S.
1995-01-01
The design of a new parallel-processing digital simulator is described. The new simulator has been developed specifically for analysis of wind energy systems in real time. The new processor has been named: the Wind Energy System Time-domain simulator, version 3 (WEST-3). Like previous WEST versions, WEST-3 performs many computations in parallel. The modules in WEST-3 are pure digital processors, however. These digital processors can be programmed individually and operated in concert to achieve real-time simulation of wind turbine systems. Because of this programmability, WEST-3 is very much more flexible and general than its two predecessors. The design features of WEST-3 are described to show how the system produces high-speed solutions of nonlinear time-domain equations. WEST-3 has two very fast Computational Units (CU's) that use minicomputer technology plus special architectural features that make them many times faster than a microcomputer. These CU's are needed to perform the complex computations associated with the wind turbine rotor system in real time. The parallel architecture of the CU causes several tasks to be done in each cycle, including an IO operation and the combination of a multiply, add, and store. The WEST-3 simulator can be expanded at any time for additional computational power. This is possible because the CU's interfaced to each other and to other portions of the simulation using special serial buses. These buses can be 'patched' together in essentially any configuration (in a manner very similar to the programming methods used in analog computation) to balance the input/ output requirements. CU's can be added in any number to share a given computational load. This flexible bus feature is very different from many other parallel processors which usually have a throughput limit because of rigid bus architecture.
A users' guide to the trace contaminant control simulation computer program
NASA Technical Reports Server (NTRS)
Perry, J. L.
1994-01-01
The Trace Contaminant Control Simulation computer program is a tool for assessing the performance of various trace contaminant control technologies for removing trace chemical contamination from a spacecraft cabin atmosphere. The results obtained from the program can be useful in assessing different technology combinations, system sizing, system location with respect to other life support systems, and the overall life cycle economics of a trace contaminant control system. The user's manual is extracted in its entirety from NASA TM-108409 to provide a stand-alone reference for using any version of the program. The first publication of the manual as part of TM-108409 also included a detailed listing of version 8.0 of the program. As changes to the code were necessary, it became apparent that the user's manual should be separate from the computer code documentation and be general enough to provide guidance in using any version of the program. Provided in the guide are tips for input file preparation, general program execution, and output file manipulation. Information concerning source code listings of the latest version of the computer program may be obtained by contacting the author.
Tracking Women and Minorities as They Attain Degrees in Computing and Related Fields
ERIC Educational Resources Information Center
Sorkin, Sylvia; Gore, Mary Elizabeth; Mento, Barbara; Stanton, Jon
2010-01-01
Two Maryland colleges (one a four-year liberal arts college for women, and one a public community college) have worked to increase the number of graduates, especially women and other under-represented groups, in their computer science, computer information systems, engineering, and mathematics programs over a four-year period. In August 2004, they…
The Nature-Computer Camp. Final Evaluation Report, 1984-1985. E.C.I.A. Chapter 2.
ERIC Educational Resources Information Center
District of Columbia Public Schools, Washington, DC. Div. of Quality Assurance.
This report presents a description and evaluation of the Nature-Computer Camp (NCC), an environmental and computer science program designed for sixth grade students in the District of Columbia public schools. Inputs, processes and outcomes based on a Planning, Monitoring and Implementing (PMI) Evaluation Model are reviewed for each of the four…
Nature-Computer Camp. Final Evaluation Report. E.C.I.A. Chapter 2.
ERIC Educational Resources Information Center
District of Columbia Public Schools, Washington, DC. Div. of Quality Assurance.
This report presents a description and evaluation of the Nature-Computer Camp (NCC), an environmental and computer science program designed for sixth grade students in the District of Columbia public schools. Inputs, processes and outcomes based on a Planning, Monitoring and Implementing (PMI) Evaluation Model are reviewed for each of the four…
ERIC Educational Resources Information Center
Ryoo, Jean J.; Margolis, Jane; Lee, Clifford H.; Sandoval, Cueponcaxochitl D. M.; Goode, Joanna
2013-01-01
Despite the fact that computer science (CS) is the driver of technological innovations across all disciplines and aspects of our lives, including participatory media, high school CS too commonly fails to incorporate the perspectives and concerns of low-income students of color. This article describes a partnership program -- Exploring Computer…
ERIC Educational Resources Information Center
Goosen, Richard F.
2009-01-01
This study provides information for higher education leaders that have or are considering conducting Computer Aided Design (CAD) instruction using student owned notebook computers. Survey data were collected during the first 8 years of a pilot program requiring engineering technology students at a four year public university to acquire a notebook…
Parallel computation using boundary elements in solid mechanics
NASA Technical Reports Server (NTRS)
Chien, L. S.; Sun, C. T.
1990-01-01
The inherent parallelism of the boundary element method is shown. The boundary element is formulated by assuming the linear variation of displacements and tractions within a line element. Moreover, MACSYMA symbolic program is employed to obtain the analytical results for influence coefficients. Three computational components are parallelized in this method to show the speedup and efficiency in computation. The global coefficient matrix is first formed concurrently. Then, the parallel Gaussian elimination solution scheme is applied to solve the resulting system of equations. Finally, and more importantly, the domain solutions of a given boundary value problem are calculated simultaneously. The linear speedups and high efficiencies are shown for solving a demonstrated problem on Sequent Symmetry S81 parallel computing system.
Preliminary demonstration of a robust controller design method
NASA Technical Reports Server (NTRS)
Anderson, L. R.
1980-01-01
Alternative computational procedures for obtaining a feedback control law which yields a control signal based on measurable quantitites are evaluated. The three methods evaluated are: (1) the standard linear quadratic regulator design model; (2) minimization of the norm of the feedback matrix, k via nonlinear programming subject to the constraint that the closed loop eigenvalues be in a specified domain in the complex plane; and (3) maximize the angles between the closed loop eigenvectors in combination with minimizing the norm of K also via the constrained nonlinear programming. The third or robust design method was chosen to yield a closed loop system whose eigenvalues are insensitive to small changes in the A and B matrices. The relationship between orthogonality of closed loop eigenvectors and the sensitivity of closed loop eigenvalues is described. Computer programs are described.
Computer analysis of multicircuit shells of revolution by the field method
NASA Technical Reports Server (NTRS)
Cohen, G. A.
1975-01-01
The field method, presented previously for the solution of even-order linear boundary value problems defined on one-dimensional open branch domains, is extended to boundary value problems defined on one-dimensional domains containing circuits. This method converts the boundary value problem into two successive numerically stable initial value problems, which may be solved by standard forward integration techniques. In addition, a new method for the treatment of singular boundary conditions is presented. This method, which amounts to a partial interchange of the roles of force and displacement variables, is problem independent with respect to both accuracy and speed of execution. This method was implemented in a computer program to calculate the static response of ring stiffened orthotropic multicircuit shells of revolution to asymmetric loads. Solutions are presented for sample problems which illustrate the accuracy and efficiency of the method.
The Globus Galaxies Platform. Delivering Science Gateways as a Service
DOE Office of Scientific and Technical Information (OSTI.GOV)
Madduri, Ravi; Chard, Kyle; Chard, Ryan
We use public cloud computers to host sophisticated scientific data; software is then used to transform scientific practice by enabling broad access to capabilities previously available only to the few. The primary obstacle to more widespread use of public clouds to host scientific software (‘cloud-based science gateways’) has thus far been the considerable gap between the specialized needs of science applications and the capabilities provided by cloud infrastructures. We describe here a domain-independent, cloud-based science gateway platform, the Globus Galaxies platform, which overcomes this gap by providing a set of hosted services that directly address the needs of science gatewaymore » developers. The design and implementation of this platform leverages our several years of experience with Globus Genomics, a cloud-based science gateway that has served more than 200 genomics researchers across 30 institutions. Building on that foundation, we have also implemented a platform that leverages the popular Galaxy system for application hosting and workflow execution; Globus services for data transfer, user and group management, and authentication; and a cost-aware elastic provisioning model specialized for public cloud resources. We describe here the capabilities and architecture of this platform, present six scientific domains in which we have successfully applied it, report on user experiences, and analyze the economics of our deployments. Published 2015. This article is a U.S. Government work and is in the public domain in the USA.« less
ERIC Educational Resources Information Center
ERIC Review, 1993
1993-01-01
The "ERIC Review" is published three times a year and announces research results, publications, and new programs relevant to each issue's theme topic. This issue explores computer networking in elementary and secondary schools via two principal articles: "Plugging into the 'Net'" (Michael B. Eisenberg and Donald P. Ely); and…
NASA Astrophysics Data System (ADS)
Schmitz, Oliver; de Jong, Kor; Karssenberg, Derek
2017-04-01
There is an increasing demand to run environmental models on a big scale: simulations over large areas at high resolution. The heterogeneity of available computing hardware such as multi-core CPUs, GPUs or supercomputer potentially provides significant computing power to fulfil this demand. However, this requires detailed knowledge of the underlying hardware, parallel algorithm design and the implementation thereof in an efficient system programming language. Domain scientists such as hydrologists or ecologists often lack this specific software engineering knowledge, their emphasis is (and should be) on exploratory building and analysis of simulation models. As a result, models constructed by domain specialists mostly do not take full advantage of the available hardware. A promising solution is to separate the model building activity from software engineering by offering domain specialists a model building framework with pre-programmed building blocks that they combine to construct a model. The model building framework, consequently, needs to have built-in capabilities to make full usage of the available hardware. Developing such a framework providing understandable code for domain scientists and being runtime efficient at the same time poses several challenges on developers of such a framework. For example, optimisations can be performed on individual operations or the whole model, or tasks need to be generated for a well-balanced execution without explicitly knowing the complexity of the domain problem provided by the modeller. Ideally, a modelling framework supports the optimal use of available hardware whichsoever combination of model building blocks scientists use. We demonstrate our ongoing work on developing parallel algorithms for spatio-temporal modelling and demonstrate 1) PCRaster, an environmental software framework (http://www.pcraster.eu) providing spatio-temporal model building blocks and 2) parallelisation of about 50 of these building blocks using the new Fern library (https://github.com/geoneric/fern/), an independent generic raster processing library. Fern is a highly generic software library and its algorithms can be configured according to the configuration of a modelling framework. With manageable programming effort (e.g. matching data types between programming and domain language) we created a binding between Fern and PCRaster. The resulting PCRaster Python multicore module can be used to execute existing PCRaster models without having to make any changes to the model code. We show initial results on synthetic and geoscientific models indicating significant runtime improvements provided by parallel local and focal operations. We further outline challenges in improving remaining algorithms such as flow operations over digital elevation maps and further potential improvements like enhancing disk I/O.
ERIC Educational Resources Information Center
Demetriadis, Stavros; Egerter, Tina; Hanisch, Frank; Fischer, Frank
2011-01-01
This study investigates the effectiveness of using peer review in the context of scripted collaboration to foster both domain-specific and domain-general knowledge acquisition in the computer science domain. Using a one-factor design with a script and a control condition, students worked in small groups on a series of computer science problems…
Real science at the petascale.
Saksena, Radhika S; Boghosian, Bruce; Fazendeiro, Luis; Kenway, Owain A; Manos, Steven; Mazzeo, Marco D; Sadiq, S Kashif; Suter, James L; Wright, David; Coveney, Peter V
2009-06-28
We describe computational science research that uses petascale resources to achieve scientific results at unprecedented scales and resolution. The applications span a wide range of domains, from investigation of fundamental problems in turbulence through computational materials science research to biomedical applications at the forefront of HIV/AIDS research and cerebrovascular haemodynamics. This work was mainly performed on the US TeraGrid 'petascale' resource, Ranger, at Texas Advanced Computing Center, in the first half of 2008 when it was the largest computing system in the world available for open scientific research. We have sought to use this petascale supercomputer optimally across application domains and scales, exploiting the excellent parallel scaling performance found on up to at least 32 768 cores for certain of our codes in the so-called 'capability computing' category as well as high-throughput intermediate-scale jobs for ensemble simulations in the 32-512 core range. Furthermore, this activity provides evidence that conventional parallel programming with MPI should be successful at the petascale in the short to medium term. We also report on the parallel performance of some of our codes on up to 65 636 cores on the IBM Blue Gene/P system at the Argonne Leadership Computing Facility, which has recently been named the fastest supercomputer in the world for open science.
Gardner, William; Morton, Suzanne; Byron, Sepheen C; Tinoco, Aldo; Canan, Benjamin D; Leonhart, Karen; Kong, Vivian; Scholle, Sarah Hudson
2014-01-01
Objective To determine whether quality measures based on computer-extracted EHR data can reproduce findings based on data manually extracted by reviewers. Data Sources We studied 12 measures of care indicated for adolescent well-care visits for 597 patients in three pediatric health systems. Study Design Observational study. Data Collection/Extraction Methods Manual reviewers collected quality data from the EHR. Site personnel programmed their EHR systems to extract the same data from structured fields in the EHR according to national health IT standards. Principal Findings Overall performance measured via computer-extracted data was 21.9 percent, compared with 53.2 percent for manual data. Agreement measures were high for immunizations. Otherwise, agreement between computer extraction and manual review was modest (Kappa = 0.36) because computer-extracted data frequently missed care events (sensitivity = 39.5 percent). Measure validity varied by health care domain and setting. A limitation of our findings is that we studied only three domains and three sites. Conclusions The accuracy of computer-extracted EHR quality reporting depends on the use of structured data fields, with the highest agreement found for measures and in the setting that had the greatest concentration of structured fields. We need to improve documentation of care, data extraction, and adaptation of EHR systems to practice workflow. PMID:24471935
Haas, Magali; Stephenson, Diane; Romero, Klaus; Gordon, Mark Forrest; Zach, Neta; Geerts, Hugo
2016-09-01
Many disease-modifying clinical development programs in Alzheimer's disease (AD) have failed to date, and development of new and advanced preclinical models that generate actionable knowledge is desperately needed. This review reports on computer-based modeling and simulation approach as a powerful tool in AD research. Statistical data-analysis techniques can identify associations between certain data and phenotypes, such as diagnosis or disease progression. Other approaches integrate domain expertise in a formalized mathematical way to understand how specific components of pathology integrate into complex brain networks. Private-public partnerships focused on data sharing, causal inference and pathway-based analysis, crowdsourcing, and mechanism-based quantitative systems modeling represent successful real-world modeling examples with substantial impact on CNS diseases. Similar to other disease indications, successful real-world examples of advanced simulation can generate actionable support of drug discovery and development in AD, illustrating the value that can be generated for different stakeholders. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
William H. Butler; Ashley Monroe; Sarah McCaffrey
2015-01-01
The Collaborative Forest Landscape Restoration Program (CFLRP), established in 2009, encourages collaborative landscape scale ecosystem restoration efforts on United States Forest Service (USFS) lands. Although the USFS employees have experience engaging in collaborative planning, CFLRP requires collaboration in implementation, a domain where little prior experience...
The Role of Human Intelligence in Computer-Based Intelligent Tutoring Systems.
ERIC Educational Resources Information Center
Epstein, Kenneth; Hillegeist, Eleanor
An Intelligent Tutoring System (ITS) consists of an expert problem-solving program in a subject domain, a tutoring model capable of remediation or primary instruction, and an assessment model that monitors student understanding. The Geometry Proof Tutor (GPT) is an ITS which was developed at Carnegie Mellon University and field tested in the…
Development of a Core Competency Model for the Master of Public Health Degree
Calhoun, Judith G.; Ramiah, Kalpana; Weist, Elizabeth McGean; Shortell, Stephen M.
2008-01-01
Core competencies have been used to redefine curricula across the major health professions in recent decades. In 2006, the Association of Schools of Public Health identified core competencies for the master of public health degree in graduate schools and programs of public health. We provide an overview of the model development process and a listing of 12 core domains and 119 competencies that can serve as a resource for faculty and students for enhancing the quality and accountability of graduate public health education and training. The primary vision for the initiative is the graduation of professionals who are more fully prepared for the many challenges and opportunities in public health in the forthcoming decade. PMID:18633093
Clients’ perceptions of the quality of care in Mexico City’s public-sector legal abortion program
Becker, Davida; Díaz-Olavarrieta, Claudia; Juárez, Clara; García, Sandra G.; Sanhueza, Patricio; Harper, Cynthia C.
2014-01-01
Context In 2007 the Mexico City legislature made the groundbreaking decision to legalize first trimester abortion. Limited research has been conducted to understand clients’ perceptions of the abortion services available in public sector facilities. Methods We measured clients’ perceptions of quality of care at three public sector sites in Mexico City in 2009 (n=402). We assessed six domains of quality of care (client-staff interaction, information provision, technical competence, post-abortion contraceptive services, accessibility, and the facility environment), and conducted ordinal logistic regression analysis to identify which domains were important to women for their overall evaluation of care. We measured the association of overall service evaluation with socio-demographic factors and abortion-visit characteristics, in addition to specific quality of care domains. Results Clients reported a high quality of care for abortion services with an overall mean rating of 8.8 out of 10. Multivariable analysis showed that important domains for high evaluation included client perception of doctor as technically skilled (p<0.05), comfort with doctor (p<0.001), perception of confidentiality (p<.01), perception that receptionist was respectful (p<.05) and counseling on self-care at home following the abortion and post-abortion emotions (p<0.05 and p<0.01). Other relevant domains for high evaluation were convenient site hours (p<0.01), waiting time (p<0.001) and clean facility (p<0.05). Nulliparous women rated their care less favorably than parous women (p<0.05). Conclusions Our findings highlight important domains of service quality to women’s overall evaluations of abortion care in Mexico City. Strategies to improve clients’ service experiences should focus on improving counseling, service accessibility and waiting time. PMID:22227626
Water-use computer programs for Florida
Geiger, L.H.
1984-01-01
Using U.S. Geological Survey computer programs L149-L153, this report shows how to process water-use data for the functional water-use categories: public supply, rural supply, industrial self-supplied, irrigation, and thermo-electric power generation. The programs are used to selectively retrieve entries and list them in a format suitable for publication. Instructions are given for coding cards to produce tables of water-use data for each of the functional use categories. These cards contain entries that identify a particular water-use data-collection site in Florida. Entries on the cards include location information such as county code, water management district code, hydrologic unit code, and, where applicable, a site name and number. Annual and monthly pumpage is included. These entries are shown with several different headings; for example, surface water or ground water, freshwater or saline pumpages, or consumptive use. All the programs use a similar approach; however, the actual programs differ with each functional water-use category and are discussed separately. Data prepared for these programs can also be processed by the National Water-Use Data System. (USGS)
Data Discovery with IBM Watson
NASA Astrophysics Data System (ADS)
Fessler, J.
2016-12-01
BM Watson is a cognitive computing system that uses machine learning, statistical analysis, and natural language processing to find and understand the clues in questions posed to it. Watson was made famous when it bested two champions on TV's Jeopardy! show. Since then, Watson has evolved into a platform of cognitive services that can be trained on very granular fields up study. Watson is being used to support a number of subject domains, such as cancer research, public safety, engineering, and the intelligence community. IBM will be providing a presentation and demonstration on the Watson technology and will discuss its capabilities including Natural Language Processing, text analytics and enterprise search, as well as cognitive computing with deep Q&A. The team will also be giving examples of how IBM Watson technology is being used to support real-world problems across a number of public sector agencies
36 CFR 1120.52 - Computerized records.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 36 Parks, Forests, and Public Property 3 2014-07-01 2014-07-01 false Computerized records. 1120.52 Section 1120.52 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION BARRIERS COMPLIANCE... additional programming of the computer, thus producing information not previously in being, is not required...
36 CFR 1120.52 - Computerized records.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 36 Parks, Forests, and Public Property 3 2012-07-01 2012-07-01 false Computerized records. 1120.52 Section 1120.52 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION BARRIERS COMPLIANCE... additional programming of the computer, thus producing information not previously in being, is not required...