NASA Technical Reports Server (NTRS)
Yan, Jerry C.
1987-01-01
In concurrent systems, a major responsibility of the resource management system is to decide how the application program is to be mapped onto the multi-processor. Instead of using abstract program and machine models, a generate-and-test framework known as 'post-game analysis' that is based on data gathered during program execution is proposed. Each iteration consists of (1) (a simulation of) an execution of the program; (2) analysis of the data gathered; and (3) the proposal of a new mapping that would have a smaller execution time. These heuristics are applied to predict execution time changes in response to small perturbations applied to the current mapping. An initial experiment was carried out using simple strategies on 'pipeline-like' applications. The results obtained from four simple strategies demonstrated that for this kind of application, even simple strategies can produce acceptable speed-up with a small number of iterations.
Estimating aquifer transmissivity from specific capacity using MATLAB.
McLin, Stephen G
2005-01-01
Historically, specific capacity information has been used to calculate aquifer transmissivity when pumping test data are unavailable. This paper presents a simple computer program written in the MATLAB programming language that estimates transmissivity from specific capacity data while correcting for aquifer partial penetration and well efficiency. The program graphically plots transmissivity as a function of these factors so that the user can visually estimate their relative importance in a particular application. The program is compatible with any computer operating system running MATLAB, including Windows, Macintosh OS, Linux, and Unix. Two simple examples illustrate program usage.
10 CFR 470.13 - Program solicitation.
Code of Federal Regulations, 2010 CFR
2010-01-01
... simple application form for submitting a proposal for support under the program, together with... directly to interested individuals, entities, and associations thereof, to the maximum extent feasible. ...
NASA Astrophysics Data System (ADS)
Michalik, Peter; Mital, Dusan; Zajac, Jozef; Brezikova, Katarina; Duplak, Jan; Hatala, Michal; Radchenko, Svetlana
2016-10-01
Article deals with point to using intelligent relay and PLC systems in practice, to their architecture and principles of programming and simulations for education process on all types of school from secondary to universities. Aim of the article is proposal of simple examples of applications, where is demonstrated methodology of programming on real simple practice examples and shown using of chosen instructions. In practical part is described process of creating schemas and describing of function blocks, where are described methodologies of creating program and simulations of output reactions on changeable inputs for intelligent relays.
Simplification May Not Be So Simple: Gauging State Alignment with the FAFSA
ERIC Educational Resources Information Center
Pingel, Sarah
2017-01-01
Applying for financial aid can be a complicated, time-consuming endeavor for students and their families. Fortunately, many state aid programs have taken strides to align aid applications to the form used for federal aid programs, the Free Application for Federal Student Aid (FAFSA), making state aid more readily accessible. New conversations…
A Simple Huckel Molecular Orbital Plotter
ERIC Educational Resources Information Center
Ramakrishnan, Raghunathan
2013-01-01
A program is described and presented to readily plot the molecular orbitals from a Huckel calculation. The main features of the program and the scope of its applicability are discussed through some example organic molecules. (Contains 2 figures.)
A Comparative Study of Multi-material Data Structures for Computational Physics Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garimella, Rao Veerabhadra; Robey, Robert W.
The data structures used to represent the multi-material state of a computational physics application can have a drastic impact on the performance of the application. We look at efficient data structures for sparse applications where there may be many materials, but only one or few in most computational cells. We develop simple performance models for use in selecting possible data structures and programming patterns. We verify the analytic models of performance through a small test program of the representative cases.
A simple program to measure and analyse tree rings using Excel, R and SigmaScan
Hietz, Peter
2011-01-01
I present a new software that links a program for image analysis (SigmaScan), one for spreadsheets (Excel) and one for statistical analysis (R) for applications of tree-ring analysis. The first macro measures ring width marked by the user on scanned images, stores raw and detrended data in Excel and calculates the distance to the pith and inter-series correlations. A second macro measures darkness along a defined path to identify latewood–earlywood transition in conifers, and a third shows the potential for automatic detection of boundaries. Written in Visual Basic for Applications, the code makes use of the advantages of existing programs and is consequently very economic and relatively simple to adjust to the requirements of specific projects or to expand making use of already available code. PMID:26109835
A simple program to measure and analyse tree rings using Excel, R and SigmaScan.
Hietz, Peter
I present a new software that links a program for image analysis (SigmaScan), one for spreadsheets (Excel) and one for statistical analysis (R) for applications of tree-ring analysis. The first macro measures ring width marked by the user on scanned images, stores raw and detrended data in Excel and calculates the distance to the pith and inter-series correlations. A second macro measures darkness along a defined path to identify latewood-earlywood transition in conifers, and a third shows the potential for automatic detection of boundaries. Written in Visual Basic for Applications, the code makes use of the advantages of existing programs and is consequently very economic and relatively simple to adjust to the requirements of specific projects or to expand making use of already available code.
A "Simple Query Interface" Adapter for the Discovery and Exchange of Learning Resources
ERIC Educational Resources Information Center
Massart, David
2006-01-01
Developed as part of CEN/ISSS Workshop on Learning Technology efforts to improve interoperability between learning resource repositories, the Simple Query Interface (SQI) is an Application Program Interface (API) for querying heterogeneous repositories of learning resource metadata. In the context of the ProLearn Network of Excellence, SQI is used…
Prototyping distributed simulation networks
NASA Technical Reports Server (NTRS)
Doubleday, Dennis L.
1990-01-01
Durra is a declarative language designed to support application-level programming. The use of Durra is illustrated to describe a simple distributed application: a simulation of a collection of networked vehicle simulators. It is shown how the language is used to describe the application, its components and structure, and how the runtime executive provides for the execution of the application.
The control data "GIRAFFE" system for interactive graphic finite element analysis
NASA Technical Reports Server (NTRS)
Park, S.; Brandon, D. M., Jr.
1975-01-01
The Graphical Interface for Finite Elements (GIRAFFE) general purpose interactive graphics application package was described. This system may be used as a pre/post processor for structural analysis computer programs. It facilitates the operations of creating, editing, or reviewing all the structural input/output data on a graphics terminal in a time-sharing mode of operation. An application program for a simple three-dimensional plate problem was illustrated.
Pandey, Anil Kumar; Saroha, Kartik; Sharma, Param Dev; Patel, Chetan; Bal, Chandrashekhar; Kumar, Rakesh
2017-01-01
In this study, we have developed a simple image processing application in MATLAB that uses suprathreshold stochastic resonance (SSR) and helps the user to visualize abdominopelvic tumor on the exported prediuretic positron emission tomography/computed tomography (PET/CT) images. A brainstorming session was conducted for requirement analysis for the program. It was decided that program should load the screen captured PET/CT images and then produces output images in a window with a slider control that should enable the user to view the best image that visualizes the tumor, if present. The program was implemented on personal computer using Microsoft Windows and MATLAB R2013b. The program has option for the user to select the input image. For the selected image, it displays output images generated using SSR in a separate window having a slider control. The slider control enables the user to view images and select one which seems to provide the best visualization of the area(s) of interest. The developed application enables the user to select, process, and view output images in the process of utilizing SSR to detect the presence of abdominopelvic tumor on prediuretic PET/CT image.
42 CFR 52b.10 - What are the terms and conditions of awards?
Code of Federal Regulations, 2010 CFR
2010-10-01
... program: (a) Title. The applicant must have a fee simple or other estate or interest in the site... contractor. (l) Notice of Federal Interest. The grantee shall record a Notice of Federal Interest in the... certifies that the grantee institution has fee simple title to the site free and clear of all liens...
DOT National Transportation Integrated Search
1974-08-01
DYNALIST, a computer program that extracts complex eigenvalues and eigenvectors for dynamic systems described in terms of matrix equations of motion, has been acquired and made operational at TSC. In this report, simple dynamic systems are used to de...
Hybrid Applications Of Artificial Intelligence
NASA Technical Reports Server (NTRS)
Borchardt, Gary C.
1988-01-01
STAR, Simple Tool for Automated Reasoning, is interactive, interpreted programming language for development and operation of artificial-intelligence application systems. Couples symbolic processing with compiled-language functions and data structures. Written in C language and currently available in UNIX version (NPO-16832), and VMS version (NPO-16965).
NASA Technical Reports Server (NTRS)
Blackburn, C. L.; Dovi, A. R.; Kurtze, W. L.; Storaasli, O. O.
1981-01-01
A computer software system for the processing and integration of engineering data and programs, called IPAD (Integrated Programs for Aerospace-Vehicle Design), is described. The ability of the system to relieve the engineer of the mundane task of input data preparation is demonstrated by the application of a prototype system to the design, analysis, and/or machining of three simple structures. Future work to further enhance the system's automated data handling and ability to handle larger and more varied design problems are also presented.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-18
... notice to solicit comments on the proposed rule change from interested persons. \\1\\ 15 U.S.C. 78s(b)(1...) The Customer \\3\\ Rebate Program in Section B; (ii) Simple Order pricing in Section I entitled Rebates... Exchange proposes to amend the Simple Order Fees for Removing Liquidity in Section I which are applicable...
Discrete Tchebycheff orthonormal polynomials and applications
NASA Technical Reports Server (NTRS)
Lear, W. M.
1980-01-01
Discrete Tchebycheff orthonormal polynomials offer a convenient way to make least squares polynomial fits of uniformly spaced discrete data. Computer programs to do so are simple and fast, and appear to be less affected by computer roundoff error, for the higher order fits, than conventional least squares programs. They are useful for any application of polynomial least squares fits: approximation of mathematical functions, noise analysis of radar data, and real time smoothing of noisy data, to name a few.
Web Program for Development of GUIs for Cluster Computers
NASA Technical Reports Server (NTRS)
Czikmantory, Akos; Cwik, Thomas; Klimeck, Gerhard; Hua, Hook; Oyafuso, Fabiano; Vinyard, Edward
2003-01-01
WIGLAF (a Web Interface Generator and Legacy Application Facade) is a computer program that provides a Web-based, distributed, graphical-user-interface (GUI) framework that can be adapted to any of a broad range of application programs, written in any programming language, that are executed remotely on any cluster computer system. WIGLAF enables the rapid development of a GUI for controlling and monitoring a specific application program running on the cluster and for transferring data to and from the application program. The only prerequisite for the execution of WIGLAF is a Web-browser program on a user's personal computer connected with the cluster via the Internet. WIGLAF has a client/server architecture: The server component is executed on the cluster system, where it controls the application program and serves data to the client component. The client component is an applet that runs in the Web browser. WIGLAF utilizes the Extensible Markup Language to hold all data associated with the application software, Java to enable platform-independent execution on the cluster system and the display of a GUI generator through the browser, and the Java Remote Method Invocation software package to provide simple, effective client/server networking.
Evolution of a minimal parallel programming model
Lusk, Ewing; Butler, Ralph; Pieper, Steven C.
2017-04-30
Here, we take a historical approach to our presentation of self-scheduled task parallelism, a programming model with its origins in early irregular and nondeterministic computations encountered in automated theorem proving and logic programming. We show how an extremely simple task model has evolved into a system, asynchronous dynamic load balancing (ADLB), and a scalable implementation capable of supporting sophisticated applications on today’s (and tomorrow’s) largest supercomputers; and we illustrate the use of ADLB with a Green’s function Monte Carlo application, a modern, mature nuclear physics code in production use. Our lesson is that by surrendering a certain amount of generalitymore » and thus applicability, a minimal programming model (in terms of its basic concepts and the size of its application programmer interface) can achieve extreme scalability without introducing complexity.« less
Implementing Multidisciplinary and Multi-Zonal Applications Using MPI
NASA Technical Reports Server (NTRS)
Fineberg, Samuel A.
1995-01-01
Multidisciplinary and multi-zonal applications are an important class of applications in the area of Computational Aerosciences. In these codes, two or more distinct parallel programs or copies of a single program are utilized to model a single problem. To support such applications, it is common to use a programming model where a program is divided into several single program multiple data stream (SPMD) applications, each of which solves the equations for a single physical discipline or grid zone. These SPMD applications are then bound together to form a single multidisciplinary or multi-zonal program in which the constituent parts communicate via point-to-point message passing routines. Unfortunately, simple message passing models, like Intel's NX library, only allow point-to-point and global communication within a single system-defined partition. This makes implementation of these applications quite difficult, if not impossible. In this report it is shown that the new Message Passing Interface (MPI) standard is a viable portable library for implementing the message passing portion of multidisciplinary applications. Further, with the extension of a portable loader, fully portable multidisciplinary application programs can be developed. Finally, the performance of MPI is compared to that of some native message passing libraries. This comparison shows that MPI can be implemented to deliver performance commensurate with native message libraries.
A Budget Tree Improvement Program
Hans Nienstaedt; Hyun Kang
1983-01-01
In an Upper Peninsula Michigan test of simple design, white spruce of a Beachburg, Ontaria provenance grew 17.5 percent taller than white spruce from the Ottawa N.F. The paper describes how to convert such tests to low-cost, low-risk, highly flexible improvement programs. The approach is applicable to other species of low priority.
Toward a Natural Speech Understanding System
1989-10-01
WALTER J. SENUS Technical Director Directorate of Intelligence & Reconnaissance FOR THE COMMANDER JAMES W. HYDE III V Directorate of Plans & Programs ...applicable) Human Resources Laboratory F30602-81-C-0193 8 . ADDRESS (City, State, and ZIP Code) 10. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT TASK WORK...error rates for distinctive words produced in isolation by a single speaker, and their simple programming requirements. Template-matching systems rank
Simple sequence repeat marker loci discovery using SSR primer.
Robinson, Andrew J; Love, Christopher G; Batley, Jacqueline; Barker, Gary; Edwards, David
2004-06-12
Simple sequence repeats (SSRs) have become important molecular markers for a broad range of applications, such as genome mapping and characterization, phenotype mapping, marker assisted selection of crop plants and a range of molecular ecology and diversity studies. With the increase in the availability of DNA sequence information, an automated process to identify and design PCR primers for amplification of SSR loci would be a useful tool in plant breeding programs. We report an application that integrates SPUTNIK, an SSR repeat finder, with Primer3, a PCR primer design program, into one pipeline tool, SSR Primer. On submission of multiple FASTA formatted sequences, the script screens each sequence for SSRs using SPUTNIK. The results are parsed to Primer3 for locus-specific primer design. The script makes use of a Web-based interface, enabling remote use. This program has been written in PERL and is freely available for non-commercial users by request from the authors. The Web-based version may be accessed at http://hornbill.cspp.latrobe.edu.au/
Wrapping up BLAST and other applications for use on Unix clusters.
Hokamp, Karsten; Shields, Denis C; Wolfe, Kenneth H; Caffrey, Daniel R
2003-02-12
We have developed two programs that speed up common bioinformatic applications by spreading them across a UNIX cluster.(1) BLAST.pm, a new module for the 'MOLLUSC' package. (2) WRAPID, a simple tool for parallelizing large numbers of small instances of programs such as BLAST, FASTA and CLUSTALW. The packages were developed in Perl on a 20-node Linux cluster and are provided together with a configuration script and documentation. They can be freely downloaded from http://wolfe.gen.tcd.ie/wrapper.
MPIRUN: A Portable Loader for Multidisciplinary and Multi-Zonal Applications
NASA Technical Reports Server (NTRS)
Fineberg, Samuel A.; Woodrow, Thomas S. (Technical Monitor)
1994-01-01
Multidisciplinary and multi-zonal applications are an important class of applications in the area of Computational Aerosciences. In these codes, two or more distinct parallel programs or copies of a single program are utilized to model a single problem. To support such applications, it is common to use a programming model where a program is divided into several single program multiple data stream (SPMD) applications, each of which solves the equations for a single physical discipline or grid zone. These SPMD applications are then bound together to form a single multidisciplinary or multi-zonal program in which the constituent parts communicate via point-to-point message passing routines. One method for implementing the message passing portion of these codes is with the new Message Passing Interface (MPI) standard. Unfortunately, this standard only specifies the message passing portion of an application, but does not specify any portable mechanisms for loading an application. MPIRUN was developed to provide a portable means for loading MPI programs, and was specifically targeted at multidisciplinary and multi-zonal applications. Programs using MPIRUN for loading and MPI for message passing are then portable between all machines supported by MPIRUN. MPIRUN is currently implemented for the Intel iPSC/860, TMC CM5, IBM SP-1 and SP-2, Intel Paragon, and workstation clusters. Further, MPIRUN is designed to be simple enough to port easily to any system supporting MPI.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lusk, Ewing; Butler, Ralph; Pieper, Steven C.
Here, we take a historical approach to our presentation of self-scheduled task parallelism, a programming model with its origins in early irregular and nondeterministic computations encountered in automated theorem proving and logic programming. We show how an extremely simple task model has evolved into a system, asynchronous dynamic load balancing (ADLB), and a scalable implementation capable of supporting sophisticated applications on today’s (and tomorrow’s) largest supercomputers; and we illustrate the use of ADLB with a Green’s function Monte Carlo application, a modern, mature nuclear physics code in production use. Our lesson is that by surrendering a certain amount of generalitymore » and thus applicability, a minimal programming model (in terms of its basic concepts and the size of its application programmer interface) can achieve extreme scalability without introducing complexity.« less
Simplified Calculation Of Solar Fluxes In Solar Receivers
NASA Technical Reports Server (NTRS)
Bhandari, Pradeep
1990-01-01
Simplified Calculation of Solar Flux Distribution on Side Wall of Cylindrical Cavity Solar Receivers computer program employs simple solar-flux-calculation algorithm for cylindrical-cavity-type solar receiver. Results compare favorably with those of more complicated programs. Applications include study of solar energy and transfer of heat, and space power/solar-dynamics engineering. Written in FORTRAN 77.
Combinatorial structures to modeling simple games and applications
NASA Astrophysics Data System (ADS)
Molinero, Xavier
2017-09-01
We connect three different topics: combinatorial structures, game theory and chemistry. In particular, we establish the bases to represent some simple games, defined as influence games, and molecules, defined from atoms, by using combinatorial structures. First, we characterize simple games as influence games using influence graphs. It let us to modeling simple games as combinatorial structures (from the viewpoint of structures or graphs). Second, we formally define molecules as combinations of atoms. It let us to modeling molecules as combinatorial structures (from the viewpoint of combinations). It is open to generate such combinatorial structures using some specific techniques as genetic algorithms, (meta-)heuristics algorithms and parallel programming, among others.
Contemporary issues in HIM. The application layer--III.
Wear, L L; Pinkert, J R
1993-07-01
We have seen document preparation systems evolve from basic line editors through powerful, sophisticated desktop publishing programs. This component of the application layer is probably one of the most used, and most readily identifiable. Ask grade school children nowadays, and many will tell you that they have written a paper on a computer. Next month will be a "fun" tour through a number of other application programs we find useful. They will range from a simple notebook reminder to a sophisticated photograph processor. Application layer: Software targeted for the end user, focusing on a specific application area, and typically residing in the computer system as distinct components on top of the OS. Desktop publishing: A document preparation program that begins with the text features of a word processor, then adds the ability for a user to incorporate outputs from a variety of graphic programs, spreadsheets, and other applications. Line editor: A document preparation program that manipulates text in a file on the basis of numbered lines. Word processor: A document preparation program that can, among other things, reformat sections of documents, move and replace blocks of text, use multiple character fonts, automatically create a table of contents and index, create complex tables, and combine text and graphics.
A simple modern correctness condition for a space-based high-performance multiprocessor
NASA Technical Reports Server (NTRS)
Probst, David K.; Li, Hon F.
1992-01-01
A number of U.S. national programs, including space-based detection of ballistic missile launches, envisage putting significant computing power into space. Given sufficient progress in low-power VLSI, multichip-module packaging and liquid-cooling technologies, we will see design of high-performance multiprocessors for individual satellites. In very high speed implementations, performance depends critically on tolerating large latencies in interprocessor communication; without latency tolerance, performance is limited by the vastly differing time scales in processor and data-memory modules, including interconnect times. The modern approach to tolerating remote-communication cost in scalable, shared-memory multiprocessors is to use a multithreaded architecture, and alter the semantics of shared memory slightly, at the price of forcing the programmer either to reason about program correctness in a relaxed consistency model or to agree to program in a constrained style. The literature on multiprocessor correctness conditions has become increasingly complex, and sometimes confusing, which may hinder its practical application. We propose a simple modern correctness condition for a high-performance, shared-memory multiprocessor; the correctness condition is based on a simple interface between the multiprocessor architecture and a high-performance, shared-memory multiprocessor; the correctness condition is based on a simple interface between the multiprocessor architecture and the parallel programming system.
ERIC Educational Resources Information Center
Stanley, Simone; Ymele-Leki, Patrick
2017-01-01
A community outreach project was integrated in a District of Columbia public schools summer internship program for students from underrepresented minorities in STEM. The project introduced these students to fundamental engineering principles by leveraging a smartphone application (App) so readily accessible and attractive to them that it boosted…
A Simple Simulator to Teach Enzyme Kinetics Dynamics. Application in a Problem-Solving Exercise
ERIC Educational Resources Information Center
Torres, Néstor; Santos, Guido
2017-01-01
Enzyme kinetics is an essential part of biochemistry programs, which have been gaining importance in recent years for their applications in biotechnology and biomedicine. The teaching and learning of these issues has been traditionally hampered by difficulties that stem mainly from the dynamic and mathematical nature of the topic and the…
NASA Astrophysics Data System (ADS)
Brandelik, Andreas
2009-07-01
CALCMIN, an open source Visual Basic program, was implemented in EXCEL™. The program was primarily developed to support geoscientists in their routine task of calculating structural formulae of minerals on the basis of chemical analysis mainly obtained by electron microprobe (EMP) techniques. Calculation programs for various minerals are already included in the form of sub-routines. These routines are arranged in separate modules containing a minimum of code. The architecture of CALCMIN allows the user to easily develop new calculation routines or modify existing routines with little knowledge of programming techniques. By means of a simple mouse-click, the program automatically generates a rudimentary framework of code using the object model of the Visual Basic Editor (VBE). Within this framework simple commands and functions, which are provided by the program, can be used, for example, to perform various normalization procedures or to output the results of the computations. For the clarity of the code, element symbols are used as variables initialized by the program automatically. CALCMIN does not set any boundaries in complexity of the code used, resulting in a wide range of possible applications. Thus, matrix and optimization methods can be included, for instance, to determine end member contents for subsequent thermodynamic calculations. Diverse input procedures are provided, such as the automated read-in of output files created by the EMP. Furthermore, a subsequent filter routine enables the user to extract specific analyses in order to use them for a corresponding calculation routine. An event-driven, interactive operating mode was selected for easy application of the program. CALCMIN leads the user from the beginning to the end of the calculation process.
Open source software in a practical approach for post processing of radiologic images.
Valeri, Gianluca; Mazza, Francesco Antonino; Maggi, Stefania; Aramini, Daniele; La Riccia, Luigi; Mazzoni, Giovanni; Giovagnoni, Andrea
2015-03-01
The purpose of this paper is to evaluate the use of open source software (OSS) to process DICOM images. We selected 23 programs for Windows and 20 programs for Mac from 150 possible OSS programs including DICOM viewers and various tools (converters, DICOM header editors, etc.). The programs selected all meet the basic requirements such as free availability, stand-alone application, presence of graphical user interface, ease of installation and advanced features beyond simple display monitor. Capabilities of data import, data export, metadata, 2D viewer, 3D viewer, support platform and usability of each selected program were evaluated on a scale ranging from 1 to 10 points. Twelve programs received a score higher than or equal to eight. Among them, five obtained a score of 9: 3D Slicer, MedINRIA, MITK 3M3, VolView, VR Render; while OsiriX received 10. OsiriX appears to be the only program able to perform all the operations taken into consideration, similar to a workstation equipped with proprietary software, allowing the analysis and interpretation of images in a simple and intuitive way. OsiriX is a DICOM PACS workstation for medical imaging and software for image processing for medical research, functional imaging, 3D imaging, confocal microscopy and molecular imaging. This application is also a good tool for teaching activities because it facilitates the attainment of learning objectives among students and other specialists.
Designing application software in wide area network settings
NASA Technical Reports Server (NTRS)
Makpangou, Mesaac; Birman, Ken
1990-01-01
Progress in methodologies for developing robust local area network software has not been matched by similar results for wide area settings. The design of application software spanning multiple local area environments is examined. For important classes of applications, simple design techniques are presented that yield fault tolerant wide area programs. An implementation of these techniques as a set of tools for use within the ISIS system is described.
Noise studies of communication systems using the SYSTID computer aided analysis program
NASA Technical Reports Server (NTRS)
Tranter, W. H.; Dawson, C. T.
1973-01-01
SYSTID computer aided design is a simple program for simulating data systems and communication links. A trial of the efficiency of the method was carried out by simulating a linear analog communication system to determine its noise performance and by comparing the SYSTID result with the result arrived at by theoretical calculation. It is shown that the SYSTID program is readily applicable to the analysis of these types of systems.
A Switching-Mode Power Supply Design Tool to Improve Learning in a Power Electronics Course
ERIC Educational Resources Information Center
Miaja, P. F.; Lamar, D. G.; de Azpeitia, M.; Rodriguez, A.; Rodriguez, M.; Hernando, M. M.
2011-01-01
The static design of ac/dc and dc/dc switching-mode power supplies (SMPS) relies on a simple but repetitive process. Although specific spreadsheets, available in various computer-aided design (CAD) programs, are widely used, they are difficult to use in educational applications. In this paper, a graphic tool programmed in MATLAB is presented,…
Program CONTRAST--A general program for the analysis of several survival or recovery rate estimates
Hines, J.E.; Sauer, J.R.
1989-01-01
This manual describes the use of program CONTRAST, which implements a generalized procedure for the comparison of several rate estimates. This method can be used to test both simple and composite hypotheses about rate estimates, and we discuss its application to multiple comparisons of survival rate estimates. Several examples of the use of program CONTRAST are presented. Program CONTRAST will run on IBM-cimpatible computers, and requires estimates of the rates to be tested, along with associated variance and covariance estimates.
Yaniv, Ziv; Lowekamp, Bradley C; Johnson, Hans J; Beare, Richard
2018-06-01
Modern scientific endeavors increasingly require team collaborations to construct and interpret complex computational workflows. This work describes an image-analysis environment that supports the use of computational tools that facilitate reproducible research and support scientists with varying levels of software development skills. The Jupyter notebook web application is the basis of an environment that enables flexible, well-documented, and reproducible workflows via literate programming. Image-analysis software development is made accessible to scientists with varying levels of programming experience via the use of the SimpleITK toolkit, a simplified interface to the Insight Segmentation and Registration Toolkit. Additional features of the development environment include user friendly data sharing using online data repositories and a testing framework that facilitates code maintenance. SimpleITK provides a large number of examples illustrating educational and research-oriented image analysis workflows for free download from GitHub under an Apache 2.0 license: github.com/InsightSoftwareConsortium/SimpleITK-Notebooks .
Technology in rural transportation. Simple solution #9, transportation operations optimization
DOT National Transportation Integrated Search
1997-01-01
This application was identified as a promising rural Intelligent Transportation Systems (ITS) solution under a project sponsored by the Federal Highway Administration (FHWA) and the ENTERPRISE program. This summary describes the solution as well as o...
Technology in rural transportation. Simple solution #7, lane drop driver awareness
DOT National Transportation Integrated Search
1997-01-01
This application was identified as a promising rural Intelligent Transportation Systems (ITS) solution under a project sponsored by the Federal Highway Administration (FHWA) and the ENTERPRISE program. This summary describes the solution as well as o...
Technology in rural transportation. Simple solution #8, mobile weather sensors
DOT National Transportation Integrated Search
1997-01-01
This application was identified as a promising rural Intelligent Transportation Systems (ITS) solution under a project sponsored by the Federal Highway Administration (FHWA) and the ENTERPRISE program. This summary describes the solution as well as o...
Technology in rural transportation. Simple solution #4, low cost vehicle detection
DOT National Transportation Integrated Search
1997-01-01
This application was identified as a promising rural Intelligent Transportation Systems (ITS) solution under a project sponsored by the Federal Highway Administration (FHWA) and the ENTERPRISE program. This summary describes the solution as well as o...
Technology in rural transportation. Simple solution #3, coordinated rural transit services
DOT National Transportation Integrated Search
1997-01-01
This application was identified as a promising rural Intelligent Transportation Systems (ITS) solution under a project sponsored by the Federal Highway Administration (FHWA) and the ENTERPRISE program. This summary describes the solution as well as o...
Technology in rural transportation. Simple solution #14, public service weather radio
DOT National Transportation Integrated Search
1997-01-01
This application was identified as a promising rural Intelligent Transportation Systems (ITS) solution under a project sponsored by the Federal Highway Administration (FHWA) and the ENTERPRISE program. This summary describes the solution as well as o...
Technology in rural transportation. Simple solution #2, coordinate addressing system
DOT National Transportation Integrated Search
1997-01-01
This application was identified as a promising rural Intelligent Transportation Systems (ITS) solution under a project sponsored by the Federal Highway Administration (FHWA) and the ENTERPRISE program. This summary describes the solution as well as o...
SYVA: A program to analyze symmetry of molecules based on vector algebra
NASA Astrophysics Data System (ADS)
Gyevi-Nagy, László; Tasi, Gyula
2017-06-01
Symmetry is a useful concept in physics and chemistry. It can be used to find out some simple properties of a molecule or simplify complex calculations. In this paper a simple vector algebraic method is described to determine all symmetry elements of an arbitrary molecule. To carry out the symmetry analysis, a program has been written, which is also capable of generating the framework group of the molecule, revealing the symmetry properties of normal modes of vibration and symmetrizing the structure. To demonstrate the capabilities of the program, it is compared to other common widely used stand-alone symmetry analyzer (SYMMOL, Symmetrizer) and molecular modeling (NWChem, ORCA, MRCC) software. SYVA can generate input files for molecular modeling programs, e.g. Gaussian, using precisely symmetrized molecular structures. Possible applications are also demonstrated by integrating SYVA with the GAMESS and MRCC software.
Simple measurement of lenticular lens quality for autostereoscopic displays
NASA Astrophysics Data System (ADS)
Gray, Stuart; Boudreau, Robert A.
2013-03-01
Lenticular lens based autostereoscopic 3D displays are finding many applications in digital signage and consumer electronics devices. A high quality 3D viewing experience requires the lenticular lens be properly aligned with the pixels on the display device so that each eye views the correct image. This work presents a simple and novel method for rapidly assessing the quality of a lenticular lens to be used in autostereoscopic displays. Errors in lenticular alignment across the entire display are easily observed with a simple test pattern where adjacent views are programmed to display different colors.
Discrete sequence prediction and its applications
NASA Technical Reports Server (NTRS)
Laird, Philip
1992-01-01
Learning from experience to predict sequences of discrete symbols is a fundamental problem in machine learning with many applications. We apply sequence prediction using a simple and practical sequence-prediction algorithm, called TDAG. The TDAG algorithm is first tested by comparing its performance with some common data compression algorithms. Then it is adapted to the detailed requirements of dynamic program optimization, with excellent results.
Elastic-plastic analysis of a propagating crack under cyclic loading
NASA Technical Reports Server (NTRS)
Newman, J. C., Jr.; Armen, H., Jr.
1974-01-01
Development and application of a two-dimensional finite-element analysis to predict crack-closure and crack-opening stresses during specified histories of cyclic loading. An existing finite-element computer program which accounts for elastic-plastic material behavior under cyclic loading was modified to account for changing boundary conditions - crack growth and intermittent contact of crack surfaces. This program was subsequently used to study the crack-closure behavior under constant-amplitude and simple block-program loading.
DOT National Transportation Integrated Search
1997-01-01
This application was identified as a promising rural Intelligent Transportation Systems (ITS) solution under a project sponsored by the Federal Highway Administration (FHWA) and the ENTERPRISE program. This summary describes the solution as well as o...
Technology in rural transportation. Simple solution #10, wireless pagers to activate warning beacons
DOT National Transportation Integrated Search
1997-01-01
This application was identified as a promising rural Intelligent Transportation Systems (ITS) solution under a project sponsored by the Federal Highway Administration (FHWA) and the ENTERPRISE program. This summary describes the solution as well as o...
Technology in rural transportation. Simple solution #6, traveler information on the internet
DOT National Transportation Integrated Search
1997-01-01
This application was identified as a promising rural Intelligent Transportation Systems (ITS) solution under a project sponsored by the Federal Highway Administration (FHWA) and the ENTERPRISE program. This summary describes the solution as well as o...
DOT National Transportation Integrated Search
1997-01-01
This application was identified as a promising rural Intelligent Transportation Systems (ITS) solution under a project sponsored by the Federal Highway Administration (FHWA) and the ENTERPRISE program. This summary describes the solution as well as o...
DOT National Transportation Integrated Search
1997-01-01
This application was identified as a promising rural Intelligent Transportation Systems (ITS) solution under a project sponsored by the Federal Highway Administration (FHWA) and the ENTERPRISE program. This summary describes the solution as well as o...
Technology in rural transportation. Simple solution #5, traveler information using fax machines
DOT National Transportation Integrated Search
1997-01-01
This application was identified as a promising rural Intelligent Transportation Systems (ITS) solution under a project sponsored by the Federal Highway Administration (FHWA) and the ENTERPRISE program. This summary describes the solution as well as o...
Chips: A Tool for Developing Software Interfaces Interactively.
1987-10-01
of the application through the objects on the screen. Chips makes this easy by supplying simple and direct access to the source code and data ...object-oriented programming, user interface management systems, programming environments. Typographic Conventions Technical terms appearing in the...creating an environment in which we could do our work. This project could not have happened without him. Jeff Bonar started and managed the Chips
Concurrent Image Processing Executive (CIPE). Volume 2: Programmer's guide
NASA Technical Reports Server (NTRS)
Williams, Winifred I.
1990-01-01
This manual is intended as a guide for application programmers using the Concurrent Image Processing Executive (CIPE). CIPE is intended to become the support system software for a prototype high performance science analysis workstation. In its current configuration CIPE utilizes a JPL/Caltech Mark 3fp Hypercube with a Sun-4 host. CIPE's design is capable of incorporating other concurrent architectures as well. CIPE provides a programming environment to applications' programmers to shield them from various user interfaces, file transactions, and architectural complexities. A programmer may choose to write applications to use only the Sun-4 or to use the Sun-4 with the hypercube. A hypercube program will use the hypercube's data processors and optionally the Weitek floating point accelerators. The CIPE programming environment provides a simple set of subroutines to activate user interface functions, specify data distributions, activate hypercube resident applications, and to communicate parameters to and from the hypercube.
An Accessible User Interface for Geoscience and Programming
NASA Astrophysics Data System (ADS)
Sevre, E. O.; Lee, S.
2012-12-01
The goal of this research is to develop an interface that will simplify user interaction with software for scientists. The motivating factor of the research is to develop tools that assist scientists with limited motor skills with the efficient generation and use of software tools. Reliance on computers and programming is increasing in the world of geology, and it is increasingly important for geologists and geophysicists to have the computational resources to use advanced software and edit programs for their research. I have developed a prototype of a program to help geophysicists write programs using a simple interface that requires only simple single-mouse-clicks to input code. It is my goal to minimize the amount of typing necessary to create simple programs and scripts to increase accessibility for people with disabilities limiting fine motor skills. This interface can be adapted for various programming and scripting languages. Using this interface will simplify development of code for C/C++, Java, and GMT, and can be expanded to support any other text based programming language. The interface is designed around the concept of maximizing the amount of code that can be written using a minimum number of clicks and typing. The screen is split into two sections: a list of click-commands is on the left hand side, and a text area is on the right hand side. When the user clicks on a command on the left hand side the applicable code is automatically inserted at the insertion point in the text area. Currently in the C/C++ interface, there are commands for common code segments that are often used, such as for loops, comments, print statements, and structured code creation. The primary goal is to provide an interface that will work across many devices for developing code. A simple prototype has been developed for the iPad. Due to the limited number of devices that an iOS application can be used with, the code has been re-written in Java to run on a wider range of devices. Currently, the software works in a prototype mode, and it is our goal to further development to create software that can benefit a wide range of people working in geosciences, which will make code development practical and accessible for a wider audience of scientists. By using an interface like this, it reduces potential for errors by reusing known working code.
Kangaroo – A pattern-matching program for biological sequences
2002-01-01
Background Biologists are often interested in performing a simple database search to identify proteins or genes that contain a well-defined sequence pattern. Many databases do not provide straightforward or readily available query tools to perform simple searches, such as identifying transcription binding sites, protein motifs, or repetitive DNA sequences. However, in many cases simple pattern-matching searches can reveal a wealth of information. We present in this paper a regular expression pattern-matching tool that was used to identify short repetitive DNA sequences in human coding regions for the purpose of identifying potential mutation sites in mismatch repair deficient cells. Results Kangaroo is a web-based regular expression pattern-matching program that can search for patterns in DNA, protein, or coding region sequences in ten different organisms. The program is implemented to facilitate a wide range of queries with no restriction on the length or complexity of the query expression. The program is accessible on the web at http://bioinfo.mshri.on.ca/kangaroo/ and the source code is freely distributed at http://sourceforge.net/projects/slritools/. Conclusion A low-level simple pattern-matching application can prove to be a useful tool in many research settings. For example, Kangaroo was used to identify potential genetic targets in a human colorectal cancer variant that is characterized by a high frequency of mutations in coding regions containing mononucleotide repeats. PMID:12150718
Molecular implementation of simple logic programs.
Ran, Tom; Kaplan, Shai; Shapiro, Ehud
2009-10-01
Autonomous programmable computing devices made of biomolecules could interact with a biological environment and be used in future biological and medical applications. Biomolecular implementations of finite automata and logic gates have already been developed. Here, we report an autonomous programmable molecular system based on the manipulation of DNA strands that is capable of performing simple logical deductions. Using molecular representations of facts such as Man(Socrates) and rules such as Mortal(X) <-- Man(X) (Every Man is Mortal), the system can answer molecular queries such as Mortal(Socrates)? (Is Socrates Mortal?) and Mortal(X)? (Who is Mortal?). This biomolecular computing system compares favourably with previous approaches in terms of expressive power, performance and precision. A compiler translates facts, rules and queries into their molecular representations and subsequently operates a robotic system that assembles the logical deductions and delivers the result. This prototype is the first simple programming language with a molecular-scale implementation.
Numerical simulation of NQR/NMR: Applications in quantum computing.
Possa, Denimar; Gaudio, Anderson C; Freitas, Jair C C
2011-04-01
A numerical simulation program able to simulate nuclear quadrupole resonance (NQR) as well as nuclear magnetic resonance (NMR) experiments is presented, written using the Mathematica package, aiming especially applications in quantum computing. The program makes use of the interaction picture to compute the effect of the relevant nuclear spin interactions, without any assumption about the relative size of each interaction. This makes the program flexible and versatile, being useful in a wide range of experimental situations, going from NQR (at zero or under small applied magnetic field) to high-field NMR experiments. Some conditions specifically required for quantum computing applications are implemented in the program, such as the possibility of use of elliptically polarized radiofrequency and the inclusion of first- and second-order terms in the average Hamiltonian expansion. A number of examples dealing with simple NQR and quadrupole-perturbed NMR experiments are presented, along with the proposal of experiments to create quantum pseudopure states and logic gates using NQR. The program and the various application examples are freely available through the link http://www.profanderson.net/files/nmr_nqr.php. Copyright © 2011 Elsevier Inc. All rights reserved.
LISP as an Environment for Software Design: Powerful and Perspicuous
Blum, Robert L.; Walker, Michael G.
1986-01-01
The LISP language provides a useful set of features for prototyping knowledge-intensive, clinical applications software that is not found In most other programing environments. Medical computer programs that need large medical knowledge bases, such as programs for diagnosis, therapeutic consultation, education, simulation, and peer review, are hard to design, evolve continually, and often require major revisions. They necessitate an efficient and flexible program development environment. The LISP language and programming environments bullt around it are well suited for program prototyping. The lingua franca of artifical intelligence researchers, LISP facllitates bullding complex systems because it is simple yet powerful. Because of its simplicity, LISP programs can read, execute, modify and even compose other LISP programs at run time. Hence, it has been easy for system developers to create programming tools that greatly speed the program development process, and that may be easily extended by users. This has resulted in the creation of many useful graphical interfaces, editors, and debuggers, which facllitate the development of knowledge-intensive medical applications.
CLMNANAL: A C++ program for application of the Coleman stability analysis to rotorcraft
NASA Technical Reports Server (NTRS)
Lance, Michael B.
1996-01-01
This program is an adaptation of the theory of Robert P. Coleman and Arnold M. Feingold as presented in NACA Report 1351, 1958. This theory provided a method for the analysis of multiple-bladed rotor systems to determine the system susceptibility to ground resonance. Their treatment also provided a simple means for determining the required product of rotor and chassis damping factors to suppress the resonance. This C++ program is based on a FORTRAN 77 version of a similar code.
Extending cluster Lot Quality Assurance Sampling designs for surveillance programs
Hund, Lauren; Pagano, Marcello
2014-01-01
Lot quality assurance sampling (LQAS) has a long history of applications in industrial quality control. LQAS is frequently used for rapid surveillance in global health settings, with areas classified as poor or acceptable performance based on the binary classification of an indicator. Historically, LQAS surveys have relied on simple random samples from the population; however, implementing two-stage cluster designs for surveillance sampling is often more cost-effective than simple random sampling. By applying survey sampling results to the binary classification procedure, we develop a simple and flexible non-parametric procedure to incorporate clustering effects into the LQAS sample design to appropriately inflate the sample size, accommodating finite numbers of clusters in the population when relevant. We use this framework to then discuss principled selection of survey design parameters in longitudinal surveillance programs. We apply this framework to design surveys to detect rises in malnutrition prevalence in nutrition surveillance programs in Kenya and South Sudan, accounting for clustering within villages. By combining historical information with data from previous surveys, we design surveys to detect spikes in the childhood malnutrition rate. PMID:24633656
Extending cluster lot quality assurance sampling designs for surveillance programs.
Hund, Lauren; Pagano, Marcello
2014-07-20
Lot quality assurance sampling (LQAS) has a long history of applications in industrial quality control. LQAS is frequently used for rapid surveillance in global health settings, with areas classified as poor or acceptable performance on the basis of the binary classification of an indicator. Historically, LQAS surveys have relied on simple random samples from the population; however, implementing two-stage cluster designs for surveillance sampling is often more cost-effective than simple random sampling. By applying survey sampling results to the binary classification procedure, we develop a simple and flexible nonparametric procedure to incorporate clustering effects into the LQAS sample design to appropriately inflate the sample size, accommodating finite numbers of clusters in the population when relevant. We use this framework to then discuss principled selection of survey design parameters in longitudinal surveillance programs. We apply this framework to design surveys to detect rises in malnutrition prevalence in nutrition surveillance programs in Kenya and South Sudan, accounting for clustering within villages. By combining historical information with data from previous surveys, we design surveys to detect spikes in the childhood malnutrition rate. Copyright © 2014 John Wiley & Sons, Ltd.
Runtime and Architecture Support for Efficient Data Exchange in Multi-Accelerator Applications.
Cabezas, Javier; Gelado, Isaac; Stone, John E; Navarro, Nacho; Kirk, David B; Hwu, Wen-Mei
2015-05-01
Heterogeneous parallel computing applications often process large data sets that require multiple GPUs to jointly meet their needs for physical memory capacity and compute throughput. However, the lack of high-level abstractions in previous heterogeneous parallel programming models force programmers to resort to multiple code versions, complex data copy steps and synchronization schemes when exchanging data between multiple GPU devices, which results in high software development cost, poor maintainability, and even poor performance. This paper describes the HPE runtime system, and the associated architecture support, which enables a simple, efficient programming interface for exchanging data between multiple GPUs through either interconnects or cross-node network interfaces. The runtime and architecture support presented in this paper can also be used to support other types of accelerators. We show that the simplified programming interface reduces programming complexity. The research presented in this paper started in 2009. It has been implemented and tested extensively in several generations of HPE runtime systems as well as adopted into the NVIDIA GPU hardware and drivers for CUDA 4.0 and beyond since 2011. The availability of real hardware that support key HPE features gives rise to a rare opportunity for studying the effectiveness of the hardware support by running important benchmarks on real runtime and hardware. Experimental results show that in a exemplar heterogeneous system, peer DMA and double-buffering, pinned buffers, and software techniques can improve the inter-accelerator data communication bandwidth by 2×. They can also improve the execution speed by 1.6× for a 3D finite difference, 2.5× for 1D FFT, and 1.6× for merge sort, all measured on real hardware. The proposed architecture support enables the HPE runtime to transparently deploy these optimizations under simple portable user code, allowing system designers to freely employ devices of different capabilities. We further argue that simple interfaces such as HPE are needed for most applications to benefit from advanced hardware features in practice.
Runtime and Architecture Support for Efficient Data Exchange in Multi-Accelerator Applications
Cabezas, Javier; Gelado, Isaac; Stone, John E.; Navarro, Nacho; Kirk, David B.; Hwu, Wen-mei
2014-01-01
Heterogeneous parallel computing applications often process large data sets that require multiple GPUs to jointly meet their needs for physical memory capacity and compute throughput. However, the lack of high-level abstractions in previous heterogeneous parallel programming models force programmers to resort to multiple code versions, complex data copy steps and synchronization schemes when exchanging data between multiple GPU devices, which results in high software development cost, poor maintainability, and even poor performance. This paper describes the HPE runtime system, and the associated architecture support, which enables a simple, efficient programming interface for exchanging data between multiple GPUs through either interconnects or cross-node network interfaces. The runtime and architecture support presented in this paper can also be used to support other types of accelerators. We show that the simplified programming interface reduces programming complexity. The research presented in this paper started in 2009. It has been implemented and tested extensively in several generations of HPE runtime systems as well as adopted into the NVIDIA GPU hardware and drivers for CUDA 4.0 and beyond since 2011. The availability of real hardware that support key HPE features gives rise to a rare opportunity for studying the effectiveness of the hardware support by running important benchmarks on real runtime and hardware. Experimental results show that in a exemplar heterogeneous system, peer DMA and double-buffering, pinned buffers, and software techniques can improve the inter-accelerator data communication bandwidth by 2×. They can also improve the execution speed by 1.6× for a 3D finite difference, 2.5× for 1D FFT, and 1.6× for merge sort, all measured on real hardware. The proposed architecture support enables the HPE runtime to transparently deploy these optimizations under simple portable user code, allowing system designers to freely employ devices of different capabilities. We further argue that simple interfaces such as HPE are needed for most applications to benefit from advanced hardware features in practice. PMID:26180487
Path-Following Solutions Of Nonlinear Equations
NASA Technical Reports Server (NTRS)
Barger, Raymond L.; Walters, Robert W.
1989-01-01
Report describes some path-following techniques for solution of nonlinear equations and compares with other methods. Use of multipurpose techniques applicable at more than one stage of path-following computation results in system relatively simple to understand, program, and use. Comparison of techniques with method of parametric differentiation (MPD) reveals definite advantages for path-following methods. Emphasis in investigation on multiuse techniques being applied at more than one stage of path-following computation. Incorporation of multipurpose techniques results in concise computer code relatively simple to use.
NASA Astrophysics Data System (ADS)
Mlimandago, S.
This research paper have gone out with very simple and easy (several) new concepts in document management for space projects and programs which can be applied anywhere both in the developing and developed countries. These several new concepts are and have been applied in Tanzania, Kenya and Uganda and found out to bear very good results using simple procedures. The intergral project based its documentation management approach from the outset on electronic document sharing and archiving. The main objective of having new concepts was to provide a faster and wider availability of the most current space information to all parties rather than creating a paperless office. Implementation of the new concepts approach required the capturing of documents in an appropriate and simple electronic format at source establishing new procedures for project wide information sharing and the deployment of a new generation of simple procedure - WEB - based tools. Key success factors were the early adoption of Internet technologies and simple procedures for improved information flow new concepts which can be applied anywhere both in the developed and the developing countries.
Lowekamp, Bradley C; Chen, David T; Ibáñez, Luis; Blezek, Daniel
2013-01-01
SimpleITK is a new interface to the Insight Segmentation and Registration Toolkit (ITK) designed to facilitate rapid prototyping, education and scientific activities via high level programming languages. ITK is a templated C++ library of image processing algorithms and frameworks for biomedical and other applications, and it was designed to be generic, flexible and extensible. Initially, ITK provided a direct wrapping interface to languages such as Python and Tcl through the WrapITK system. Unlike WrapITK, which exposed ITK's complex templated interface, SimpleITK was designed to provide an easy to use and simplified interface to ITK's algorithms. It includes procedural methods, hides ITK's demand driven pipeline, and provides a template-less layer. Also SimpleITK provides practical conveniences such as binary distribution packages and overloaded operators. Our user-friendly design goals dictated a departure from the direct interface wrapping approach of WrapITK, toward a new facade class structure that only exposes the required functionality, hiding ITK's extensive template use. Internally SimpleITK utilizes a manual description of each filter with code-generation and advanced C++ meta-programming to provide the higher-level interface, bringing the capabilities of ITK to a wider audience. SimpleITK is licensed as open source software library under the Apache License Version 2.0 and more information about downloading it can be found at http://www.simpleitk.org.
Reference Models for Structural Technology Assessment and Weight Estimation
NASA Technical Reports Server (NTRS)
Cerro, Jeff; Martinovic, Zoran; Eldred, Lloyd
2005-01-01
Previously the Exploration Concepts Branch of NASA Langley Research Center has developed techniques for automating the preliminary design level of launch vehicle airframe structural analysis for purposes of enhancing historical regression based mass estimating relationships. This past work was useful and greatly reduced design time, however its application area was very narrow in terms of being able to handle a large variety in structural and vehicle general arrangement alternatives. Implementation of the analysis approach presented herein also incorporates some newly developed computer programs. Loft is a program developed to create analysis meshes and simultaneously define structural element design regions. A simple component defining ASCII file is read by Loft to begin the design process. HSLoad is a Visual Basic implementation of the HyperSizer Application Programming Interface, which automates the structural element design process. Details of these two programs and their use are explained in this paper. A feature which falls naturally out of the above analysis paradigm is the concept of "reference models". The flexibility of the FEA based JAVA processing procedures and associated process control classes coupled with the general utility of Loft and HSLoad make it possible to create generic program template files for analysis of components ranging from something as simple as a stiffened flat panel, to curved panels, fuselage and cryogenic tank components, flight control surfaces, wings, through full air and space vehicle general arrangements.
ERIC Educational Resources Information Center
Toumasis, Charalampos
2004-01-01
Emphasis on problem solving and mathematical modeling has gained considerable attention in the last few years. Connecting mathematics to other subjects and to the real world outside the classroom has received increased attention in mathematics programs. This article describes an application of simple differential equations in the field of…
DOT National Transportation Integrated Search
2000-04-01
An innovative and simple approach is presented for estimation of the resilient modulus of subgrade soils utilizing the cone penetration test. Field and laboratory testing programs were carried out at seven sites that comprise three common soil types ...
Next generation simulation tools: the Systems Biology Workbench and BioSPICE integration.
Sauro, Herbert M; Hucka, Michael; Finney, Andrew; Wellock, Cameron; Bolouri, Hamid; Doyle, John; Kitano, Hiroaki
2003-01-01
Researchers in quantitative systems biology make use of a large number of different software packages for modelling, analysis, visualization, and general data manipulation. In this paper, we describe the Systems Biology Workbench (SBW), a software framework that allows heterogeneous application components--written in diverse programming languages and running on different platforms--to communicate and use each others' capabilities via a fast binary encoded-message system. Our goal was to create a simple, high performance, opensource software infrastructure which is easy to implement and understand. SBW enables applications (potentially running on separate, distributed computers) to communicate via a simple network protocol. The interfaces to the system are encapsulated in client-side libraries that we provide for different programming languages. We describe in this paper the SBW architecture, a selection of current modules, including Jarnac, JDesigner, and SBWMeta-tool, and the close integration of SBW into BioSPICE, which enables both frameworks to share tools and compliment and strengthen each others capabilities.
Lee, Ying Li; Chien, Tsai Feng; Kuo, Ming Chuan; Chang, Polun
2014-01-01
This study aims to understand the relationship between participating nurses' motivation, achievement and satisfaction before and after they learned to program in Excel Visual Basic for Applications (Excel VBA). We held a workshop to train nurses in developing simple Excel VBA information systems to support their clinical or administrative practices. Before and after the workshop, the participants were evaluated on their knowledge of Excel VBA, and a questionnaire was given to survey their learning motivation and satisfaction. Statistics softwares Winsteps and SPSS were used for data analysis. Results show that the participants are more knowledgeable about VBA as well as more motivated in learning VBA after the workshop. Participants were highly satisfied with the overall arrangement of the workshop and instructors, but didn't have enough confidence in promoting the application of Excel VBA themselves. In addition, we were unable to predict the participants' achievement by their demographic characteristics or pre-test motivation level.
Helioviewer.org: Simple Solar and Heliospheric Data Visualization
NASA Astrophysics Data System (ADS)
Hughitt, V. K.; Ireland, J.; Mueller, D.
2011-12-01
Helioviewer.org is a free and open-source web application for exploring solar physics data in a simple and intuitive manner. Over the past several years, Helioviewer.org has enabled thousands of users from across the globe to explore the inner heliosphere, providing access to over ten million images from the SOHO, SDO, and STEREO missions. While Helioviewer.org has seen a surge in use by the public in recent months, it is still ultimately a science tool. The newest version of Helioviewer.org provides access to science-quality data for all available images through the Virtual Solar Observatory (VSO). In addition to providing a powerful platform for browsing heterogeneous sets of solar data, Helioviewer.org also seeks to be as flexible and extensible as possible, providing access to much of its functionality via a simple Application Programming Interface (API). Recently, the Helioviewer.org API was used for two such applications: a Wordpress plugin, and a Python library for solar physics data analysis (SunPy). These applications are discussed and examples of API usage are provided. Finally, Helioviewer.org is undergoing continual development, with new features being added on a regular basis. Recent updates to Helioviewer.org are discussed, along with a preview of things to come.
NASA Technical Reports Server (NTRS)
Borchardt, G. C.
1994-01-01
The Simple Tool for Automated Reasoning program (STAR) is an interactive, interpreted programming language for the development and operation of artificial intelligence (AI) application systems. STAR provides an environment for integrating traditional AI symbolic processing with functions and data structures defined in compiled languages such as C, FORTRAN and PASCAL. This type of integration occurs in a number of AI applications including interpretation of numerical sensor data, construction of intelligent user interfaces to existing compiled software packages, and coupling AI techniques with numerical simulation techniques and control systems software. The STAR language was created as part of an AI project for the evaluation of imaging spectrometer data at NASA's Jet Propulsion Laboratory. Programming in STAR is similar to other symbolic processing languages such as LISP and CLIP. STAR includes seven primitive data types and associated operations for the manipulation of these structures. A semantic network is used to organize data in STAR, with capabilities for inheritance of values and generation of side effects. The AI knowledge base of STAR can be a simple repository of records or it can be a highly interdependent association of implicit and explicit components. The symbolic processing environment of STAR may be extended by linking the interpreter with functions defined in conventional compiled languages. These external routines interact with STAR through function calls in either direction, and through the exchange of references to data structures. The hybrid knowledge base may thus be accessed and processed in general by either side of the application. STAR is initially used to link externally compiled routines and data structures. It is then invoked to interpret the STAR rules and symbolic structures. In a typical interactive session, the user enters an expression to be evaluated, STAR parses the input, evaluates the expression, performs any file input/output required, and displays the results. The STAR interpreter is written in the C language for interactive execution. It has been implemented on a VAX 11/780 computer operating under VMS, and the UNIX version has been implemented on a Sun Microsystems 2/170 workstation. STAR has a memory requirement of approximately 200K of 8 bit bytes, excluding externally compiled functions and application-dependent symbolic definitions. This program was developed in 1985.
NASA Technical Reports Server (NTRS)
Borchardt, G. C.
1994-01-01
The Simple Tool for Automated Reasoning program (STAR) is an interactive, interpreted programming language for the development and operation of artificial intelligence (AI) application systems. STAR provides an environment for integrating traditional AI symbolic processing with functions and data structures defined in compiled languages such as C, FORTRAN and PASCAL. This type of integration occurs in a number of AI applications including interpretation of numerical sensor data, construction of intelligent user interfaces to existing compiled software packages, and coupling AI techniques with numerical simulation techniques and control systems software. The STAR language was created as part of an AI project for the evaluation of imaging spectrometer data at NASA's Jet Propulsion Laboratory. Programming in STAR is similar to other symbolic processing languages such as LISP and CLIP. STAR includes seven primitive data types and associated operations for the manipulation of these structures. A semantic network is used to organize data in STAR, with capabilities for inheritance of values and generation of side effects. The AI knowledge base of STAR can be a simple repository of records or it can be a highly interdependent association of implicit and explicit components. The symbolic processing environment of STAR may be extended by linking the interpreter with functions defined in conventional compiled languages. These external routines interact with STAR through function calls in either direction, and through the exchange of references to data structures. The hybrid knowledge base may thus be accessed and processed in general by either side of the application. STAR is initially used to link externally compiled routines and data structures. It is then invoked to interpret the STAR rules and symbolic structures. In a typical interactive session, the user enters an expression to be evaluated, STAR parses the input, evaluates the expression, performs any file input/output required, and displays the results. The STAR interpreter is written in the C language for interactive execution. It has been implemented on a VAX 11/780 computer operating under VMS, and the UNIX version has been implemented on a Sun Microsystems 2/170 workstation. STAR has a memory requirement of approximately 200K of 8 bit bytes, excluding externally compiled functions and application-dependent symbolic definitions. This program was developed in 1985.
ERIC Educational Resources Information Center
Simkin, Mark G.
2008-01-01
Data-validation routines enable computer applications to test data to ensure their accuracy, completeness, and conformance to industry or proprietary standards. This paper presents five programming cases that require students to validate five different types of data: (1) simple user data entries, (2) UPC codes, (3) passwords, (4) ISBN numbers, and…
Calculating Strain Relief in Electronic-Component Leads
NASA Technical Reports Server (NTRS)
Snytsheuvel, H.
1985-01-01
Stress/strain formulas applicable to design of electronic-component leads compiled in report. Such things as factors of safety and whether or not lead is likely to fall in service determined in advance. Set of formulas is simple enough to be solved on programable hand-held calculator.
Monitoring robot actions for error detection and recovery
NASA Technical Reports Server (NTRS)
Gini, M.; Smith, R.
1987-01-01
Reliability is a serious problem in computer controlled robot systems. Although robots serve successfully in relatively simple applications such as painting and spot welding, their potential in areas such as automated assembly is hampered by programming problems. A program for assembling parts may be logically correct, execute correctly on a simulator, and even execute correctly on a robot most of the time, yet still fail unexpectedly in the face of real world uncertainties. Recovery from such errors is far more complicated than recovery from simple controller errors, since even expected errors can often manifest themselves in unexpected ways. Here, a novel approach is presented for improving robot reliability. Instead of anticipating errors, researchers use knowledge-based programming techniques so that the robot can autonomously exploit knowledge about its task and environment to detect and recover from failures. They describe preliminary experiment of a system that they designed and constructed.
SSRscanner: a program for reporting distribution and exact location of simple sequence repeats.
Anwar, Tamanna; Khan, Asad U
2006-02-20
Simple sequence repeats (SSRs) have become important molecular markers for a broad range of applications, such as genome mapping and characterization, phenotype mapping, marker assisted selection of crop plants and a range of molecular ecology and diversity studies. These repeated DNA sequences are found in both prokaryotes and eukaryotes. They are distributed almost at random throughout the genome, ranging from mononucleotide to trinucleotide repeats. They are also found at longer lengths (> 6 repeating units) of tracts. Most of the computer programs that find SSRs do not report its exact position. A computer program SSRscanner was written to find out distribution, frequency and exact location of each SSR in the genome. SSRscanner is user friendly. It can search repeats of any length and produce outputs with their exact position on chromosome and their frequency of occurrence in the sequence. This program has been written in PERL and is freely available for non-commercial users by request from the authors. Please contact the authors by E-mail: huzzi99@hotmail.com.
ShareSync: A Solution for Deterministic Data Sharing over Ethernet
NASA Technical Reports Server (NTRS)
Dunn, Daniel J., II; Koons, William A.; Kennedy, Richard D.; Davis, Philip A.
2007-01-01
As part of upgrading the Contact Dynamics Simulation Laboratory (CDSL) at the NASA Marshall Space Flight Center (MSFC), a simple, cost effective method was needed to communicate data among the networked simulation machines and I/O controllers used to run the facility. To fill this need and similar applicable situations, a generic protocol was developed, called ShareSync. ShareSync is a lightweight, real-time, publish-subscribe Ethernet protocol for simple and deterministic data sharing across diverse machines and operating systems. ShareSync provides a simple Application Programming Interface (API) for simulation programmers to incorporate into their code. The protocol is compatible with virtually all Ethernet-capable machines, is flexible enough to support a variety of applications, is fast enough to provide soft real-time determinism, and is a low-cost resource for distributed simulation development, deployment, and maintenance. The first design cycle iteration of ShareSync has been completed, and the protocol has undergone several testing procedures including endurance and benchmarking tests and approaches the 2001ts data synchronization design goal for the CDSL.
Forest management and economics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buongiorno, J.; Gilless, J.K.
1987-01-01
This volume provides a survey of quantitative methods, guiding the reader through formulation and analysis of models that address forest management problems. The authors use simple mathematics, graphics, and short computer programs to explain each method. Emphasizing applications, they discuss linear, integer, dynamic, and goal programming; simulation; network modeling; and econometrics, as these relate to problems of determining economic harvest schedules in even-aged and uneven-aged forests, the evaluation of forest policies, multiple-objective decision making, and more.
NASA Technical Reports Server (NTRS)
1977-01-01
A method was developed for using the NASA aviation data base and computer programs in conjunction with the GE management analysis and projection service to perform simple and complex economic analysis for planning, forecasting, and evaluating OAST programs. Capabilities of the system are discussed along with procedures for making basic data tabulations, updates and entries. The system is applied in an agricultural aviation study in order to assess its value for actual utility in the OAST working environment.
NASA Technical Reports Server (NTRS)
Green, D. M.
1978-01-01
Software programs are described, one which implements a voltage regulation function, and one which implements a charger function with peak-power tracking of its input. The software, written in modular fashion, is intended as a vehicle for further experimentation with the P-3 system. A control teleprinter allows an operator to make parameter modifications to the control algorithm during experiments. The programs require 3K ROM and 2K ram each. User manuals for each system are included as well as a third program for simple I/O control.
Neural network based architectures for aerospace applications
NASA Technical Reports Server (NTRS)
Ricart, Richard
1987-01-01
A brief history of the field of neural networks research is given and some simple concepts are described. In addition, some neural network based avionics research and development programs are reviewed. The need for the United States Air Force and NASA to assume a leadership role in supporting this technology is stressed.
Treatment Effects for Adolescent Struggling Readers: An Application of Moderated Mediation
ERIC Educational Resources Information Center
Roberts, Greg; Fletcher, Jack M.; Stuebing, Karla K.; Barth, Amy E.; Vaughn, Sharon
2013-01-01
This study used multigroup structural equations to evaluate the possibility that a theory-driven, evidence-based, yearlong reading program for sixth-grade struggling readers moderates the interrelationships among elements of the simple model of reading (i.e., listening comprehension, word reading, and reading comprehension; Hoover & Gough,…
Forecasting Pell Program Applications Using Structural Aggregate Models.
ERIC Educational Resources Information Center
Cavin, Edward S.
1995-01-01
Demand for Pell Grant financial aid has become difficult to predict when using the current microsimulation model. This paper proposes an alternative model that uses aggregate data (based on individuals' microlevel decisions and macrodata on family incomes, college costs, and opportunity wages) and avoids some limitations of simple linear models.…
A Performance Support Tool for Cisco Training Program Managers
ERIC Educational Resources Information Center
Benson, Angela D.; Bothra, Jashoda; Sharma, Priya
2004-01-01
Performance support systems can play an important role in corporations by managing and allowing distribution of information more easily. These systems run the gamut from simple paper job aids to sophisticated computer- and web-based software applications that support the entire corporate supply chain. According to Gery (1991), a performance…
Receptor Surface Models in the Classroom: Introducing Molecular Modeling to Students in a 3-D World
ERIC Educational Resources Information Center
Geldenhuys, Werner J.; Hayes, Michael; Van der Schyf, Cornelis J.; Allen, David D.; Malan, Sarel F.
2007-01-01
A simple, novel and generally applicable method to demonstrate structure-activity associations of a group of biologically interesting compounds in relation to receptor binding is described. This method is useful for undergraduates and graduate students in medicinal chemistry and computer modeling programs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Venkata, Manjunath Gorentla; Aderholdt, William F
The pre-exascale systems are expected to have a significant amount of hierarchical and heterogeneous on-node memory, and this trend of system architecture in extreme-scale systems is expected to continue into the exascale era. along with hierarchical-heterogeneous memory, the system typically has a high-performing network ad a compute accelerator. This system architecture is not only effective for running traditional High Performance Computing (HPC) applications (Big-Compute), but also for running data-intensive HPC applications and Big-Data applications. As a consequence, there is a growing desire to have a single system serve the needs of both Big-Compute and Big-Data applications. Though the system architecturemore » supports the convergence of the Big-Compute and Big-Data, the programming models and software layer have yet to evolve to support either hierarchical-heterogeneous memory systems or the convergence. A programming abstraction to address this problem. The programming abstraction is implemented as a software library and runs on pre-exascale and exascale systems supporting current and emerging system architecture. Using distributed data-structures as a central concept, it provides (1) a simple, usable, and portable abstraction for hierarchical-heterogeneous memory and (2) a unified programming abstraction for Big-Compute and Big-Data applications.« less
Teaching biomedical applications to secondary students.
Openshaw, S; Fleisher, A; Ljunggren, C
1999-01-01
Certain aspects of biomedical engineering applications lend themselves well to experimentation that can be done by high school students. This paper describes two experiments done during a six-week summer internship program in which two high school students used electrodes, circuit boards, and computers to mimic a sophisticated heart monitor and also to control a robotic car. Our experience suggests that simple illustrations of complex instrumentation can be effective in introducing adolescents to the biomedical engineering field.
Lowekamp, Bradley C.; Chen, David T.; Ibáñez, Luis; Blezek, Daniel
2013-01-01
SimpleITK is a new interface to the Insight Segmentation and Registration Toolkit (ITK) designed to facilitate rapid prototyping, education and scientific activities via high level programming languages. ITK is a templated C++ library of image processing algorithms and frameworks for biomedical and other applications, and it was designed to be generic, flexible and extensible. Initially, ITK provided a direct wrapping interface to languages such as Python and Tcl through the WrapITK system. Unlike WrapITK, which exposed ITK's complex templated interface, SimpleITK was designed to provide an easy to use and simplified interface to ITK's algorithms. It includes procedural methods, hides ITK's demand driven pipeline, and provides a template-less layer. Also SimpleITK provides practical conveniences such as binary distribution packages and overloaded operators. Our user-friendly design goals dictated a departure from the direct interface wrapping approach of WrapITK, toward a new facade class structure that only exposes the required functionality, hiding ITK's extensive template use. Internally SimpleITK utilizes a manual description of each filter with code-generation and advanced C++ meta-programming to provide the higher-level interface, bringing the capabilities of ITK to a wider audience. SimpleITK is licensed as open source software library under the Apache License Version 2.0 and more information about downloading it can be found at http://www.simpleitk.org. PMID:24416015
Supporting geoscience with graphical-user-interface Internet tools for the Macintosh
NASA Astrophysics Data System (ADS)
Robin, Bernard
1995-07-01
This paper describes a suite of Macintosh graphical-user-interface (GUI) software programs that can be used in conjunction with the Internet to support geoscience education. These software programs allow science educators to access and retrieve a large body of resources from an increasing number of network sites, taking advantage of the intuitive, simple-to-use Macintosh operating system. With these tools, educators easily can locate, download, and exchange not only text files but also sound resources, video movie clips, and software application files from their desktop computers. Another major advantage of these software tools is that they are available at no cost and may be distributed freely. The following GUI software tools are described including examples of how they can be used in an educational setting: ∗ Eudora—an e-mail program ∗ NewsWatcher—a newsreader ∗ TurboGopher—a Gopher program ∗ Fetch—a software application for easy File Transfer Protocol (FTP) ∗ NCSA Mosaic—a worldwide hypertext browsing program. An explosive growth of online archives currently is underway as new electronic sites are being added continuously to the Internet. Many of these resources may be of interest to science educators who learn they can share not only ASCII text files, but also graphic image files, sound resources, QuickTime movie clips, and hypermedia projects with colleagues from locations around the world. These powerful, yet simple to learn GUI software tools are providing a revolution in how knowledge can be accessed, retrieved, and shared.
Search without Boundaries Using Simple APIs
Tong, Qi
2009-01-01
The U.S. Geological Survey (USGS) Library, where the author serves as the digital services librarian, is increasingly challenged to make it easier for users to find information from many heterogeneous information sources. Information is scattered throughout different software applications (i.e., library catalog, federated search engine, link resolver, and vendor websites), and each specializes in one thing. How could the library integrate the functionalities of one application with another and provide a single point of entry for users to search across? To improve the user experience, the library launched an effort to integrate the federated search engine into the library's intranet website. The result is a simple search box that leverages the federated search engine's built-in application programming interfaces (APIs). In this article, the author describes how this project demonstrated the power of APIs and their potential to be used by other enterprise search portals inside or outside of the library.
[Quality assurance of the renal applications software].
del Real Núñez, R; Contreras Puertas, P I; Moreno Ortega, E; Mena Bares, L M; Maza Muret, F R; Latre Romero, J M
2007-01-01
The need for quality assurance of all technical aspects of nuclear medicine studies is widely recognised. However, little attention has been paid to the quality assurance of the applications software. Our work reported here aims at verifying the analysis software for processing of renal nuclear medicine studies (renograms). The software tools were used to build a synthetic dynamic model of renal system. The model consists of two phases: perfusion and function. The organs of interest (kidneys, bladder and aortic artery) were simple geometric forms. The uptake of the renal structures was described by mathematic functions. Curves corresponding to normal or pathological conditions were simulated for kidneys, bladder and aortic artery by appropriate selection of parameters. There was no difference between the parameters of the mathematic curves and the quantitative data produced by the renal analysis program. Our test procedure is simple to apply, reliable, reproducible and rapid to verify the renal applications software.
The Application of Finite Element Solution Techniques in Structural Analysis on a Microcomputer.
1981-12-01
my wife for her support of this research project and the amount of time she spent helping me in preparation. Thanks go to the personnel at Computer...questions which had to be answered concerning the microcomputer in relation to a sequentially programmed finite element program. The first was how big...central site, then usefullness of the microcomputer is limited. The first series of problems consisted of a simple truss structure, which was expanded
A simple approach to optimal control of invasive species.
Hastings, Alan; Hall, Richard J; Taylor, Caz M
2006-12-01
The problem of invasive species and their control is one of the most pressing applied issues in ecology today. We developed simple approaches based on linear programming for determining the optimal removal strategies of different stage or age classes for control of invasive species that are still in a density-independent phase of growth. We illustrate the application of this method to the specific example of invasive Spartina alterniflora in Willapa Bay, WA. For all such systems, linear programming shows in general that the optimal strategy in any time step is to prioritize removal of a single age or stage class. The optimal strategy adjusts which class is the focus of control through time and can be much more cost effective than prioritizing removal of the same stage class each year.
DAVE: A plug and play model for distributed multimedia application development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mines, R.F.; Friesen, J.A.; Yang, C.L.
1994-07-01
This paper presents a model being used for the development of distributed multimedia applications. The Distributed Audio Video Environment (DAVE) was designed to support the development of a wide range of distributed applications. The implementation of this model is described. DAVE is unique in that it combines a simple ``plug and play`` programming interface, supports both centralized and fully distributed applications, provides device and media extensibility, promotes object reuseability, and supports interoperability and network independence. This model enables application developers to easily develop distributed multimedia applications and create reusable multimedia toolkits. DAVE was designed for developing applications such as videomore » conferencing, media archival, remote process control, and distance learning.« less
The Application of LOGO! in Control System of a Transmission and Sorting Mechanism
NASA Astrophysics Data System (ADS)
Liu, Jian; Lv, Yuan-Jun
Logic programming of general logic control module LOGO! has been recommended the application in transmission and sorting mechanism. First, the structure and operating principle of the mechanism had been introduced. Then the pneumatic loop of the mechanism had been plotted in the software of FluidSIM-P. At last, pneumatic loop and motors had been control by LOGO!, which makes the control process simple and clear instead of the complicated control of ordinary relay. LOGO! can achieve the complicated interlock control composed of inter relays and time relays. In the control process, the logic control function of LOGO! is fully used to logic programming so that the system realizes the control of air cylinder and motor. It is reliable and adjustable mechanism after application.
Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit.
Mateu, Juan; Lasala, María José; Alamán, Xavier
2015-08-31
In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain.
24 CFR 906.39 - Contents of a homeownership program.
Code of Federal Regulations, 2010 CFR
2010-04-01
... applicable to the particular factual situation: (a) Method of Sale: The PHA should indicate how units will be sold, including a description of the exact method of sale, such as, for example, fee simple conveyance, lease-purchase, or sale of a cooperative share. PHAs may sell units directly to a tenant or eligible...
Practical Application of Aspiration as an Outcome Indicator in Extension Evaluation
ERIC Educational Resources Information Center
Jayaratne, K. S. U.
2010-01-01
Extension educators need simple and accurate evaluation tools for program evaluation. This article explains how to use aspiration as an outcome indicator in Extension evaluation and introduces a practical evaluation tool. Aspiration can be described as the readiness for change. By recording participants' levels of aspiration, we will be able to…
Algorithm Building and Learning Programming Languages Using a New Educational Paradigm
NASA Astrophysics Data System (ADS)
Jain, Anshul K.; Singhal, Manik; Gupta, Manu Sheel
2011-08-01
This research paper presents a new concept of using a single tool to associate syntax of various programming languages, algorithms and basic coding techniques. A simple framework has been programmed in Python that helps students learn skills to develop algorithms, and implement them in various programming languages. The tool provides an innovative and a unified graphical user interface for development of multimedia objects, educational games and applications. It also aids collaborative learning amongst students and teachers through an integrated mechanism based on Remote Procedure Calls. The paper also elucidates an innovative method for code generation to enable students to learn the basics of programming languages using drag-n-drop methods for image objects.
BioWord: A sequence manipulation suite for Microsoft Word
2012-01-01
Background The ability to manipulate, edit and process DNA and protein sequences has rapidly become a necessary skill for practicing biologists across a wide swath of disciplines. In spite of this, most everyday sequence manipulation tools are distributed across several programs and web servers, sometimes requiring installation and typically involving frequent switching between applications. To address this problem, here we have developed BioWord, a macro-enabled self-installing template for Microsoft Word documents that integrates an extensive suite of DNA and protein sequence manipulation tools. Results BioWord is distributed as a single macro-enabled template that self-installs with a single click. After installation, BioWord will open as a tab in the Office ribbon. Biologists can then easily manipulate DNA and protein sequences using a familiar interface and minimize the need to switch between applications. Beyond simple sequence manipulation, BioWord integrates functionality ranging from dyad search and consensus logos to motif discovery and pair-wise alignment. Written in Visual Basic for Applications (VBA) as an open source, object-oriented project, BioWord allows users with varying programming experience to expand and customize the program to better meet their own needs. Conclusions BioWord integrates a powerful set of tools for biological sequence manipulation within a handy, user-friendly tab in a widely used word processing software package. The use of a simple scripting language and an object-oriented scheme facilitates customization by users and provides a very accessible educational platform for introducing students to basic bioinformatics algorithms. PMID:22676326
BioWord: a sequence manipulation suite for Microsoft Word.
Anzaldi, Laura J; Muñoz-Fernández, Daniel; Erill, Ivan
2012-06-07
The ability to manipulate, edit and process DNA and protein sequences has rapidly become a necessary skill for practicing biologists across a wide swath of disciplines. In spite of this, most everyday sequence manipulation tools are distributed across several programs and web servers, sometimes requiring installation and typically involving frequent switching between applications. To address this problem, here we have developed BioWord, a macro-enabled self-installing template for Microsoft Word documents that integrates an extensive suite of DNA and protein sequence manipulation tools. BioWord is distributed as a single macro-enabled template that self-installs with a single click. After installation, BioWord will open as a tab in the Office ribbon. Biologists can then easily manipulate DNA and protein sequences using a familiar interface and minimize the need to switch between applications. Beyond simple sequence manipulation, BioWord integrates functionality ranging from dyad search and consensus logos to motif discovery and pair-wise alignment. Written in Visual Basic for Applications (VBA) as an open source, object-oriented project, BioWord allows users with varying programming experience to expand and customize the program to better meet their own needs. BioWord integrates a powerful set of tools for biological sequence manipulation within a handy, user-friendly tab in a widely used word processing software package. The use of a simple scripting language and an object-oriented scheme facilitates customization by users and provides a very accessible educational platform for introducing students to basic bioinformatics algorithms.
Kwolek, J M; Wells, J E; Goodman, D S; Smith, W W
2016-05-01
Simultaneous laser locking of infrared (IR) and ultraviolet lasers to a visible stabilized reference laser is demonstrated via a Fabry-Perot (FP) cavity. LabVIEW is used to analyze the input, and an internal proportional-integral-derivative algorithm converts the FP signal to an analog locking feedback signal. The locking program stabilized both lasers to a long term stability of better than 9 MHz, with a custom-built IR laser undergoing significant improvement in frequency stabilization. The results of this study demonstrate the viability of a simple, computer-controlled, non-temperature-stabilized FP locking scheme for our applications, laser cooling of Ca(+) ions, and its use in other applications with similar modest frequency stabilization requirements.
Program For A Pushbutton Display
NASA Technical Reports Server (NTRS)
Busquets, Anthony M.; Luck, William S., Jr.
1989-01-01
Programmable Display Pushbutton (PDP) is pushbutton device available from Micro Switch having programmable 16X35 matrix of light-emitting diodes on pushbutton surface. Any desired legends display on PDP's, producing user-friendly applications reducing need for dedicated manual controls. Interacts with operator, calls for correct response before transmitting next message. Both simple manual control and sophisticated programmable link between operator and host system. Programmable Display Pushbutton Legend Editor (PDPE) computer program used to create light-emitting-diode (LED) displays for pushbuttons. Written in FORTRAN.
Initial Kernel Timing Using a Simple PIM Performance Model
NASA Technical Reports Server (NTRS)
Katz, Daniel S.; Block, Gary L.; Springer, Paul L.; Sterling, Thomas; Brockman, Jay B.; Callahan, David
2005-01-01
This presentation will describe some initial results of paper-and-pencil studies of 4 or 5 application kernels applied to a processor-in-memory (PIM) system roughly similar to the Cascade Lightweight Processor (LWP). The application kernels are: * Linked list traversal * Sun of leaf nodes on a tree * Bitonic sort * Vector sum * Gaussian elimination The intent of this work is to guide and validate work on the Cascade project in the areas of compilers, simulators, and languages. We will first discuss the generic PIM structure. Then, we will explain the concepts needed to program a parallel PIM system (locality, threads, parcels). Next, we will present a simple PIM performance model that will be used in the remainder of the presentation. For each kernel, we will then present a set of codes, including codes for a single PIM node, and codes for multiple PIM nodes that move data to threads and move threads to data. These codes are written at a fairly low level, between assembly and C, but much closer to C than to assembly. For each code, we will present some hand-drafted timing forecasts, based on the simple PIM performance model. Finally, we will conclude by discussing what we have learned from this work, including what programming styles seem to work best, from the point-of-view of both expressiveness and performance.
SSRscanner: a program for reporting distribution and exact location of simple sequence repeats
Anwar, Tamanna; Khan, Asad U
2006-01-01
Simple sequence repeats (SSRs) have become important molecular markers for a broad range of applications, such as genome mapping and characterization, phenotype mapping, marker assisted selection of crop plants and a range of molecular ecology and diversity studies. These repeated DNA sequences are found in both prokaryotes and eukaryotes. They are distributed almost at random throughout the genome, ranging from mononucleotide to trinucleotide repeats. They are also found at longer lengths (> 6 repeating units) of tracts. Most of the computer programs that find SSRs do not report its exact position. A computer program SSRscanner was written to find out distribution, frequency and exact location of each SSR in the genome. SSRscanner is user friendly. It can search repeats of any length and produce outputs with their exact position on chromosome and their frequency of occurrence in the sequence. Availability This program has been written in PERL and is freely available for non-commercial users by request from the authors. Please contact the authors by E-mail: huzzi99@hotmail.com PMID:17597863
Hidalgo-Mazzei, Diego; Mateu, Ainoa; Reinares, María; Murru, Andrea; Del Mar Bonnín, Caterina; Varo, Cristina; Valentí, Marc; Undurraga, Juan; Strejilevich, Sergio; Sánchez-Moreno, José; Vieta, Eduard; Colom, Francesc
2016-08-01
During the last fifteen years, the possibility of delivering psychoeducation programs through Internet-based platforms have been explored. Studies evaluating those programs have shown good to acceptable retention rates. In this context, we developed a smartphone application (SIMPLe) collecting information about mood symptoms and offering personalized psychoeducation messages. The main aims of this study were to evaluate the feasibility, acceptability and satisfaction of the smartphone application. The study was conducted from March to August 2015. Participation in the study was proposed to a consecutive sample of adult patients attending an outpatient mental health clinic. Sociodemographic data, clinical and functional assessments alongside smartphone ownership and uses were collected at baseline and at 3 months' follow-up. A 5 item Likert-scale satisfaction questionnaire was also employed. 51 participants were initially enrolled in the study, 36 (74%) remained actively using the application after 3 months. The whole sample interacted with the application a mean of 77 days (SD=26.2). During these days they completed 88% of the daily tests. Over 86% of the participants agreed that the experience using the application was satisfactory. The diversity of smartphones operating systems led to a moderate, although representative, sample number. Additionally, the subjective data reporting, narrow time frame of use and stability of the patients could have affected the results. The results confirm that this particular intervention is feasible and represent a satisfactory and acceptable instrument for the self-management of bipolar disorder as an add-on to the usual treatment but future clinical trials must still probe its efficacy. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Cooke, C. H.
1975-01-01
STICAP (Stiff Circuit Analysis Program) is a FORTRAN 4 computer program written for the CDC-6400-6600 computer series and SCOPE 3.0 operating system. It provides the circuit analyst a tool for automatically computing the transient responses and frequency responses of large linear time invariant networks, both stiff and nonstiff (algorithms and numerical integration techniques are described). The circuit description and user's program input language is engineer-oriented, making simple the task of using the program. Engineering theories underlying STICAP are examined. A user's manual is included which explains user interaction with the program and gives results of typical circuit design applications. Also, the program structure from a systems programmer's viewpoint is depicted and flow charts and other software documentation are given.
Applied research of embedded WiFi technology in the motion capture system
NASA Astrophysics Data System (ADS)
Gui, Haixia
2012-04-01
Embedded wireless WiFi technology is one of the current wireless hot spots in network applications. This paper firstly introduces the definition and characteristics of WiFi. With the advantages of WiFi such as using no wiring, simple operation and stable transmission, this paper then gives a system design for the application of embedded wireless WiFi technology in the motion capture system. Also, it verifies the effectiveness of design in the WiFi-based wireless sensor hardware and software program.
Building Automatic Grading Tools for Basic of Programming Lab in an Academic Institution
NASA Astrophysics Data System (ADS)
Harimurti, Rina; Iwan Nurhidayat, Andi; Asmunin
2018-04-01
The skills of computer programming is a core competency that must be mastered by students majoring in computer sciences. The best way to improve this skill is through the practice of writing many programs to solve various problems from simple to complex. It takes hard work and a long time to check and evaluate the results of student labs one by one, especially if the number of students a lot. Based on these constrain, web proposes Automatic Grading Tools (AGT), the application that can evaluate and deeply check the source code in C, C++. The application architecture consists of students, web-based applications, compilers, and operating systems. Automatic Grading Tools (AGT) is implemented MVC Architecture and using open source software, such as laravel framework version 5.4, PostgreSQL 9.6, Bootstrap 3.3.7, and jquery library. Automatic Grading Tools has also been tested for real problems by submitting source code in C/C++ language and then compiling. The test results show that the AGT application has been running well.
A uniform approach for programming distributed heterogeneous computing systems
Grasso, Ivan; Pellegrini, Simone; Cosenza, Biagio; Fahringer, Thomas
2014-01-01
Large-scale compute clusters of heterogeneous nodes equipped with multi-core CPUs and GPUs are getting increasingly popular in the scientific community. However, such systems require a combination of different programming paradigms making application development very challenging. In this article we introduce libWater, a library-based extension of the OpenCL programming model that simplifies the development of heterogeneous distributed applications. libWater consists of a simple interface, which is a transparent abstraction of the underlying distributed architecture, offering advanced features such as inter-context and inter-node device synchronization. It provides a runtime system which tracks dependency information enforced by event synchronization to dynamically build a DAG of commands, on which we automatically apply two optimizations: collective communication pattern detection and device-host-device copy removal. We assess libWater’s performance in three compute clusters available from the Vienna Scientific Cluster, the Barcelona Supercomputing Center and the University of Innsbruck, demonstrating improved performance and scaling with different test applications and configurations. PMID:25844015
A uniform approach for programming distributed heterogeneous computing systems.
Grasso, Ivan; Pellegrini, Simone; Cosenza, Biagio; Fahringer, Thomas
2014-12-01
Large-scale compute clusters of heterogeneous nodes equipped with multi-core CPUs and GPUs are getting increasingly popular in the scientific community. However, such systems require a combination of different programming paradigms making application development very challenging. In this article we introduce libWater, a library-based extension of the OpenCL programming model that simplifies the development of heterogeneous distributed applications. libWater consists of a simple interface, which is a transparent abstraction of the underlying distributed architecture, offering advanced features such as inter-context and inter-node device synchronization. It provides a runtime system which tracks dependency information enforced by event synchronization to dynamically build a DAG of commands, on which we automatically apply two optimizations: collective communication pattern detection and device-host-device copy removal. We assess libWater's performance in three compute clusters available from the Vienna Scientific Cluster, the Barcelona Supercomputing Center and the University of Innsbruck, demonstrating improved performance and scaling with different test applications and configurations.
Technology: Presentations in the Cloud with a Twist
ERIC Educational Resources Information Center
Siegle, Del
2011-01-01
Technology tools have come a long way from early word processing applications and opportunities for students to engage in simple programming. Many tools now exist for students to develop and share products in a variety of formats and for a wide range of audiences. PowerPoint is probably the most ubiquitously used tool for student projects. In…
Application developer's tutorial for the CSM testbed architecture
NASA Technical Reports Server (NTRS)
Underwood, Phillip; Felippa, Carlos A.
1988-01-01
This tutorial serves as an illustration of the use of the programmer interface on the CSM Testbed Architecture (NICE). It presents a complete, but simple, introduction to using both the GAL-DBM (Global Access Library-Database Manager) and CLIP (Command Language Interface Program) to write a NICE processor. Familiarity with the CSM Testbed architecture is required.
38 CFR 1.911 - Collection of debts owed by reason of participation in a benefits program.
Code of Federal Regulations, 2010 CFR
2010-07-01
... exact amount of the debt; (2) The specific reasons for the debt, in simple and concise language; (3) The... transferred to Treasury for administrative offset or collection. (5) That interest and administrative costs... applicant of money already collected, in § 1.967; and (5) The assessment of interest and administrative...
ERIC Educational Resources Information Center
Wholeben, Brent E.
This volume is an exposition of a mathematical modeling technique for use in the evaluation and solution of complex educational problems at all levels. It explores in detail the application of simple algebraic techniques to such issues as program reduction, fiscal rollbacks, and computer curriculum planning. Part I ("Introduction to the…
A Web of Resources for Introductory Computer Science.
ERIC Educational Resources Information Center
Rebelsky, Samuel A.
As the field of Computer Science has grown, the syllabus of the introductory Computer Science course has changed significantly. No longer is it a simple introduction to programming or a tutorial on computer concepts and applications. Rather, it has become a survey of the field of Computer Science, touching on a wide variety of topics from digital…
NASA Astrophysics Data System (ADS)
Laban, Shaban; El-Desouky, Aly
2014-05-01
To achieve a rapid, simple and reliable parallel processing of different types of tasks and big data processing on any compute cluster, a lightweight messaging-based distributed applications processing and workflow execution framework model is proposed. The framework is based on Apache ActiveMQ and Simple (or Streaming) Text Oriented Message Protocol (STOMP). ActiveMQ , a popular and powerful open source persistence messaging and integration patterns server with scheduler capabilities, acts as a message broker in the framework. STOMP provides an interoperable wire format that allows framework programs to talk and interact between each other and ActiveMQ easily. In order to efficiently use the message broker a unified message and topic naming pattern is utilized to achieve the required operation. Only three Python programs and simple library, used to unify and simplify the implementation of activeMQ and STOMP protocol, are needed to use the framework. A watchdog program is used to monitor, remove, add, start and stop any machine and/or its different tasks when necessary. For every machine a dedicated one and only one zoo keeper program is used to start different functions or tasks, stompShell program, needed for executing the user required workflow. The stompShell instances are used to execute any workflow jobs based on received message. A well-defined, simple and flexible message structure, based on JavaScript Object Notation (JSON), is used to build any complex workflow systems. Also, JSON format is used in configuration, communication between machines and programs. The framework is platform independent. Although, the framework is built using Python the actual workflow programs or jobs can be implemented by any programming language. The generic framework can be used in small national data centres for processing seismological and radionuclide data received from the International Data Centre (IDC) of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). Also, it is possible to extend the use of the framework in monitoring the IDC pipeline. The detailed design, implementation,conclusion and future work of the proposed framework will be presented.
Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit
Mateu, Juan; Lasala, María José; Alamán, Xavier
2015-01-01
In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain. PMID:26334275
Capture zones for simple aquifers
McElwee, Carl D.
1991-01-01
Capture zones showing the area influenced by a well within a certain time are useful for both aquifer protection and cleanup. If hydrodynamic dispersion is neglected, a deterministic curve defines the capture zone. Analytical expressions for the capture zones can be derived for simple aquifers. However, the capture zone equations are transcendental and cannot be explicitly solved for the coordinates of the capture zone boundary. Fortunately, an iterative scheme allows the solution to proceed quickly and efficiently even on a modest personal computer. Three forms of the analytical solution must be used in an iterative scheme to cover the entire region of interest, after the extreme values of the x coordinate are determined by an iterative solution. The resulting solution is a discrete one, and usually 100-1000 intervals along the x-axis are necessary for a smooth definition of the capture zone. The presented program is written in FORTRAN and has been used in a variety of computing environments. No graphics capability is included with the program; it is assumed the user has access to a commercial package. The superposition of capture zones for multiple wells is expected to be satisfactory if the spacing is not too close. Because this program deals with simple aquifers, the results rarely will be the final word in a real application.
Openlobby: an open game server for lobby and matchmaking
NASA Astrophysics Data System (ADS)
Zamzami, E. M.; Tarigan, J. T.; Jaya, I.; Hardi, S. M.
2018-03-01
Online Multiplayer is one of the most essential feature in modern games. However, while developing a multiplayer feature can be done with a simple computer networking programming, creating a balanced multiplayer session requires more player management components such as game lobby and matchmaking system. Our objective is to develop OpenLobby, a server that available to be used by other developers to support their multiplayer application. The proposed system acts as a lobby and matchmaker where queueing players will be matched to other player according to a certain criteria defined by developer. The solution provides an application programing interface that can be used by developer to interact with the server. For testing purpose, we developed a game that uses the server as their multiplayer server.
Task 6 -- Advanced turbine systems program conceptual design and product development
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-01-10
The Allison Engine Company has completed the Task 6 Conceptual Design and Analysis of Phase 2 of the Advanced Turbine System (ATS) contract. At the heart of Allison`s system is an advanced simple cycle gas turbine engine. This engine will incorporate components that ensure the program goals are met. Allison plans to commercialize the ATS demonstrator and market a family of engines incorporating this technology. This family of engines, ranging from 4.9 MW to 12 MW, will be suitable for use in all industrial engine applications, including electric power generation, mechanical drive, and marine propulsion. In the field of electricmore » power generation, the engines will be used for base load, standby, cogeneration, and distributed generation applications.« less
Ma, J; Otten, M; Kamadjeu, R; Mir, R; Rosencrans, L; McLaughlin, S; Yoon, S
2008-04-01
For more than two decades, Epi Info software has been used to meet the data management, analysis, and mapping needs of public health professionals in more than 181 countries and 13 languages. Until now, most Epi Info systems have been relatively simple, mainly because of a lack of detailed and structured guidance for developing complex systems. We created the structured application framework for Epi Info (SAFE), which is a set of guidelines that allows developers to create both simple and complex information systems using accepted good programming practices. This has resulted in application code blocks that are re-useable and easy to maintain, modify, and enhance. The flexibility of SAFE allows various aggregate and case-based application modules to be rapidly created, combined, and updated to create health information systems or sub-systems enabling continuous, incremental enhancement as national and local capacity increases. SAFE and Epi Info are both cost-free and have low system requirements--characteristics that render this framework and software beneficial for developing countries.
Physicist's simple access to protein structures: the computer program WHAT IF
NASA Astrophysics Data System (ADS)
Altenberg-Greulich, Brigitte; Zech, Stephan G.; Stehlik, Dietmar; Vriend, Gert
2001-06-01
We describe the computer program WHAT IF and its application to two physical examples. For the DNA binding protein, OCT-1 (pou domain) the location of amino acids with a sidechain amino group is shown. Such knowledge is required when staining this molecule with a fluorescence dye, which binds chemically to the amino terminus as well as amino groups in sidechains. The program shows that most sidechain amino groups are protected when DNA is bound to OCT-1, allowing selective staining of the amino terminal NH2 group. A protein stained this way can be used in fluorescence spectroscopic studies on function aspects of OCT-1.
NASA Astrophysics Data System (ADS)
Cheng, Tao; Wu, Youwei; Shen, Xiaoqin; Lai, Wenyong; Huang, Wei
2018-01-01
In this work, a simple methodology was developed to enhance the patterning resolution of inkjet printing, involving process optimization as well as substrate modification and treatment. The line width of the inkjet-printed silver lines was successfully reduced to 1/3 of the original value using this methodology. Large-area flexible circuits with delicate patterns and good morphology were thus fabricated. The resultant flexible circuits showed excellent electrical conductivity as low as 4.5 Ω/□ and strong tolerance to mechanical bending. The simple methodology is also applicable to substrates with various wettability, which suggests a general strategy to enhance the printing quality of inkjet printing for manufacturing high-performance large-area flexible electronics. Project supported by the National Key Basic Research Program of China (Nos. 2014CB648300, 2017YFB0404501), the National Natural Science Foundation of China (Nos. 21422402, 21674050), the Natural Science Foundation of Jiangsu Province (Nos. BK20140060, BK20130037, BK20140865, BM2012010), the Program for Jiangsu Specially-Appointed Professors (No. RK030STP15001), the Program for New Century Excellent Talents in University (No. NCET-13-0872), the NUPT "1311 Project" and Scientific Foundation (Nos. NY213119, NY213169), the Synergetic Innovation Center for Organic Electronics and Information Displays, the Priority Academic Program Development of Jiangsu Higher Education Institutions (PAPD), the Leading Talent of Technological Innovation of National Ten-Thousands Talents Program of China, the Excellent Scientific and Technological Innovative Teams of Jiangsu Higher Education Institutions (No. TJ217038), the Program for Graduate Students Research and Innovation of Jiangsu Province (No. KYZZ16-0253), and the 333 Project of Jiangsu Province (Nos. BRA2017402, BRA2015374).
Online evaluation programs: benefits and limitations.
Burhansstipanov, Linda; Clark, Richard E; Watanabe-Galloway, Shinobu; Petereit, Daniel G; Eschiti, Valerie; Krebs, Linda U; Pingatore, Noel L
2012-04-01
Patient navigation programs are increasing throughout the USA, yet some evaluation measures are too vague to determine what and how navigation functions. Through collaborative efforts an online evaluation program was developed. The goal of this evaluation program is to make data entry accurate, simple, and efficient. This comprehensive program includes major components on staff, mentoring, committees, partnerships, grants/studies, products, dissemination, patient navigation, and reports. Pull down menus, radio buttons, and check boxes are incorporated whenever possible. Although the program has limitations, the benefits of having access to current, up-to-date program data 24/7 are worth overcoming the challenges. Of major benefit is the ability of the staff to tailor summary reports to provide anonymous feedback in a timely manner to community partners and participants. The tailored data are useful for the partners to generate summaries for inclusion in new grant applications.
Development of test methods for textile composites
NASA Technical Reports Server (NTRS)
Masters, John E.; Ifju, Peter G.; Fedro, Mark J.
1993-01-01
NASA's Advanced Composite Technology (ACT) Program was initiated in 1990 with the purpose of developing less costly composite aircraft structures. A number of innovative materials and processes were evaluated as a part of this effort. Chief among them are composite materials reinforced with textile preforms. These new forms of composite materials bring with them potential testing problems. Methods currently in practice were developed over the years for composite materials made from prepreg tape or simple 2-D woven fabrics. A wide variety of 2-D and 3-D braided, woven, stitched, and knit preforms were suggested for application in the ACT program. The applicability of existing test methods to the wide range of emerging materials bears investigation. The overriding concern is that the values measured are accurate representations of the true material response. The ultimate objective of this work is to establish a set of test methods to evaluate the textile composites developed for the ACT Program.
A Selected Library of Transport Coefficients for Combustion and Plasma Physics Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cloutman, L.D.
2000-08-01
COYOTE and similar combustion programs based on the multicomponent Navier-Stokes equations require the mixture viscosity, thermal conductivity, and species transport coefficients as input. This report documents a model of these molecular transport coefficients that is simpler than the general theory, but which provides adequate accuracy for many purposes. This model leads to a computationally convenient, self-contained, and easy-to-use source of such data in a format suitable for use by such programs. We present the data for various neutral species in two forms. The first form is a simple functional fit to the transport coefficients. The second form is the usemore » of tabulated Lennard-Jones parameters in simple theoretical expressions for the gas-phase transport coefficients. The model then is extended to the case of a two-temperature plasma. Lennard-Jones parameters are given for a number of chemical species of interest in combustion research.« less
Quantitative Modeling of Earth Surface Processes
NASA Astrophysics Data System (ADS)
Pelletier, Jon D.
This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes.
BehavePlus fire modeling system, version 5.0: Design and Features
Faith Ann Heinsch; Patricia L. Andrews
2010-01-01
The BehavePlus fire modeling system is a computer program that is based on mathematical models that describe wildland fire behavior and effects and the fire environment. It is a flexible system that produces tables, graphs, and simple diagrams. It can be used for a host of fire management applications, including projecting the behavior of an ongoing fire, planning...
BehavePlus fire modeling system, version 4.0: User's Guide
Patricia L. Andrews; Collin D. Bevins; Robert C. Seli
2005-01-01
The BehavePlus fire modeling system is a program for personal computers that is a collection of mathematical models that describe fire and the fire environment. It is a flexible system that produces tables, graphs, and simple diagrams. It can be used for a multitude of fire management applications including projecting the behavior of an ongoing fire, planning...
Automated Performance Prediction of Message-Passing Parallel Programs
NASA Technical Reports Server (NTRS)
Block, Robert J.; Sarukkai, Sekhar; Mehra, Pankaj; Woodrow, Thomas S. (Technical Monitor)
1995-01-01
The increasing use of massively parallel supercomputers to solve large-scale scientific problems has generated a need for tools that can predict scalability trends of applications written for these machines. Much work has been done to create simple models that represent important characteristics of parallel programs, such as latency, network contention, and communication volume. But many of these methods still require substantial manual effort to represent an application in the model's format. The NIK toolkit described in this paper is the result of an on-going effort to automate the formation of analytic expressions of program execution time, with a minimum of programmer assistance. In this paper we demonstrate the feasibility of our approach, by extending previous work to detect and model communication patterns automatically, with and without overlapped computations. The predictions derived from these models agree, within reasonable limits, with execution times of programs measured on the Intel iPSC/860 and Paragon. Further, we demonstrate the use of MK in selecting optimal computational grain size and studying various scalability metrics.
Factors affecting residency rank-listing: a Maxdiff survey of graduating Canadian medical students.
Wang, Tao; Wong, Benson; Huang, Alexander; Khatri, Prateek; Ng, Carly; Forgie, Melissa; Lanphear, Joel H; O'Neill, Peter J
2011-08-25
In Canada, graduating medical students consider many factors, including geographic, social, and academic, when ranking residency programs through the Canadian Residency Matching Service (CaRMS). The relative significance of these factors is poorly studied in Canada. It is also unknown how students differentiate between their top program choices. This survey study addresses the influence of various factors on applicant decision making. Graduating medical students from all six Ontario medical schools were invited to participate in an online survey available for three weeks prior to the CaRMS match day in 2010. Max-Diff discrete choice scaling, multiple choice, and drop-list style questions were employed. The Max-Diff data was analyzed using a scaled simple count method. Data for how students distinguish between top programs was analyzed as percentages. Comparisons were made between male and female applicants as well as between family medicine and specialist applicants; statistical significance was determined by the Mann-Whitney test. In total, 339 of 819 (41.4%) eligible students responded. The variety of clinical experiences and resident morale were weighed heavily in choosing a residency program; whereas financial incentives and parental leave attitudes had low influence. Major reasons that applicants selected their first choice program over their second choice included the distance to relatives and desirability of the city. Both genders had similar priorities when selecting programs. Family medicine applicants rated the variety of clinical experiences more importantly; whereas specialty applicants emphasized academic factors more. Graduating medical students consider program characteristics such as the variety of clinical experiences and resident morale heavily in terms of overall priority. However, differentiation between their top two choice programs is often dependent on social/geographic factors. The results of this survey will contribute to a better understanding of the CaRMS decision making process for both junior medical students and residency program directors.
The automated Army ROTC Questionnaire (ARQ)
NASA Technical Reports Server (NTRS)
Young, David L. H.
1991-01-01
The Reserve Officer Training Corps Cadet Command (ROTCCC) takes applications for its officer training program from college students and Army enlisted personnel worldwide. Each applicant is required to complete a set of application forms prior to acceptance into the ROTC program. These forms are covered by several regulations that govern the eligibility of potential applicants and guide the applicant through the application process. Eligibility criteria changes as Army regulations are periodically revised. Outdated information results in a loss of applications attributable to frustration and error. ROTCCC asked for an inexpensive and reliable way of automating their application process. After reviewing the process, it was determined that an expert system with good end user interface capabilities could be used to solve a large part of the problem. The system captures the knowledge contained within the regulations, enables the quick distribution and implementation of eligibility criteria changes, and distributes the expertise of the admissions personnel to the education centers and colleges. The expert system uses a modified version of CLIPS that was streamlined to make the most efficient use of its capabilities. A user interface with windowing capabilities provides the applicant with a simple and effective way to input his/her personal data.
Young Children and Turtle Graphics Programming: Generating and Debugging Simple Turtle Programs.
ERIC Educational Resources Information Center
Cuneo, Diane O.
Turtle graphics is a popular vehicle for introducing children to computer programming. Children combine simple graphic commands to get a display screen cursor (called a turtle) to draw designs on the screen. The purpose of this study was to examine young children's abilities to function in a simple computer programming environment. Four- and…
Processing Solutions for Big Data in Astronomy
NASA Astrophysics Data System (ADS)
Fillatre, L.; Lepiller, D.
2016-09-01
This paper gives a simple introduction to processing solutions applied to massive amounts of data. It proposes a general presentation of the Big Data paradigm. The Hadoop framework, which is considered as the pioneering processing solution for Big Data, is described together with YARN, the integrated Hadoop tool for resource allocation. This paper also presents the main tools for the management of both the storage (NoSQL solutions) and computing capacities (MapReduce parallel processing schema) of a cluster of machines. Finally, more recent processing solutions like Spark are discussed. Big Data frameworks are now able to run complex applications while keeping the programming simple and greatly improving the computing speed.
David, Fabrice P A; Delafontaine, Julien; Carat, Solenne; Ross, Frederick J; Lefebvre, Gregory; Jarosz, Yohan; Sinclair, Lucas; Noordermeer, Daan; Rougemont, Jacques; Leleu, Marion
2014-01-01
The HTSstation analysis portal is a suite of simple web forms coupled to modular analysis pipelines for various applications of High-Throughput Sequencing including ChIP-seq, RNA-seq, 4C-seq and re-sequencing. HTSstation offers biologists the possibility to rapidly investigate their HTS data using an intuitive web application with heuristically pre-defined parameters. A number of open-source software components have been implemented and can be used to build, configure and run HTS analysis pipelines reactively. Besides, our programming framework empowers developers with the possibility to design their own workflows and integrate additional third-party software. The HTSstation web application is accessible at http://htsstation.epfl.ch.
HTSstation: A Web Application and Open-Access Libraries for High-Throughput Sequencing Data Analysis
David, Fabrice P. A.; Delafontaine, Julien; Carat, Solenne; Ross, Frederick J.; Lefebvre, Gregory; Jarosz, Yohan; Sinclair, Lucas; Noordermeer, Daan; Rougemont, Jacques; Leleu, Marion
2014-01-01
The HTSstation analysis portal is a suite of simple web forms coupled to modular analysis pipelines for various applications of High-Throughput Sequencing including ChIP-seq, RNA-seq, 4C-seq and re-sequencing. HTSstation offers biologists the possibility to rapidly investigate their HTS data using an intuitive web application with heuristically pre-defined parameters. A number of open-source software components have been implemented and can be used to build, configure and run HTS analysis pipelines reactively. Besides, our programming framework empowers developers with the possibility to design their own workflows and integrate additional third-party software. The HTSstation web application is accessible at http://htsstation.epfl.ch. PMID:24475057
Moghazy, Amr; Abdelrahman, Amira; Fahim, Ayman
2012-01-01
Preparedness is a necessity for proper handling of emergencies and disaster, particularly in Suez Canal and Sinai regions. To assure best success rates, educative programs should be environmentally based. Burn and fire preventive educative programs were tailored to adapt social and education levels of audience. In addition, common etiologies and applicability of preventive measures, according to local resources and logistics, were considered. Presentations were the main educative tool; they were made as simple as possible to assure best understanding. To assure continuous education, brochures and stickers, containing most popular mistakes and questions, were distributed after the sessions. Audience was classified according to their level of knowledge to health professional group; students groups; high-risk group; and lay people group. For course efficacy evaluation, pre- and posttests were used immediately before and after the sessions. Right answers in both tests were compared for statistical significance. Results showed significant acquisition of proper attitude and knowledge in all educated groups. The highest was among students and the least was in health professionals. Comprehensive simple environmental-based educative programs are ideal for rapid reform and community mobilization in our region. Activities should include direct contact, stickers and flyers, and audiovisual tools if possible.
Status of Wrought FeCrAl-UO 2 Capsules Irradiated in the Advanced Test Reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Field, Kevin G.; Harp, J.; Core, G.
2017-07-01
Candidate cladding materials for accident tolerant fuel applications require extensive testing and validation prior to commercial deployment within the nuclear power industry. One class of cladding materials, FeCrAl alloys, is currently undergoing such effort. Within these activities is a series of irradiation programs within the Advanced Test Reactor. These programs are developed to aid in commercial maturation and understand the fundamental mechanisms controlling the cladding performance during normal operation of a typical light water reactor. Three different irradiation programs are on-going; one designed as a simple proof-of-principle concept, the other to evaluate the susceptibility of FeCrAl to fuel-cladding chemical interaction,more » and the last to fully simulate the conditions of a pressurized water reactor experimentally. To date, nondestructive post-irradiation examination has been completed on the rodlet deemed FCA-L3 from the simple proof-of-concept irradiation program. Initial results show possible breach of the rodlet under irradiation but further studies are needed to conclusively determine whether breach has occurred and the underlying reasons for such a possible failure. Further work includes characterizing additional rodlets following irradiation.« less
Hypercluster parallel processing library user's manual
NASA Technical Reports Server (NTRS)
Quealy, Angela
1990-01-01
This User's Manual describes the Hypercluster Parallel Processing Library, composed of FORTRAN-callable subroutines which enable a FORTRAN programmer to manipulate and transfer information throughout the Hypercluster at NASA Lewis Research Center. Each subroutine and its parameters are described in detail. A simple heat flow application using Laplace's equation is included to demonstrate the use of some of the library's subroutines. The manual can be used initially as an introduction to the parallel features provided by the library. Thereafter it can be used as a reference when programming an application.
JGromacs: a Java package for analyzing protein simulations.
Münz, Márton; Biggin, Philip C
2012-01-23
In this paper, we introduce JGromacs, a Java API (Application Programming Interface) that facilitates the development of cross-platform data analysis applications for Molecular Dynamics (MD) simulations. The API supports parsing and writing file formats applied by GROMACS (GROningen MAchine for Chemical Simulations), one of the most widely used MD simulation packages. JGromacs builds on the strengths of object-oriented programming in Java by providing a multilevel object-oriented representation of simulation data to integrate and interconvert sequence, structure, and dynamics information. The easy-to-learn, easy-to-use, and easy-to-extend framework is intended to simplify and accelerate the implementation and development of complex data analysis algorithms. Furthermore, a basic analysis toolkit is included in the package. The programmer is also provided with simple tools (e.g., XML-based configuration) to create applications with a user interface resembling the command-line interface of GROMACS applications. JGromacs and detailed documentation is freely available from http://sbcb.bioch.ox.ac.uk/jgromacs under a GPLv3 license .
JGromacs: A Java Package for Analyzing Protein Simulations
2011-01-01
In this paper, we introduce JGromacs, a Java API (Application Programming Interface) that facilitates the development of cross-platform data analysis applications for Molecular Dynamics (MD) simulations. The API supports parsing and writing file formats applied by GROMACS (GROningen MAchine for Chemical Simulations), one of the most widely used MD simulation packages. JGromacs builds on the strengths of object-oriented programming in Java by providing a multilevel object-oriented representation of simulation data to integrate and interconvert sequence, structure, and dynamics information. The easy-to-learn, easy-to-use, and easy-to-extend framework is intended to simplify and accelerate the implementation and development of complex data analysis algorithms. Furthermore, a basic analysis toolkit is included in the package. The programmer is also provided with simple tools (e.g., XML-based configuration) to create applications with a user interface resembling the command-line interface of GROMACS applications. Availability: JGromacs and detailed documentation is freely available from http://sbcb.bioch.ox.ac.uk/jgromacs under a GPLv3 license. PMID:22191855
Software Template for Instruction in Mathematics
NASA Technical Reports Server (NTRS)
Shelton, Robert O.; Moebes, Travis A.; Beall, Anna
2005-01-01
Intelligent Math Tutor (IMT) is a software system that serves as a template for creating software for teaching mathematics. IMT can be easily connected to artificial-intelligence software and other analysis software through input and output of files. IMT provides an easy-to-use interface for generating courses that include tests that contain both multiple-choice and fill-in-the-blank questions, and enables tracking of test scores. IMT makes it easy to generate software for Web-based courses or to manufacture compact disks containing executable course software. IMT also can function as a Web-based application program, with features that run quickly on the Web, while retaining the intelligence of a high-level language application program with many graphics. IMT can be used to write application programs in text, graphics, and/or sound, so that the programs can be tailored to the needs of most handicapped persons. The course software generated by IMT follows a "back to basics" approach of teaching mathematics by inducing the student to apply creative mathematical techniques in the process of learning. Students are thereby made to discover mathematical fundamentals and thereby come to understand mathematics more deeply than they could through simple memorization.
TealLock 5.20 security software program for handheld devices.
Tahil, Fatimah A
2004-07-01
The TealLock has a simple graphic interface, and the program is user-friendly with well thought out options to customize security settings. The program is inexpensive and works seamlessly with the Palm OS platform's built-in basic Security application. The developer offers a 30-day free trial version and there is no downside to trying it to see if it meets your needs. It seems to be an effective security software program for psychiatrists who keep confidential and sensitive patient information on their PDAs. In keeping with HIPAA regulations, the TealLock bolsters security for protected health information stored on PDAs or other handheld devices by providing safeguards that address authentication, access control, encryption, and selected aspects of transmission.
A hybrid nonlinear programming method for design optimization
NASA Technical Reports Server (NTRS)
Rajan, S. D.
1986-01-01
Solutions to engineering design problems formulated as nonlinear programming (NLP) problems usually require the use of more than one optimization technique. Moreover, the interaction between the user (analysis/synthesis) program and the NLP system can lead to interface, scaling, or convergence problems. An NLP solution system is presented that seeks to solve these problems by providing a programming system to ease the user-system interface. A simple set of rules is used to select an optimization technique or to switch from one technique to another in an attempt to detect, diagnose, and solve some potential problems. Numerical examples involving finite element based optimal design of space trusses and rotor bearing systems are used to illustrate the applicability of the proposed methodology.
Modular experimental platform for science and applications
NASA Technical Reports Server (NTRS)
Hill, A. S.
1984-01-01
A modularized, standardized spacecraft bus, known as MESA, suitable for a variety of science and applications missions is discussed. The basic bus consists of a simple structural arrangement housing attitude control, telemetry/command, electrical power, propulsion and thermal control subsystems. The general arrangement allows extensive subsystem adaptation to mission needs. Kits provide for the addition of tape recorders, increased power levels and propulsion growth. Both 3-axis and spin stabilized flight proven attitude control subsystems are available. The MESA bus can be launched on Ariane, as a secondary payload for low cost, or on the STS with a PAM-D or other suitable upper stage. Multi-spacecraft launches are possible with either booster. Launch vehicle integration is simple and cost-effective. The low cost of the MESA bus is achieved by the extensive utilization of existing subsystem design concepts and equipment, and efficient program management and test integration techniques.
An improved design method for EPC middleware
NASA Astrophysics Data System (ADS)
Lou, Guohuan; Xu, Ran; Yang, Chunming
2014-04-01
For currently existed problems and difficulties during the small and medium enterprises use EPC (Electronic Product Code) ALE (Application Level Events) specification to achieved middleware, based on the analysis of principle of EPC Middleware, an improved design method for EPC middleware is presented. This method combines the powerful function of MySQL database, uses database to connect reader-writer with upper application system, instead of development of ALE application program interface to achieve a middleware with general function. This structure is simple and easy to implement and maintain. Under this structure, different types of reader-writers added can be configured conveniently and the expandability of the system is improved.
Synthetic Proxy Infrastructure for Task Evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Junghans, Christoph; Pavel, Robert
The Synthetic Proxy Infrastructure for Task Evaluation is a proxy application designed to support application developers in gauging the performance of various task granularities when determining how best to utilize task based programming models.The infrastructure is designed to provide examples of common communication patterns with a synthetic workload intended to provide performance data to evaluate programming model and platform overheads for the purpose of determining task granularity for task decomposition purposes. This is presented as a reference implementation of a proxy application with run-time configurable input and output task dependencies ranging from an embarrassingly parallel scenario to patterns with stencil-likemore » dependencies upon their nearest neighbors. Once all, if any, inputs are satisfied each task will execute a synthetic workload (a simple DGEMM of in this case) of varying size and output all, if any, outputs to the next tasks.The intent is for this reference implementation to be implemented as a proxy app in different programming models so as to provide the same infrastructure and to allow for application developers to simulate their own communication needs to assist in task decomposition under various models on a given platform.« less
Time-domain representation of frequency-dependent foundation impedance functions
Safak, E.
2006-01-01
Foundation impedance functions provide a simple means to account for soil-structure interaction (SSI) when studying seismic response of structures. Impedance functions represent the dynamic stiffness of the soil media surrounding the foundation. The fact that impedance functions are frequency dependent makes it difficult to incorporate SSI in standard time-history analysis software. This paper introduces a simple method to convert frequency-dependent impedance functions into time-domain filters. The method is based on the least-squares approximation of impedance functions by ratios of two complex polynomials. Such ratios are equivalent, in the time-domain, to discrete-time recursive filters, which are simple finite-difference equations giving the relationship between foundation forces and displacements. These filters can easily be incorporated into standard time-history analysis programs. Three examples are presented to show the applications of the method.
Danilovich, Margaret K; Diaz, Laura; Saberbein, Gustavo; Healey, William E; Huber, Gail; Corcos, Daniel M
2017-01-01
We describe a community-engaged approach with Medicaid home and community-based services (HCBS), home care aide (HCA), client, and physical therapist stakeholders to develop a mobile application (app) exercise intervention through focus groups and interviews. Participants desired a short exercise program with modification capabilities, goal setting, and mechanisms to track progress. Concerns regarding participation were training needs and feasibility within usual care services. Technological preferences were for simple, easy-to-use, and engaging content. The app was piloted with HCA-client dyads (n = 5) to refine the intervention and evaluate content. Engaging stakeholders in intervention development provides valuable user-feedback on both desired exercise program contents and mobile technology preferences for HCBS recipients.
[Virtual surgical education: experience with medicine and surgery students].
Bonavina, Luigi; Mozzi, Enrico; Peracchia, Alberto
2003-01-01
The use of virtual reality simulation is currently being proposed within programs of postgraduate surgical education. The simple tasks that make up an operative procedure can be repeatedly performed until satisfactory execution is achieved, and the errors can be corrected by means of objective assessment. The aim of this study was to evaluate the applicability and the results of structured practice with the LapSim laparoscopic simulator used by undergraduate medical students. A significant reduction in operative time and errors was noted in several tasks (navigation, clipping, etc.). Although the transfer of technical skills to the operating room environment remains to be demonstrated, our research shows that this type of teaching is applicable to undergraduate medical students and in future may become a useful tool for selecting individuals for surgical residency programs.
Development of a simple, self-contained flight test data acquisition system
NASA Technical Reports Server (NTRS)
Renz, R. R. L.
1981-01-01
A low cost flight test data acquisition system, applicable to general aviation airplanes, was developed which meets criteria for doing longitudinal and lateral stability analysis. Th package consists of (1) a microprocessor controller and data acquisition module; (2) a transducer module; and (3) a power supply module. The system is easy to install and occupies space in the cabin or baggage compartment of the airplane. All transducers are contained in these modules except the total pressure tube, static pressure air temperature transducer, and control position transducers. The NASA-developed MMLE program was placed on a microcomputer on which all data reduction is done. The flight testing program undertaken proved both the flight testing hardware and the data reduction method to be applicable to the current field of general aviation airplanes.
Macromolecular Calculations for the XTAL-System of Crystallographic Programs
1989-06-01
INSTRUMENT DE NTiFiCAT C 1.1BE R ORG!A%,ZAT:ON1 (if applicable) Office of Naval Research ONR N00014-88-K-0323 8c A:):)R -S ( Citr . Sta te, and ZIP Code) 10...of prio, difference, and updated maps, in addition to the usual BDF handling, is simple but a fruitful source of confusion. For the usual iterative
Gonçalves, Cristina P; Mohallem, José R
2004-11-15
We report the development of a simple algorithm to modify quantum chemistry codes based on the LCAO procedure, to account for the isotope problem in electronic structure calculations. No extra computations are required compared to standard Born-Oppenheimer calculations. An upgrade of the Gamess package called ISOTOPE is presented, and its applicability is demonstrated in some examples.
BOWS (bioinformatics open web services) to centralize bioinformatics tools in web services.
Velloso, Henrique; Vialle, Ricardo A; Ortega, J Miguel
2015-06-02
Bioinformaticians face a range of difficulties to get locally-installed tools running and producing results; they would greatly benefit from a system that could centralize most of the tools, using an easy interface for input and output. Web services, due to their universal nature and widely known interface, constitute a very good option to achieve this goal. Bioinformatics open web services (BOWS) is a system based on generic web services produced to allow programmatic access to applications running on high-performance computing (HPC) clusters. BOWS intermediates the access to registered tools by providing front-end and back-end web services. Programmers can install applications in HPC clusters in any programming language and use the back-end service to check for new jobs and their parameters, and then to send the results to BOWS. Programs running in simple computers consume the BOWS front-end service to submit new processes and read results. BOWS compiles Java clients, which encapsulate the front-end web service requisitions, and automatically creates a web page that disposes the registered applications and clients. Bioinformatics open web services registered applications can be accessed from virtually any programming language through web services, or using standard java clients. The back-end can run in HPC clusters, allowing bioinformaticians to remotely run high-processing demand applications directly from their machines.
ERIC Educational Resources Information Center
Endres, Frank L.
Symbolic Interactive Matrix Processing Language (SIMPLE) is a conversational matrix-oriented source language suited to a batch or a time-sharing environment. The two modes of operation of SIMPLE are conversational mode and programing mode. This program uses a TAURUS time-sharing system and cathode ray terminals or teletypes. SIMPLE performs all…
NASA Astrophysics Data System (ADS)
De Lucas, Javier
2015-03-01
A simple geometrical model for calculating the effective emissivity in blackbody cylindrical cavities has been developed. The back ray tracing technique and the Monte Carlo method have been employed, making use of a suitable set of coordinates and auxiliary planes. In these planes, the trajectories of individual photons in the successive reflections between the cavity points are followed in detail. The theoretical model is implemented by using simple numerical tools, programmed in Microsoft Visual Basic for Application and Excel. The algorithm is applied to isothermal and non-isothermal diffuse cylindrical cavities with a lid; however, the basic geometrical structure can be generalized to a cylindro-conical shape and specular reflection. Additionally, the numerical algorithm and the program source code can be used, with minor changes, for determining the distribution of the cavity points, where photon absorption takes place. This distribution could be applied to the study of the influence of thermal gradients on the effective emissivity profiles, for example. Validation is performed by analyzing the convergence of the Monte Carlo method as a function of the number of trials and by comparison with published results of different authors.
Moessfit. A free Mössbauer fitting program
NASA Astrophysics Data System (ADS)
Kamusella, Sirko; Klauss, Hans-Henning
2016-12-01
A free data analysis program for Mössbauer spectroscopy was developed to solve commonly faced problems such as simultaneous fitting of multiple data sets, Maximum Entropy Method and a proper error estimation. The program is written in C++ using the Qt application framework and the Gnu Scientific Library. Moessfit makes use of multithreading to reasonably apply the multi core CPU capacities of modern PC. The whole fit is specified in a text input file issued to simplify work flow for the user and provide a simple start in the Mössbauer data analysis for beginners. However, the possibility to define arbitrary parameter dependencies and distributions as well as relaxation spectra makes Moessfit interesting for advanced user as well.
COMP Superscalar, an interoperable programming framework
NASA Astrophysics Data System (ADS)
Badia, Rosa M.; Conejero, Javier; Diaz, Carlos; Ejarque, Jorge; Lezzi, Daniele; Lordan, Francesc; Ramon-Cortes, Cristian; Sirvent, Raul
2015-12-01
COMPSs is a programming framework that aims to facilitate the parallelization of existing applications written in Java, C/C++ and Python scripts. For that purpose, it offers a simple programming model based on sequential development in which the user is mainly responsible for (i) identifying the functions to be executed as asynchronous parallel tasks and (ii) annotating them with annotations or standard Python decorators. A runtime system is in charge of exploiting the inherent concurrency of the code, automatically detecting and enforcing the data dependencies between tasks and spawning these tasks to the available resources, which can be nodes in a cluster, clouds or grids. In cloud environments, COMPSs provides scalability and elasticity features allowing the dynamic provision of resources.
NASA Astrophysics Data System (ADS)
Hinze, Ralf
Programmers happily use induction to prove properties of recursive programs. To show properties of corecursive programs they employ coinduction, but perhaps less enthusiastically. Coinduction is often considered a rather low-level proof method, in particular, as it departs quite radically from equational reasoning. Corecursive programs are conveniently defined using recursion equations. Suitably restricted, these equations possess unique solutions. Uniqueness gives rise to a simple and attractive proof technique, which essentially brings equational reasoning to the coworld. We illustrate the approach using two major examples: streams and infinite binary trees. Both coinductive types exhibit a rich structure: they are applicative functors or idioms, and they can be seen as memo-tables or tabulations. We show that definitions and calculations benefit immensely from this additional structure.
Special report writer: A flexible information management system. Documentation and user's manual
NASA Technical Reports Server (NTRS)
Greene, W. A.
1976-01-01
A special report writer (SSR) was developed which performs multiple correlations on files containing several data hierarchies. Output reports are specified in a simple notation, readily learned by persons having limited familarity with ADP. The SRR system can be adopted by other NASA installations while the basic techniques themselves are compatible with the information management needs of a wide range of organizations. Specifically, the program lends itself to generalization and can be readily adapted for other file management purposes. Extensive details on the characteristics of the SRR program are presented along with a full explanation of the system for those contemplating its application to other data bases. The complete COBOL program and documentation are available.
Transient upset models in computer systems
NASA Technical Reports Server (NTRS)
Mason, G. M.
1983-01-01
Essential factors for the design of transient upset monitors for computers are discussed. The upset is a system level event that is software dependent. It can occur in the program flow, the opcode set, the opcode address domain, the read address domain, and the write address domain. Most upsets are in the program flow. It is shown that simple, external monitors functioning transparently relative to the system operations can be built if a detailed accounting is made of the characteristics of the faults that can happen. Sample applications are provided for different states of the Z-80 and 8085 based system.
GIS data models for coal geology
DOE Office of Scientific and Technical Information (OSTI.GOV)
McColloch, G.H. Jr.; Timberlake, K.J.; Oldham, A.V.
A variety of spatial data models can be applied to different aspects of coal geology. The simple vector data models found in various Computer Aided Drafting (CAD) programs are sometimes used for routine mapping and some simple analyses. However, more sophisticated applications that maintain the topological relationships between cartographic elements enhance analytical potential. Also, vector data models are best for producing various types of high quality, conventional maps. The raster data model is generally considered best for representing data that varies continuously over a geographic area, such as the thickness of a coal bed. Information is lost when contour linesmore » are threaded through raster grids for display, so volumes and tonnages are more accurately determined by working directly with raster data. Raster models are especially well suited to computationally simple surface-to-surface analysis, or overlay functions. Another data model, triangulated irregular networks (TINs) are superior at portraying visible surfaces because many TIN programs support break fines. Break lines locate sharp breaks in slope such as those generated by bodies of water or ridge crests. TINs also {open_quotes}honor{close_quotes} data points so that a surface generated from a set of points will be forced to pass through those points. TINs or grids generated from TINs, are particularly good at determining the intersections of surfaces such as coal seam outcrops and geologic unit boundaries. No single technique works best for all coal-related applications. The ability to use a variety of data models, and transform from one model to another is essential for obtaining optimum results in a timely manner.« less
Simple linear and multivariate regression models.
Rodríguez del Águila, M M; Benítez-Parejo, N
2011-01-01
In biomedical research it is common to find problems in which we wish to relate a response variable to one or more variables capable of describing the behaviour of the former variable by means of mathematical models. Regression techniques are used to this effect, in which an equation is determined relating the two variables. While such equations can have different forms, linear equations are the most widely used form and are easy to interpret. The present article describes simple and multiple linear regression models, how they are calculated, and how their applicability assumptions are checked. Illustrative examples are provided, based on the use of the freely accessible R program. Copyright © 2011 SEICAP. Published by Elsevier Espana. All rights reserved.
Sakhteman, Amirhossein; Zare, Bijan
2016-01-01
An interactive application, Modelface, was presented for Modeller software based on windows platform. The application is able to run all steps of homology modeling including pdb to fasta generation, running clustal, model building and loop refinement. Other modules of modeler including energy calculation, energy minimization and the ability to make single point mutations in the PDB structures are also implemented inside Modelface. The API is a simple batch based application with no memory occupation and is free of charge for academic use. The application is also able to repair missing atom types in the PDB structures making it suitable for many molecular modeling studies such as docking and molecular dynamic simulation. Some successful instances of modeling studies using Modelface are also reported. PMID:28243276
76 FR 17718 - Simple Alternatives, LLC and The RBB Fund, Inc.; Notice of Application
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-30
... SECURITIES AND EXCHANGE COMMISSION [Investment Company Act Release No. 29616; 812-13801] Simple... approval and would grant relief from certain disclosure requirements. Applicants: Simple Alternatives, LLC (``Simple Alternatives'') and The RBB Fund, Inc. (the ``Company''). Filing Dates: The application was filed...
Asynchronous Message Service Reference Implementation
NASA Technical Reports Server (NTRS)
Burleigh, Scott C.
2011-01-01
This software provides a library of middleware functions with a simple application programming interface, enabling implementation of distributed applications in conformance with the CCSDS AMS (Consultative Committee for Space Data Systems Asynchronous Message Service) specification. The AMS service, and its protocols, implement an architectural concept under which the modules of mission systems may be designed as if they were to operate in isolation, each one producing and consuming mission information without explicit awareness of which other modules are currently operating. Communication relationships among such modules are self-configuring; this tends to minimize complexity in the development and operations of modular data systems. A system built on this model is a society of generally autonomous, inter-operating modules that may fluctuate freely over time in response to changing mission objectives, modules functional upgrades, and recovery from individual module failure. The purpose of AMS, then, is to reduce mission cost and risk by providing standard, reusable infrastructure for the exchange of information among data system modules in a manner that is simple to use, highly automated, flexible, robust, scalable, and efficient. The implementation is designed to spawn multiple threads of AMS functionality under the control of an AMS application program. These threads enable all members of an AMS-based, distributed application to discover one another in real time, subscribe to messages on specific topics, and to publish messages on specific topics. The query/reply (client/server) communication model is also supported. Message exchange is optionally subject to encryption (to support confidentiality) and authorization. Fault tolerance measures in the discovery protocol minimize the likelihood of overall application failure due to any single operational error anywhere in the system. The multi-threaded design simplifies processing while enabling application nodes to operate at high speeds; linked lists protected by mutex semaphores and condition variables are used for efficient, inter-thread communication. Applications may use a variety of transport protocols underlying AMS itself, including TCP (Transmission Control Protocol), UDP (User Datagram Protocol), and message queues.
Control of H2S emissions using an ozone oxidation process: Preliminary results
NASA Technical Reports Server (NTRS)
Defaveri, D.; Ferrando, B.; Ferraiolo, G.
1986-01-01
The problem of eliminating industrial emission odors does not have a simple solution, and consequently has not been researched extensively. Therefore, an experimental research program regarding oxidation of H2S through ozone was undertaken to verify the applicable limits of the procedure and, in addition, was designed to supply a useful analytical means of rationalizing the design of reactors employed in the sector.
Lithium Niobate Arithmetic Logic Unit
1991-03-01
Boot51] A.D. Booth, "A Signed Binary Multiplication Technique," Quarterly Journal of Mechanics and Applied Mathematics , Vol. IV Part 2, 1951. [ChWi79...Trans. Computers, Vol. C-26, No. 7, July 1977, pp. 681-687. [Wake8 I] John F. Wakerly , "Miocrocomputer Architecture and Programming," John Wiley and...different division methods and discusses their applicability to simple bit serial implementation. Several different designs are then presented and
Balsillie, J.H.; Donoghue, J.F.; Butler, K.M.; Koch, J.L.
2002-01-01
Two-dimensional plotting tools can be of invaluable assistance in analytical scientific pursuits, and have been widely used in the analysis and interpretation of sedimentologic data. We consider, in this work, the use of arithmetic probability paper (APP). Most statistical computer applications do not allow for the generation of APP plots, because of apparent intractable nonlinearity of the percentile (or probability) axis of the plot. We have solved this problem by identifying an equation(s) for determining plotting positions of Gaussian percentiles (or probabilities), so that APP plots can easily be computer generated. An EXCEL example is presented, and a programmed, simple-to-use EXCEL application template is hereby made publicly available, whereby a complete granulometric analysis including data listing, moment measure calculations, and frequency and cumulative APP plots, is automatically produced.
An application of artificial intelligence to automatic telescopes
NASA Technical Reports Server (NTRS)
Swanson, Keith; Drummond, Mark; Bresina, John
1992-01-01
Automatic Photoelectric Telescopes (APT's) allow an astronomer to be removed form the telescope site in both time and space. APT's 'execute' an observation program (a set of observation requests) expressed in an ASCII-based language (ATIS) and collect observation results expressed in this same language. The observation program is currently constructed by a Principal Astronomer from the requests of multiple users; the execution is currently controlled by a simple heuristic dispatch scheduler. Research aimed at improving the use of APT's is being carried out by the Entropy Reduction Engine (ERE) project at NASA Ames. The overall goal of the ERE project is the study and construction of systems that integrate planning, scheduling, and control. This paper discusses the application of some ERE technical results to the improvement of both the scheduling and the operation of APT's.
cp-R, an interface the R programming language for clinical laboratory method comparisons.
Holmes, Daniel T
2015-02-01
Clinical scientists frequently need to compare two different bioanalytical methods as part of assay validation/monitoring. As a matter necessity, regression methods for quantitative comparison in clinical chemistry, hematology and other clinical laboratory disciplines must allow for error in both the x and y variables. Traditionally the methods popularized by 1) Deming and 2) Passing and Bablok have been recommended. While commercial tools exist, no simple open source tool is available. The purpose of this work was to develop and entirely open-source GUI-driven program for bioanalytical method comparisons capable of performing these regression methods and able to produce highly customized graphical output. The GUI is written in python and PyQt4 with R scripts performing regression and graphical functions. The program can be run from source code or as a pre-compiled binary executable. The software performs three forms of regression and offers weighting where applicable. Confidence bands of the regression are calculated using bootstrapping for Deming and Passing Bablok methods. Users can customize regression plots according to the tools available in R and can produced output in any of: jpg, png, tiff, bmp at any desired resolution or ps and pdf vector formats. Bland Altman plots and some regression diagnostic plots are also generated. Correctness of regression parameter estimates was confirmed against existing R packages. The program allows for rapid and highly customizable graphical output capable of conforming to the publication requirements of any clinical chemistry journal. Quick method comparisons can also be performed and cut and paste into spreadsheet or word processing applications. We present a simple and intuitive open source tool for quantitative method comparison in a clinical laboratory environment. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Pattern recognition analysis and classification modeling of selenium-producing areas
Naftz, D.L.
1996-01-01
Established chemometric and geochemical techniques were applied to water quality data from 23 National Irrigation Water Quality Program (NIWQP) study areas in the Western United States. These techniques were applied to the NIWQP data set to identify common geochemical processes responsible for mobilization of selenium and to develop a classification model that uses major-ion concentrations to identify areas that contain elevated selenium concentrations in water that could pose a hazard to water fowl. Pattern recognition modeling of the simple-salt data computed with the SNORM geochemical program indicate three principal components that explain 95% of the total variance. A three-dimensional plot of PC 1, 2 and 3 scores shows three distinct clusters that correspond to distinct hydrochemical facies denoted as facies 1, 2 and 3. Facies 1 samples are distinguished by water samples without the CaCO3 simple salt and elevated concentrations of NaCl, CaSO4, MgSO4 and Na2SO4 simple salts relative to water samples in facies 2 and 3. Water samples in facies 2 are distinguished from facies 1 by the absence of the MgSO4 simple salt and the presence of the CaCO3 simple salt. Water samples in facies 3 are similar to samples in facies 2, with the absence of both MgSO4 and CaSO4 simple salts. Water samples in facies 1 have the largest selenium concentration (10 ??gl-1), compared to a median concentration of 2.0 ??gl-1 and less than 1.0 ??gl-1 for samples in facies 2 and 3. A classification model using the soft independent modeling by class analogy (SIMCA) algorithm was constructed with data from the NIWQP study areas. The classification model was successful in identifying water samples with a selenium concentration that is hazardous to some species of water-fowl from a test data set comprised of 2,060 water samples from throughout Utah and Wyoming. Application of chemometric and geochemical techniques during data synthesis analysis of multivariate environmental databases from other national-scale environmental programs such as the NIWQP could also provide useful insights for addressing 'real world' environmental problems.
Integrated Task and Data Parallel Programming
NASA Technical Reports Server (NTRS)
Grimshaw, A. S.
1998-01-01
This research investigates the combination of task and data parallel language constructs within a single programming language. There are an number of applications that exhibit properties which would be well served by such an integrated language. Examples include global climate models, aircraft design problems, and multidisciplinary design optimization problems. Our approach incorporates data parallel language constructs into an existing, object oriented, task parallel language. The language will support creation and manipulation of parallel classes and objects of both types (task parallel and data parallel). Ultimately, the language will allow data parallel and task parallel classes to be used either as building blocks or managers of parallel objects of either type, thus allowing the development of single and multi-paradigm parallel applications. 1995 Research Accomplishments In February I presented a paper at Frontiers 1995 describing the design of the data parallel language subset. During the spring I wrote and defended my dissertation proposal. Since that time I have developed a runtime model for the language subset. I have begun implementing the model and hand-coding simple examples which demonstrate the language subset. I have identified an astrophysical fluid flow application which will validate the data parallel language subset. 1996 Research Agenda Milestones for the coming year include implementing a significant portion of the data parallel language subset over the Legion system. Using simple hand-coded methods, I plan to demonstrate (1) concurrent task and data parallel objects and (2) task parallel objects managing both task and data parallel objects. My next steps will focus on constructing a compiler and implementing the fluid flow application with the language. Concurrently, I will conduct a search for a real-world application exhibiting both task and data parallelism within the same program. Additional 1995 Activities During the fall I collaborated with Andrew Grimshaw and Adam Ferrari to write a book chapter which will be included in Parallel Processing in C++ edited by Gregory Wilson. I also finished two courses, Compilers and Advanced Compilers, in 1995. These courses complete my class requirements at the University of Virginia. I have only my dissertation research and defense to complete.
Integrated Task And Data Parallel Programming: Language Design
NASA Technical Reports Server (NTRS)
Grimshaw, Andrew S.; West, Emily A.
1998-01-01
his research investigates the combination of task and data parallel language constructs within a single programming language. There are an number of applications that exhibit properties which would be well served by such an integrated language. Examples include global climate models, aircraft design problems, and multidisciplinary design optimization problems. Our approach incorporates data parallel language constructs into an existing, object oriented, task parallel language. The language will support creation and manipulation of parallel classes and objects of both types (task parallel and data parallel). Ultimately, the language will allow data parallel and task parallel classes to be used either as building blocks or managers of parallel objects of either type, thus allowing the development of single and multi-paradigm parallel applications. 1995 Research Accomplishments In February I presented a paper at Frontiers '95 describing the design of the data parallel language subset. During the spring I wrote and defended my dissertation proposal. Since that time I have developed a runtime model for the language subset. I have begun implementing the model and hand-coding simple examples which demonstrate the language subset. I have identified an astrophysical fluid flow application which will validate the data parallel language subset. 1996 Research Agenda Milestones for the coming year include implementing a significant portion of the data parallel language subset over the Legion system. Using simple hand-coded methods, I plan to demonstrate (1) concurrent task and data parallel objects and (2) task parallel objects managing both task and data parallel objects. My next steps will focus on constructing a compiler and implementing the fluid flow application with the language. Concurrently, I will conduct a search for a real-world application exhibiting both task and data parallelism within the same program m. Additional 1995 Activities During the fall I collaborated with Andrew Grimshaw and Adam Ferrari to write a book chapter which will be included in Parallel Processing in C++ edited by Gregory Wilson. I also finished two courses, Compilers and Advanced Compilers, in 1995. These courses complete my class requirements at the University of Virginia. I have only my dissertation research and defense to complete.
1997-12-01
Fracture Analysis of the F-5, 15%-Spar Bolt DR Devendra Kumar SAALC/LD 6- 16 CUNY-City College, New York, NY A Simple, Multiversion Concurrency Control...Program, University of Dayton, Dayton, OH. [3]AFGROW, Air Force Crack Propagation Analysis Program, Version 3.82 (1997) 15-8 A SIMPLE, MULTIVERSION ...Office of Scientific Research Boiling Air Force Base, DC and San Antonio Air Logistic Center August 1997 16-1 A SIMPLE, MULTIVERSION CONCURRENCY
Automating Disk Forensic Processing with SleuthKit, XML and Python
2009-05-01
1 Automating Disk Forensic Processing with SleuthKit, XML and Python Simson L. Garfinkel Abstract We have developed a program called fiwalk which...files themselves. We show how it is relatively simple to create automated disk forensic applications using a Python module we have written that reads...software that the portable device may contain. Keywords: Computer Forensics; XML; Sleuth Kit; Python I. INTRODUCTION In recent years we have found many
The Extensibility of an Interpreted Language Using Plugin Libraries
NASA Astrophysics Data System (ADS)
Herceg, Dorde; Radaković, Davorka
2011-09-01
Dynamic geometry software (DGS) are computer programs that allow one to create and manipulate geometrical drawings. They are mostly used in teaching and studying geometry. However, DGS can also be used to develop interactive drawings not directly related to geometry. Examples include teaching materials for numerical mathematics at secondary school and university levels, or interactive mathematical games for elementary school children. Such applications often surpass the intended purposes of the DGS and may require complicated programming on behalf of the user. In this paper we present a simple plug-in model which enables easy development and deployment of interactive GUI components for "Geometrijica", a DGS we are developing on Silverlight.
Study of CdTe/CdS solar cell at low power density for low-illumination applications
NASA Astrophysics Data System (ADS)
Devi, Nisha; Aziz, Anver; Datta, Shouvik
2016-05-01
In this paper, we numerically investigate CdTe/CdS PV cell properties using a simulation program Solar Cell Capacitance Simulator in 1D (SCAPS-1D). A simple structure of CdTe PV cell has been optimized to study the effect of temperature, absorber thickness and work function at very low incident power. Objective of this research paper is to build an efficient and cost effective solar cell for portable electronic devices such as portable computers and cell phones that work at low incident power because most of such devices work at diffused and reflected sunlight. In this report, we simulated a simple CdTe PV cell at very low incident power, which gives good efficiency.
Improvements in Spectrum's fit to program data tool.
Mahiane, Severin G; Marsh, Kimberly; Grantham, Kelsey; Crichlow, Shawna; Caceres, Karen; Stover, John
2017-04-01
The Joint United Nations Program on HIV/AIDS-supported Spectrum software package (Glastonbury, Connecticut, USA) is used by most countries worldwide to monitor the HIV epidemic. In Spectrum, HIV incidence trends among adults (aged 15-49 years) are derived by either fitting to seroprevalence surveillance and survey data or generating curves consistent with program and vital registration data, such as historical trends in the number of newly diagnosed infections or people living with HIV and AIDS related deaths. This article describes development and application of the fit to program data (FPD) tool in Joint United Nations Program on HIV/AIDS' 2016 estimates round. In the FPD tool, HIV incidence trends are described as a simple or double logistic function. Function parameters are estimated from historical program data on newly reported HIV cases, people living with HIV or AIDS-related deaths. Inputs can be adjusted for proportions undiagnosed or misclassified deaths. Maximum likelihood estimation or minimum chi-squared distance methods are used to identify the best fitting curve. Asymptotic properties of the estimators from these fits are used to estimate uncertainty. The FPD tool was used to fit incidence for 62 countries in 2016. Maximum likelihood and minimum chi-squared distance methods gave similar results. A double logistic curve adequately described observed trends in all but four countries where a simple logistic curve performed better. Robust HIV-related program and vital registration data are routinely available in many middle-income and high-income countries, whereas HIV seroprevalence surveillance and survey data may be scarce. In these countries, the FPD tool offers a simpler, improved approach to estimating HIV incidence trends.
Application of the "see one, do one, teach one" concept in surgical training.
Kotsis, Sandra V; Chung, Kevin C
2013-05-01
The traditional method of teaching in surgery is known as "see one, do one, teach one." However, many have argued that this method is no longer applicable, mainly because of concerns for patient safety. The purpose of this article is to show that the basis of the traditional teaching method is still valid in surgical training if it is combined with various adult learning principles. The authors reviewed literature regarding the history of the formation of the surgical residency program, adult learning principles, mentoring, and medical simulation. The authors provide examples for how these learning techniques can be incorporated into a surgical resident training program. The surgical residency program created by Dr. William Halsted remained virtually unchanged until recently with reductions in resident work hours and changes to a competency-based training system. Such changes have reduced the teaching time between attending physicians and residents. Learning principles such as experience, observation, thinking, and action and deliberate practice can be used to train residents. Mentoring is also an important aspect in teaching surgical technique. The authors review the different types of simulators-standardized patients, virtual reality applications, and high-fidelity mannequin simulators-and the advantages and disadvantages of using them. The traditional teaching method of "see one, do one, teach one" in surgical residency programs is simple but still applicable. It needs to evolve with current changes in the medical system to adequately train surgical residents and also provide patients with safe, evidence-based care.
Squid - a simple bioinformatics grid.
Carvalho, Paulo C; Glória, Rafael V; de Miranda, Antonio B; Degrave, Wim M
2005-08-03
BLAST is a widely used genetic research tool for analysis of similarity between nucleotide and protein sequences. This paper presents a software application entitled "Squid" that makes use of grid technology. The current version, as an example, is configured for BLAST applications, but adaptation for other computing intensive repetitive tasks can be easily accomplished in the open source version. This enables the allocation of remote resources to perform distributed computing, making large BLAST queries viable without the need of high-end computers. Most distributed computing / grid solutions have complex installation procedures requiring a computer specialist, or have limitations regarding operating systems. Squid is a multi-platform, open-source program designed to "keep things simple" while offering high-end computing power for large scale applications. Squid also has an efficient fault tolerance and crash recovery system against data loss, being able to re-route jobs upon node failure and recover even if the master machine fails. Our results show that a Squid application, working with N nodes and proper network resources, can process BLAST queries almost N times faster than if working with only one computer. Squid offers high-end computing, even for the non-specialist, and is freely available at the project web site. Its open-source and binary Windows distributions contain detailed instructions and a "plug-n-play" instalation containing a pre-configured example.
Facilitating Co-Design for Extreme-Scale Systems Through Lightweight Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engelmann, Christian; Lauer, Frank
This work focuses on tools for investigating algorithm performance at extreme scale with millions of concurrent threads and for evaluating the impact of future architecture choices to facilitate the co-design of high-performance computing (HPC) architectures and applications. The approach focuses on lightweight simulation of extreme-scale HPC systems with the needed amount of accuracy. The prototype presented in this paper is able to provide this capability using a parallel discrete event simulation (PDES), such that a Message Passing Interface (MPI) application can be executed at extreme scale, and its performance properties can be evaluated. The results of an initial prototype aremore » encouraging as a simple 'hello world' MPI program could be scaled up to 1,048,576 virtual MPI processes on a four-node cluster, and the performance properties of two MPI programs could be evaluated at up to 16,384 virtual MPI processes on the same system.« less
Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R
2009-01-01
We report on the real-time creation of an application for hands-on neurophysiology in an advanced undergraduate teaching laboratory. Enabled by the rapid software development tools included in the Matlab technical computing environment (The Mathworks, Natick, MA), a team, consisting of a neurophysiology educator and a biophysicist trained as an electrical engineer, interfaced to a course of approximately 15 students from engineering and biology backgrounds. The result is the powerful freeware data acquisition and analysis environment, "g-PRIME." The software was developed from week to week in response to curriculum demands, and student feedback. The program evolved from a simple software oscilloscope, enabling RC circuit analysis, to a suite of tools supporting analysis of neuronal excitability and synaptic transmission analysis in invertebrate model systems. The program has subsequently expanded in application to university courses, research, and high school projects in the US and abroad as free courseware.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Staley, Martin
2017-09-20
This high-performance ray tracing library provides very fast rendering; compact code; type flexibility through C++ "generic programming" techniques; and ease of use via an application programming interface (API) that operates independently of any GUI, on-screen display, or other enclosing application. Kip supports constructive solid geometry (CSG) models based on a wide variety of built-in shapes and logical operators, and also allows for user-defined shapes and operators to be provided. Additional features include basic texturing; input/output of models using a simple human-readable file format and with full error checking and detailed diagnostics; and support for shared data parallelism. Kip is writtenmore » in pure, ANSI standard C++; is entirely platform independent; and is very easy to use. As a C++ "header only" library, it requires no build system, configuration or installation scripts, wizards, non-C++ preprocessing, makefiles, shell scripts, or external libraries.« less
Predictable Programming on a Precision Timed Architecture
2008-04-18
Application: A Video Game Figure 6: Structure of the Video Game Example Inspired by an example game sup- plied with the Hydra development board [17...we implemented a sim- ple video game in C targeted to our PRET architecture. Our example centers on rendering graphics and is otherwise fairly simple...background image. 13 Figure 10: A Screen Dump From Our Video Game Ultimately, each displayed pixel is one of only four col- ors, but the pixels in
Overcoming the Coupling Dilemma in DNA-Programmable Nanoparticle Assemblies by "Ag+ Soldering".
Wang, Huiqiao; Li, Yulin; Liu, Miao; Gong, Ming; Deng, Zhaoxiang
2015-05-20
Strong coupling between nanoparticles is critical for facilitating charge and energy transfers. Despite the great success of DNA-programmable nanoparticle assemblies, the very weak interparticle coupling represents a key barrier to various applications. Here, an extremely simple, fast, and highly efficient process combining DNA-programming and molecular/ionic bonding is developed to address this challenge, which exhibits a seamless fusion with DNA nanotechnology. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
1976-03-01
pseudo -ranae and range rate correlations , and GDM software etficiency. Other simplifications include the eliwination of all or part of che multipath...signal is available. Then the pdf parameters are trivially available by simple mean, variance and correlation measurements on the quadrature signal...This report investigates the application of CSEL to the LES 8/9 and GPS satellite programs. In addition, a new analysis of the effects of soft and
Semivariogram modeling by weighted least squares
Jian, X.; Olea, R.A.; Yu, Y.-S.
1996-01-01
Permissible semivariogram models are fundamental for geostatistical estimation and simulation of attributes having a continuous spatiotemporal variation. The usual practice is to fit those models manually to experimental semivariograms. Fitting by weighted least squares produces comparable results to fitting manually in less time, systematically, and provides an Akaike information criterion for the proper comparison of alternative models. We illustrate the application of a computer program with examples showing the fitting of simple and nested models. Copyright ?? 1996 Elsevier Science Ltd.
A simple and sensitive method to measure timing accuracy.
De Clercq, Armand; Crombez, Geert; Buysse, Ann; Roeyers, Herbert
2003-02-01
Timing accuracy in presenting experimental stimuli (visual information on a PC or on a TV) and responding (keyboard presses and mouse signals) is of importance in several experimental paradigms. In this article, a simple system for measuring timing accuracy is described. The system uses two PCs (at least Pentium II, 200 MHz), a photocell, and an amplifier. No additional boards and timing hardware are needed. The first PC, a SlavePC, monitors the keyboard presses or mouse signals from the PC under test and uses a photocell that is placed in front of the screen to detect the appearance of visual stimuli on the display. The software consists of a small program running on the SlavePC. The SlavePC is connected through a serial line with a second PC. This MasterPC controls the SlavePC through an ActiveX control, which is used in a Visual Basic program. The accuracy of our system was investigated by using a similar setup of a SlavePC and a MasterPC to generate pulses and by using a pulse generator card. These tests revealed that our system has a 0.01-msec accuracy. As an illustration, the reaction time accuracy of INQUISIT for a few applications was tested using our system. It was found that in those applications that we investigated, INQUISIT measures reaction times from keyboard presses with millisecond accuracy.
Using NetCloak to develop server-side Web-based experiments without writing CGI programs.
Wolfe, Christopher R; Reyna, Valerie F
2002-05-01
Server-side experiments use the Web server, rather than the participant's browser, to handle tasks such as random assignment, eliminating inconsistencies with JAVA and other client-side applications. Heretofore, experimenters wishing to create server-side experiments have had to write programs to create common gateway interface (CGI) scripts in programming languages such as Perl and C++. NetCloak uses simple, HTML-like commands to create CGIs. We used NetCloak to implement an experiment on probability estimation. Measurements of time on task and participants' IP addresses assisted quality control. Without prior training, in less than 1 month, we were able to use NetCloak to design and create a Web-based experiment and to help graduate students create three Web-based experiments of their own.
Using Alice 2.0 to Design Games for People with Stroke.
Proffitt, Rachel; Kelleher, Caitlin; Baum, M Carolyn; Engsberg, Jack
2012-08-01
Computer and videogames are gaining in popularity as rehabilitation tools. Unfortunately, most systems still require extensive programming/engineering knowledge to create, something that therapists, as novice programmers, do not possess. There is software designed to allow novice programmers to create storyboard and games through simple drag-and-drop formats; however, the applications for therapeutic game development have not been studied. The purpose of this study was to have an occupational therapy (OT) student with no prior computer programming experience learn how to create computer games for persons with stroke using Alice 2.0, a drag-and-drop editor, designed by Carnegie Mellon University (Pittsburgh, PA). The OT student learned how to use Alice 2.0 through a textbook, tutorials, and assistance from computer science students. She kept a journal of her process, detailing her successes and challenges. The OT student created three games for people with stroke using Alice 2.0. She found that although there were many supports in Alice for creating stories, it lacked critical pieces necessary for game design. Her recommendations for a future programming environment for therapists were that it (1) be efficient, (2) include basic game design pieces so therapists do not have to create them, (3) provide technical support, and (4) be simple. With the incorporation of these recommendations, a future programming environment for therapists will be an effective tool for therapeutic game development.
Semilinear programming: applications and implementation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mohan, S.
Semilinear programming is a method of solving optimization problems with linear constraints where the non-negativity restrictions on the variables are dropped and the objective function coefficients can take on different values depending on whether the variable is positive or negative. The simplex method for linear programming is modified in this thesis to solve general semilinear and piecewise linear programs efficiently without having to transform them into equivalent standard linear programs. Several models in widely different areas of optimization such as production smoothing, facility locations, goal programming and L/sub 1/ estimation are presented first to demonstrate the compact formulation that arisesmore » when such problems are formulated as semilinear programs. A code SLP is constructed using the semilinear programming techniques. Problems in aggregate planning and L/sub 1/ estimation are solved using SLP and equivalent linear programs using a linear programming simplex code. Comparisons of CPU times and number iterations indicate SLP to be far superior. The semilinear programming techniques are extended to piecewise linear programming in the implementation of the code PLP. Piecewise linear models in aggregate planning are solved using PLP and equivalent standard linear programs using a simple upper bounded linear programming code SUBLP.« less
Utilizing Linked Open Data Sources for Automatic Generation of Semantic Metadata
NASA Astrophysics Data System (ADS)
Nummiaho, Antti; Vainikainen, Sari; Melin, Magnus
In this paper we present an application that can be used to automatically generate semantic metadata for tags given as simple keywords. The application that we have implemented in Java programming language creates the semantic metadata by linking the tags to concepts in different semantic knowledge bases (CrunchBase, DBpedia, Freebase, KOKO, Opencyc, Umbel and/or WordNet). The steps that our application takes in doing so include detecting possible languages, finding spelling suggestions and finding meanings from amongst the proper nouns and common nouns separately. Currently, our application supports English, Finnish and Swedish words, but other languages could be included easily if the required lexical tools (spellcheckers, etc.) are available. The created semantic metadata can be of great use in, e.g., finding and combining similar contents, creating recommendations and targeting advertisements.
NASA Astrophysics Data System (ADS)
Korol, Roman; Kilgour, Michael; Segal, Dvira
2018-03-01
We present our in-house quantum transport package, ProbeZT. This program provides linear response coefficients: electrical and electronic thermal conductances, as well as the thermopower of molecular junctions in which electrons interact with the surrounding thermal environment. Calculations are performed based on the Büttiker probe method, which introduces decoherence, energy exchange and dissipation effects phenomenologically using virtual electrode terminals called probes. The program can realize different types of probes, each introducing various environmental effects, including elastic and inelastic scattering of electrons. The molecular system is described by an arbitrary tight-binding Hamiltonian, allowing the study of different geometries beyond simple one-dimensional wires. Applications of the program to study the thermoelectric performance of molecular junctions are illustrated. The program also has a built-in functionality to simulate electron transport in double-stranded DNA molecules based on a tight-binding (ladder) description of the junction.
At the Edge of Translation – Materials to Program Cells for Directed Differentiation
Arany, Praveen R; Mooney, David J
2010-01-01
The rapid advancement in basic biology knowledge, especially in the stem cell field, has created new opportunities to develop biomaterials capable of orchestrating the behavior of transplanted and host cells. Based on our current understanding of cellular differentiation, a conceptual framework for the use of materials to program cells in situ is presented, namely a domino versus a switchboard model, to highlight the use of single versus multiple cues in a controlled manner to modulate biological processes. Further, specific design principles of material systems to present soluble and insoluble cues that are capable of recruiting, programming and deploying host cells for various applications are presented. The evolution of biomaterials from simple inert substances used to fill defects, to the recent development of sophisticated material systems capable of programming cells in situ is providing a platform to translate our understanding of basic biological mechanisms to clinical care. PMID:20860763
CUTE: A Concolic Unit Testing Engine for C
2005-01-01
We also introduce program units of a simple C-like language (cf. [20]). We present how CUTE instruments programs and performs concolic execution. We...works for a simple C-like language shown in Figure 2. START represents the first statement of a program under test. Each statement has an optional...is a variable, c is a constant p ::= v = v | v 6= v | v < v | v ≤ v | v ≥ v | v > v Figure 2: Syntax of a simple C-like language the inputs at the
Experiments in optics for younger students by and for older students
NASA Astrophysics Data System (ADS)
Masi, James V.
1995-10-01
Under the auspices of a joint NSF/DOE grant for science and mathematics, the Electrical Engineering Department of the Engineering School at Western New England College developed a program of instruction in optics and optical applications for local Junior High School students. College level juniors and professors in the electrical engineering department, after the juniors had taken a one semester introductory course in optics and electro-optics, served as instructors in teaching and laboratory instruction in such diverse areas as solar cells/light detection, light sources, simple optics, optical fibers, liquid crystals, and lasers. Concepts such as seismic monitoring, Fourier transforms, power generation, information transfer, and many other applications were explained at level by the college students to the junior high school students with great effectiveness. Students at the lower level caught the enthusiasm of those at the upper level and learned with retention. Seven years into the program, the pros and cons are presented, the now- college bound students and their observations are detailed, and the learning experience for all is assessed, with scenarios for alternate programs suggested.
A Simple Measure of the Dynamics of Segmented Genomes: An Application to Influenza
NASA Astrophysics Data System (ADS)
Aris-Brosou, Stéphane
The severity of influenza epidemics, which can potentially become a pandemic, has been very difficult to predict. However, past efforts were focusing on gene-by-gene approaches, while it is acknowledged that the whole genome dynamics contribute to the severity of an epidemic. Here, putting this rationale into action, I describe a simple measure of the amount of reassortment that affects influenza at a genomic scale during a particular year. The analysis of 530 complete genomes of the H1N1 subtype, sampled over eleven years, shows that the proposed measure explains 58% of the variance in the prevalence of H1 influenza in the US population. The proposed measure, denoted nRF, could therefore improve influenza surveillance programs at a minimal cost.
Study of CdTe/CdS solar cell at low power density for low-illumination applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Devi, Nisha, E-mail: nishatanwer1989@gmail.com; Aziz, Anver, E-mail: aaziz@jmi.ac.in; Datta, Shouvik
In this paper, we numerically investigate CdTe/CdS PV cell properties using a simulation program Solar Cell Capacitance Simulator in 1D (SCAPS-1D). A simple structure of CdTe PV cell has been optimized to study the effect of temperature, absorber thickness and work function at very low incident power. Objective of this research paper is to build an efficient and cost effective solar cell for portable electronic devices such as portable computers and cell phones that work at low incident power because most of such devices work at diffused and reflected sunlight. In this report, we simulated a simple CdTe PV cellmore » at very low incident power, which gives good efficiency.« less
Simulator for multilevel optimization research
NASA Technical Reports Server (NTRS)
Padula, S. L.; Young, K. C.
1986-01-01
A computer program designed to simulate and improve multilevel optimization techniques is described. By using simple analytic functions to represent complex engineering analyses, the simulator can generate and test a large variety of multilevel decomposition strategies in a relatively short time. This type of research is an essential step toward routine optimization of large aerospace systems. The paper discusses the types of optimization problems handled by the simulator and gives input and output listings and plots for a sample problem. It also describes multilevel implementation techniques which have value beyond the present computer program. Thus, this document serves as a user's manual for the simulator and as a guide for building future multilevel optimization applications.
Building an open-source robotic stereotaxic instrument.
Coffey, Kevin R; Barker, David J; Ma, Sisi; West, Mark O
2013-10-29
This protocol includes the designs and software necessary to upgrade an existing stereotaxic instrument to a robotic (CNC) stereotaxic instrument for around $1,000 (excluding a drill), using industry standard stepper motors and CNC controlling software. Each axis has variable speed control and may be operated simultaneously or independently. The robot's flexibility and open coding system (g-code) make it capable of performing custom tasks that are not supported by commercial systems. Its applications include, but are not limited to, drilling holes, sharp edge craniotomies, skull thinning, and lowering electrodes or cannula. In order to expedite the writing of g-coding for simple surgeries, we have developed custom scripts that allow individuals to design a surgery with no knowledge of programming. However, for users to get the most out of the motorized stereotax, it would be beneficial to be knowledgeable in mathematical programming and G-Coding (simple programming for CNC machining). The recommended drill speed is greater than 40,000 rpm. The stepper motor resolution is 1.8°/Step, geared to 0.346°/Step. A standard stereotax has a resolution of 2.88 μm/step. The maximum recommended cutting speed is 500 μm/sec. The maximum recommended jogging speed is 3,500 μm/sec. The maximum recommended drill bit size is HP 2.
Recommendations for Hypersonic Boundary Layer Transition Flight Testing
NASA Technical Reports Server (NTRS)
Berry, Scott A.; Kimmel, Roger; Reshotko, Eli
2011-01-01
Much has been learned about the physics underlying the transition process at supersonic and hypersonic speeds through years of analysis, experiment and computation. Generally, the application of this knowledge has been restricted to simple shapes like plates, cones and spherical bodies. However, flight reentry vehicles are in reality never simple. They typically are highly complex geometries flown at angle of attack so three-dimensional effects are very important, as are roughness effects due to surface features and/or ablation. This paper will review our present understanding of the physics of the transition process and look back at some of the recent flight test programs for their successes and failures. The goal of this paper is to develop rationale for new hypersonic boundary layer transition flight experiments. Motivations will be derived from both an inward look at what we believe constitutes a good flight test program as well as an outward review of the goals and objectives of some recent US based unclassified proposals and programs. As part of our recommendations, this paper will address the need for careful experimental work as per the guidelines enunciated years ago by the U.S. Transition Study Group. Following these guidelines is essential to obtaining reliable, usable data for allowing refinement of transition estimation techniques.
Bouhaddou, Omar; Lincoln, Michael J.; Maulden, Sarah; Murphy, Holli; Warnekar, Pradnya; Nguyen, Viet; Lam, Siew; Brown, Steven H; Frankson, Ferdinand J.; Crandall, Glen; Hughes, Carla; Sigley, Roger; Insley, Marcia; Graham, Gail
2006-01-01
The Veterans Administration (VA) has adopted an ambitious program to standardize its clinical terminology to comply with industry-wide standards. The VA is using commercially available tools and in-house software to create a high-quality reference terminology system. The terminology will be used by current and future applications with no planned disruption to operational systems. The first large customer of the group is the national VA Health Data Repository (HDR). Unique enterprise identifiers are assigned to each standard term, and a rich network of semantic relationships makes the resulting data not only recognizable, but highly computable and reusable in a variety of applications, including decision support and data sharing with partners such as the Department of Defense (DoD). This paper describes the specific methods and approaches that the VA has employed to develop and implement this innovative program in existing information system. The goal is to share with others our experience with key issues that face our industry as we move toward an electronic health record for every individual. PMID:17238306
NASA Astrophysics Data System (ADS)
Tanner, Meghan; Eckel, Ryan; Senevirathne, Indrajith
The versatility, simplicity, and robustness of Arduino microcontroller architecture have won a huge following with increasingly serious engineering and physical science applications. Arduino microcontroller environment coupled with commercially available sensors have been used to systematically measure, record, and analyze low currents, low voltages and corresponding dissipated power for assessing secondary physical properties in a diverse array of engineering systems. Setup was assembled via breadboard, wire, and simple soldering with an Arduino Uno with ATmega328P microcontroller connected to a PC. The microcontroller was programmed with Arduino Software while the bootloader was used to upload the code. Commercial Hall effect current sensor modules ACS712 and INA169 current shunt monitor was used to measure corresponding low to ultra-low currents and voltages. Stable measurement data was obtained via sensors and compared with corresponding oscilloscope measurements to assess reliability and uncertainty. Sensor breakout boards were modified to enhance the sensitivity of the measurements and to expand the applicability. Discussion of these measurements will focus on capabilities, capacities and limitations of the systems with examples of possible applications. Lock Haven Nanotechnology Program.
WaveJava: Wavelet-based network computing
NASA Astrophysics Data System (ADS)
Ma, Kun; Jiao, Licheng; Shi, Zhuoer
1997-04-01
Wavelet is a powerful theory, but its successful application still needs suitable programming tools. Java is a simple, object-oriented, distributed, interpreted, robust, secure, architecture-neutral, portable, high-performance, multi- threaded, dynamic language. This paper addresses the design and development of a cross-platform software environment for experimenting and applying wavelet theory. WaveJava, a wavelet class library designed by the object-orient programming, is developed to take advantage of the wavelets features, such as multi-resolution analysis and parallel processing in the networking computing. A new application architecture is designed for the net-wide distributed client-server environment. The data are transmitted with multi-resolution packets. At the distributed sites around the net, these data packets are done the matching or recognition processing in parallel. The results are fed back to determine the next operation. So, the more robust results can be arrived quickly. The WaveJava is easy to use and expand for special application. This paper gives a solution for the distributed fingerprint information processing system. It also fits for some other net-base multimedia information processing, such as network library, remote teaching and filmless picture archiving and communications.
T-Check in Technologies for Interoperability: Web Services and Security--Single Sign-On
2007-12-01
following tools: • Apache Tomcat 6.0—a Java Servlet container to host the Web services and a simple Web client application [Apache 2007a] • Apache Axis...Eclipse. Eclipse – an open development platform. http://www.eclipse.org/ (2007) [Hunter 2001] Hunter, Jason. Java Servlet Programming, 2nd Edition...Citation SAML 1.1 Java Toolkit SAML Ping Identity’s SAML-1.1 implementation [SourceID 2006] OpenSAML SAML An open source implementation of SAML 1.1
Current trends for customized biomedical software tools.
Khan, Haseeb Ahmad
2017-01-01
In the past, biomedical scientists were solely dependent on expensive commercial software packages for various applications. However, the advent of user-friendly programming languages and open source platforms has revolutionized the development of simple and efficient customized software tools for solving specific biomedical problems. Many of these tools are designed and developed by biomedical scientists independently or with the support of computer experts and often made freely available for the benefit of scientific community. The current trends for customized biomedical software tools are highlighted in this short review.
NASA Astrophysics Data System (ADS)
Bartoletti, Massimo
Usage automata are an extension of finite stata automata, with some additional features (e.g. parameters and guards) that improve their expressivity. Usage automata are expressive enough to model security requirements of real-world applications; at the same time, they are simple enough to be statically amenable, e.g. they can be model-checked against abstractions of program usages. We study here some foundational aspects of usage automata. In particular, we discuss about their expressive power, and about their effective use in run-time mechanisms for enforcing usage policies.
Survival analysis in telemetry studies: The staggered entry design
Pollock, K.H.; Winterstein, S.R.; Bunck, C.M.; Curtis, P.D.
1989-01-01
A simple description of the Kaplan-Meier procedure is presented with an example using northern bobwhite quail survival data. The Kaplan- Meier procedure was then generalized to allow gradual (or staggered) entry of animals into the study, allowing animals being lost (or censored) due to radio failure, radio loss, or emigration of the animal from the study area. Additionally, the applicability and generalization of the log rank test, a test to compare two survival distributions, was demonstrated. Computer program was developed and is available from authors.
NASA Technical Reports Server (NTRS)
Barger, R. L.; Walters, R. W.
1986-01-01
Some path-following techniques are described and compared with other methods. Use of multipurpose techniques that can be used at more than one stage of the path-following computation results in a system that is relatively simple to understand, program, and use. Comparison of path-following methods with the method of parametric differentiation reveals definite advantages for the path-following methods. The fact that parametric differentiation has found a broader range of applications indicates that path-following methods have been underutilized.
Application of See One, Do One, Teach One Concept in Surgical Training
Kotsis, Sandra V.; Chung, Kevin C.
2016-01-01
Background The traditional method of teaching in Surgery is known as “See One, Do One, Teach One.” However, many have argued that this method is no longer applicable mainly because of concerns for patient safety. The purpose of this paper is to show that the basis of the traditional teaching method is still valid in surgical training if it is combined with various adult learning principles. Methods We reviewed literature regarding the history of the formation of the surgical residency program, adult learning principles, mentoring, and medical simulation. We provide examples for how these learning techniques can be incorporated into a surgical resident training program. Results The surgical residency program created by Dr. William Halsted remained virtually unchanged until recently with reductions in resident work hours and changes to a competency-based training system. Such changes have reduced the teaching time between attending physicians and residents. Learning principles such as “Experience, Observation, Thinking and Action” as well as deliberate practice can be used to train residents. Mentoring is also an important aspect in teaching surgical technique. We review the different types of simulators: standardized patients, virtual reality applications, and high-fidelity mannequin simulators and the advantages and disadvantages of using them. Conclusions The traditional teaching method of “see one, do one, teach one” in surgical residency programs is simple but still applicable. It needs to evolve with current changes in the medical system to adequately train surgical residents and also provide patients with safe, evidence-based care. PMID:23629100
Simple webs of natural environment theme as a result of sharing in science teacher training
NASA Astrophysics Data System (ADS)
Tapilouw, M. C.; Firman, H.; Redjeki, S.; Chandra, D. T.
2018-03-01
Thematic learning is one type of integrated science (Biology, Physics, Chemistry and Earth Science) in Science Education. This study is concerning about simple webs of natural environment theme in science learning, as one of training material in science teacher training program. Making simple web is a goal of first step in teacher training program. Every group explain their web illustration to other group. Twenty Junior High School science teacher above one education foundation participate in science teacher training program. In order to gather simple webs, sharing method was used in this first step of science teacher training. The result of this study is five different simple web of natural environment themes. These webs represent science learning in class VII/Semester I, class VII/Semester II, Class VIII, Class IX/Semester I, Class IX/Semester II based on basic competency in National Curriculum 2013. Each group discussed web of natural environment theme based on their learning experience in real class which basic competency and subject matters are linked with natural environment theme. As a conclusion, simple webs are potential to develop in the next step of science teacher training program and to be implemented in real class.
Microcomputer-Assisted Mathematics: From Simple Interest to e.
ERIC Educational Resources Information Center
Kimberling, Clark
1985-01-01
The progression from simple interest to compound interest leads naturally and quickly to the number e, involving mathematical discovery learning through writing programs. Several programs are given, with suggestions for a teaching sequence. (MNS)
Commercial Complexity and Local and Global Involvement in Programs: Effects on Viewer Responses.
ERIC Educational Resources Information Center
Oberman, Heiko; Thorson, Esther
A study investigated the effects of local (momentary) and global (whole program) involvement in program context and the effects of message complexity on the retention of television commercials. Sixteen commercials, categorized as simple video/simple audio through complex video/complex audio were edited into two globally high- and two globally…
Dong, Zhao; Nath, Anjali; Guo, Jing; Bhaumik, Urmi; Chin, May Y; Dong, Sherry; Marshall, Erica; Murphy, Johnna S; Sandel, Megan T; Sommer, Susan J; Ursprung, W W Sanouri; Woods, Elizabeth R; Reid, Margaret; Adamkiewicz, Gary
2018-01-01
To test the applicability of the Environmental Scoring System, a quick and simple approach for quantitatively measuring environmental triggers collected during home visits, and to evaluate its contribution to improving asthma outcomes among various child asthma programs. We pooled and analyzed data from multiple child asthma programs in the Greater Boston Area, Massachusetts, collected in 2011 to 2016, to examine the association of environmental scores (ES) with measures of asthma outcomes and compare the results across programs. Our analysis showed that demographics were important contributors to variability in asthma outcomes and total ES, and largely explained the differences among programs at baseline. Among all programs in general, we found that asthma outcomes were significantly improved and total ES significantly reduced over visits, with the total Asthma Control Test score negatively associated with total ES. Our study demonstrated that the Environmental Scoring System is a useful tool for measuring home asthma triggers and can be applied regardless of program and survey designs, and that demographics of the target population may influence the improvement in asthma outcomes.
MPF: A portable message passing facility for shared memory multiprocessors
NASA Technical Reports Server (NTRS)
Malony, Allen D.; Reed, Daniel A.; Mcguire, Patrick J.
1987-01-01
The design, implementation, and performance evaluation of a message passing facility (MPF) for shared memory multiprocessors are presented. The MPF is based on a message passing model conceptually similar to conversations. Participants (parallel processors) can enter or leave a conversation at any time. The message passing primitives for this model are implemented as a portable library of C function calls. The MPF is currently operational on a Sequent Balance 21000, and several parallel applications were developed and tested. Several simple benchmark programs are presented to establish interprocess communication performance for common patterns of interprocess communication. Finally, performance figures are presented for two parallel applications, linear systems solution, and iterative solution of partial differential equations.
Developing a protocol for creating microfluidic devices with a 3D printer, PDMS, and glass
NASA Astrophysics Data System (ADS)
Collette, Robyn; Novak, Eric; Shirk, Kathryn
2015-03-01
Microfluidics research requires the design and fabrication of devices that have the ability to manipulate small volumes of fluid, typically ranging from microliters to picoliters. These devices are used for a wide range of applications including the assembly of materials and testing of biological samples. Many methods have been previously developed to create microfluidic devices, including traditional nanolithography techniques. However, these traditional techniques are cost-prohibitive for many small-scale laboratories. This research explores a relatively low-cost technique using a 3D printed master, which is used as a template for the fabrication of polydimethylsiloxane (PDMS) microfluidic devices. The masters are designed using computer aided design (CAD) software and can be printed and modified relatively quickly. We have developed a protocol for creating simple microfluidic devices using a 3D printer and PDMS adhered to glass. This relatively simple and lower-cost technique can now be scaled to more complicated device designs and applications. Funding provided by the Undergraduate Research Grant Program at Shippensburg University and the Student/Faculty Research Engagement Grants from the College of Arts and Sciences at Shippensburg University.
BiDiBlast: comparative genomics pipeline for the PC.
de Almeida, João M G C F
2010-06-01
Bi-directional BLAST is a simple approach to detect, annotate, and analyze candidate orthologous or paralogous sequences in a single go. This procedure is usually confined to the realm of customized Perl scripts, usually tuned for UNIX-like environments. Porting those scripts to other operating systems involves refactoring them, and also the installation of the Perl programming environment with the required libraries. To overcome these limitations, a data pipeline was implemented in Java. This application submits two batches of sequences to local versions of the NCBI BLAST tool, manages result lists, and refines both bi-directional and simple hits. GO Slim terms are attached to hits, several statistics are derived, and molecular evolution rates are estimated through PAML. The results are written to a set of delimited text tables intended for further analysis. The provided graphic user interface allows a friendly interaction with this application, which is documented and available to download at http://moodle.fct.unl.pt/course/view.php?id=2079 or https://sourceforge.net/projects/bidiblast/ under the GNU GPL license. Copyright 2010 Beijing Genomics Institute. Published by Elsevier Ltd. All rights reserved.
XAFS Data Interchange: A single spectrum XAFS data file format.
Ravel, B; Newville, M
We propose a standard data format for the interchange of XAFS data. The XAFS Data Interchange (XDI) standard is meant to encapsulate a single spectrum of XAFS along with relevant metadata. XDI is a text-based format with a simple syntax which clearly delineates metadata from the data table in a way that is easily interpreted both by a computer and by a human. The metadata header is inspired by the format of an electronic mail header, representing metadata names and values as an associative array. The data table is represented as columns of numbers. This format can be imported as is into most existing XAFS data analysis, spreadsheet, or data visualization programs. Along with a specification and a dictionary of metadata types, we provide an application-programming interface written in C and bindings for programming dynamic languages.
XAFS Data Interchange: A single spectrum XAFS data file format
NASA Astrophysics Data System (ADS)
Ravel, B.; Newville, M.
2016-05-01
We propose a standard data format for the interchange of XAFS data. The XAFS Data Interchange (XDI) standard is meant to encapsulate a single spectrum of XAFS along with relevant metadata. XDI is a text-based format with a simple syntax which clearly delineates metadata from the data table in a way that is easily interpreted both by a computer and by a human. The metadata header is inspired by the format of an electronic mail header, representing metadata names and values as an associative array. The data table is represented as columns of numbers. This format can be imported as is into most existing XAFS data analysis, spreadsheet, or data visualization programs. Along with a specification and a dictionary of metadata types, we provide an application-programming interface written in C and bindings for programming dynamic languages.
Compiled MPI: Cost-Effective Exascale Applications Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bronevetsky, G; Quinlan, D; Lumsdaine, A
2012-04-10
The complexity of petascale and exascale machines makes it increasingly difficult to develop applications that can take advantage of them. Future systems are expected to feature billion-way parallelism, complex heterogeneous compute nodes and poor availability of memory (Peter Kogge, 2008). This new challenge for application development is motivating a significant amount of research and development on new programming models and runtime systems designed to simplify large-scale application development. Unfortunately, DoE has significant multi-decadal investment in a large family of mission-critical scientific applications. Scaling these applications to exascale machines will require a significant investment that will dwarf the costs of hardwaremore » procurement. A key reason for the difficulty in transitioning today's applications to exascale hardware is their reliance on explicit programming techniques, such as the Message Passing Interface (MPI) programming model to enable parallelism. MPI provides a portable and high performance message-passing system that enables scalable performance on a wide variety of platforms. However, it also forces developers to lock the details of parallelization together with application logic, making it very difficult to adapt the application to significant changes in the underlying system. Further, MPI's explicit interface makes it difficult to separate the application's synchronization and communication structure, reducing the amount of support that can be provided by compiler and run-time tools. This is in contrast to the recent research on more implicit parallel programming models such as Chapel, OpenMP and OpenCL, which promise to provide significantly more flexibility at the cost of reimplementing significant portions of the application. We are developing CoMPI, a novel compiler-driven approach to enable existing MPI applications to scale to exascale systems with minimal modifications that can be made incrementally over the application's lifetime. It includes: (1) New set of source code annotations, inserted either manually or automatically, that will clarify the application's use of MPI to the compiler infrastructure, enabling greater accuracy where needed; (2) A compiler transformation framework that leverages these annotations to transform the original MPI source code to improve its performance and scalability; (3) Novel MPI runtime implementation techniques that will provide a rich set of functionality extensions to be used by applications that have been transformed by our compiler; and (4) A novel compiler analysis that leverages simple user annotations to automatically extract the application's communication structure and synthesize most complex code annotations.« less
A beginner's guide to Pickett's SPCAT/SPFIT
NASA Astrophysics Data System (ADS)
Novick, Stewart E.
2016-11-01
Two of the most powerful and versatile high resolution spectroscopic predicting and fitting programs are SPCAT/SPFIT first presented by Herbert Pickett in 1991 and refined, expanded, and updated by Herb until his retirement from the Jet Propulsion Laboratory (JPL) in 2008. With versatility, unfortunately, comes complexity. The purpose of this paper is to present for the beginning spectroscopist (or the seasoned spectroscopist unfamiliar with these programs) a simple introduction to SPCAT/SPFIT. I will not be presenting the most powerful and sophisticated uses of these programs. I leave that for future articles, not necessarily by me. This paper outlines the file structures of the input and output files of the programs and a simple tutorial on how to run the programs. Simple examples are worked out, supported by a website containing the files and notes on more complex uses of the program.
Efficient parallel architecture for highly coupled real-time linear system applications
NASA Technical Reports Server (NTRS)
Carroll, Chester C.; Homaifar, Abdollah; Barua, Soumavo
1988-01-01
A systematic procedure is developed for exploiting the parallel constructs of computation in a highly coupled, linear system application. An overall top-down design approach is adopted. Differential equations governing the application under consideration are partitioned into subtasks on the basis of a data flow analysis. The interconnected task units constitute a task graph which has to be computed in every update interval. Multiprocessing concepts utilizing parallel integration algorithms are then applied for efficient task graph execution. A simple scheduling routine is developed to handle task allocation while in the multiprocessor mode. Results of simulation and scheduling are compared on the basis of standard performance indices. Processor timing diagrams are developed on the basis of program output accruing to an optimal set of processors. Basic architectural attributes for implementing the system are discussed together with suggestions for processing element design. Emphasis is placed on flexible architectures capable of accommodating widely varying application specifics.
Application of nomographs for analysis and prediction of receiver spurious response EMI
NASA Astrophysics Data System (ADS)
Heather, F. W.
1985-07-01
Spurious response EMI for the front end of a superheterodyne receiver follows a simple mathematic formula; however, the application of the formula to predict test frequencies produces more data than can be evaluated. An analysis technique has been developed to graphically depict all receiver spurious responses usig a nomograph and to permit selection of optimum test frequencies. The discussion includes the math model used to simulate a superheterodyne receiver, the implementation of the model in the computer program, the approach to test frequency selection, interpretation of the nomographs, analysis and prediction of receiver spurious response EMI from the nomographs, and application of the nomographs. In addition, figures are provided of sample applications. This EMI analysis and prediction technique greatly improves the Electromagnetic Compatibility (EMC) test engineer's ability to visualize the scope of receiver spurious response EMI testing and optimize test frequency selection.
Flexible Architecture for FPGAs in Embedded Systems
NASA Technical Reports Server (NTRS)
Clark, Duane I.; Lim, Chester N.
2012-01-01
Commonly, field-programmable gate arrays (FPGAs) being developed in cPCI embedded systems include the bus interface in the FPGA. This complicates the development because the interface is complicated and requires a lot of development time and FPGA resources. In addition, flight qualification requires a substantial amount of time be devoted to just this interface. Another complication of putting the cPCI interface into the FPGA being developed is that configuration information loaded into the device by the cPCI microprocessor is lost when a new bit file is loaded, requiring cumbersome operations to return the system to an operational state. Finally, SRAM-based FPGAs are typically programmed via specialized cables and software, with programming files being loaded either directly into the FPGA, or into PROM devices. This can be cumbersome when doing FPGA development in an embedded environment, and does not have an easy path to flight. Currently, FPGAs used in space applications are usually programmed via multiple space-qualified PROM devices that are physically large and require extra circuitry (typically including a separate one-time programmable FPGA) to enable them to be used for this application. This technology adds a cPCI interface device with a simple, flexible, high-performance backend interface supporting multiple backend FPGAs. It includes a mechanism for programming the FPGAs directly via the microprocessor in the embedded system, eliminating specialized hardware, software, and PROM devices and their associated circuitry. It has a direct path to flight, and no extra hardware and minimal software are required to support reprogramming in flight. The device added is currently a small FPGA, but an advantage of this technology is that the design of the device does not change, regardless of the application in which it is being used. This means that it needs to be qualified for flight only once, and is suitable for one-time programmable devices or an application specific integrated circuit (ASIC). An application programming interface (API) further reduces the development time needed to use the interface device in a system.
NASA Astrophysics Data System (ADS)
Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.
2014-10-01
Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.
NASA Astrophysics Data System (ADS)
Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.
2015-03-01
Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.
A Simple and Effective Program to Increase Faculty Knowledge of and Referrals to Counseling Centers
ERIC Educational Resources Information Center
Nolan, Susan A.; Pace, Kristi A.; Iannelli, Richard J.; Palma, Thomas V.; Pakalns, Gail P.
2006-01-01
The authors describe a simple, cost-effective, and empirically supported program to increase faculty referrals of students to counseling centers (CCs). Incoming faculty members at 3 universities received a mailing and personal telephone call from a CC staff member. Faculty assigned to the outreach program had greater knowledge of and rates of…
A new algorithm to reduce noise in microscopy images implemented with a simple program in python.
Papini, Alessio
2012-03-01
All microscopical images contain noise, increasing when (e.g., transmission electron microscope or light microscope) approaching the resolution limit. Many methods are available to reduce noise. One of the most commonly used is image averaging. We propose here to use the mode of pixel values. Simple Python programs process a given number of images, recorded consecutively from the same subject. The programs calculate the mode of the pixel values in a given position (a, b). The result is a new image containing in (a, b) the mode of the values. Therefore, the final pixel value corresponds to that read in at least two of the pixels in position (a, b). The application of the program on a set of images obtained by applying salt and pepper noise and GIMP hurl noise with 10-90% standard deviation showed that the mode performs better than averaging with three-eight images. The data suggest that the mode would be more efficient (in the sense of a lower number of recorded images to process to reduce noise below a given limit) for lower number of total noisy pixels and high standard deviation (as impulse noise and salt and pepper noise), while averaging would be more efficient when the number of varying pixels is high, and the standard deviation is low, as in many cases of Gaussian noise affected images. The two methods may be used serially. Copyright © 2011 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Liao, Haitao; Wu, Wenwang; Fang, Daining
2018-07-01
A coupled approach combining the reduced space Sequential Quadratic Programming (SQP) method with the harmonic balance condensation technique for finding the worst resonance response is developed. The nonlinear equality constraints of the optimization problem are imposed on the condensed harmonic balance equations. Making use of the null space decomposition technique, the original optimization formulation in the full space is mathematically simplified, and solved in the reduced space by means of the reduced SQP method. The transformation matrix that maps the full space to the null space of the constrained optimization problem is constructed via the coordinate basis scheme. The removal of the nonlinear equality constraints is accomplished, resulting in a simple optimization problem subject to bound constraints. Moreover, second order correction technique is introduced to overcome Maratos effect. The combination application of the reduced SQP method and condensation technique permits a large reduction of the computational cost. Finally, the effectiveness and applicability of the proposed methodology is demonstrated by two numerical examples.
Nonlinear least-squares data fitting in Excel spreadsheets.
Kemmer, Gerdi; Keller, Sandro
2010-02-01
We describe an intuitive and rapid procedure for analyzing experimental data by nonlinear least-squares fitting (NLSF) in the most widely used spreadsheet program. Experimental data in x/y form and data calculated from a regression equation are inputted and plotted in a Microsoft Excel worksheet, and the sum of squared residuals is computed and minimized using the Solver add-in to obtain the set of parameter values that best describes the experimental data. The confidence of best-fit values is then visualized and assessed in a generally applicable and easily comprehensible way. Every user familiar with the most basic functions of Excel will be able to implement this protocol, without previous experience in data fitting or programming and without additional costs for specialist software. The application of this tool is exemplified using the well-known Michaelis-Menten equation characterizing simple enzyme kinetics. Only slight modifications are required to adapt the protocol to virtually any other kind of dataset or regression equation. The entire protocol takes approximately 1 h.
A Tool and Application Programming Interface for Browsing Historical Geostationary Satellite Data
NASA Astrophysics Data System (ADS)
Chee, T.; Nguyen, L.; Minnis, P.; Spangenberg, D.; Ayers, J.
2013-12-01
Providing access to information is a key concern for NASA Langley Research Center. We describe a tool and method that allows end users to easily browse and access information that is otherwise difficult to acquire and manipulate. The tool described has as its core the application-programming interface that is made available to the public. One goal of the tool is to provide a demonstration to end users so that they can use the enhanced imagery as an input into their own work flows. This project builds upon NASA Langley Cloud and Radiation Group's experience with making real-time and historical satellite imagery accessible and easily searchable. As we see the increasing use of virtual supply chains that provide additional value at each link there is value in making satellite imagery available through a simple access method as well as allowing users to browse and view that imagery as they need rather than in a manner most convenient for the data provider.
CH5M3D: an HTML5 program for creating 3D molecular structures.
Earley, Clarke W
2013-11-18
While a number of programs and web-based applications are available for the interactive display of 3-dimensional molecular structures, few of these provide the ability to edit these structures. For this reason, we have developed a library written in JavaScript to allow for the simple creation of web-based applications that should run on any browser capable of rendering HTML5 web pages. While our primary interest in developing this application was for educational use, it may also prove useful to researchers who want a light-weight application for viewing and editing small molecular structures. Molecular compounds are drawn on the HTML5 Canvas element, with the JavaScript code making use of standard techniques to allow display of three-dimensional structures on a two-dimensional canvas. Information about the structure (bond lengths, bond angles, and dihedral angles) can be obtained using a mouse or other pointing device. Both atoms and bonds can be added or deleted, and rotation about bonds is allowed. Routines are provided to read structures either from the web server or from the user's computer, and creation of galleries of structures can be accomplished with only a few lines of code. Documentation and examples are provided to demonstrate how users can access all of the molecular information for creation of web pages with more advanced features. A light-weight (≈ 75 kb) JavaScript library has been made available that allows for the simple creation of web pages containing interactive 3-dimensional molecular structures. Although this library is designed to create web pages, a web server is not required. Installation on a web server is straightforward and does not require any server-side modules or special permissions. The ch5m3d.js library has been released under the GNU GPL version 3 open-source license and is available from http://sourceforge.net/projects/ch5m3d/.
CH5M3D: an HTML5 program for creating 3D molecular structures
2013-01-01
Background While a number of programs and web-based applications are available for the interactive display of 3-dimensional molecular structures, few of these provide the ability to edit these structures. For this reason, we have developed a library written in JavaScript to allow for the simple creation of web-based applications that should run on any browser capable of rendering HTML5 web pages. While our primary interest in developing this application was for educational use, it may also prove useful to researchers who want a light-weight application for viewing and editing small molecular structures. Results Molecular compounds are drawn on the HTML5 Canvas element, with the JavaScript code making use of standard techniques to allow display of three-dimensional structures on a two-dimensional canvas. Information about the structure (bond lengths, bond angles, and dihedral angles) can be obtained using a mouse or other pointing device. Both atoms and bonds can be added or deleted, and rotation about bonds is allowed. Routines are provided to read structures either from the web server or from the user’s computer, and creation of galleries of structures can be accomplished with only a few lines of code. Documentation and examples are provided to demonstrate how users can access all of the molecular information for creation of web pages with more advanced features. Conclusions A light-weight (≈ 75 kb) JavaScript library has been made available that allows for the simple creation of web pages containing interactive 3-dimensional molecular structures. Although this library is designed to create web pages, a web server is not required. Installation on a web server is straightforward and does not require any server-side modules or special permissions. The ch5m3d.js library has been released under the GNU GPL version 3 open-source license and is available from http://sourceforge.net/projects/ch5m3d/. PMID:24246004
Integrated protocol for reliable and fast quantification and documentation of electrophoresis gels.
Rehbein, Peter; Schwalbe, Harald
2015-06-01
Quantitative analysis of electrophoresis gels is an important part in molecular cloning, as well as in protein expression and purification. Parallel quantifications in yield and purity can be most conveniently obtained from densitometric analysis. This communication reports a comprehensive, reliable and simple protocol for gel quantification and documentation, applicable for single samples and with special features for protein expression screens. As major component of the protocol, the fully annotated code of a proprietary open source computer program for semi-automatic densitometric quantification of digitized electrophoresis gels is disclosed. The program ("GelQuant") is implemented for the C-based macro-language of the widespread integrated development environment of IGOR Pro. Copyright © 2014 Elsevier Inc. All rights reserved.
Building a computer-aided design capability using a standard time share operating system
NASA Technical Reports Server (NTRS)
Sobieszczanski, J.
1975-01-01
The paper describes how an integrated system of engineering computer programs can be built using a standard commercially available operating system. The discussion opens with an outline of the auxiliary functions that an operating system can perform for a team of engineers involved in a large and complex task. An example of a specific integrated system is provided to explain how the standard operating system features can be used to organize the programs into a simple and inexpensive but effective system. Applications to an aircraft structural design study are discussed to illustrate the use of an integrated system as a flexible and efficient engineering tool. The discussion concludes with an engineer's assessment of an operating system's capabilities and desirable improvements.
Multidirectional Scanning Model, MUSCLE, to Vectorize Raster Images with Straight Lines
Karas, Ismail Rakip; Bayram, Bulent; Batuk, Fatmagul; Akay, Abdullah Emin; Baz, Ibrahim
2008-01-01
This paper presents a new model, MUSCLE (Multidirectional Scanning for Line Extraction), for automatic vectorization of raster images with straight lines. The algorithm of the model implements the line thinning and the simple neighborhood methods to perform vectorization. The model allows users to define specified criteria which are crucial for acquiring the vectorization process. In this model, various raster images can be vectorized such as township plans, maps, architectural drawings, and machine plans. The algorithm of the model was developed by implementing an appropriate computer programming and tested on a basic application. Results, verified by using two well known vectorization programs (WinTopo and Scan2CAD), indicated that the model can successfully vectorize the specified raster data quickly and accurately. PMID:27879843
Multidirectional Scanning Model, MUSCLE, to Vectorize Raster Images with Straight Lines.
Karas, Ismail Rakip; Bayram, Bulent; Batuk, Fatmagul; Akay, Abdullah Emin; Baz, Ibrahim
2008-04-15
This paper presents a new model, MUSCLE (Multidirectional Scanning for Line Extraction), for automatic vectorization of raster images with straight lines. The algorithm of the model implements the line thinning and the simple neighborhood methods to perform vectorization. The model allows users to define specified criteria which are crucial for acquiring the vectorization process. In this model, various raster images can be vectorized such as township plans, maps, architectural drawings, and machine plans. The algorithm of the model was developed by implementing an appropriate computer programming and tested on a basic application. Results, verified by using two well known vectorization programs (WinTopo and Scan2CAD), indicated that the model can successfully vectorize the specified raster data quickly and accurately.
Application of the docking program SOL for CSAR benchmark.
Sulimov, Alexey V; Kutov, Danil C; Oferkin, Igor V; Katkova, Ekaterina V; Sulimov, Vladimir B
2013-08-26
This paper is devoted to results obtained by the docking program SOL and the post-processing program DISCORE at the CSAR benchmark. SOL and DISCORE programs are described. SOL is the original docking program developed on the basis of the genetic algorithm, MMFF94 force field, rigid protein, precalculated energy grid including desolvation in the frame of simplified GB model, vdW, and electrostatic interactions and taking into account the ligand internal strain energy. An important SOL feature is the single- or multi-processor performance for up to hundreds of CPUs. DISCORE improves the binding energy scoring by the local energy optimization of the ligand docked pose and a simple linear regression on the base of available experimental data. The docking program SOL has demonstrated a good ability for correct ligand positioning in the active sites of the tested proteins in most cases of CSAR exercises. SOL and DISCORE have not demonstrated very exciting results on the protein-ligand binding free energy estimation. Nevertheless, for some target proteins, SOL and DISCORE were among the first in prediction of inhibition activity. Ways to improve SOL and DISCORE are discussed.
From technical jargon to plain English for application.
Lindsley, O R
1991-01-01
These examples of translating technical jargon into plain English application words, acronyms, letter codes, and simple tests were necessary as we developed Precision Teaching. I hope our experience is useful to others facing the problems of applying technology in practical settings. At the least, our experience should give you an idea of the work and time involved in making your own translations. Above all, be patient. Accurate plain English translations do not come easily. They cannot be made at your desk. A search often takes years to produce one new accurate plain English translation. Rapid publication pressures, journal editorial policies, and investments in materials, books, and computer programs all combine to hamper these translations. It's possible that you will find some of our plain English equivalents useful in your own applied behavior analysis applications. PMID:1752836
A Thermal Management Systems Model for the NASA GTX RBCC Concept
NASA Technical Reports Server (NTRS)
Traci, Richard M.; Farr, John L., Jr.; Laganelli, Tony; Walker, James (Technical Monitor)
2002-01-01
The Vehicle Integrated Thermal Management Analysis Code (VITMAC) was further developed to aid the analysis, design, and optimization of propellant and thermal management concepts for advanced propulsion systems. The computational tool is based on engineering level principles and models. A graphical user interface (GUI) provides a simple and straightforward method to assess and evaluate multiple concepts before undertaking more rigorous analysis of candidate systems. The tool incorporates the Chemical Equilibrium and Applications (CEA) program and the RJPA code to permit heat transfer analysis of both rocket and air breathing propulsion systems. Key parts of the code have been validated with experimental data. The tool was specifically tailored to analyze rocket-based combined-cycle (RBCC) propulsion systems being considered for space transportation applications. This report describes the computational tool and its development and verification for NASA GTX RBCC propulsion system applications.
ERIC Educational Resources Information Center
Gron, Liz U.; Bradley, Shelly B.; McKenzie, Jennifer R.; Shinn, Sara E.; Teague, M. Warfield
2013-01-01
This paper presents the use of simple, outcome-based assessment tools to design and evaluate the first semester of a new introductory laboratory program created to teach green analytical chemistry using environmental samples. This general chemistry laboratory program, like many introductory courses, has a wide array of stakeholders within and…
NASA/ESTO investments in remote sensing technologies (Conference Presentation)
NASA Astrophysics Data System (ADS)
Babu, Sachidananda R.
2017-02-01
For more then 18 years NASA Earth Science Technology Office has been investing in remote sensing technologies. During this period ESTO has invested in more then 900 tasks. These tasks are managed under multiple programs like Instrument Incubator Program (IIP), Advanced Component Technology (ACT), Advanced Information Systems Technology (AIST), In-Space Validation of Earth Science Technologies (InVEST), Sustainable Land Imaging - Technology (SLI-T) and others. This covers the whole spectrum of technologies from component to full up satellite in space and software. Over the years many of these technologies have been infused into space missions like Aquarius, SMAP, CYGNSS, SWOT, TEMPO and others. Over the years ESTO is actively investing in Infrared sensor technologies for space applications. Recent investments have been for SLI-T and InVEST program. On these tasks technology development is from simple Bolometers to Advanced Photonic waveguide based spectrometers. Some of the details on these missions and technologies will be presented.
ESTO Investments in Innovative Sensor Technologies for Remote Sensing
NASA Technical Reports Server (NTRS)
Babu, Sachidananda R.
2017-01-01
For more then 18 years NASA Earth Science Technology Office has been investing in remote sensing technologies. During this period ESTO has invested in more then 900 tasks. These tasks are managed under multiple programs like Instrument Incubator Program (IIP), Advanced Component Technology (ACT), Advanced Information Systems Technology (AIST), In-Space Validation of Earth Science Technologies (InVEST), Sustainable Land Imaging - Technology (SLI-T) and others. This covers the whole spectrum of technologies from component to full up satellite in space and software. Over the years many of these technologies have been infused into space missions like Aquarius, SMAP, CYGNSS, SWOT, TEMPO and others. Over the years ESTO is actively investing in Infrared sensor technologies for space applications. Recent investments have been for SLI-T and InVEST program. On these tasks technology development is from simple Bolometers to Advanced Photonic waveguide based spectrometers. Some of the details on these missions and technologies will be presented.
NASA Technical Reports Server (NTRS)
Chang, Sin-Chung; Himansu, Ananda; Loh, Ching-Yuen; Wang, Xiao-Yen; Yu, Shang-Tao
2003-01-01
This paper reports on a significant advance in the area of non-reflecting boundary conditions (NRBCs) for unsteady flow computations. As a part of the development of the space-time conservation element and solution element (CE/SE) method, sets of NRBCs for 1D Euler problems are developed without using any characteristics-based techniques. These conditions are much simpler than those commonly reported in the literature, yet so robust that they are applicable to subsonic, transonic and supersonic flows even in the presence of discontinuities. In addition, the straightforward multidimensional extensions of the present 1D NRBCs have been shown numerically to be equally simple and robust. The paper details the theoretical underpinning of these NRBCs, and explains their unique robustness and accuracy in terms of the conservation of space-time fluxes. Some numerical results for an extended Sod's shock-tube problem, illustrating the effectiveness of the present NRBCs are included, together with an associated simple Fortran computer program. As a preliminary to the present development, a review of the basic CE/SE schemes is also included.
Simple interventions to improve healthy eating behaviors in the school cafeteria
2016-01-01
The National School Lunch Program in the United States provides an important opportunity to improve nutrition for the 30 million children who participate every school day. The purpose of this narrative review is to present and evaluate simple, evidence-based strategies to improve healthy eating behaviors at school. Healthy eating behaviors are defined as increased selection/consumption of fruits and/or vegetables, increased selection of nutrient-dense foods, or decreased selection of low-nutrient, energy-dense foods. Data were collected from sales records, 24-hour food recalls, direct observation, and estimation of plate waste. The review is limited to simple, discrete interventions that are easy to implement. Sixteen original, peer-reviewed articles are included. Interventions are divided into 5 categories: modification of choice, behavior modification, marketing strategies, time-efficiency strategies, and fruit slicing. All interventions resulted in improved eating behaviors, but not all interventions are applicable or feasible in all settings. Because these studies were performed prior to the implementation of the new federally mandated school meal standards, it is unknown if these interventions would yield similar results if repeated now. PMID:26874753
Banta, Edward R.; Poeter, Eileen P.; Doherty, John E.; Hill, Mary C.
2006-01-01
he Joint Universal Parameter IdenTification and Evaluation of Reliability Application Programming Interface (JUPITER API) improves the computer programming resources available to those developing applications (computer programs) for model analysis.The JUPITER API consists of eleven Fortran-90 modules that provide for encapsulation of data and operations on that data. Each module contains one or more entities: data, data types, subroutines, functions, and generic interfaces. The modules do not constitute computer programs themselves; instead, they are used to construct computer programs. Such computer programs are called applications of the API. The API provides common modeling operations for use by a variety of computer applications.The models being analyzed are referred to here as process models, and may, for example, represent the physics, chemistry, and(or) biology of a field or laboratory system. Process models commonly are constructed using published models such as MODFLOW (Harbaugh et al., 2000; Harbaugh, 2005), MT3DMS (Zheng and Wang, 1996), HSPF (Bicknell et al., 1997), PRMS (Leavesley and Stannard, 1995), and many others. The process model may be accessed by a JUPITER API application as an external program, or it may be implemented as a subroutine within a JUPITER API application . In either case, execution of the model takes place in a framework designed by the application programmer. This framework can be designed to take advantage of any parallel processing capabilities possessed by the process model, as well as the parallel-processing capabilities of the JUPITER API.Model analyses for which the JUPITER API could be useful include, for example: Compare model results to observed values to determine how well the model reproduces system processes and characteristics.Use sensitivity analysis to determine the information provided by observations to parameters and predictions of interest.Determine the additional data needed to improve selected model predictions.Use calibration methods to modify parameter values and other aspects of the model.Compare predictions to regulatory limits.Quantify the uncertainty of predictions based on the results of one or many simulations using inferential or Monte Carlo methods.Determine how to manage the system to achieve stated objectives.The capabilities provided by the JUPITER API include, for example, communication with process models, parallel computations, compressed storage of matrices, and flexible input capabilities. The input capabilities use input blocks suitable for lists or arrays of data. The input blocks needed for one application can be included within one data file or distributed among many files. Data exchange between different JUPITER API applications or between applications and other programs is supported by data-exchange files.The JUPITER API has already been used to construct a number of applications. Three simple example applications are presented in this report. More complicated applications include the universal inverse code UCODE_2005 (Poeter et al., 2005), the multi-model analysis MMA (Eileen P. Poeter, Mary C. Hill, E.R. Banta, S.W. Mehl, and Steen Christensen, written commun., 2006), and a code named OPR_PPR (Matthew J. Tonkin, Claire R. Tiedeman, Mary C. Hill, and D. Matthew Ely, written communication, 2006).This report describes a set of underlying organizational concepts and complete specifics about the JUPITER API. While understanding the organizational concept presented is useful to understanding the modules, other organizational concepts can be used in applications constructed using the JUPITER API.
NASA Technical Reports Server (NTRS)
Carlson, Harry W.
1985-01-01
The purpose here is to show how two linearized theory computer programs in combination may be used for the design of low speed wing flap systems capable of high levels of aerodynamic efficiency. A fundamental premise of the study is that high levels of aerodynamic performance for flap systems can be achieved only if the flow about the wing remains predominantly attached. Based on this premise, a wing design program is used to provide idealized attached flow camber surfaces from which candidate flap systems may be derived, and, in a following step, a wing evaluation program is used to provide estimates of the aerodynamic performance of the candidate systems. Design strategies and techniques that may be employed are illustrated through a series of examples. Applicability of the numerical methods to the analysis of a representative flap system (although not a system designed by the process described here) is demonstrated in a comparison with experimental data.
Currency arbitrage detection using a binary integer programming model
NASA Astrophysics Data System (ADS)
Soon, Wanmei; Ye, Heng-Qing
2011-04-01
In this article, we examine the use of a new binary integer programming (BIP) model to detect arbitrage opportunities in currency exchanges. This model showcases an excellent application of mathematics to the real world. The concepts involved are easily accessible to undergraduate students with basic knowledge in Operations Research. Through this work, students can learn to link several types of basic optimization models, namely linear programming, integer programming and network models, and apply the well-known sensitivity analysis procedure to accommodate realistic changes in the exchange rates. Beginning with a BIP model, we discuss how it can be reduced to an equivalent but considerably simpler model, where an efficient algorithm can be applied to find the arbitrages and incorporate the sensitivity analysis procedure. A simple comparison is then made with a different arbitrage detection model. This exercise helps students learn to apply basic Operations Research concepts to a practical real-life example, and provides insights into the processes involved in Operations Research model formulations.
Software Models Impact Stresses
NASA Technical Reports Server (NTRS)
Hanshaw, Timothy C.; Roy, Dipankar; Toyooka, Mark
1991-01-01
Generalized Impact Stress Software designed to assist engineers in predicting stresses caused by variety of impacts. Program straightforward, simple to implement on personal computers, "user friendly", and handles variety of boundary conditions applied to struck body being analyzed. Applications include mathematical modeling of motions and transient stresses of spacecraft, analysis of slamming of piston, of fast valve shutoffs, and play of rotating bearing assembly. Provides fast and inexpensive analytical tool for analysis of stresses and reduces dependency on expensive impact tests. Written in FORTRAN 77. Requires use of commercial software package PLOT88.
Innovations in dynamic test restraint systems
NASA Technical Reports Server (NTRS)
Fuld, Christopher J.
1990-01-01
Recent launch system development programs have led to a new generation of large scale dynamic tests. The variety of test scenarios share one common requirement: restrain and capture massive high velocity flight hardware with no structural damage. The Space Systems Lab of McDonnell Douglas developed a remarkably simple and cost effective approach to such testing using ripstitch energy absorbers adapted from the sport of technical rockclimbing. The proven system reliability of the capture system concept has led to a wide variety of applications in test system design and in aerospace hardware design.
SMT-Aware Instantaneous Footprint Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roy, Probir; Liu, Xu; Song, Shuaiwen
Modern architectures employ simultaneous multithreading (SMT) to increase thread-level parallelism. SMT threads share many functional units and the whole memory hierarchy of a physical core. Without a careful code design, SMT threads can easily contend with each other for these shared resources, causing severe performance degradation. Minimizing SMT thread contention for HPC applications running on dedicated platforms is very challenging, because they usually spawn threads within Single Program Multiple Data (SPMD) models. To address this important issue, we introduce a simple scheme for SMT-aware code optimization, which aims to reduce the memory contention across SMT threads.
The Geostationary Operational Satellite R Series SpaceWire Based Data System Architecture
NASA Technical Reports Server (NTRS)
Krimchansky, Alexander; Anderson, William H.; Bearer, Craig
2010-01-01
The GOES-R program selected SpaceWire as the best solution to satisfy the desire for simple and flexible instrument to spacecraft command and telemetry communications. Data generated by GOES-R instruments is critical for meteorological forecasting, public safety, space weather, and other key applications. In addition, GOES-R instrument data is provided to ground stations on a 24/7 basis. GOES-R requires data errors be detected and corrected from origin to final destination. This paper describes GOES-R developed strategy to satisfy this requirement
Experience with ActiveX control for simple channel access
DOE Office of Scientific and Technical Information (OSTI.GOV)
Timossi, C.; Nishimura, H.; McDonald, J.
2003-05-15
Accelerator control system applications at Berkeley Lab's Advanced Light Source (ALS) are typically deployed on operator consoles running Microsoft Windows 2000 and utilize EPICS[2]channel access for data access. In an effort to accommodate the wide variety of Windows based development tools and developers with little experience in network programming, ActiveX controls have been deployed on the operator stations. Use of ActiveX controls for use in the accelerator control environment has been presented previously[1]. Here we report on some of our experiences with the use and development of these controls.
MR-CDF: Managing multi-resolution scientific data
NASA Technical Reports Server (NTRS)
Salem, Kenneth
1993-01-01
MR-CDF is a system for managing multi-resolution scientific data sets. It is an extension of the popular CDF (Common Data Format) system. MR-CDF provides a simple functional interface to client programs for storage and retrieval of data. Data is stored so that low resolution versions of the data can be provided quickly. Higher resolutions are also available, but not as quickly. By managing data with MR-CDF, an application can be relieved of the low-level details of data management, and can easily trade data resolution for improved access time.
WebChem Viewer: a tool for the easy dissemination of chemical and structural data sets
2014-01-01
Background Sharing sets of chemical data (e.g., chemical properties, docking scores, etc.) among collaborators with diverse skill sets is a common task in computer-aided drug design and medicinal chemistry. The ability to associate this data with images of the relevant molecular structures greatly facilitates scientific communication. There is a need for a simple, free, open-source program that can automatically export aggregated reports of entire chemical data sets to files viewable on any computer, regardless of the operating system and without requiring the installation of additional software. Results We here present a program called WebChem Viewer that automatically generates these types of highly portable reports. Furthermore, in designing WebChem Viewer we have also created a useful online web application for remotely generating molecular structures from SMILES strings. We encourage the direct use of this online application as well as its incorporation into other software packages. Conclusions With these features, WebChem Viewer enables interdisciplinary collaborations that require the sharing and visualization of small molecule structures and associated sets of heterogeneous chemical data. The program is released under the FreeBSD license and can be downloaded from http://nbcr.ucsd.edu/WebChemViewer. The associated web application (called “Smiley2png 1.0”) can be accessed through freely available web services provided by the National Biomedical Computation Resource at http://nbcr.ucsd.edu. PMID:24886360
Simple Analytic Expressions for the Magnetic Field of a Circular Current Loop
NASA Technical Reports Server (NTRS)
Simpson, James C.; Lane, John E.; Immer, Christopher D.; Youngquist, Robert C.
2001-01-01
Analytic expressions for the magnetic induction (magnetic flux density, B) of a simple planar circular current loop have been published in Cartesian and cylindrical coordinates [1,2], and are also known implicitly in spherical coordinates [3]. In this paper, we present explicit analytic expressions for B and its spatial derivatives in Cartesian, cylindrical, and spherical coordinates for a filamentary current loop. These results were obtained with extensive use of Mathematica "TM" and are exact throughout all space outside of the conductor. The field expressions reduce to the well-known limiting cases and satisfy V · B = 0 and V x B = 0 outside the conductor. These results are general and applicable to any model using filamentary circular current loops. Solenoids of arbitrary size may be easily modeled by approximating the total magnetic induction as the sum of those for the individual loops. The inclusion of the spatial derivatives expands their utility to magnetohydrodynamics where the derivatives are required. The equations can be coded into any high-level programming language. It is necessary to numerically evaluate complete elliptic integrals of the first and second kind, but this capability is now available with most programming packages.
Kim, Sung-Min
2018-01-01
Cessation of dewatering following underground mine closure typically results in groundwater rebound, because mine voids and surrounding strata undergo flooding up to the levels of the decant points, such as shafts and drifts. SIMPL (Simplified groundwater program In Mine workings using the Pipe equation and Lumped parameter model), a simplified lumped parameter model-based program for predicting groundwater levels in abandoned mines, is presented herein. The program comprises a simulation engine module, 3D visualization module, and graphical user interface, which aids data processing, analysis, and visualization of results. The 3D viewer facilitates effective visualization of the predicted groundwater level rebound phenomenon together with a topographic map, mine drift, goaf, and geological properties from borehole data. SIMPL is applied to data from the Dongwon coal mine and Dalsung copper mine in Korea, with strong similarities in simulated and observed results. By considering mine workings and interpond connections, SIMPL can thus be used to effectively analyze and visualize groundwater rebound. In addition, the predictions by SIMPL can be utilized to prevent the surrounding environment (water and soil) from being polluted by acid mine drainage. PMID:29747480
Bohnen, Jordan D; George, Brian C; Williams, Reed G; Schuller, Mary C; DaRosa, Debra A; Torbeck, Laura; Mullen, John T; Meyerson, Shari L; Auyang, Edward D; Chipman, Jeffrey G; Choi, Jennifer N; Choti, Michael A; Endean, Eric D; Foley, Eugene F; Mandell, Samuel P; Meier, Andreas H; Smink, Douglas S; Terhune, Kyla P; Wise, Paul E; Soper, Nathaniel J; Zwischenberger, Joseph B; Lillemoe, Keith D; Dunnington, Gary L; Fryer, Jonathan P
Intraoperative performance assessment of residents is of growing interest to trainees, faculty, and accreditors. Current approaches to collect such assessments are limited by low participation rates and long delays between procedure and evaluation. We deployed an innovative, smartphone-based tool, SIMPL (System for Improving and Measuring Procedural Learning), to make real-time intraoperative performance assessment feasible for every case in which surgical trainees participate, and hypothesized that SIMPL could be feasibly integrated into surgical training programs. Between September 1, 2015 and February 29, 2016, 15 U.S. general surgery residency programs were enrolled in an institutional review board-approved trial. SIMPL was made available after 70% of faculty and residents completed a 1-hour training session. Descriptive and univariate statistics analyzed multiple dimensions of feasibility, including training rates, volume of assessments, response rates/times, and dictation rates. The 20 most active residents and attendings were evaluated in greater detail. A total of 90% of eligible users (1267/1412) completed training. Further, 13/15 programs began using SIMPL. Totally, 6024 assessments were completed by 254 categorical general surgery residents (n = 3555 assessments) and 259 attendings (n = 2469 assessments), and 3762 unique operations were assessed. There was significant heterogeneity in participation within and between programs. Mean percentage (range) of users who completed ≥1, 5, and 20 assessments were 62% (21%-96%), 34% (5%-75%), and 10% (0%-32%) across all programs, and 96%, 75%, and 32% in the most active program. Overall, response rate was 70%, dictation rate was 24%, and mean response time was 12 hours. Assessments increased from 357 (September 2015) to 1146 (February 2016). The 20 most active residents each received mean 46 assessments by 10 attendings for 20 different procedures. SIMPL can be feasibly integrated into surgical training programs to enhance the frequency and timeliness of intraoperative performance assessment. We believe SIMPL could help facilitate a national competency-based surgical training system, although local and systemic challenges still need to be addressed. Copyright © 2016. Published by Elsevier Inc.
NASA Technical Reports Server (NTRS)
Chimiak, Reine; Harris, Bernard; Williams, Phillip
2013-01-01
Basic Common Data Format (CDF) tools (e.g., cdfedit) provide no specific support for creating International Solar-Terrestrial Physics/Space Physics Data Facility (ISTP/SPDF) standard files. While it is possible for someone who is familiar with the ISTP/SPDF metadata guidelines to create compliant files using just the basic tools, the process is error-prone and unreasonable for someone without ISTP/SPDF expertise. The key problem is the lack of a tool with specific support for creating files that comply with the ISTP/SPDF guidelines. There are basic CDF tools such as cdfedit and skeletoncdf for creating CDF files, but these have no specific support for creating ISTP/ SPDF compliant files. The SPDF ISTP CDF skeleton editor is a cross-platform, Java-based GUI editor program that allows someone with only a basic understanding of the ISTP/SPDF guidelines to easily create compliant files. The editor is a simple graphical user interface (GUI) application for creating and editing ISTP/SPDF guideline-compliant skeleton CDF files. The SPDF ISTP CDF skeleton editor consists of the following components: A swing-based Java GUI program, JavaHelp-based manual/ tutorial, Image/Icon files, and HTML Web page for distribution. The editor is available as a traditional Java desktop application as well as a Java Network Launching Protocol (JNLP) application. Once started, it functions like a typical Java GUI file editor application for creating/editing application-unique files.
Ovchinnikov, Victor; Nam, Kwangho; Karplus, Martin
2016-08-25
A method is developed to obtain simultaneously free energy profiles and diffusion constants from restrained molecular simulations in diffusive systems. The method is based on low-order expansions of the free energy and diffusivity as functions of the reaction coordinate. These expansions lead to simple analytical relationships between simulation statistics and model parameters. The method is tested on 1D and 2D model systems; its accuracy is found to be comparable to or better than that of the existing alternatives, which are briefly discussed. An important aspect of the method is that the free energy is constructed by integrating its derivatives, which can be computed without need for overlapping sampling windows. The implementation of the method in any molecular simulation program that supports external umbrella potentials (e.g., CHARMM) requires modification of only a few lines of code. As a demonstration of its applicability to realistic biomolecular systems, the method is applied to model the α-helix ↔ β-sheet transition in a 16-residue peptide in implicit solvent, with the reaction coordinate provided by the string method. Possible modifications of the method are briefly discussed; they include generalization to multidimensional reaction coordinates [in the spirit of the model of Ermak and McCammon (Ermak, D. L.; McCammon, J. A. J. Chem. Phys. 1978, 69, 1352-1360)], a higher-order expansion of the free energy surface, applicability in nonequilibrium systems, and a simple test for Markovianity. In view of the small overhead of the method relative to standard umbrella sampling, we suggest its routine application in the cases where umbrella potential simulations are appropriate.
Liaw, Sok Ying; Koh, Yiwen; Dawood, Rabiah; Kowitlawakul, Yanika; Zhou, Wentao; Lau, Siew Tiang
2014-03-01
Preparing nursing students for making the transition to graduate nurse is crucial for entry into practice. Final year student nurses at the National University of Singapore (NUS) are required to undergo a consolidated clinical practice to prepare them for their transition to graduate nurse. To describe the development, implementation and evaluation of a simulation program known as SIMulated Professional Learning Environment (SIMPLE) in preparing the final year student nurses for their clinical practicum in transition to graduate nurse practice. A set of simulation features and best practices were used as conceptual framework to develop and implement the simulation program. 94 final year student nurses participated in the 15-hour SIMPLE program that incorporated multiple simulation scenarios based on actual ward clinical practices. Pre and post-tests were conducted to assess the students' preparedness for their clinical practice in transition to graduate nurse practice. The students also completed a satisfaction questionnaire and open questions to evaluate their simulation experiences. The student nurses demonstrated a significant improvement (t=12.06, p<0.01) on post-test score (mean=117.21, SD=15.17) from pre-test score (mean=97.86, SD=15.08) for their perceived preparedness towards their clinical practicum in transition to graduate nurse practice. They were highly satisfied with their simulation learning. Themes emerged from the comments on the most valuable aspects of the SIMPLE program and ways to improve the program. The study provided evidences on the effectiveness of the SIMPLE program in enhancing the students' preparedness for their transition to graduate nurse practice. A key success of the SIMPLE program was the used of simulation strategy and the involvement of practicing nurses that closely linked the students with the realities of current nursing practice to prepare them for the role of staff nurses. Copyright © 2013 Elsevier Ltd. All rights reserved.
Colucci, Philip G; Kostandy, Petro; Shrauner, William R; Arleo, Elizabeth; Fuortes, Michele; Griffin, Andrew S; Huang, Yun-Han; Juluru, Krishna; Tsiouris, Apostolos John
2015-02-01
Rationale and Objectives: The primary role of radiology in the preclinical setting is the use of imaging to improve students' understanding of anatomy. Many currently available Web-based anatomy programs include either suboptimal or overwhelming levels of detail for medical students.Our objective was to develop a user-friendly software program that anatomy instructors can completely tailor to match the desired level of detail for their curriculum, meets the unique needs of the first- and the second-year medical students, and is compatible with most Internet browsers and tablets.Materials and Methods: RadStax is a Web-based application developed using free, open-source, ubiquitous software. RadStax was first introduced as an interactive resource for independent study and later incorporated into lectures. First- and second-year medical students were surveyed for quantitative feedback regarding their experience.Results: RadStax was successfully introduced into our medical school curriculum. It allows the creation of learning modules with labeled multiplanar (MPR) image sets, basic anatomic information, and a self-assessment feature. The program received overwhelmingly positive feedback from students. Of 115 students surveyed, 87.0% found it highly effective as a study tool and 85.2% reported high user satisfaction with the program.Conclusions: RadStax is a novel application for instructors wishing to create an atlas of labeled MPR radiologic studies tailored to meet the specific needs their curriculum. Simple and focused, it provides an interactive experience for students similar to the practice of radiologists.This program is a robust anatomy teaching tool that effectively aids in educating the preclinical medical student.
Colucci, Philip G.; Kostandy, Petro; Shrauner, William R.; Arleo, Elizabeth; Fuortes, Michele; Griffin, Andrew S.; Huang, Yun-Han; Juluru, Krishna; Tsiouris, Apostolos John
2016-01-01
Rationale and Objectives The primary role of radiology in the preclinical setting is the use of imaging to improve students’ understanding of anatomy. Many currently available Web-based anatomy programs include either suboptimal or overwhelming levels of detail for medical students. Our objective was to develop a user-friendly software program that anatomy instructors can completely tailor to match the desired level of detail for their curriculum, meets the unique needs of the first- and the second-year medical students, and is compatible with most Internet browsers and tablets. Materials and Methods RadStax is a Web-based application developed using free, open-source, ubiquitous software. RadStax was first introduced as an interactive resource for independent study and later incorporated into lectures. First- and second-year medical students were surveyed for quantitative feedback regarding their experience. Results RadStax was successfully introduced into our medical school curriculum. It allows the creation of learning modules with labeled multiplanar (MPR) image sets, basic anatomic information, and a self-assessment feature. The program received overwhelmingly positive feedback from students. Of 115 students surveyed, 87.0% found it highly effective as a study tool and 85.2% reported high user satisfaction with the program. Conclusions RadStax is a novel application for instructors wishing to create an atlas of labeled MPR radiologic studies tailored to meet the specific needs their curriculum. Simple and focused, it provides an interactive experience for students similar to the practice of radiologists. This program is a robust anatomy teaching tool that effectively aids in educating the preclinical medical student. PMID:25964956
Falcone, John L; Middleton, Donald B
2013-01-01
The Accreditation Council for Graduate Medical Education (ACGME) sets residency performance standards for the American Board of Family Medicine Certification Examination. This study aims are to describe the compliance of residency programs with ACGME standards and to determine whether residency pass rates depend on program size and location. In this retrospective cohort study, residency performance from 2007 to 2011 was compared with the ACGME performance standards. Simple linear regression was performed to see whether program pass rates were dependent on program size. Regional differences in performance were compared with χ(2) tests, using an α level of 0.05. Of 429 total residency programs, there were 205 (47.8%) that violate ACGME performance standards. Linear regression showed that program pass rates were positively correlated and dependent on program size (P < .001). The median pass rate per state was 86.4% (interquartile range, 82.0-90.8. χ(2) Tests showed that states in the West performed higher than the other 3 US Census Bureau Regions (all P < .001). Approximately half of the family medicine training programs do not meet the ACGME examination performance standards. Pass rates are associated with residency program size, and regional variation occurs. These findings have the potential to affect ACGME policy and residency program application patterns.
WeBIAS: a web server for publishing bioinformatics applications.
Daniluk, Paweł; Wilczyński, Bartek; Lesyng, Bogdan
2015-11-02
One of the requirements for a successful scientific tool is its availability. Developing a functional web service, however, is usually considered a mundane and ungratifying task, and quite often neglected. When publishing bioinformatic applications, such attitude puts additional burden on the reviewers who have to cope with poorly designed interfaces in order to assess quality of presented methods, as well as impairs actual usefulness to the scientific community at large. In this note we present WeBIAS-a simple, self-contained solution to make command-line programs accessible through web forms. It comprises a web portal capable of serving several applications and backend schedulers which carry out computations. The server handles user registration and authentication, stores queries and results, and provides a convenient administrator interface. WeBIAS is implemented in Python and available under GNU Affero General Public License. It has been developed and tested on GNU/Linux compatible platforms covering a vast majority of operational WWW servers. Since it is written in pure Python, it should be easy to deploy also on all other platforms supporting Python (e.g. Windows, Mac OS X). Documentation and source code, as well as a demonstration site are available at http://bioinfo.imdik.pan.pl/webias . WeBIAS has been designed specifically with ease of installation and deployment of services in mind. Setting up a simple application requires minimal effort, yet it is possible to create visually appealing, feature-rich interfaces for query submission and presentation of results.
Test Waveform Applications for JPL STRS Operating Environment
NASA Technical Reports Server (NTRS)
Lux, James P.; Peters, Kenneth J.; Taylor, Gregory H.; Lang, Minh; Stern, Ryan A.; Duncan, Courtney B.
2013-01-01
This software demonstrates use of the JPL Space Telecommunications Radio System (STRS) Operating Environment (OE), tests APIs (application programming interfaces) presented by JPL STRS OE, and allows for basic testing of the underlying hardware platform. This software uses the JPL STRS Operating Environment ["JPL Space Tele com - munications Rad io System Operating Environment,"(NPO-4776) NASA Tech Briefs, commercial edition, Vol. 37, No. 1 (January 2013), p. 47] to interact with the JPL-SDR Software Defined Radio developed for the CoNNeCT (COmmunications, Navigation, and Networking rEconfigurable Testbed) Project as part of the SCaN Testbed installed on the International Space Station (ISS). These are the first applications that are compliant with the new NASA STRS Architecture Standard. Several example waveform applications are provided to demonstrate use of the JPL STRS OE for the JPL-SDR platform used for the CoNNeCT Project. The waveforms provide a simple digitizer and playback capability for the SBand RF slice, and a simple digitizer for the GPS slice [CoNNeCT Global Positioning System RF Module, (NPO-47764) NASA Tech Briefs, commercial edition, Vol. 36, No. 3 (March 2012), p. 36]. These waveforms may be used for hardware test, as well as for on-orbit or laboratory checkout. Additional example waveforms implement SpaceWire and timer modules, which can be used for time transfer and demonstration of communication between the two Xilinx FPGAs in the JPLSDR. The waveforms are also compatible with ground-based use of the JPL STRS OE on radio breadboards and Linux.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spence, R.D.; Godbee, H.W.; Tallent, O.K.
1991-01-01
Despite the demonstrated importance of diffusion control in leaching, other mechanisms have been observed to play a role and leaching from porous solid bodies is not simple diffusion. Only simple diffusion theory has been developed well enough for extrapolation, as yet. The well developed diffusion theory, used in data analysis by ANSI/ANS-16.1 and the NEWBOX program, can help in trying to extrapolate and predict the performance of solidified waste forms over decades and centuries, but the limitations and increased uncertainty must be understood in so doing. Treating leaching as a semi-infinite medium problem, as done in the Cote model, resultsmore » in simpler equations, but limits, application to early leaching behavior when less than 20% of a given component has been leached. 18 refs., 2 tabs.« less
Optimization strategies for molecular dynamics programs on Cray computers and scalar work stations
NASA Astrophysics Data System (ADS)
Unekis, Michael J.; Rice, Betsy M.
1994-12-01
We present results of timing runs and different optimization strategies for a prototype molecular dynamics program that simulates shock waves in a two-dimensional (2-D) model of a reactive energetic solid. The performance of the program may be improved substantially by simple changes to the Fortran or by employing various vendor-supplied compiler optimizations. The optimum strategy varies among the machines used and will vary depending upon the details of the program. The effect of various compiler options and vendor-supplied subroutine calls is demonstrated. Comparison is made between two scalar workstations (IBM RS/6000 Model 370 and Model 530) and several Cray supercomputers (X-MP/48, Y-MP8/128, and C-90/16256). We find that for a scientific application program dominated by sequential, scalar statements, a relatively inexpensive high-end work station such as the IBM RS/60006 RISC series will outperform single processor performance of the Cray X-MP/48 and perform competitively with single processor performance of the Y-MP8/128 and C-9O/16256.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Juhee; Lee, Sungpyo; Lee, Moo Hyung
Quasi-unipolar non-volatile organic transistor memory (NOTM) can combine the best characteristics of conventional unipolar and ambipolar NOTMs and, as a result, exhibit improved device performance. Unipolar NOTMs typically exhibit a large signal ratio between the programmed and erased current signals but also require a large voltage to program and erase the memory cells. Meanwhile, an ambipolar NOTM can be programmed and erased at lower voltages, but the resulting signal ratio is small. By embedding a discontinuous n-type fullerene layer within a p-type pentacene film, quasi-unipolar NOTMs are fabricated, of which the signal storage utilizes both electrons and holes while themore » electrical signal relies on only hole conduction. These devices exhibit superior memory performance relative to both pristine unipolar pentacene devices and ambipolar fullerene/pentacene bilayer devices. The quasi-unipolar NOTM exhibited a larger signal ratio between the programmed and erased states while also reducing the voltage required to program and erase a memory cell. This simple approach should be readily applicable for various combinations of advanced organic semiconductors that have been recently developed and thereby should make a significant impact on organic memory research.« less
Shillcutt, Samuel D; LeFevre, Amnesty E; Fischer-Walker, Christa L; Taneja, Sunita; Black, Robert E; Mazumder, Sarmila
2017-01-01
This study evaluates the cost-effectiveness of the DAZT program for scaling up treatment of acute child diarrhea in Gujarat India using a net-benefit regression framework. Costs were calculated from societal and caregivers' perspectives and effectiveness was assessed in terms of coverage of zinc and both zinc and Oral Rehydration Salt. Regression models were tested in simple linear regression, with a specified set of covariates, and with a specified set of covariates and interaction terms using linear regression with endogenous treatment effects was used as the reference case. The DAZT program was cost-effective with over 95% certainty above $5.50 and $7.50 per appropriately treated child in the unadjusted and adjusted models respectively, with specifications including interaction terms being cost-effective with 85-97% certainty. Findings from this study should be combined with other evidence when considering decisions to scale up programs such as the DAZT program to promote the use of ORS and zinc to treat child diarrhea.
Fulfilling the promise of holographic optical elements
NASA Astrophysics Data System (ADS)
Moss, Gaylord E.
1990-05-01
Consider the whole class of holographic optical elements which either contain pictorial image information or have the ability to modify wavefronts. Even after many years of development, there are pitifully few marketable applications. The visionary promises that holography would create a revolution in the optics and display industries have not been fulfilled. Time has shown that, while it was relatively simple to dream up ideas for myriad applications, these ideas have generally not moved beyond laboratory demonstrations. Exceptions are a few items such as optical elements for supermarket scanners, head-up displays and laser diode lenses. This paper addresses: 1. The many promises of holographic elements 2. The difficulties of practical implementation 3. A reassessment of research and development priorities To give simple examples of these points, they are discussed mainly as they apply to one type of holographic application: automotive displays. These familiar displays give a clear example of both the promises and difficulties that holographic elements present in the world of high volume, low-costproduction. Automotive displays could be considered as a trivial application alongside more interesting fundamental research programs or high cost, sophisticated military applications. One might even consider "trivial" automotive displays to be a disreputable subject for serious researchers. The case is made that exactly the opposite is true. The resources for large scale development exist only in a healthy commercial market. An example is the Japanese funding of high technology through commercial product development. This has been shown to be effective in the development of other technologies, such as ceramics, semiconductors, solar cells and composite materials. In like manner, if holography is to become an economically important technology, more and more competent researchers must start looking outside the universities and military laboratories for support. They must involve themselves in some of the "trivial" commercial applications.
Carlos, Emanuel; Kiazadeh, Asal; Deuermeier, Jonas; Branquinho, Rita; Martins, Rodrigo; Fortunato, Elvira
2018-08-24
Lately, resistive switching memories (ReRAM) have been attracting a lot of attention due to their possibilities of fast operation, lower power consumption and simple fabrication process and they can also be scaled to very small dimensions. However, most of these ReRAM are produced by physical methods and nowadays the industry demands more simplicity, typically associated with low cost manufacturing. As such, ReRAMs in this work are developed from a solution-based aluminum oxide (Al 2 O 3 ) using a simple combustion synthesis process. The device performance is optimized by two-stage deposition of the Al 2 O 3 film. The resistive switching properties of the bilayer devices are reproducible with a yield of 100%. The ReRAM devices show unipolar resistive switching behavior with good endurance and retention time up to 10 5 s at 85 °C. The devices can be programmed in a multi-level cell operation mode by application of different reset voltages. Temperature analysis of various resistance states reveals a filamentary nature based on the oxygen vacancies. The optimized film was stacked between ITO and indium zinc oxide, targeting a fully transparent device for applications on transparent system-on-panel technology.
Software Applications to Access Earth Science Data: Building an ECHO Client
NASA Astrophysics Data System (ADS)
Cohen, A.; Cechini, M.; Pilone, D.
2010-12-01
Historically, developing an ECHO (NASA’s Earth Observing System (EOS) ClearingHOuse) client required interaction with its SOAP API. SOAP, as a framework for web service communication has numerous advantages for Enterprise applications and Java/C# type programming languages. However, as interest has grown for quick development cycles and more intriguing “mashups,” ECHO has seen the SOAP API lose its appeal. In order to address these changing needs, ECHO has introduced two new interfaces facilitating simple access to its metadata holdings. The first interface is built upon the OpenSearch format and ESIP Federated Search framework. The second interface is built upon the Representational State Transfer (REST) architecture. Using the REST and OpenSearch APIs to access ECHO makes development with modern languages much more feasible and simpler. Client developers can leverage the simple interaction with ECHO to focus more of their time on the advanced functionality they are presenting to users. To demonstrate the simplicity of developing with the REST API, participants will be led through a hands-on experience where they will develop an ECHO client that performs the following actions: + Login + Provider discovery + Provider based dataset discovery + Dataset, Temporal, and Spatial constraint based Granule discovery + Online Data Access
Applying Jlint to Space Exploration Software
NASA Technical Reports Server (NTRS)
Artho, Cyrille; Havelund, Klaus
2004-01-01
Java is a very successful programming language which is also becoming widespread in embedded systems, where software correctness is critical. Jlint is a simple but highly efficient static analyzer that checks a Java program for several common errors, such as null pointer exceptions, and overflow errors. It also includes checks for multi-threading problems, such as deadlocks and data races. The case study described here shows the effectiveness of Jlint in find-false positives in the multi-threading warnings gives an insight into design patterns commonly used in multi-threaded code. The results show that a few analysis techniques are sufficient to avoid almost all false positives. These techniques include investigating all possible callers and a few code idioms. Verifying the correct application of these patterns is still crucial, because their correct usage is not trivial.
Greenhouse effect simulator - An educational application
NASA Astrophysics Data System (ADS)
Machado, Alan Freitas; Viveiros, Bruno Martins; da Silva, Claudio Elias
2016-12-01
Using the program "Modellus", we intend to create a simple simulation to show the impacts that the Greenhouse Effect might have, in a didactic and friendly way, in order to expose this notions to high and middle school students. In order to do so, we created a program that will simulate a sweep, through the Troposphere, and create two lines in a graphic, one showing the temperatures behavior, in normal conditions, and the other showing how the temperature behaves in the presence of excess of Greenhouse gases. The main purpose of the project is to use the model in schools and try to make kids more conscious of their roles in our so society, showing them the consequences of the tendency of our acts, stimulating them to be more proactives to change the future.
A simple tool for stereological assessment of digital images: the STEPanizer.
Tschanz, S A; Burri, P H; Weibel, E R
2011-07-01
STEPanizer is an easy-to-use computer-based software tool for the stereological assessment of digitally captured images from all kinds of microscopical (LM, TEM, LSM) and macroscopical (radiology, tomography) imaging modalities. The program design focuses on providing the user a defined workflow adapted to most basic stereological tasks. The software is compact, that is user friendly without being bulky. STEPanizer comprises the creation of test systems, the appropriate display of digital images with superimposed test systems, a scaling facility, a counting module and an export function for the transfer of results to spreadsheet programs. Here we describe the major workflow of the tool illustrating the application on two examples from transmission electron microscopy and light microscopy, respectively. © 2011 The Authors Journal of Microscopy © 2011 Royal Microscopical Society.
NASA Astrophysics Data System (ADS)
Iwasawa, Masaki; Tanikawa, Ataru; Hosono, Natsuki; Nitadori, Keigo; Muranushi, Takayuki; Makino, Junichiro
2016-08-01
We present the basic idea, implementation, measured performance, and performance model of FDPS (Framework for Developing Particle Simulators). FDPS is an application-development framework which helps researchers to develop simulation programs using particle methods for large-scale distributed-memory parallel supercomputers. A particle-based simulation program for distributed-memory parallel computers needs to perform domain decomposition, exchange of particles which are not in the domain of each computing node, and gathering of the particle information in other nodes which are necessary for interaction calculation. Also, even if distributed-memory parallel computers are not used, in order to reduce the amount of computation, algorithms such as the Barnes-Hut tree algorithm or the Fast Multipole Method should be used in the case of long-range interactions. For short-range interactions, some methods to limit the calculation to neighbor particles are required. FDPS provides all of these functions which are necessary for efficient parallel execution of particle-based simulations as "templates," which are independent of the actual data structure of particles and the functional form of the particle-particle interaction. By using FDPS, researchers can write their programs with the amount of work necessary to write a simple, sequential and unoptimized program of O(N2) calculation cost, and yet the program, once compiled with FDPS, will run efficiently on large-scale parallel supercomputers. A simple gravitational N-body program can be written in around 120 lines. We report the actual performance of these programs and the performance model. The weak scaling performance is very good, and almost linear speed-up was obtained for up to the full system of the K computer. The minimum calculation time per timestep is in the range of 30 ms (N = 107) to 300 ms (N = 109). These are currently limited by the time for the calculation of the domain decomposition and communication necessary for the interaction calculation. We discuss how we can overcome these bottlenecks.
Fan, Daoqing; Shang, Changshuai; Gu, Wenling; Wang, Erkang; Dong, Shaojun
2017-08-09
Glutathione (GSH) plays crucial roles in various biological functions, the level alterations of which have been linked to varieties of diseases. Herein, we for the first time expanded the application of oxidase-like property of MnO 2 nanosheet (MnO 2 NS) to fluorescent substrates of peroxidase. Different from previously reported fluorescent quenching phenomena, we found that MnO 2 NS could not only largely quench the fluorescence of highly fluorescent Scopoletin (SC) but also surprisingly enhance that of nonfluorescent Amplex Red (AR) via oxidation reaction. If MnO 2 NS is premixed with GSH, it will be reduced to Mn 2+ and lose the oxidase-like property, accompanied by subsequent increase in SC's fluorescence and decrease in AR's. On the basis of the above mechanism, we construct the first MnO 2 NS-based ratiometric fluorescent sensor for ultrasensitive and selective detection of GSH. Notably, this ratiometric sensor is programmed by the cascade logic circuit (an INHIBIT gate cascade with a 1 to 2 decoder). And a linear relationship between ratiometric fluorescent intensities of the two substrates and logarithmic values of GSH's concentrations is obtained. The detection limit of GSH is as low as 6.7 nM, which is much lower than previous ratiometric fluorescent sensors, and the lowest MnO 2 NS-based fluorescent GSH sensor reported so far. Furthermore, this sensor is simple, label-free, and low-cost; it also presents excellent applicability in human serum samples.
Integration of Space Weather Forecasts into Space Protection
NASA Astrophysics Data System (ADS)
Reeves, G.
2012-09-01
How would the US respond to a clandestine attack that disabled one of our satellites? How would we know that it was an attack, not a natural failure? The goal of space weather programs as applied to space protection are simple: Provide a rapid and reliable assessment of the probability that satellite or system failure was caused by the space environment. Achieving that goal is not as simple. However, great strides are being made on a number of fronts. We will report on recent successes in providing rapid, automated anomaly/attack assessment for the penetrating radiation environment in the Earth's radiation belts. We have previously reported on the Dynamic Radiation Environment Assimilation Model (DREAM) that was developed at Los Alamos National Laboratory to assess hazards posed by the natural and by nuclear radiation belts. This year we will report on recent developments that are moving this program from the research, test, and evaluation phases to real-time implementation and application. We will discuss the challenges of leveraging space environment data sets for applications that are beyond the scope of mission requirements, the challenges of moving data from where they exist to where they are needed, the challenges of turning data into actionable information, and how those challenges were overcome. We will discuss the state-of-the-art as it exists in 2012 including the new capabilities that have been enabled and the limitations that still exist. We will also discuss how currently untapped data resources could advance the state-of-the-art and the future steps for implementing automatic real-time anomaly forensics.
Information Content in Radio Waves: Student Investigations in Radio Science
NASA Astrophysics Data System (ADS)
Jacobs, K.; Scaduto, T.
2013-12-01
We describe an inquiry-based instructional unit on information content in radio waves, created in the summer of 2013 as part of a MIT Haystack Observatory (Westford, MA) NSF Research Experiences for Teachers (RET) program. This topic is current and highly relevant, addressing science and technical aspects from radio astronomy, geodesy, and atmospheric research areas as well as Next Generation Science Standards (NGSS). Projects and activities range from simple classroom demonstrations and group investigations, to long term research projects incorporating data acquisition from both student-built instrumentation as well as online databases. Each of the core lessons is applied to one of the primary research centers at Haystack through an inquiry project that builds on previously developed units through the MIT Haystack RET program. In radio astronomy, students investigate the application of a simple and inexpensive software defined radio chip (RTL-SDR) for use in systems implementing a small and very small radio telescope (SRT and VSRT). Both of these systems allow students to explore fundamental principles of radio waves and interferometry as applied to radio astronomy. In ionospheric research, students track solar storms from the initial coronal mass ejection (using Solar Dynamics Observatory images) to the resulting variability in total electron density concentrations using data from the community standard Madrigal distributed database system maintained by MIT Haystack. Finally, students get to explore very long-baseline interferometry as it is used in geodetic studies by measuring crustal plate displacements over time. Alignment to NextGen standards is provided for each lesson and activity with emphasis on HS-PS4 'Waves and Their Applications in Technologies for Information Transfer'.
Application-Level Interoperability Across Grids and Clouds
NASA Astrophysics Data System (ADS)
Jha, Shantenu; Luckow, Andre; Merzky, Andre; Erdely, Miklos; Sehgal, Saurabh
Application-level interoperability is defined as the ability of an application to utilize multiple distributed heterogeneous resources. Such interoperability is becoming increasingly important with increasing volumes of data, multiple sources of data as well as resource types. The primary aim of this chapter is to understand different ways in which application-level interoperability can be provided across distributed infrastructure. We achieve this by (i) using the canonical wordcount application, based on an enhanced version of MapReduce that scales-out across clusters, clouds, and HPC resources, (ii) establishing how SAGA enables the execution of wordcount application using MapReduce and other programming models such as Sphere concurrently, and (iii) demonstrating the scale-out of ensemble-based biomolecular simulations across multiple resources. We show user-level control of the relative placement of compute and data and also provide simple performance measures and analysis of SAGA-MapReduce when using multiple, different, heterogeneous infrastructures concurrently for the same problem instance. Finally, we discuss Azure and some of the system-level abstractions that it provides and show how it is used to support ensemble-based biomolecular simulations.
Process control charts in infection prevention: Make it simple to make it happen.
Wiemken, Timothy L; Furmanek, Stephen P; Carrico, Ruth M; Mattingly, William A; Persaud, Annuradha K; Guinn, Brian E; Kelley, Robert R; Ramirez, Julio A
2017-03-01
Quality improvement is central to Infection Prevention and Control (IPC) programs. Challenges may occur when applying quality improvement methodologies like process control charts, often due to the limited exposure of typical IPs. Because of this, our team created an open-source database with a process control chart generator for IPC programs. The objectives of this report are to outline the development of the application and demonstrate application using simulated data. We used Research Electronic Data Capture (REDCap Consortium, Vanderbilt University, Nashville, TN), R (R Foundation for Statistical Computing, Vienna, Austria), and R Studio Shiny (R Foundation for Statistical Computing) to create an open source data collection system with automated process control chart generation. We used simulated data to test and visualize both in-control and out-of-control processes for commonly used metrics in IPC programs. The R code for implementing the control charts and Shiny application can be found on our Web site (https://github.com/ul-research-support/spcapp). Screen captures of the workflow and simulated data indicating both common cause and special cause variation are provided. Process control charts can be easily developed based on individual facility needs using freely available software. Through providing our work free to all interested parties, we hope that others will be able to harness the power and ease of use of the application for improving the quality of care and patient safety in their facilities. Copyright © 2017 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
A new version of Visual tool for estimating the fractal dimension of images
NASA Astrophysics Data System (ADS)
Grossu, I. V.; Felea, D.; Besliu, C.; Jipa, Al.; Bordeianu, C. C.; Stan, E.; Esanu, T.
2010-04-01
This work presents a new version of a Visual Basic 6.0 application for estimating the fractal dimension of images (Grossu et al., 2009 [1]). The earlier version was limited to bi-dimensional sets of points, stored in bitmap files. The application was extended for working also with comma separated values files and three-dimensional images. New version program summaryProgram title: Fractal Analysis v02 Catalogue identifier: AEEG_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEG_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 9999 No. of bytes in distributed program, including test data, etc.: 4 366 783 Distribution format: tar.gz Programming language: MS Visual Basic 6.0 Computer: PC Operating system: MS Windows 98 or later RAM: 30 M Classification: 14 Catalogue identifier of previous version: AEEG_v1_0 Journal reference of previous version: Comput. Phys. Comm. 180 (2009) 1999 Does the new version supersede the previous version?: Yes Nature of problem: Estimating the fractal dimension of 2D and 3D images. Solution method: Optimized implementation of the box-counting algorithm. Reasons for new version:The previous version was limited to bitmap image files. The new application was extended in order to work with objects stored in comma separated values (csv) files. The main advantages are: Easier integration with other applications (csv is a widely used, simple text file format); Less resources consumed and improved performance (only the information of interest, the "black points", are stored); Higher resolution (the points coordinates are loaded into Visual Basic double variables [2]); Possibility of storing three-dimensional objects (e.g. the 3D Sierpinski gasket). In this version the optimized box-counting algorithm [1] was extended to the three-dimensional case. Summary of revisions:The application interface was changed from SDI (single document interface) to MDI (multi-document interface). One form was added in order to provide a graphical user interface for the new functionalities (fractal analysis of 2D and 3D images stored in csv files). Additional comments: User friendly graphical interface; Easy deployment mechanism. Running time: In the first approximation, the algorithm is linear. References:[1] I.V. Grossu, C. Besliu, M.V. Rusu, Al. Jipa, C.C. Bordeianu, D. Felea, Comput. Phys. Comm. 180 (2009) 1999-2001.[2] F. Balena, Programming Microsoft Visual Basic 6.0, Microsoft Press, US, 1999.
Inverse kinematic solution for near-simple robots and its application to robot calibration
NASA Technical Reports Server (NTRS)
Hayati, Samad A.; Roston, Gerald P.
1986-01-01
This paper provides an inverse kinematic solution for a class of robot manipulators called near-simple manipulators. The kinematics of these manipulators differ from those of simple-robots by small parameter variations. Although most robots are by design simple, in practice, due to manufacturing tolerances, every robot is near-simple. The method in this paper gives an approximate inverse kinematics solution for real time applications based on the nominal solution for these robots. The validity of the results are tested both by a simulation study and by applying the algorithm to a PUMA robot.
Using Probabilistic Information in Solving Resource Allocation Problems for a Decentralized Firm
1978-09-01
deterministic equivalent form of HIQ’s problem (5) by an approach similar to the one used in stochastic programming with simple recourse. See Ziemba [38) or, in...1964). 38. Ziemba , W.T., "Stochastic Programs with Simple Recourse," Technical Report 72-15, Stanford University, Department of Operations Research
NASA Astrophysics Data System (ADS)
Terry, N.; Day-Lewis, F. D.; Werkema, D. D.; Lane, J. W., Jr.
2017-12-01
Soil moisture is a critical parameter for agriculture, water supply, and management of landfills. Whereas direct data (as from TDR or soil moisture probes) provide localized point scale information, it is often more desirable to produce 2D and/or 3D estimates of soil moisture from noninvasive measurements. To this end, geophysical methods for indirectly assessing soil moisture have great potential, yet are limited in terms of quantitative interpretation due to uncertainty in petrophysical transformations and inherent limitations in resolution. Simple tools to produce soil moisture estimates from geophysical data are lacking. We present a new standalone program, MoisturEC, for estimating moisture content distributions from electrical conductivity data. The program uses an indicator kriging method within a geostatistical framework to incorporate hard data (as from moisture probes) and soft data (as from electrical resistivity imaging or electromagnetic induction) to produce estimates of moisture content and uncertainty. The program features data visualization and output options as well as a module for calibrating electrical conductivity with moisture content to improve estimates. The user-friendly program is written in R - a widely used, cross-platform, open source programming language that lends itself to further development and customization. We demonstrate use of the program with a numerical experiment as well as a controlled field irrigation experiment. Results produced from the combined geostatistical framework of MoisturEC show improved estimates of moisture content compared to those generated from individual datasets. This application provides a convenient and efficient means for integrating various data types and has broad utility to soil moisture monitoring in landfills, agriculture, and other problems.
A new programming metaphor for image processing procedures
NASA Technical Reports Server (NTRS)
Smirnov, O. M.; Piskunov, N. E.
1992-01-01
Most image processing systems, besides an Application Program Interface (API) which lets users write their own image processing programs, also feature a higher level of programmability. Traditionally, this is a command or macro language, which can be used to build large procedures (scripts) out of simple programs or commands. This approach, a legacy of the teletypewriter has serious drawbacks. A command language is clumsy when (and if! it attempts to utilize the capabilities of a multitasking or multiprocessor environment, it is but adequate for real-time data acquisition and processing, it has a fairly steep learning curve, and the user interface is very inefficient,. especially when compared to a graphical user interface (GUI) that systems running under Xll or Windows should otherwise be able to provide. ll these difficulties stem from one basic problem: a command language is not a natural metaphor for an image processing procedure. A more natural metaphor - an image processing factory is described in detail. A factory is a set of programs (applications) that execute separate operations on images, connected by pipes that carry data (images and parameters) between them. The programs function concurrently, processing images as they arrive along pipes, and querying the user for whatever other input they need. From the user's point of view, programming (constructing) factories is a lot like playing with LEGO blocks - much more intuitive than writing scripts. Focus is on some of the difficulties of implementing factory support, most notably the design of an appropriate API. It also shows that factories retain all the functionality of a command language (including loops and conditional branches), while suffering from none of the drawbacks outlined above. Other benefits of factory programming include self-tuning factories and the process of encapsulation, which lets a factory take the shape of a standard application both from the system and the user's point of view, and thus be used as a component of other factories. A bare-bones prototype of factory programming was implemented under the PcIPS image processing system, and a complete version (on a multitasking platform) is under development.
Franc, Jeffrey Michael; Ingrassia, Pier Luigi; Verde, Manuela; Colombo, Davide; Della Corte, Francesco
2015-02-01
Surge capacity, or the ability to manage an extraordinary volume of patients, is fundamental for hospital management of mass-casualty incidents. However, quantification of surge capacity is difficult and no universal standard for its measurement has emerged, nor has a standardized statistical method been advocated. As mass-casualty incidents are rare, simulation may represent a viable alternative to measure surge capacity. Hypothesis/Problem The objective of the current study was to develop a statistical method for the quantification of surge capacity using a combination of computer simulation and simple process-control statistical tools. Length-of-stay (LOS) and patient volume (PV) were used as metrics. The use of this method was then demonstrated on a subsequent computer simulation of an emergency department (ED) response to a mass-casualty incident. In the derivation phase, 357 participants in five countries performed 62 computer simulations of an ED response to a mass-casualty incident. Benchmarks for ED response were derived from these simulations, including LOS and PV metrics for triage, bed assignment, physician assessment, and disposition. In the application phase, 13 students of the European Master in Disaster Medicine (EMDM) program completed the same simulation scenario, and the results were compared to the standards obtained in the derivation phase. Patient-volume metrics included number of patients to be triaged, assigned to rooms, assessed by a physician, and disposed. Length-of-stay metrics included median time to triage, room assignment, physician assessment, and disposition. Simple graphical methods were used to compare the application phase group to the derived benchmarks using process-control statistical tools. The group in the application phase failed to meet the indicated standard for LOS from admission to disposition decision. This study demonstrates how simulation software can be used to derive values for objective benchmarks of ED surge capacity using PV and LOS metrics. These objective metrics can then be applied to other simulation groups using simple graphical process-control tools to provide a numeric measure of surge capacity. Repeated use in simulations of actual EDs may represent a potential means of objectively quantifying disaster management surge capacity. It is hoped that the described statistical method, which is simple and reusable, will be useful for investigators in this field to apply to their own research.
Java 3D Interactive Visualization for Astrophysics
NASA Astrophysics Data System (ADS)
Chae, K.; Edirisinghe, D.; Lingerfelt, E. J.; Guidry, M. W.
2003-05-01
We are developing a series of interactive 3D visualization tools that employ the Java 3D API. We have applied this approach initially to a simple 3-dimensional galaxy collision model (restricted 3-body approximation), with quite satisfactory results. Running either as an applet under Web browser control, or as a Java standalone application, this program permits real-time zooming, panning, and 3-dimensional rotation of the galaxy collision simulation under user mouse and keyboard control. We shall also discuss applications of this technology to 3-dimensional visualization for other problems of astrophysical interest such as neutron star mergers and the time evolution of element/energy production networks in X-ray bursts. *Managed by UT-Battelle, LLC, for the U.S. Department of Energy under contract DE-AC05-00OR22725.
CommServer: A Communications Manager For Remote Data Sites
NASA Astrophysics Data System (ADS)
Irving, K.; Kane, D. L.
2012-12-01
CommServer is a software system that manages making connections to remote data-gathering stations, providing a simple network interface to client applications. The client requests a connection to a site by name, and the server establishes the connection, providing a bidirectional channel between the client and the target site if successful. CommServer was developed to manage networks of FreeWave serial data radios with multiple data sites, repeaters, and network-accessed base stations, and has been in continuous operational use for several years. Support for Iridium modems using RUDICS will be added soon, and no changes to the application interface are anticipated. CommServer is implemented on Linux using programs written in bash shell, Python, Perl, AWK, under a set of conventions we refer to as ThinObject.
Software for determining the true displacement of faults
NASA Astrophysics Data System (ADS)
Nieto-Fuentes, R.; Nieto-Samaniego, Á. F.; Xu, S.-S.; Alaniz-Álvarez, S. A.
2014-03-01
One of the most important parameters of faults is the true (or net) displacement, which is measured by restoring two originally adjacent points, called “piercing points”, to their original positions. This measurement is not typically applicable because it is rare to observe piercing points in natural outcrops. Much more common is the measurement of the apparent displacement of a marker. Methods to calculate the true displacement of faults using descriptive geometry, trigonometry or vector algebra are common in the literature, and most of them solve a specific situation from a large amount of possible combinations of the fault parameters. True displacements are not routinely calculated because it is a tedious and tiring task, despite their importance and the relatively simple methodology. We believe that the solution is to develop software capable of performing this work. In a previous publication, our research group proposed a method to calculate the true displacement of faults by solving most combinations of fault parameters using simple trigonometric equations. The purpose of this contribution is to present a computer program for calculating the true displacement of faults. The input data are the dip of the fault; the pitch angles of the markers, slickenlines and observation lines; and the marker separation. To prevent the common difficulties involved in switching between operative systems, the software is developed using the Java programing language. The computer program could be used as a tool in education and will also be useful for the calculation of the true fault displacement in geological and engineering works. The application resolves the cases with known direction of net slip, which commonly is assumed parallel to the slickenlines. This assumption is not always valid and must be used with caution, because the slickenlines are formed during a step of the incremental displacement on the fault surface, whereas the net slip is related to the finite slip.
NASA Technical Reports Server (NTRS)
Maskew, Brian
1987-01-01
The VSAERO low order panel method formulation is described for the calculation of subsonic aerodynamic characteristics of general configurations. The method is based on piecewise constant doublet and source singularities. Two forms of the internal Dirichlet boundary condition are discussed and the source distribution is determined by the external Neumann boundary condition. A number of basic test cases are examined. Calculations are compared with higher order solutions for a number of cases. It is demonstrated that for comparable density of control points where the boundary conditions are satisfied, the low order method gives comparable accuracy to the higher order solutions. It is also shown that problems associated with some earlier low order panel methods, e.g., leakage in internal flows and junctions and also poor trailing edge solutions, do not appear for the present method. Further, the application of the Kutta conditions is extremely simple; no extra equation or trailing edge velocity point is required. The method has very low computing costs and this has made it practical for application to nonlinear problems requiring iterative solutions for wake shape and surface boundary layer effects.
SWMPr: An R Package for Retrieving, Organizing, and ...
The System-Wide Monitoring Program (SWMP) was implemented in 1995 by the US National Estuarine Research Reserve System. This program has provided two decades of continuous monitoring data at over 140 fixed stations in 28 estuaries. However, the increasing quantity of data provided by the monitoring network has complicated broad-scale comparisons between systems and, in some cases, prevented simple trend analysis of water quality parameters at individual sites. This article describes the SWMPr package that provides several functions that facilitate data retrieval, organization, andanalysis of time series data in the reserve estuaries. Previously unavailable functions for estuaries are also provided to estimate rates of ecosystem metabolism using the open-water method. The SWMPr package has facilitated a cross-reserve comparison of water quality trends and links quantitative information with analysis tools that have use for more generic applications to environmental time series. The manuscript describes a software package that was recently developed to retrieve, organize, and analyze monitoring data from the National Estuarine Research Reserve System. Functions are explained in detail, including recent applications for trend analysis of ecosystem metabolism.
A high-speed linear algebra library with automatic parallelism
NASA Technical Reports Server (NTRS)
Boucher, Michael L.
1994-01-01
Parallel or distributed processing is key to getting highest performance workstations. However, designing and implementing efficient parallel algorithms is difficult and error-prone. It is even more difficult to write code that is both portable to and efficient on many different computers. Finally, it is harder still to satisfy the above requirements and include the reliability and ease of use required of commercial software intended for use in a production environment. As a result, the application of parallel processing technology to commercial software has been extremely small even though there are numerous computationally demanding programs that would significantly benefit from application of parallel processing. This paper describes DSSLIB, which is a library of subroutines that perform many of the time-consuming computations in engineering and scientific software. DSSLIB combines the high efficiency and speed of parallel computation with a serial programming model that eliminates many undesirable side-effects of typical parallel code. The result is a simple way to incorporate the power of parallel processing into commercial software without compromising maintainability, reliability, or ease of use. This gives significant advantages over less powerful non-parallel entries in the market.
ASM Conference on Prokaryotic Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaplan, H. B.
2005-07-13
Support was provided by DOE for the 2nd ASM Conference on Prokaryotic Development. The final conference program and abstracts book is attached. The conference presentations are organized around topics that are central to the current research areas in prokaryotic development. The program starts with topics that involve relatively simple models systems and ends with systems that are more complex. The topics are: i) the cell cycle, ii) the cytoskeleton, iii) morphogenesis, iv) developmental transcription, v) signaling, vi) multicellularity, and vii) developmental diversity and symbiosis. The best-studied prokaryotic development model systems will be highlighted at the conference through research presentations bymore » leaders in the field. Many of these systems are also model systems of relevance to the DOE mission including carbon sequestration (Bradyrizobium, Synechococcus), energy production (Anabaena, Rhodobacter) and bioremediation (Caulobacter, Mesorhizobium). In addition, many of the highlighted organisms have important practical applications; the actinomycetes and myxobacteria produce antimicrobials that are of commercial interest. It is certain that the cutting-edge science presented at the conference will be applicable to the large group of bacteria relevant to the DOE mission.« less
Micrometeoroid and Orbital Debris (MMOD) Shield Ballistic Limit Analysis Program
NASA Technical Reports Server (NTRS)
Ryan, Shannon
2013-01-01
This software implements penetration limit equations for common micrometeoroid and orbital debris (MMOD) shield configurations, windows, and thermal protection systems. Allowable MMOD risk is formulated in terms of the probability of penetration (PNP) of the spacecraft pressure hull. For calculating the risk, spacecraft geometry models, mission profiles, debris environment models, and penetration limit equations for installed shielding configurations are required. Risk assessment software such as NASA's BUMPERII is used to calculate mission PNP; however, they are unsuitable for use in shield design and preliminary analysis studies. The software defines a single equation for the design and performance evaluation of common MMOD shielding configurations, windows, and thermal protection systems, along with a description of their validity range and guidelines for their application. Recommendations are based on preliminary reviews of fundamental assumptions, and accuracy in predicting experimental impact test results. The software is programmed in Visual Basic for Applications for installation as a simple add-in for Microsoft Excel. The user is directed to a graphical user interface (GUI) that requires user inputs and provides solutions directly in Microsoft Excel workbooks.
Excel VBA for Physicists; A Primer
NASA Astrophysics Data System (ADS)
Liengme, Bernard V.
2016-11-01
This book is both an introduction and a demonstration of how Visual Basic for Applications (VBA) can greatly enhance Microsoft Excel® by giving users the ability to create their own functions within a worksheet and to create subroutines to perform repetitive actions. The book is written so readers are encouraged to experiment with VBA programming with examples using fairly simple physics or non-complicated mathematics such as root finding and numerical integration. Tested Excel® workbooks are available for each chapter and there is nothing to buy or install. A tested Excel workbook for each chapter can be downloaded from Book information
AFM 4.0: a toolbox for DNA microarray analysis
Breitkreutz, Bobby-Joe; Jorgensen, Paul; Breitkreutz, Ashton; Tyers, Mike
2001-01-01
We have developed a series of programs, collectively packaged as Array File Maker 4.0 (AFM), that manipulate and manage DNA microarray data. AFM 4.0 is simple to use, applicable to any organism or microarray, and operates within the familiar confines of Microsoft Excel. Given a database of expression ratios, AFM 4.0 generates input files for clustering, helps prepare colored figures and Venn diagrams, and can uncover aneuploidy in yeast microarray data. AFM 4.0 should be especially useful to laboratories that do not have access to specialized commercial or in-house software. PMID:11532221
Tang, Qi-Yi; Zhang, Chuan-Xi
2013-04-01
A comprehensive but simple-to-use software package called DPS (Data Processing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical software. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology. © 2012 The Authors Insect Science © 2012 Institute of Zoology, Chinese Academy of Sciences.
NASA Technical Reports Server (NTRS)
1972-01-01
A long life assurance program for the development of design, process, test, and application guidelines for achieving reliable spacecraft hardware was conducted. The study approach consisted of a review of technical data performed concurrently with a survey of the aerospace industry. The data reviewed included design and operating characteristics, failure histories and solutions, and similar documents. The topics covered by the guidelines are reported. It is concluded that long life hardware is achieved through meticulous attention to many details and no simple set of rules can suffice.
Horno, J; González-Caballero, F; González-Fernández, C F
1990-01-01
Simple techniques of network thermodynamics are used to obtain the numerical solution of the Nernst-Planck and Poisson equation system. A network model for a particular physical situation, namely ionic transport through a thin membrane with simultaneous diffusion, convection and electric current, is proposed. Concentration and electric field profiles across the membrane, as well as diffusion potential, have been simulated using the electric circuit simulation program, SPICE. The method is quite general and extremely efficient, permitting treatments of multi-ion systems whatever the boundary and experimental conditions may be.
Software design and documentation language, revision 1
NASA Technical Reports Server (NTRS)
Kleine, H.
1979-01-01
The Software Design and Documentation Language (SDDL) developed to provide an effective communications medium to support the design and documentation of complex software applications is described. Features of the system include: (1) a processor which can convert design specifications into an intelligible, informative machine-reproducible document; (2) a design and documentation language with forms and syntax that are simple, unrestrictive, and communicative; and (3) methodology for effective use of the language and processor. The SDDL processor is written in the SIMSCRIPT II programming language and is implemented on the UNIVAC 1108, the IBM 360/370, and Control Data machines.
Functional programming interpreter. M. S. thesis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robison, A.D.
1987-03-01
Functional Programming (FP) sup BAC87 is an alternative to conventional imperative programming languages. This thesis describes an FP interpreter implementation. Superficially, FP appears to be a simple, but very inefficient language. Its simplicity, however, allows it to be interpreted quickly. Much of the inefficiency can be removed by simple interpreter techniques. This thesis describes the Illinois Functional Programming (IFP) interpreter, an interactive functional programming implementation which runs under both MS-DOS and UNIX. The IFP interpreter allows functions to be created, executed, and debugged in an environment very similar to UNIX. IFP's speed is competitive with other interpreted languages such asmore » BASIC.« less
Quantum Accelerators for High-performance Computing Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humble, Travis S.; Britt, Keith A.; Mohiyaddin, Fahd A.
We define some of the programming and system-level challenges facing the application of quantum processing to high-performance computing. Alongside barriers to physical integration, prominent differences in the execution of quantum and conventional programs challenges the intersection of these computational models. Following a brief overview of the state of the art, we discuss recent advances in programming and execution models for hybrid quantum-classical computing. We discuss a novel quantum-accelerator framework that uses specialized kernels to offload select workloads while integrating with existing computing infrastructure. We elaborate on the role of the host operating system to manage these unique accelerator resources, themore » prospects for deploying quantum modules, and the requirements placed on the language hierarchy connecting these different system components. We draw on recent advances in the modeling and simulation of quantum computing systems with the development of architectures for hybrid high-performance computing systems and the realization of software stacks for controlling quantum devices. Finally, we present simulation results that describe the expected system-level behavior of high-performance computing systems composed from compute nodes with quantum processing units. We describe performance for these hybrid systems in terms of time-to-solution, accuracy, and energy consumption, and we use simple application examples to estimate the performance advantage of quantum acceleration.« less
NASA Technical Reports Server (NTRS)
Houston, Johnny L.
1990-01-01
Program EAGLE (Eglin Arbitrary Geometry Implicit Euler) is a multiblock grid generation and steady-state flow solver system. This system combines a boundary conforming surface generation, a composite block structure grid generation scheme, and a multiblock implicit Euler flow solver algorithm. The three codes are intended to be used sequentially from the definition of the configuration under study to the flow solution about the configuration. EAGLE was specifically designed to aid in the analysis of both freestream and interference flow field configurations. These configurations can be comprised of single or multiple bodies ranging from simple axisymmetric airframes to complex aircraft shapes with external weapons. Each body can be arbitrarily shaped with or without multiple lifting surfaces. Program EAGLE is written to compile and execute efficiently on any CRAY machine with or without Solid State Disk (SSD) devices. Also, the code uses namelist inputs which are supported by all CRAY machines using the FORTRAN Compiler CF177. The use of namelist inputs makes it easier for the user to understand the inputs and to operate Program EAGLE. Recently, the Code was modified to operate on other computers, especially the Sun Spare4 Workstation. Several two-dimensional grid configurations were completely and successfully developed using EAGLE. Currently, EAGLE is being used for three-dimension grid applications.
Wicks, Laura C; Cairns, Gemma S; Melnyk, Jacob; Bryce, Scott; Duncan, Rory R; Dalgarno, Paul A
2017-01-01
We developed a simple, cost-effective smartphone microscopy platform for use in educational and public engagement programs. We demonstrated its effectiveness, and potential for citizen science through a national imaging initiative, EnLightenment . The cost effectiveness of the instrument allowed for the program to deliver over 500 microscopes to more than 100 secondary schools throughout Scotland, targeting 1000's of 12-14 year olds. Through careful, quantified, selection of a high power, low-cost objective lens, our smartphone microscope has an imaging resolution of microns, with a working distance of 3 mm. It is therefore capable of imaging single cells and sub-cellular features, and retains usability for young children. The microscopes were designed in kit form and provided an interdisciplinary educational tool. By providing full lesson plans and support material, we developed a framework to explore optical design, microscope performance, engineering challenges on construction and real-world applications in life sciences, biological imaging, marine biology, art, and technology. A national online imaging competition framed EnLightenment ; with over 500 high quality images submitted of diverse content, spanning multiple disciplines. With examples of cellular and sub-cellular features clearly identifiable in some submissions, we show how young public can use these instruments for research-level imaging applications, and the potential of the instrument for citizen science programs.
VET Program Completion Rates: An Evaluation of the Current Method. Occasional Paper
ERIC Educational Resources Information Center
National Centre for Vocational Education Research (NCVER), 2016
2016-01-01
This work asks one simple question: "how reliable is the method used by the National Centre for Vocational Education Research (NCVER) to estimate projected rates of VET program completion?" In other words, how well do early projections align with actual completion rates some years later? Completion rates are simple to calculate with a…
Design Features of a Friendly Software Environment for Novice Programmers. Technical Report No. 3.
ERIC Educational Resources Information Center
Eisenstadt, Marc
This paper describes the results of a 6-year period of design, implementation, testing, and iterative redesign of a programming language, user aids, and curriculum materials for use by psychology students learning how to write simple computer programs. The SOLO language, which was the resulting product, is primarily a simple, database…
A Simple Spreadsheet Program for the Calculation of Lattice-Site Distributions
ERIC Educational Resources Information Center
McCaffrey, John G.
2009-01-01
A simple spreadsheet program is presented that can be used by undergraduate students to calculate the lattice-site distributions in solids. A major strength of the method is the natural way in which the correct number of ions or atoms are present, or absent, at specific lattice distances. The expanding-cube method utilized is straightforward to…
DIY soundcard based temperature logging system. Part II: applications
NASA Astrophysics Data System (ADS)
Nunn, John
2016-11-01
This paper demonstrates some simple applications of how temperature logging systems may be used to monitor simple heat experiments, and how the data obtained can be analysed to get some additional insight into the physical processes.
NASA Technical Reports Server (NTRS)
Chang, S.-C.; Himansu, A.; Loh, C.-Y.; Wang, X.-Y.; Yu, S.-T.J.
2005-01-01
This paper reports on a significant advance in the area of nonreflecting boundary conditions (NRBCs) for unsteady flow computations. As a part of t he development of t he space-time conservation element and solution element (CE/SE) method, sets of NRBCs for 1D Euler problems are developed without using any characteristics- based techniques. These conditions are much simpler than those commonly reported in the literature, yet so robust that they are applicable to subsonic, transonic and supersonic flows even in the presence of discontinuities. In addition, the straightforward multidimensional extensions of the present 1D NRBCs have been shown numerically to be equally simple and robust. The paper details the theoretical underpinning of these NRBCs, and explains t heir unique robustness and accuracy in terms of t he conservation of space-time fluxes. Some numerical results for an extended Sod's shock-tube problem, illustrating the effectiveness of the present NRBCs are included, together with an associated simple Fortran computer program. As a preliminary to the present development, a review of the basic CE/SE schemes is also included.
Hidalgo-Mazzei, Diego; Mateu, Ainoa; Reinares, María; Undurraga, Juan; Bonnín, Caterina del Mar; Sánchez-Moreno, José; Vieta, Eduard; Colom, Francesc
2015-03-20
New technologies have recently been used for monitoring signs and symptoms of mental health illnesses and particularly have been tested to improve the outcomes in bipolar disorders. Web-based psychoeducational programs for bipolar disorders have also been implemented, yet to our knowledge, none of them have integrated both approaches in one single intervention. The aim of this project is to develop and validate a smartphone application to monitor symptoms and signs and empower the self-management of bipolar disorder, offering customized embedded psychoeducation contents, in order to identify early symptoms and prevent relapses and hospitalizations. The project will be carried out in three complementary phases, which will include a feasibility study (first phase), a qualitative study (second phase) and a randomized controlled trial (third phase) comparing the smartphone application (SIMPLe) on top of treatment as usual with treatment as usual alone. During the first phase, feasibility and satisfaction will be assessed with the application usage log data and with an electronic survey. Focus groups will be conducted and technical improvements will be incorporated at the second phase. Finally, at the third phase, survival analysis with multivariate data analysis will be performed and relationships between socio-demographic, clinical variables and assessments scores with relapses in each group will be explored. This project could result in a highly available, user-friendly and not costly monitoring and psychoeducational intervention that could improve the outcome of people suffering from bipolar disorders in a practical and secure way. Clinical Trials.gov: NCT02258711 (October 2014).
Simple and flexible SAS and SPSS programs for analyzing lag-sequential categorical data.
O'Connor, B P
1999-11-01
This paper describes simple and flexible programs for analyzing lag-sequential categorical data, using SAS and SPSS. The programs read a stream of codes and produce a variety of lag-sequential statistics, including transitional frequencies, expected transitional frequencies, transitional probabilities, adjusted residuals, z values, Yule's Q values, likelihood ratio tests of stationarity across time and homogeneity across groups or segments, transformed kappas for unidirectional dependence, bidirectional dependence, parallel and nonparallel dominance, and significance levels based on both parametric and randomization tests.
Faith, Daniel P
2008-12-01
New species conservation strategies, including the EDGE of Existence (EDGE) program, have expanded threatened species assessments by integrating information about species' phylogenetic distinctiveness. Distinctiveness has been measured through simple scores that assign shared credit among species for evolutionary heritage represented by the deeper phylogenetic branches. A species with a high score combined with a high extinction probability receives high priority for conservation efforts. Simple hypothetical scenarios for phylogenetic trees and extinction probabilities demonstrate how such scoring approaches can provide inefficient priorities for conservation. An existing probabilistic framework derived from the phylogenetic diversity measure (PD) properly captures the idea of shared responsibility for the persistence of evolutionary history. It avoids static scores, takes into account the status of close relatives through their extinction probabilities, and allows for the necessary updating of priorities in light of changes in species threat status. A hypothetical phylogenetic tree illustrates how changes in extinction probabilities of one or more species translate into changes in expected PD. The probabilistic PD framework provided a range of strategies that moved beyond expected PD to better consider worst-case PD losses. In another example, risk aversion gave higher priority to a conservation program that provided a smaller, but less risky, gain in expected PD. The EDGE program could continue to promote a list of top species conservation priorities through application of probabilistic PD and simple estimates of current extinction probability. The list might be a dynamic one, with all the priority scores updated as extinction probabilities change. Results of recent studies suggest that estimation of extinction probabilities derived from the red list criteria linked to changes in species range sizes may provide estimated probabilities for many different species. Probabilistic PD provides a framework for single-species assessment that is well-integrated with a broader measurement of impacts on PD owing to climate change and other factors.
ELM - A SIMPLE TOOL FOR THERMAL-HYDRAULIC ANALYSIS OF SOLID-CORE NUCLEAR ROCKET FUEL ELEMENTS
NASA Technical Reports Server (NTRS)
Walton, J. T.
1994-01-01
ELM is a simple computational tool for modeling the steady-state thermal-hydraulics of propellant flow through fuel element coolant channels in nuclear thermal rockets. Written for the nuclear propulsion project of the Space Exploration Initiative, ELM evaluates the various heat transfer coefficient and friction factor correlations available for turbulent pipe flow with heat addition. In the past, these correlations were found in different reactor analysis codes, but now comparisons are possible within one program. The logic of ELM is based on the one-dimensional conservation of energy in combination with Newton's Law of Cooling to determine the bulk flow temperature and the wall temperature across a control volume. Since the control volume is an incremental length of tube, the corresponding pressure drop is determined by application of the Law of Conservation of Momentum. The size, speed, and accuracy of ELM make it a simple tool for use in fuel element parametric studies. ELM is a machine independent program written in FORTRAN 77. It has been successfully compiled on an IBM PC compatible running MS-DOS using Lahey FORTRAN 77, a DEC VAX series computer running VMS, and a Sun4 series computer running SunOS UNIX. ELM requires 565K of RAM under SunOS 4.1, 360K of RAM under VMS 5.4, and 406K of RAM under MS-DOS. Because this program is machine independent, no executable is provided on the distribution media. The standard distribution medium for ELM is one 5.25 inch 360K MS-DOS format diskette. ELM was developed in 1991. DEC, VAX, and VMS are trademarks of Digital Equipment Corporation. Sun4 and SunOS are trademarks of Sun Microsystems, Inc. IBM PC is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation.
The MOLGENIS toolkit: rapid prototyping of biosoftware at the push of a button.
Swertz, Morris A; Dijkstra, Martijn; Adamusiak, Tomasz; van der Velde, Joeri K; Kanterakis, Alexandros; Roos, Erik T; Lops, Joris; Thorisson, Gudmundur A; Arends, Danny; Byelas, George; Muilu, Juha; Brookes, Anthony J; de Brock, Engbert O; Jansen, Ritsert C; Parkinson, Helen
2010-12-21
There is a huge demand on bioinformaticians to provide their biologists with user friendly and scalable software infrastructures to capture, exchange, and exploit the unprecedented amounts of new *omics data. We here present MOLGENIS, a generic, open source, software toolkit to quickly produce the bespoke MOLecular GENetics Information Systems needed. The MOLGENIS toolkit provides bioinformaticians with a simple language to model biological data structures and user interfaces. At the push of a button, MOLGENIS' generator suite automatically translates these models into a feature-rich, ready-to-use web application including database, user interfaces, exchange formats, and scriptable interfaces. Each generator is a template of SQL, JAVA, R, or HTML code that would require much effort to write by hand. This 'model-driven' method ensures reuse of best practices and improves quality because the modeling language and generators are shared between all MOLGENIS applications, so that errors are found quickly and improvements are shared easily by a re-generation. A plug-in mechanism ensures that both the generator suite and generated product can be customized just as much as hand-written software. In recent years we have successfully evaluated the MOLGENIS toolkit for the rapid prototyping of many types of biomedical applications, including next-generation sequencing, GWAS, QTL, proteomics and biobanking. Writing 500 lines of model XML typically replaces 15,000 lines of hand-written programming code, which allows for quick adaptation if the information system is not yet to the biologist's satisfaction. Each application generated with MOLGENIS comes with an optimized database back-end, user interfaces for biologists to manage and exploit their data, programming interfaces for bioinformaticians to script analysis tools in R, Java, SOAP, REST/JSON and RDF, a tab-delimited file format to ease upload and exchange of data, and detailed technical documentation. Existing databases can be quickly enhanced with MOLGENIS generated interfaces using the 'ExtractModel' procedure. The MOLGENIS toolkit provides bioinformaticians with a simple model to quickly generate flexible web platforms for all possible genomic, molecular and phenotypic experiments with a richness of interfaces not provided by other tools. All the software and manuals are available free as LGPLv3 open source at http://www.molgenis.org.
Multiobjective optimization techniques for structural design
NASA Technical Reports Server (NTRS)
Rao, S. S.
1984-01-01
The multiobjective programming techniques are important in the design of complex structural systems whose quality depends generally on a number of different and often conflicting objective functions which cannot be combined into a single design objective. The applicability of multiobjective optimization techniques is studied with reference to simple design problems. Specifically, the parameter optimization of a cantilever beam with a tip mass and a three-degree-of-freedom vabration isolation system and the trajectory optimization of a cantilever beam are considered. The solutions of these multicriteria design problems are attempted by using global criterion, utility function, game theory, goal programming, goal attainment, bounded objective function, and lexicographic methods. It has been observed that the game theory approach required the maximum computational effort, but it yielded better optimum solutions with proper balance of the various objective functions in all the cases.
Deformable mirrors development program at ESO
NASA Astrophysics Data System (ADS)
Stroebele, Stefan; Vernet, Elise; Brinkmann, Martin; Jakob, Gerd; Lilley, Paul; Casali, Mark; Madec, Pierre-Yves; Kasper, Markus
2016-07-01
Over the last decade, adaptive optics has become essential in different fields of research including medicine and industrial applications. With this new need, the market of deformable mirrors has expanded a lot allowing new technologies and actuation principles to be developed. Several E-ELT instruments have identified the need for post focal deformable mirrors but with the increasing size of the telescopes the requirements on the deformable mirrors become more demanding. A simple scaling up of existing technologies from few hundred actuators to thousands of actuators will not be sufficient to satisfy the future needs of ESO. To bridge the gap between available deformable mirrors and the future needs for the E-ELT, ESO started a development program for deformable mirror technologies. The requirements and the path to get the deformable mirrors for post focal adaptive optics systems for the E-ELT is presented.
Interactive spectral analyzer and comparator (ISAAC)
NASA Astrophysics Data System (ADS)
Latković, O.; Cséki, A.; Vince, I.
2003-10-01
We are developing an application for graphical comparison of observed and synthetic spectra (ISAAC). Synthetic spectrum calculation is performed by SPECTRUM, Stellar Spectral Synthesis Program by Richard O. Gray that we use with his kind permission. This program computes line profiles under LTE conditions in the given wavelength interval using a stellar (solar) atmosphere model, a spectral line data list (wavelength, energy levels, oscillator strengths, and damping constants), a file containing data for atoms and molecules, as well as a data file for hydrogen line profiles calculation. ISAAC offers a simple interface for viewing and changing any atomic parameter SPECTRUM uses for line profile calculation, enabling quick comparison of the new synthetic line profile with the observed one. In this way parameters like relative abundances, oscillator strengths and van der Waals damping constants can be improved, achieving a better agreement with the observed spectrum.
Experiences with Cray multi-tasking
NASA Technical Reports Server (NTRS)
Miya, E. N.
1985-01-01
The issues involved in modifying an existing code for multitasking is explored. They include Cray extensions to FORTRAN, an examination of the application code under study, designing workable modifications, specific code modifications to the VAX and Cray versions, performance, and efficiency results. The finished product is a faster, fully synchronous, parallel version of the original program. A production program is partitioned by hand to run on two CPUs. Loop splitting multitasks three key subroutines. Simply dividing subroutine data and control structure down the middle of a subroutine is not safe. Simple division produces results that are inconsistent with uniprocessor runs. The safest way to partition the code is to transfer one block of loops at a time and check the results of each on a test case. Other issues include debugging and performance. Task startup and maintenance (e.g., synchronization) are potentially expensive.
Implementing general quantum measurements on linear optical and solid-state qubits
NASA Astrophysics Data System (ADS)
Ota, Yukihiro; Ashhab, Sahel; Nori, Franco
2013-03-01
We show a systematic construction for implementing general measurements on a single qubit, including both strong (or projection) and weak measurements. We mainly focus on linear optical qubits. The present approach is composed of simple and feasible elements, i.e., beam splitters, wave plates, and polarizing beam splitters. We show how the parameters characterizing the measurement operators are controlled by the linear optical elements. We also propose a method for the implementation of general measurements in solid-state qubits. Furthermore, we show an interesting application of the general measurements, i.e., entanglement amplification. YO is partially supported by the SPDR Program, RIKEN. SA and FN acknowledge ARO, NSF grant No. 0726909, JSPS-RFBR contract No. 12-02-92100, Grant-in-Aid for Scientific Research (S), MEXT Kakenhi on Quantum Cybernetics, and the JSPS via its FIRST program.
Sardinha, Ana Gabriella de Oliveira; Oyama, Ceres Nunes de Resende; de Mendonça Maroja, Armando; Costa, Ivan F
2016-01-01
The aim of this paper is to provide a general discussion, algorithm, and actual working programs of the deformation method for fast simulation of biological tissue formed by fibers and fluid. In order to demonstrate the benefit of the clinical applications software, we successfully used our computational program to deform a 3D breast image acquired from patients, using a 3D scanner, in a real hospital environment. The method implements a quasi-static solution for elastic global deformations of objects. Each pair of vertices of the surface is connected and defines an elastic fiber. The set of all the elastic fibers defines a mesh of smaller size than the volumetric meshes, allowing for simulation of complex objects with less computational effort. The behavior similar to the stress tensor is obtained by the volume conservation equation that mixes the 3D coordinates. Step by step, we show the computational implementation of this approach. As an example, a 2D rectangle formed by only 4 vertices is solved and, for this simple geometry, all intermediate results are shown. On the other hand, actual implementations of these ideas in the form of working computer routines are provided for general 3D objects, including a clinical application.
SpaceWire Driver Software for Special DSPs
NASA Technical Reports Server (NTRS)
Clark, Douglas; Lux, James; Nishimoto, Kouji; Lang, Minh
2003-01-01
A computer program provides a high-level C-language interface to electronics circuitry that controls a SpaceWire interface in a system based on a space qualified version of the ADSP-21020 digital signal processor (DSP). SpaceWire is a spacecraft-oriented standard for packet-switching data-communication networks that comprise nodes connected through bidirectional digital serial links that utilize low-voltage differential signaling (LVDS). The software is tailored to the SMCS-332 application-specific integrated circuit (ASIC) (also available as the TSS901E), which provides three highspeed (150 Mbps) serial point-to-point links compliant with the proposed Institute of Electrical and Electronics Engineers (IEEE) Standard 1355.2 and equivalent European Space Agency (ESA) Standard ECSS-E-50-12. In the specific application of this software, the SpaceWire ASIC was combined with the DSP processor, memory, and control logic in a Multi-Chip Module DSP (MCM-DSP). The software is a collection of low-level driver routines that provide a simple message-passing application programming interface (API) for software running on the DSP. Routines are provided for interrupt-driven access to the two styles of interface provided by the SMCS: (1) the "word at a time" conventional host interface (HOCI); and (2) a higher performance "dual port memory" style interface (COMI).
STAR (Simple Tool for Automated Reasoning): Tutorial guide and reference manual
NASA Technical Reports Server (NTRS)
Borchardt, G. C.
1985-01-01
STAR is an interactive, interpreted programming language for the development and operation of Artificial Intelligence application systems. The language is intended for use primarily in the development of software application systems which rely on a combination of symbolic processing, central to the vast majority of AI algorithms, with routines and data structures defined in compiled languages such as C, FORTRAN and PASCAL. References to routines and data structures defined in compiled languages are intermixed with symbolic structures in STAR, resulting in a hybrid operating environment in which symbolic and non-symbolic processing and organization of data may interact to a high degree within the execution of particular application systems. The STAR language was developed in the course of a project involving AI techniques in the interpretation of imaging spectrometer data and is derived in part from a previous language called CLIP. The interpreter for STAR is implemented as a program defined in the language C and has been made available for distribution in source code form through NASA's Computer Software Management and Information Center (COSMIC). Contained within this report are the STAR Tutorial Guide, which introduces the language in a step-by-step manner, and the STAR Reference Manual, which provides a detailed summary of the features of STAR.
"Long life" DC brush motor for use on the Mars surveyor program
NASA Technical Reports Server (NTRS)
Braun, David; Noon, Don
1998-01-01
DC brush motors have several qualities which make them very attractive for space flight applications. Their mechanical commutation is simple and lightweight, requiring no external sensing and control in order to function properly. They are extremely efficient in converting electrical energy into mechanical energy. Efficiencies over 80% are not uncommon, resulting in high power throughput to weight ratios. However, the inherent unreliability and short life of sliding electrical contacts, especially in vacuum, have driven previous programs to utilize complex brushless DC or the less efficient stepper motors. The Mars Surveyor Program (MSP'98) and the Shuttle Radar Topography Mission (SRTM) have developed a reliable "long life" brush type DC motor for operation in low temperature, low pressure CO2 and N2, utilizing silver-graphite brushes. The original intent was to utilize this same motor for SRTM's space operation, but the results thus far have been unsatisfactory in vacuum. This paper describes the design, test, and results of this development.
Tolivia, Jorge; Navarro, Ana; del Valle, Eva; Perez, Cristina; Ordoñez, Cristina; Martínez, Eva
2006-02-01
To describe a simple method to achieve the differential selection and subsequent quantification of the strength signal using only one section. Several methods for performing quantitative histochemistry, immunocytochemistry or hybridocytochemistry, without use of specific commercial image analysis systems, rely on pixel-counting algorithms, which do not provide information on the amount of chromogen present in the section. Other techniques use complex algorithms to calculate the cumulative signal strength using two consecutive sections. To separate the chromogen signal we used the "Color range" option of the Adobe Photoshop program, which provides a specific file for a particular chromogen selection that could be applied on similar sections. The measurement of the chromogen signal strength of the specific staining is achieved with the Scion Image software program. The method described in this paper can also be applied to simultaneous detection of different signals on the same section or different parameters (area of particles, number of particles, etc.) when the "Analyze particles" tool of the Scion program is used.
A Symbolic and Graphical Computer Representation of Dynamical Systems
NASA Astrophysics Data System (ADS)
Gould, Laurence I.
2005-04-01
AUTONO is a Macsyma/Maxima program, designed at the University of Hartford, for solving autonomous systems of differential equations as well as for relating Lagrangians and Hamiltonians to their associated dynamical equations. AUTONO can be used in a number of fields to decipher a variety of complex dynamical systems with ease, producing their Lagrangian and Hamiltonian equations in seconds. These equations can then be incorporated into VisSim, a modeling and simulation program, which yields graphical representations of motion in a given system through easily chosen input parameters. The program, along with the VisSim differential-equations graphical package, allows for resolution and easy understanding of complex problems in a relatively short time; thus enabling quicker and more advanced computing of dynamical systems on any number of platforms---from a network of sensors on a space probe, to the behavior of neural networks, to the effects of an electromagnetic field on components in a dynamical system. A flowchart of AUTONO, along with some simple applications and VisSim output, will be shown.
NASA Technical Reports Server (NTRS)
Teverovsky, Alexander A.
2012-01-01
This document has been developed in the course of NASA Electronic Parts and Packaging (NEPP) program and is not an official endorsement of the insertion of commercial capacitors in space programs or an established set of requirements for their testing. The purpose of this document is to suggest possible ways for selection, screening, and qualification of commercial capacitors for NASA projects and open discussions in the parts engineering community related to the use of COTS ceramic capacitors. This guideline is applicable to commercial surface mount chip, simple parallel plate design, multi-layer ceramic capacitors (MLCCs) rated to voltages of 100V and less. Parts with different design, e.g. low inductance ceramic capacitors (LICA), land grid array (LGA) etc., might need additional testing and tailoring of the requirements described in this document. Although the focus of this document is on commercial MLCCs, many procedures discussed below would be beneficial for military-grade capacitors
NASA Technical Reports Server (NTRS)
1981-01-01
The set of computer programs described allows for data definition, data input, and data transfer between the LSI-11 microcomputers and the VAX-11/780 minicomputer. Program VAXCOM allows for a simple method of textual file transfer from the LSI to the VAX. Program LSICOM allows for easy file transfer from the VAX to the LSI. Program TTY changes the LSI-11 operators console to the LSI's printing device. Program DICTIN provides a means for defining a data set for input to either computer. Program DATAIN is a simple to operate data entry program which is capable of building data files on either machine. Program LEDITV is an extremely powerful, easy to use, line oriented text editor. Program COPYSBF is designed to print out textual files on the line printer without character loss from FORTRAN carriage control or wide record transfer.
A modeling paradigm for interdisciplinary water resources modeling: Simple Script Wrappers (SSW)
NASA Astrophysics Data System (ADS)
Steward, David R.; Bulatewicz, Tom; Aistrup, Joseph A.; Andresen, Daniel; Bernard, Eric A.; Kulcsar, Laszlo; Peterson, Jeffrey M.; Staggenborg, Scott A.; Welch, Stephen M.
2014-05-01
Holistic understanding of a water resources system requires tools capable of model integration. This team has developed an adaptation of the OpenMI (Open Modelling Interface) that allows easy interactions across the data passed between models. Capabilities have been developed to allow programs written in common languages such as matlab, python and scilab to share their data with other programs and accept other program's data. We call this interface the Simple Script Wrapper (SSW). An implementation of SSW is shown that integrates groundwater, economic, and agricultural models in the High Plains region of Kansas. Output from these models illustrates the interdisciplinary discovery facilitated through use of SSW implemented models. Reference: Bulatewicz, T., A. Allen, J.M. Peterson, S. Staggenborg, S.M. Welch, and D.R. Steward, The Simple Script Wrapper for OpenMI: Enabling interdisciplinary modeling studies, Environmental Modelling & Software, 39, 283-294, 2013. http://dx.doi.org/10.1016/j.envsoft.2012.07.006 http://code.google.com/p/simple-script-wrapper/
NASA Astrophysics Data System (ADS)
Jahedi, Mohammad; Ardeljan, Milan; Beyerlein, Irene J.; Paydar, Mohammad Hossein; Knezevic, Marko
2015-06-01
We use a multi-scale, polycrystal plasticity micromechanics model to study the development of orientation gradients within crystals deforming by slip. At the largest scale, the model is a full-field crystal plasticity finite element model with explicit 3D grain structures created by DREAM.3D, and at the finest scale, at each integration point, slip is governed by a dislocation density based hardening law. For deformed polycrystals, the model predicts intra-granular misorientation distributions that follow well the scaling law seen experimentally by Hughes et al., Acta Mater. 45(1), 105-112 (1997), independent of strain level and deformation mode. We reveal that the application of a simple compression step prior to simple shearing significantly enhances the development of intra-granular misorientations compared to simple shearing alone for the same amount of total strain. We rationalize that the changes in crystallographic orientation and shape evolution when going from simple compression to simple shearing increase the local heterogeneity in slip, leading to the boost in intra-granular misorientation development. In addition, the analysis finds that simple compression introduces additional crystal orientations that are prone to developing intra-granular misorientations, which also help to increase intra-granular misorientations. Many metal working techniques for refining grain sizes involve a preliminary or concurrent application of compression with severe simple shearing. Our finding reveals that a pre-compression deformation step can, in fact, serve as another processing variable for improving the rate of grain refinement during the simple shearing of polycrystalline metals.
Self-powered Imbibing Microfluidic Pump by Liquid Encapsulation: SIMPLE.
Kokalj, Tadej; Park, Younggeun; Vencelj, Matjaž; Jenko, Monika; Lee, Luke P
2014-11-21
Reliable, autonomous, internally self-powered microfluidic pumps are in critical demand for rapid point-of-care (POC) devices, integrated molecular-diagnostic platforms, and drug delivery systems. Here we report on a Self-powered Imbibing Microfluidic Pump by Liquid Encapsulation (SIMPLE), which is disposable, autonomous, easy to use and fabricate, robust, and cost efficient, as a solution for self-powered microfluidic POC devices. The imbibition pump introduces the working liquid which is sucked into a porous material (paper) upon activation. The suction of the working liquid creates a reduced pressure in the analytical channel and induces the sequential sample flow into the microfluidic circuits. It requires no external power or control and can be simply activated by a fingertip press. The flow rate can be programmed by defining the shape of utilized porous material: by using three different paper shapes with circular section angles 20°, 40° and 60°, three different volume flow rates of 0.07 μL s(-1), 0.12 μL s(-1) and 0.17 μL s(-1) are demonstrated at 200 μm × 600 μm channel cross-section. We established the SIMPLE pumping of 17 μL of sample; however, the sample volume can be increased to several hundreds of μL. To demonstrate the design, fabrication, and characterization of SIMPLE, we used a simple, robust and cheap foil-laminating fabrication technique. The SIMPLE can be integrated into hydrophilic or hydrophobic materials-based microfluidic POC devices. Since it is also applicable to large-scale manufacturing processes, we anticipate that a new chapter of a cost effective, disposable, autonomous POC diagnostic chip is addressed with this technical innovation.
Analyzing Array Manipulating Programs by Program Transformation
NASA Technical Reports Server (NTRS)
Cornish, J. Robert M.; Gange, Graeme; Navas, Jorge A.; Schachte, Peter; Sondergaard, Harald; Stuckey, Peter J.
2014-01-01
We explore a transformational approach to the problem of verifying simple array-manipulating programs. Traditionally, verification of such programs requires intricate analysis machinery to reason with universally quantified statements about symbolic array segments, such as "every data item stored in the segment A[i] to A[j] is equal to the corresponding item stored in the segment B[i] to B[j]." We define a simple abstract machine which allows for set-valued variables and we show how to translate programs with array operations to array-free code for this machine. For the purpose of program analysis, the translated program remains faithful to the semantics of array manipulation. Based on our implementation in LLVM, we evaluate the approach with respect to its ability to extract useful invariants and the cost in terms of code size.
ERIC Educational Resources Information Center
Abriata, Luciano A.
2011-01-01
A simple algorithm was implemented in a spreadsheet program to simulate the circular dichroism spectra of proteins from their secondary structure content and to fit [alpha]-helix, [beta]-sheet, and random coil contents from experimental far-UV circular dichroism spectra. The physical basis of the method is briefly reviewed within the context of…
ERIC Educational Resources Information Center
Shih, Ching-Hsiang
2011-01-01
This study evaluated whether two people with developmental disabilities would be able to actively perform simple physical activities by controlling their favorite environmental stimulation using Nintendo Wii Balance Boards with a newly developed standing location detection program (SLDP, i.e., a new software program turning a Nintendo Wii Balance…
ERIC Educational Resources Information Center
Shih, Ching-Hsiang
2011-01-01
This study assessed whether two persons with developmental disabilities would be able to actively perform simple occupational activities by controlling their favorite environmental stimulation using battery-free wireless mice with a newly developed object location detection program (OLDP, i.e., a new software program turning a battery-free…
easyHealthApps: e-Health Apps dynamic generation for smartphones & tablets.
Paschou, Mersini; Sakkopoulos, Evangelos; Tsakalidis, Athanasios
2013-06-01
Mobile phones and especially smartphones have been embraced by a rapidly increasing number of people worldwide and this trend is expected to evolve even more in the years to come. There are numerous smartphone Apps that record critical medical data in an effort to solve a particular health issue each time. We studied such applications and not surprisingly, we have found that development and design effort is often repeated. Software patterns have been detected to exist, however re-usability has not been enforced. This leads to lost programming manpower and to increased probability of repeating bugs in Apps. Moreover, at the moment smartphone e-Health Apps demand time, effort and costs for development. Unfortunately even simple data recording Apps are practically impossible to be produced by multiple health domain users who are not developers. In this work, we propose, design and implement a simple and integrated solution which gives healthcare professionals and researchers the ability to create their own data intensive smartphone applications, independent of the desired healthcare domain. The proposed approach applies efficient software techniques that hide development from the users and enable App creation through a simple Web User Interface. The Apps produced are in native format and it is possible to dynamically receive m-Health business logic and the chosen UI. Evaluation of the proposed solution has shown that the generated Apps are functionally and UI equivalent to human-coded Apps according to a number of comparison parameters. Furthermore, e-Health professionals show particular interest in developing Apps on their own for a particular domain they focus on.
NASA Technical Reports Server (NTRS)
1989-01-01
Space communication is making immense strides since ECHO was launched in 1962. It was a simple passive reflector of signals that demonstrated the concept. Today, satellites incorporating transponders, sophisticated high-gain antennas, and stabilization systems provide voice, video, and data communications to millions of people nationally and worldwide. Applications of emerging technology, typified by NASA's Advanced Communications Technology Satellite (ACTS) to be launched in 1992, will use newer portions of the frequency spectrum (the Ka-band at 30/20 GHz), along with antennas and signal-processing that could open yet new markets and services. Government programs, directly or indirectly, are responsible for many space communications accomplishments. They are sponsored and funded in part by NASA and the U.S. Department of Defense since the early 1950s. The industry is growing rapidly and is achieving international preeminence under joint private and government sponsorship. Now, however, the U.S. space communications industry - satellite manufacturers and users, launch services providers, and communications services companies - are being forced to adapt to a different environment. International competition is growing, and terrestrial technologies such as fiber optics are claiming markets until recently dominated by satellites. At the same time, advancing technology is opening up opportunities for new applications and new markets in space exploration, for defense, and for commercial applications of several types. Space communications research, development, and applications (RD and A) programs need to adjust to these realities, be better coordinated and more efficient, and be more closely attuned to commercial markets. The programs must take advantage of RD and A results in other agencies - and in other nations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sadayappan, Ponnuswamy
Exascale computing systems will provide a thousand-fold increase in parallelism and a proportional increase in failure rate relative to today's machines. Systems software for exascale machines must provide the infrastructure to support existing applications while simultaneously enabling efficient execution of new programming models that naturally express dynamic, adaptive, irregular computation; coupled simulations; and massive data analysis in a highly unreliable hardware environment with billions of threads of execution. We propose a new approach to the data and work distribution model provided by system software based on the unifying formalism of an abstract file system. The proposed hierarchical data model providesmore » simple, familiar visibility and access to data structures through the file system hierarchy, while providing fault tolerance through selective redundancy. The hierarchical task model features work queues whose form and organization are represented as file system objects. Data and work are both first class entities. By exposing the relationships between data and work to the runtime system, information is available to optimize execution time and provide fault tolerance. The data distribution scheme provides replication (where desirable and possible) for fault tolerance and efficiency, and it is hierarchical to make it possible to take advantage of locality. The user, tools, and applications, including legacy applications, can interface with the data, work queues, and one another through the abstract file model. This runtime environment will provide multiple interfaces to support traditional Message Passing Interface applications, languages developed under DARPA's High Productivity Computing Systems program, as well as other, experimental programming models. We will validate our runtime system with pilot codes on existing platforms and will use simulation to validate for exascale-class platforms. In this final report, we summarize research results from the work done at the Ohio State University towards the larger goals of the project listed above.« less
NASA Astrophysics Data System (ADS)
1989-02-01
Space communication is making immense strides since ECHO was launched in 1962. It was a simple passive reflector of signals that demonstrated the concept. Today, satellites incorporating transponders, sophisticated high-gain antennas, and stabilization systems provide voice, video, and data communications to millions of people nationally and worldwide. Applications of emerging technology, typified by NASA's Advanced Communications Technology Satellite (ACTS) to be launched in 1992, will use newer portions of the frequency spectrum (the Ka-band at 30/20 GHz), along with antennas and signal-processing that could open yet new markets and services. Government programs, directly or indirectly, are responsible for many space communications accomplishments. They are sponsored and funded in part by NASA and the U.S. Department of Defense since the early 1950s. The industry is growing rapidly and is achieving international preeminence under joint private and government sponsorship. Now, however, the U.S. space communications industry - satellite manufacturers and users, launch services providers, and communications services companies - are being forced to adapt to a different environment. International competition is growing, and terrestrial technologies such as fiber optics are claiming markets until recently dominated by satellites. At the same time, advancing technology is opening up opportunities for new applications and new markets in space exploration, for defense, and for commercial applications of several types. Space communications research, development, and applications (RD and A) programs need to adjust to these realities, be better coordinated and more efficient, and be more closely attuned to commercial markets. The programs must take advantage of RD and A results in other agencies - and in other nations.
Multi-dimensional Rankings, Program Termination, and Complexity Bounds of Flowchart Programs
NASA Astrophysics Data System (ADS)
Alias, Christophe; Darte, Alain; Feautrier, Paul; Gonnord, Laure
Proving the termination of a flowchart program can be done by exhibiting a ranking function, i.e., a function from the program states to a well-founded set, which strictly decreases at each program step. A standard method to automatically generate such a function is to compute invariants for each program point and to search for a ranking in a restricted class of functions that can be handled with linear programming techniques. Previous algorithms based on affine rankings either are applicable only to simple loops (i.e., single-node flowcharts) and rely on enumeration, or are not complete in the sense that they are not guaranteed to find a ranking in the class of functions they consider, if one exists. Our first contribution is to propose an efficient algorithm to compute ranking functions: It can handle flowcharts of arbitrary structure, the class of candidate rankings it explores is larger, and our method, although greedy, is provably complete. Our second contribution is to show how to use the ranking functions we generate to get upper bounds for the computational complexity (number of transitions) of the source program. This estimate is a polynomial, which means that we can handle programs with more than linear complexity. We applied the method on a collection of test cases from the literature. We also show the links and differences with previous techniques based on the insertion of counters.
NASA Astrophysics Data System (ADS)
Milecki, Andrzej; Pelic, Marcin
2016-10-01
This paper presents results of studies of an application of a new method of piezo bender actuators modelling. A special hysteresis simulation model was developed and is presented. The model is based on a geometrical deformation of main hysteresis loop. The piezoelectric effect is described and the history of the hysteresis modelling is briefly reviewed. Firstly, a simple model for main loop modelling is proposed. Then, a geometrical description of the non-saturated hysteresis is presented and its modelling method is introduced. The modelling makes use of the function describing the geometrical shape of the two hysteresis main curves, which can be defined theoretically or obtained by measurement. These main curves are stored in the memory and transformed geometrically in order to obtain the minor curves. Such model was prepared in the Matlab-Simulink software, but can be easily implemented using any programming language and applied in an on-line controller. In comparison to the other known simulation methods, the one presented in the paper is easy to understand, and uses simple arithmetical equations, allowing to quickly obtain the inversed model of hysteresis. The inversed model was further used for compensation of a non-saturated hysteresis of the piezo bender actuator and results have also been presented in the paper.
Design of stabilized platforms for deep space optical communications (DSOC)
NASA Astrophysics Data System (ADS)
Jacka, N.; Walter, R.; Laughlin, D.; McNally, J.
2017-02-01
Numerous Deep Space Optical Communications (DSOC) demonstrations are planned by NASA to provide the basis for future implementation of optical communications links in planetary science missions and eventually manned missions to Mars. There is a need for a simple, robust precision optical stabilization concept for long-range free space optical communications applications suitable for optical apertures and masses larger than the current state of the art. We developed a stabilization concept by exploiting the ultra-low noise and wide bandwidth of ATA-proprietary Magnetohydrodynamic (MHD) angular rate sensors and building on prior practices of flexure-based isolation. We detail a stabilization approach tailored for deep space optical communications, and present an innovative prototype design and test results. Our prototype system provides sub-micro radian stabilization for a deep space optical link such as NASA's integrated Radio frequency and Optical Communications (iROC) and NASA's DSOC programs. Initial test results and simulations suggest that >40 dB broadband jitter rejection is possible without placing unrealistic expectations on the control loop bandwidth and flexure isolation frequency. This approach offers a simple, robust method for platform stabilization without requiring a gravity offload apparatus for ground testing or launch locks to survive a typical launch environment. This paper reviews alternative stabilization concepts, their advantages and disadvantages, as well as, their applicability to various optical communications applications. We present results from testing that subjected the prototype system to realistic spacecraft base motion and confirmed predicted sub-micro radian stabilization performance with a realistic 20-cm aperture.
Simple triple-state polymer actuators with controllable folding characteristics
NASA Astrophysics Data System (ADS)
Chen, Shuyang; Li, Jing; Fang, Lichen; Zhu, Zeyu; Kang, Sung Hoon
2017-03-01
Driven by the interests in self-folding, there have been studies developing artificial self-folding structures at different length scales based on various polymer actuators that can realize dual-state actuation. However, their unidirectional nature limits the applicability of the actuators for a wide range of multi-state self-folding behaviors. In addition, complex fabrication and programming procedures hinder broad applications of existing polymer actuators. Moreover, few of the existing polymer actuators are able to show the self-folding behaviors with the precise control of curvature and force. To address these issues, we report an easy-to-fabricate triple-state actuator with controllable folding behaviors based on bilayer polymer composites with different glass transition temperatures. Initially, the fabricated actuator is in the flat state, and it can sequentially self-fold to angled folding states of opposite directions as it is heated up. Based on an analytical model and measured partial recovery behaviors of polymers, we can accurately control the folding characteristics (curvature and force) for the rational design. To demonstrate an application of our triple-state actuator, we have developed a self-folding transformer robot which self-folds from a two-dimensional sheet into a three-dimensional boat-like configuration and transforms from the boat shape to a car shape with the increase in the temperature applied to the actuator. Our findings offer a simple approach to generate multiple configurations from a single system by harnessing behaviors of polymers with the rational design.
NASA Astrophysics Data System (ADS)
Zhou, Li; Li, Zhenhua; Liu, Zhen; Yin, Meili; Ren, Jinsong; Qu, Xiaogang
2014-01-01
A simple and ``green'' strategy has been reported for the first time to fabricate upconversion nanoparticles (UCNPs) by utilizing nucleotides as bio-templates. The influence of the functionalities present on the nucleotide on the production of nanoparticles was investigated in detail. Through the effects of nucleotides, the obtained nanoparticles possessed a porous structure. The use of the as-prepared UCNPs for cell imaging, drug delivery and versatile therapy applications were demonstrated. In view of the bright up-conversion luminescence as well as the excellent biocompatibility, and the good colloidal stability of the as-prepared UCNPs, we envision that our synthesis protocol might advance both the fields of UCNPs and biomolecule-based nanotechnology for future studies.A simple and ``green'' strategy has been reported for the first time to fabricate upconversion nanoparticles (UCNPs) by utilizing nucleotides as bio-templates. The influence of the functionalities present on the nucleotide on the production of nanoparticles was investigated in detail. Through the effects of nucleotides, the obtained nanoparticles possessed a porous structure. The use of the as-prepared UCNPs for cell imaging, drug delivery and versatile therapy applications were demonstrated. In view of the bright up-conversion luminescence as well as the excellent biocompatibility, and the good colloidal stability of the as-prepared UCNPs, we envision that our synthesis protocol might advance both the fields of UCNPs and biomolecule-based nanotechnology for future studies. Electronic supplementary information (ESI) available: Supporting figures. See DOI: 10.1039/c3nr04255c
Microseed matrix screening for optimization in protein crystallization: what have we learned?
D'Arcy, Allan; Bergfors, Terese; Cowan-Jacob, Sandra W; Marsh, May
2014-09-01
Protein crystals obtained in initial screens typically require optimization before they are of X-ray diffraction quality. Seeding is one such optimization method. In classical seeding experiments, the seed crystals are put into new, albeit similar, conditions. The past decade has seen the emergence of an alternative seeding strategy: microseed matrix screening (MMS). In this strategy, the seed crystals are transferred into conditions unrelated to the seed source. Examples of MMS applications from in-house projects and the literature include the generation of multiple crystal forms and different space groups, better diffracting crystals and crystallization of previously uncrystallizable targets. MMS can be implemented robotically, making it a viable option for drug-discovery programs. In conclusion, MMS is a simple, time- and cost-efficient optimization method that is applicable to many recalcitrant crystallization problems.
Microseed matrix screening for optimization in protein crystallization: what have we learned?
D’Arcy, Allan; Bergfors, Terese; Cowan-Jacob, Sandra W.; Marsh, May
2014-01-01
Protein crystals obtained in initial screens typically require optimization before they are of X-ray diffraction quality. Seeding is one such optimization method. In classical seeding experiments, the seed crystals are put into new, albeit similar, conditions. The past decade has seen the emergence of an alternative seeding strategy: microseed matrix screening (MMS). In this strategy, the seed crystals are transferred into conditions unrelated to the seed source. Examples of MMS applications from in-house projects and the literature include the generation of multiple crystal forms and different space groups, better diffracting crystals and crystallization of previously uncrystallizable targets. MMS can be implemented robotically, making it a viable option for drug-discovery programs. In conclusion, MMS is a simple, time- and cost-efficient optimization method that is applicable to many recalcitrant crystallization problems. PMID:25195878
NASA Technical Reports Server (NTRS)
Rothmann, Elizabeth; Dugan, Joanne Bechta; Trivedi, Kishor S.; Mittal, Nitin; Bavuso, Salvatore J.
1994-01-01
The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. The Hybrid Automated Reliability Predictor (HARP) tutorial provides insight into HARP modeling techniques and the interactive textual prompting input language via a step-by-step explanation and demonstration of HARP's fault occurrence/repair model and the fault/error handling models. Example applications are worked in their entirety and the HARP tabular output data are presented for each. Simple models are presented at first with each succeeding example demonstrating greater modeling power and complexity. This document is not intended to present the theoretical and mathematical basis for HARP.
QRAP: A numerical code for projected (Q)uasiparticle (RA)ndom (P)hase approximation
NASA Astrophysics Data System (ADS)
Samana, A. R.; Krmpotić, F.; Bertulani, C. A.
2010-06-01
A computer code for quasiparticle random phase approximation - QRPA and projected quasiparticle random phase approximation - PQRPA models of nuclear structure is explained in details. The residual interaction is approximated by a simple δ-force. An important application of the code consists in evaluating nuclear matrix elements involved in neutrino-nucleus reactions. As an example, cross sections for 56Fe and 12C are calculated and the code output is explained. The application to other nuclei and the description of other nuclear and weak decay processes are also discussed. Program summaryTitle of program: QRAP ( Quasiparticle RAndom Phase approximation) Computers: The code has been created on a PC, but also runs on UNIX or LINUX machines Operating systems: WINDOWS or UNIX Program language used: Fortran-77 Memory required to execute with typical data: 16 Mbytes of RAM memory and 2 MB of hard disk space No. of lines in distributed program, including test data, etc.: ˜ 8000 No. of bytes in distributed program, including test data, etc.: ˜ 256 kB Distribution format: tar.gz Nature of physical problem: The program calculates neutrino- and antineutrino-nucleus cross sections as a function of the incident neutrino energy, and muon capture rates, using the QRPA or PQRPA as nuclear structure models. Method of solution: The QRPA, or PQRPA, equations are solved in a self-consistent way for even-even nuclei. The nuclear matrix elements for the neutrino-nucleus interaction are treated as the beta inverse reaction of odd-odd nuclei as function of the transfer momentum. Typical running time: ≈ 5 min on a 3 GHz processor for Data set 1.
Biemba, Godfrey; Chiluba, Boniface; Yeboah-Antwi, Kojo; Silavwe, Vichaels; Lunze, Karsten; Mwale, Rodgers K; Russpatrick, Scott; Hamer, Davidson H
2017-01-01
ABSTRACT Introduction: Effective community health management information systems (C-HMIS) are important in low-resource countries that rely heavily on community-based health care providers. Zambia currently lacks a functioning C-HMIS to provide real-time, community-based health information from community health workers (CHWs) to health center staff and higher levels of the health system. Program Description: We developed a C-HMIS mobile platform for use by CHWs providing integrated community case management (iCCM) services and their supervisors to address challenges of frequent stock-outs and inadequate supportive supervision of iCCM-trained CHWs. The platform used simple feature mobile phones on which were loaded the District Health Information System version 2 (DHIS2) software and Java 2 platform micro edition (J2ME) aggregation and tracker applications. This project was implemented in Chipata and Chadiza districts, which supported previous mHealth programs and had cellular coverage from all 3 major network carriers in Zambia. A total of 40 CHWs and 20 CHW supervisors received mobile phones with data bundles and training in the mobile application, after which they implemented the program over a period of 5.5 months, from February to mid-July 2016. CHWs used the mobile phones to submit data on iCCM cases seen, managed, and referred, as well as iCCM medical and diagnostic supplies received and dispensed. Using their mobile phones, the supervisors tracked CHWs' reported cases with medicine consumption, sent CHWs feedback on their referrals, and received SMS reminders to set up mentorship sessions. Observations: CHWs were able to use the mobile application to send weekly reports to health center supervisors on disease caseloads and medical commodities consumed, to make drug and supply requisitions, and to send pre-referral notices to health centers. Health center staff used the mobile system to provide feedback to CHWs on the case outcomes of referred patients and to receive automated monthly SMS reminders to invite CHWs to the facility for mentorship. District- and central-level staff were able to access community-level health data in real time using passwords. Lessons Learned: C-HMIS, using simple feature phones, was feasible and viable for the provision of real-time community-based health information to all levels of the health care system in Zambia, but smartphones, laptops, or desktop computers are needed to perform data analysis and visualization. Ongoing technical support is needed to address the hardware and software challenges CHWs face in their day-to-day interaction with the application on their mobile phones. PMID:28855233
A simple approach to industrial laser safety.
Lewandowski, Michael A; Hinz, Michael W
2005-02-01
Industrial applications of lasers include marking, welding, cutting, and other material processing. Lasers used in these ways have significant power output but are generally designed to limit operator exposure to direct or scattered laser radiation to harmless levels in order to meet the Federal Laser Product Performance Standard (21CFR1040) for Class 1 laser products. Interesting challenges occur when companies integrate high power lasers into manufacturing or process control equipment. A significant part of the integration process is developing engineering and administrative controls to produce an acceptable level of laser safety while balancing production, maintenance, and service requirements. 3M Company uses a large number of high power lasers in numerous manufacturing processes. Whether the laser is purchased as a Class 1 laser product or whether it is purchased as a Class 4 laser and then integrated into a manufacturing application, 3M Company has developed an industrial laser safety program that maintains a high degree of laser safety while facilitating the rapid and economical integration of laser technology into the manufacturing workplace. This laser safety program is based on the requirements and recommendations contained in the American National Standard for Safe Use of Lasers, ANSI Z136.1. The fundamental components of the 3M program include hazard evaluation, engineering, administrative, and procedural controls, protective equipment, signs and labels, training, and re-evaluation upon change. This program is implemented in manufacturing facilities and has resulted in an excellent history of laser safety and an effective and efficient use of laser safety resources.
Kim, Sung-Jin; Wang, Fang; Burns, Mark A; Kurabayashi, Katsuo
2009-06-01
Micromixing is a crucial step for biochemical reactions in microfluidic networks. A critical challenge is that the system containing micromixers needs numerous pumps, chambers, and channels not only for the micromixing but also for the biochemical reactions and detections. Thus, a simple and compatible design of the micromixer element for the system is essential. Here, we propose a simple, yet effective, scheme that enables micromixing and a biochemical reaction in a single microfluidic chamber without using any pumps. We accomplish this process by using natural convection in conjunction with alternating heating of two heaters for efficient micromixing, and by regulating capillarity for sample transport. As a model application, we demonstrate micromixing and subsequent polymerase chain reaction (PCR) for an influenza viral DNA fragment. This process is achieved in a platform of a microfluidic cartridge and a microfabricated heating-instrument with a fast thermal response. Our results will significantly simplify micromixing and a subsequent biochemical reaction that involves reagent heating in microfluidic networks.
Multipurpose silicon photonics signal processor core.
Pérez, Daniel; Gasulla, Ivana; Crudgington, Lee; Thomson, David J; Khokhar, Ali Z; Li, Ke; Cao, Wei; Mashanovich, Goran Z; Capmany, José
2017-09-21
Integrated photonics changes the scaling laws of information and communication systems offering architectural choices that combine photonics with electronics to optimize performance, power, footprint, and cost. Application-specific photonic integrated circuits, where particular circuits/chips are designed to optimally perform particular functionalities, require a considerable number of design and fabrication iterations leading to long development times. A different approach inspired by electronic Field Programmable Gate Arrays is the programmable photonic processor, where a common hardware implemented by a two-dimensional photonic waveguide mesh realizes different functionalities through programming. Here, we report the demonstration of such reconfigurable waveguide mesh in silicon. We demonstrate over 20 different functionalities with a simple seven hexagonal cell structure, which can be applied to different fields including communications, chemical and biomedical sensing, signal processing, multiprocessor networks, and quantum information systems. Our work is an important step toward this paradigm.Integrated optical circuits today are typically designed for a few special functionalities and require complex design and development procedures. Here, the authors demonstrate a reconfigurable but simple silicon waveguide mesh with different functionalities.
Stationary echo canceling in velocity estimation by time-domain cross-correlation.
Jensen, J A
1993-01-01
The application of stationary echo canceling to ultrasonic estimation of blood velocities using time-domain cross-correlation is investigated. Expressions are derived that show the influence from the echo canceler on the signals that enter the cross-correlation estimator. It is demonstrated that the filtration results in a velocity-dependent degradation of the signal-to-noise ratio. An analytic expression is given for the degradation for a realistic pulse. The probability of correct detection at low signal-to-noise ratios is influenced by signal-to-noise ratio, transducer bandwidth, center frequency, number of samples in the range gate, and number of A-lines employed in the estimation. Quantitative results calculated by a simple simulation program are given for the variation in probability from these parameters. An index reflecting the reliability of the estimate at hand can be calculated from the actual cross-correlation estimate by a simple formula and used in rejecting poor estimates or in displaying the reliability of the velocity estimated.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-23
... on the proposed rule change from interested persons. \\1\\ 15 U.S.C. 78s(b)(1). \\2\\ 17 CFR 240.19b-4. I....'' Specifically, the Exchange proposes to amend the Customer Rebate Program, Select Symbols,\\5\\ Simple and Complex... Category D to the Customer Rebate Program relating to Customer Simple Orders in Select Symbols. The...
ERIC Educational Resources Information Center
Shih, Ching-Hsiang; Chang, Man-Ling; Mohua, Zhang
2012-01-01
This study evaluated whether two people with developmental disabilities would be able to actively perform simple occupational activities to control their preferred environmental stimulation using a Nintendo Wii Remote Controller with a newly developed three-dimensional object orientation detection program (TDOODP, i.e. a new software program,…
Simple Numerical Analysis of Longboard Speedometer Data
ERIC Educational Resources Information Center
Hare, Jonathan
2013-01-01
Simple numerical data analysis is described, using a standard spreadsheet program, to determine distance, velocity (speed) and acceleration from voltage data generated by a skateboard/longboard speedometer (Hare 2012 "Phys. Educ." 47 409-17). This simple analysis is an introduction to data processing including scaling data as well as…
Ranking Forestry Investments With Parametric Linear Programming
Paul A. Murphy
1976-01-01
Parametric linear programming is introduced as a technique for ranking forestry investments under multiple constraints; it combines the advantages of simple tanking and linear programming as capital budgeting tools.
Hadjichristodoulou, Christos; Soteriades, Elpidoforos S; Kolonia, Virginia; Falagas, Matthew E; Pantelopoulos, Efstathios; Panagakos, Georgios; Mouchtouri, Varvara; Kremastinou, Jeni
2005-09-02
The use of geographical information system (GIS) technologies in public health surveillance is gradually gaining momentum around the world and many applications have already been reported in the literature. In this study, GIS technology was used to help county departments of Public Health to implement environmental health surveillance for the Athens 2004 Olympic and Para Olympic Games. In order to assess the workload in each Olympic county, 19 registry forms and 17 standardized inspection forms were developed to register and inspect environmental health items requiring inspection (Hotels, restaurants, swimming pools, water supply system etc), respectively. Furthermore, related databases were created using Epi Info 2002 and a geographical information system (GIS) were used to implement an integrated Environmental Health inspection program. The project was conducted in Athens by the Olympic Planning Unit (OPU) of the National School of Public Health, in close cooperation with the Ministry of Health and Social Solidarity and the corresponding departments of Public Health in all municipalities that were scheduled to host events during the Athens 2004 Olympic and Para Olympic games. A total of 44,741 premises of environmental health interest were geocoded into GIS databases and several electronic maps were developed. Using such maps in association with specific criteria, we first identified the maximum workload required to execute environmental health inspections in all premises within the eleven Olympic County Departments of Public Health. Six different scenarios were created for each county, based on devised algorithms in order to design the most effective and realistic inspection program using the available inspectors from each municipality. Furthermore, GIS applications were used to organize the daily inspection program for the Olympic games, provide coloured displays of the inspection results and link those results with the public health surveillance of specific cases or outbreak investigation. Our computerised program exhibited significant efficiency in facilitating the prudent use of public health resources in implementing environmental health inspections in densely populated urban areas as well as in rural counties. Furthermore, the application of simple algorithms in integrating human and other resources provided tailored and cost-effective applications to different public health agencies.
Reverse Aging of Composite Materials for Aeronautical Applications
NASA Astrophysics Data System (ADS)
lannone, Michele
2008-08-01
Hygro-thermal ageing of polymer matrix composite materials is a major issue for all the aeronautical structures. For carbon-epoxy composites generally used in aeronautical applications the major effect of ageing is the humidity absorption, which induces a plasticization effect, generally decreasing Tg and elastic moduli, and finally design allowables. A thermodynamical and kinetic study has been performed, aimed to establish a program of periodic heating of the composite part, able to reversing the ageing effect by inducing water desorption. The study was founded on a simple model based on Fick's law, coupled with a concept of "relative saturation coefficient" depending on the different temperature of the composite part and the environment. The behaviour of some structures exposed to humidity and "reverse aged" by heating has been virtually tested. The conclusion of the study allowed to issue a specific patent application for aeronautical structures to be designed on the basis of a "humidity free" concept which allows the use of higher design allowables; having as final results lighter composite structures with a simplified certification process.
The Buffer Diagnostic Prototype: A fault isolation application using CLIPS
NASA Technical Reports Server (NTRS)
Porter, Ken
1994-01-01
This paper describes problem domain characteristics and development experiences from using CLIPS 6.0 in a proof-of-concept troubleshooting application called the Buffer Diagnostic Prototype. The problem domain is a large digital communications subsystems called the real-time network (RTN), which was designed to upgrade the launch processing system used for shuttle support at KSC. The RTN enables up to 255 computers to share 50,000 data points with millisecond response times. The RTN's extensive built-in test capability but lack of any automatic fault isolation capability presents a unique opportunity for a diagnostic expert system application. The Buffer Diagnostic Prototype addresses RTN diagnosis with a multiple strategy approach. A novel technique called 'faulty causality' employs inexact qualitative models to process test results. Experimental knowledge provides a capability to recognize symptom-fault associations. The implementation utilizes rule-based and procedural programming techniques, including a goal-directed control structure and simple text-based generic user interface that may be reusable for other rapid prototyping applications. Although limited in scope, this project demonstrates a diagnostic approach that may be adapted to troubleshoot a broad range of equipment.
ICL: The Image Composition Language
NASA Technical Reports Server (NTRS)
Foley, James D.; Kim, Won Chul
1986-01-01
The Image Composition Language (ICL) provides a convenient way for programmers of interactive graphics application programs to define how the video look-up table of a raster display system is to be loaded. The ICL allows one or several images stored in the frame buffer to be combined in a variety of ways. The ICL treats these images as variables, and provides arithematic, relational, and conditional operators to combine the images, scalar variables, and constants in image composition expressions. The objective of ICL is to provide programmers with a simple way to compose images, to relieve the tedium usually associated with loading the video look-up table to obtain desired results.
Reusable software parts and the semi-abstract data type
NASA Technical Reports Server (NTRS)
Cohen, Sanford G.
1986-01-01
The development of reuable software parts has been an area of intense discussion within the software community for many years. An approach is described for developing reusable parts for the applications of missile guidance, navigation and control which meet the following criteria: (1) Reusable; (2) Tailorable; (3) Efficient; (4) Simple to use; and (5) Protected against misuse. Validating the feasibility of developing reusable parts which possess these characteristics is the basis of the Common Ada Missile Packages Program (CAMP). Under CAMP, over 200 reusable software parts were developed, including part for navigation, Kalman filter, signal processing and autopilot. Six different methods are presented for designing reusable software parts.
NASA Technical Reports Server (NTRS)
Weller, T.
1977-01-01
The applicability and adequacy of several computer techniques in predicting satisfactorily the nonlinear/inelastic response of angle ply laminates were evaluated. The analytical predictions were correlated with the results of a test program on the inelastic response under axial compression of a large variety of graphite-epoxy and boron-epoxy angle ply laminates. These comparison studies indicate that neither of the abovementioned analyses can satisfactorily predict either the mode of response or the ultimate stress value corresponding to a particular angle ply laminate configuration. Consequently, also the simple failure mechanisms assumed in the analytical models were not verified.
NASA Astrophysics Data System (ADS)
Huang, Da; Freeley, Mark; Palma, Matteo
2017-03-01
We present a facile strategy of general applicability for the assembly of individual nanoscale moieties in array configurations with single-molecule control. Combining the programming ability of DNA as a scaffolding material with a one-step lithographic process, we demonstrate the patterning of single quantum dots (QDs) at predefined locations on silicon and transparent glass surfaces: as proof of concept, clusters of either one, two, or three QDs were assembled in highly uniform arrays with a 60 nm interdot spacing within each cluster. Notably, the platform developed is reusable after a simple cleaning process and can be designed to exhibit different geometrical arrangements.
On the predictions of the 11B solid state NMR parameters
NASA Astrophysics Data System (ADS)
Czernek, Jiří; Brus, Jiří
2016-07-01
The set of boron containing compounds has been subject to the prediction of the 11B solid state NMR spectral parameters using DFT-GIPAW methods properly treating the solid phase effects. The quantification of the differences between measured and theoretical values has been presented, which is directly applicable in structural studies involving 11B nuclei. In particular, a simple scheme has been proposed, which is expected to provide for an estimate of the 11B chemical shift within ±2.0 ppm from the experimental value. The computer program, INFOR, enabling the visualization of concomitant Euler rotations related to the tensorial transformations has been presented.
Application of Peterson's stray light model to complex optical instruments
NASA Astrophysics Data System (ADS)
Fray, S.; Goepel, M.; Kroneberger, M.
2016-07-01
Gary L. Peterson (Breault Research Organization) presented a simple analytical model for in- field stray light evaluation of axial optical systems. We exploited this idea for more complex optical instruments of the Meteosat Third Generation (MTG) mission. For the Flexible Combined Imager (FCI) we evaluated the in-field stray light of its three-mirroranastigmat telescope, while for the Infrared Sounder (IRS) we performed an end-to-end analysis including the front telescope, interferometer and back telescope assembly and the cold optics. A comparison to simulations will be presented. The authors acknowledge the support by ESA and Thales Alenia Space through the MTG satellites program.
A wideband, high-resolution spectrum analyzer
NASA Technical Reports Server (NTRS)
Quirk, M. P.; Wilck, H. C.; Garyantes, M. F.; Grimm, M. J.
1988-01-01
A two-million-channel, 40 MHz bandwidth, digital spectrum analyzer under development at the Jet Propulsion Laboratory is described. The analyzer system will serve as a prototype processor for the sky survey portion of NASA's Search for Extraterrestrial Intelligence program and for other applications in the Deep Space Network. The analyzer digitizes an analog input, performs a 2 (sup 21) point Discrete Fourier Transform, accumulates the output power, normalizes the output to remove frequency-dependent gain, and automates simple signal detection algorithms. Due to its built-in frequency-domain processing functions and configuration flexibility, the analyzer is a very powerful tool for real-time signal analysis.
A wide-band high-resolution spectrum analyzer
NASA Technical Reports Server (NTRS)
Quirk, Maureen P.; Garyantes, Michael F.; Wilck, Helmut C.; Grimm, Michael J.
1988-01-01
A two-million-channel, 40 MHz bandwidth, digital spectrum analyzer under development at the Jet Propulsion Laboratory is described. The analyzer system will serve as a prototype processor for the sky survey portion of NASA's Search for Extraterrestrial Intelligence program and for other applications in the Deep Space Network. The analyzer digitizes an analog input, performs a 2 (sup 21) point Discrete Fourier Transform, accumulates the output power, normalizes the output to remove frequency-dependent gain, and automates simple detection algorithms. Due to its built-in frequency-domain processing functions and configuration flexibility, the analyzer is a very powerful tool for real-time signal analysis.
Lee, Albert; Gibbs, Susannah E
2013-02-01
Adolescent obesity has become an increasingly urgent issue in low- and middle-income countries. Recent relevant advances include the application of the neurobiology of addiction to food addiction and obesity. The biochemistry of the etiology of obesity indicates the need for multilevel interventions that go beyond simple behavioral approaches. Additional research on the neurobiology of food addiction and adolescent obesity in low- and middle-income countries, as well as program evaluations that examine the biochemical effects of complex interventions, is urgently needed. Copyright © 2013 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.
A wide-band high-resolution spectrum analyzer.
Quirk, M P; Garyantes, M F; Wilck, H C; Grimm, M J
1988-12-01
This paper describes a two-million-channel 40-MHz-bandwidth, digital spectrum analyzer under development at the Jet Propulsion Laboratory. The analyzer system will serve as a prototype processor for the sky survey portion of NASA's Search for Extraterrestrial Intelligence program and for other applications in the Deep Space Network. The analyzer digitizes an analog input, performs a 2(21)-point, Discrete Fourier Transform, accumulates the output power, normalizes the output to remove frequency-dependent gain, and automates simple signal detection algorithms. Due to its built-in frequency-domain processing functions and configuration flexibility, the analyzer is a very powerful tool for real-time signal analysis and detection.
A Practical Model for Forecasting New Freshman Enrollment during the Application Period.
ERIC Educational Resources Information Center
Paulsen, Michael B.
1989-01-01
A simple and effective model for forecasting freshman enrollment during the application period is presented step by step. The model requires minimal and readily available information, uses a simple linear regression analysis on a personal computer, and provides updated monthly forecasts. (MSE)
Kim, Hye Hyeon; Seo, Hwa Jeong
2014-07-01
Diabetes is a chronic disease of continuously increasing prevalence. It is a disease with risks of serious complications, thus warranting its long-term management. However, current health management and education programs for diabetes mainly consist of one-way communication, and systematic social support backup to solve diabetics' emotional problems is insufficient. According to individual behavioral changes based on the Transtheoretical Model, we designed a non-drug intervention, including exercise, and applied it to a mobile based application. For effective data sharing between patients and physicians, we adopted an SNS function for our application in order to offer a social support environment. To induce continual and comprehensive care for diabetes, rigorous self-management is essential during the diabetic's life; this is possible through a collaborative patient-physician healthcare model. We designed and developed an SNS-based diabetes self-management mobile application that supports the use of social groups, which are present in three social GYM types. With simple testing of patients in their 20s and 30s, we were able to validate the usefulness of our application. Mobile gadget-based chronic disease symptom management and intervention has the merit that health management can be conducted anywhere and anytime in order to cope with increases in the demand for health and medical services that are occurring due to the aging of the population and to cope with the surge of national medical service costs. This patient-driven and SNS-based intervention program is expected to contribute to promoting the health management habits of diabetics, who need to constantly receive health guidance.
First Experiences with the New Senhance® Telerobotic System in Visceral Surgery.
Stephan, Dietmar; Sälzer, Heike; Willeke, Frank
2018-02-01
Until recently, robotic-assisted surgery has exclusively been connected to the name DaVinci®. In 2016, a second robotic system, the Senhance®, became available. To introduce the new robotic system into clinical routine, detailed team training and an integration program were useful. Within the first 6 months, 116 cases were performed with this system. The integration program intended to start with simple and well-standardized clinical cases. We chose inguinal hernia repair using the TAPP (transabdominal preperitoneal) technique as the starting procedure. Subsequently, we added upper gastrointestinal surgery and cholecystectomies, and colorectal procedures have since also been included. Initial experience with the Senhance system as the first installation in Germany shows that it is suitable for surgery in general and for visceral surgery in particular. The application is safe due to the unproblematically quick changeover to normal laparoscopy and easy to integrate due to the very short system integration times (docking times). Since it is a laparoscopic-based system, following an integration program will enable experienced laparoscopic surgeons to very quickly manage more complex procedures. Due to lower costs, introducing robotic surgery starting with simple and standardized procedures is more feasible. After the establishment of this second robotic system, future studies will have to specifically look at differences in surgical results and basic conditions of different robotic-assisted systems. This paper documents the decision-making process of a hospital towards the integration of a robotic system and the selection criteria used while also demonstrating the planning and execution process during the introduction of the system into clinical routine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Originally developed in 1999, an updated version 8.8.0 with bug fixes was released on September 30th, 2017. EnergyPlus™ is a whole building energy simulation program that engineers, architects, and researchers use to model both energy consumption—for heating, cooling, ventilation, lighting and plug and process loads—and water use in buildings. EnergyPlus is a console-based program that reads input and writes output to text files. It ships with a number of utilities including IDF-Editor for creating input files using a simple spreadsheet-like interface, EP-Launch for managing input and output files and performing batch simulations, and EP-Compare for graphically comparing the results ofmore » two or more simulations. Several comprehensive graphical interfaces for EnergyPlus are also available. DOE does most of its work with EnergyPlus using the OpenStudio® software development kit and suite of applications. DOE releases major updates to EnergyPlus twice annually.« less
System analysis in rotorcraft design: The past decade
NASA Technical Reports Server (NTRS)
Galloway, Thomas L.
1988-01-01
Rapid advances in the technology of electronic digital computers and the need for an integrated synthesis approach in developing future rotorcraft programs has led to increased emphasis on system analysis techniques in rotorcraft design. The task in systems analysis is to deal with complex, interdependent, and conflicting requirements in a structured manner so rational and objective decisions can be made. Whether the results are wisdom or rubbish depends upon the validity and sometimes more importantly, the consistency of the inputs, the correctness of the analysis, and a sensible choice of measures of effectiveness to draw conclusions. In rotorcraft design this means combining design requirements, technology assessment, sensitivity analysis and reviews techniques currently in use by NASA and Army organizations in developing research programs and vehicle specifications for rotorcraft. These procedures span simple graphical approaches to comprehensive analysis on large mainframe computers. Examples of recent applications to military and civil missions are highlighted.
NASA Technical Reports Server (NTRS)
Pagnutti, Mary; Ryan, Robert E.; Holekamp, Kara; Harrington, Gary; Frisbie, Troy
2006-01-01
A simple and cost-effective, hyperspectral sun photometer for radiometric vicarious remote sensing system calibration, air quality monitoring, and potentially in-situ planetary climatological studies, was developed. The device was constructed solely from off the shelf components and was designed to be easily deployable for support of short-term verification and validation data collects. This sun photometer not only provides the same data products as existing multi-band sun photometers, this device requires a simpler setup, less data acquisition time and allows for a more direct calibration approach. Fielding this instrument has also enabled Stennis Space Center (SSC) Applied Sciences Directorate personnel to cross calibrate existing sun photometers. This innovative research will position SSC personnel to perform air quality assessments in support of the NASA Applied Sciences Program's National Applications program element as well as to develop techniques to evaluate aerosols in a Martian or other planetary atmosphere.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, J.P.; Bangs, A.L.; Butler, P.L.
Hetero Helix is a programming environment which simulates shared memory on a heterogeneous network of distributed-memory computers. The machines in the network may vary with respect to their native operating systems and internal representation of numbers. Hetero Helix presents a simple programming model to developers, and also considers the needs of designers, system integrators, and maintainers. The key software technology underlying Hetero Helix is the use of a compiler'' which analyzes the data structures in shared memory and automatically generates code which translates data representations from the format native to each machine into a common format, and vice versa. Themore » design of Hetero Helix was motivated in particular by the requirements of robotics applications. Hetero Helix has been used successfully in an integration effort involving 27 CPUs in a heterogeneous network and a body of software totaling roughly 100,00 lines of code. 25 refs., 6 figs.« less
Interactive display of molecular models using a microcomputer system
NASA Technical Reports Server (NTRS)
Egan, J. T.; Macelroy, R. D.
1980-01-01
A simple, microcomputer-based, interactive graphics display system has been developed for the presentation of perspective views of wire frame molecular models. The display system is based on a TERAK 8510a graphics computer system with a display unit consisting of microprocessor, television display and keyboard subsystems. The operating system includes a screen editor, file manager, PASCAL and BASIC compilers and command options for linking and executing programs. The graphics program, written in USCD PASCAL, involves the centering of the coordinate system, the transformation of centered model coordinates into homogeneous coordinates, the construction of a viewing transformation matrix to operate on the coordinates, clipping invisible points, perspective transformation and scaling to screen coordinates; commands available include ZOOM, ROTATE, RESET, and CHANGEVIEW. Data file structure was chosen to minimize the amount of disk storage space. Despite the inherent slowness of the system, its low cost and flexibility suggests general applicability.
Reconfigurable photonic crystals enabled by pressure-responsive shape-memory polymers
Fang, Yin; Ni, Yongliang; Leo, Sin-Yen; Taylor, Curtis; Basile, Vito; Jiang, Peng
2015-01-01
Smart shape-memory polymers can memorize and recover their permanent shape in response to an external stimulus (for example, heat). They have been extensively exploited for a wide spectrum of applications ranging from biomedical devices to aerospace morphing structures. However, most of the existing shape-memory polymers are thermoresponsive and their performance is hindered by heat-demanding programming and recovery steps. Although pressure is an easily adjustable process variable such as temperature, pressure-responsive shape-memory polymers are largely unexplored. Here we report a series of shape-memory polymers that enable unusual ‘cold' programming and instantaneous shape recovery triggered by applying a contact pressure at ambient conditions. Moreover, the interdisciplinary integration of scientific principles drawn from two disparate fields—the fast-growing photonic crystal and shape-memory polymer technologies—enables fabrication of reconfigurable photonic crystals and simultaneously provides a simple and sensitive optical technique for investigating the intriguing shape-memory effects at nanoscale. PMID:26074349
DIY Soundcard Based Temperature Logging System. Part II: Applications
ERIC Educational Resources Information Center
Nunn, John
2016-01-01
This paper demonstrates some simple applications of how temperature logging systems may be used to monitor simple heat experiments, and how the data obtained can be analysed to get some additional insight into the physical processes. [For "DIY Soundcard Based Temperature Logging System. Part I: Design," see EJ1114124.
77 FR 3506 - Copyright Office Fees
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-24
... spring. As part of the process of formulating that fee schedule, the Office seeks the views of interested... recognition of the lower cost in processing such simple applications as well as the need to encourage.... There is some precedent for special treatment for simple basic applications by individual authors. Since...
Wicks, Laura C.; Cairns, Gemma S.; Melnyk, Jacob; Bryce, Scott; Duncan, Rory R.; Dalgarno, Paul A.
2018-01-01
We developed a simple, cost-effective smartphone microscopy platform for use in educational and public engagement programs. We demonstrated its effectiveness, and potential for citizen science through a national imaging initiative, EnLightenment. The cost effectiveness of the instrument allowed for the program to deliver over 500 microscopes to more than 100 secondary schools throughout Scotland, targeting 1000’s of 12-14 year olds. Through careful, quantified, selection of a high power, low-cost objective lens, our smartphone microscope has an imaging resolution of microns, with a working distance of 3 mm. It is therefore capable of imaging single cells and sub-cellular features, and retains usability for young children. The microscopes were designed in kit form and provided an interdisciplinary educational tool. By providing full lesson plans and support material, we developed a framework to explore optical design, microscope performance, engineering challenges on construction and real-world applications in life sciences, biological imaging, marine biology, art, and technology. A national online imaging competition framed EnLightenment ; with over 500 high quality images submitted of diverse content, spanning multiple disciplines. With examples of cellular and sub-cellular features clearly identifiable in some submissions, we show how young public can use these instruments for research-level imaging applications, and the potential of the instrument for citizen science programs. PMID:29623296
Practical Pocket PC Application w/Biometric Security
NASA Technical Reports Server (NTRS)
Logan, Julian
2004-01-01
I work in the Flight Software Engineering Branch, where we provide design and development of embedded real-time software applications for flight and supporting ground systems to support the NASA Aeronautics and Space Programs. In addition, this branch evaluates, develops and implements new technologies for embedded real-time systems, and maintains a laboratory for applications of embedded technology. The majority of microchips that are used in modern society have been programmed using embedded technology. These small chips can be found in microwaves, calculators, home security systems, cell phones and more. My assignment this summer entails working with an iPAQ HP 5500 Pocket PC. This top-of-the-line hand-held device is one of the first mobile PC's to introduce biometric security capabilities. Biometric security, in this case a fingerprint authentication system, is on the edge of technology as far as securing information. The benefits of fingerprint authentication are enormous. The most significant of them are that it is extremely difficult to reproduce someone else's fingerprint, and it is equally difficult to lose or forget your own fingerprint as opposed to a password or pin number. One of my goals for this summer is to integrate this technology with another Pocket PC application. The second task for the summer is to develop a simple application that provides an Astronaut EVA (Extravehicular Activity) Log Book capability. The Astronaut EVA Log Book is what an astronaut would use to report the status of field missions, crew physical health, successes, future plans, etc. My goal is to develop a user interface into which these data fields can be entered and stored. The applications that I am developing are created using eMbedded Visual C++ 4.0 with the Pocket PC 2003 Software Development Kit provided by Microsoft.
Scaling Irregular Applications through Data Aggregation and Software Multithreading
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morari, Alessandro; Tumeo, Antonino; Chavarría-Miranda, Daniel
Bioinformatics, data analytics, semantic databases, knowledge discovery are emerging high performance application areas that exploit dynamic, linked data structures such as graphs, unbalanced trees or unstructured grids. These data structures usually are very large, requiring significantly more memory than available on single shared memory systems. Additionally, these data structures are difficult to partition on distributed memory systems. They also present poor spatial and temporal locality, thus generating unpredictable memory and network accesses. The Partitioned Global Address Space (PGAS) programming model seems suitable for these applications, because it allows using a shared memory abstraction across distributed-memory clusters. However, current PGAS languagesmore » and libraries are built to target regular remote data accesses and block transfers. Furthermore, they usually rely on the Single Program Multiple Data (SPMD) parallel control model, which is not well suited to the fine grained, dynamic and unbalanced parallelism of irregular applications. In this paper we present {\\bf GMT} (Global Memory and Threading library), a custom runtime library that enables efficient execution of irregular applications on commodity clusters. GMT integrates a PGAS data substrate with simple fork/join parallelism and provides automatic load balancing on a per node basis. It implements multi-level aggregation and lightweight multithreading to maximize memory and network bandwidth with fine-grained data accesses and tolerate long data access latencies. A key innovation in the GMT runtime is its thread specialization (workers, helpers and communication threads) that realize the overall functionality. We compare our approach with other PGAS models, such as UPC running using GASNet, and hand-optimized MPI code on a set of typical large-scale irregular applications, demonstrating speedups of an order of magnitude.« less
2013-01-01
Background Traditional Lot Quality Assurance Sampling (LQAS) designs assume observations are collected using simple random sampling. Alternatively, randomly sampling clusters of observations and then individuals within clusters reduces costs but decreases the precision of the classifications. In this paper, we develop a general framework for designing the cluster(C)-LQAS system and illustrate the method with the design of data quality assessments for the community health worker program in Rwanda. Results To determine sample size and decision rules for C-LQAS, we use the beta-binomial distribution to account for inflated risk of errors introduced by sampling clusters at the first stage. We present general theory and code for sample size calculations. The C-LQAS sample sizes provided in this paper constrain misclassification risks below user-specified limits. Multiple C-LQAS systems meet the specified risk requirements, but numerous considerations, including per-cluster versus per-individual sampling costs, help identify optimal systems for distinct applications. Conclusions We show the utility of C-LQAS for data quality assessments, but the method generalizes to numerous applications. This paper provides the necessary technical detail and supplemental code to support the design of C-LQAS for specific programs. PMID:24160725
Hedt-Gauthier, Bethany L; Mitsunaga, Tisha; Hund, Lauren; Olives, Casey; Pagano, Marcello
2013-10-26
Traditional Lot Quality Assurance Sampling (LQAS) designs assume observations are collected using simple random sampling. Alternatively, randomly sampling clusters of observations and then individuals within clusters reduces costs but decreases the precision of the classifications. In this paper, we develop a general framework for designing the cluster(C)-LQAS system and illustrate the method with the design of data quality assessments for the community health worker program in Rwanda. To determine sample size and decision rules for C-LQAS, we use the beta-binomial distribution to account for inflated risk of errors introduced by sampling clusters at the first stage. We present general theory and code for sample size calculations.The C-LQAS sample sizes provided in this paper constrain misclassification risks below user-specified limits. Multiple C-LQAS systems meet the specified risk requirements, but numerous considerations, including per-cluster versus per-individual sampling costs, help identify optimal systems for distinct applications. We show the utility of C-LQAS for data quality assessments, but the method generalizes to numerous applications. This paper provides the necessary technical detail and supplemental code to support the design of C-LQAS for specific programs.
NASA Technical Reports Server (NTRS)
Arnold, J. O.
1987-01-01
With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.
Brunner, J; Krummenauer, F; Lehr, H A
2000-04-01
Study end-points in microcirculation research are usually video-taped images rather than numeric computer print-outs. Analysis of these video-taped images for the quantification of microcirculatory parameters usually requires computer-based image analysis systems. Most software programs for image analysis are custom-made, expensive, and limited in their applicability to selected parameters and study end-points. We demonstrate herein that an inexpensive, commercially available computer software (Adobe Photoshop), run on a Macintosh G3 computer with inbuilt graphic capture board provides versatile, easy to use tools for the quantification of digitized video images. Using images obtained by intravital fluorescence microscopy from the pre- and postischemic muscle microcirculation in the skinfold chamber model in hamsters, Photoshop allows simple and rapid quantification (i) of microvessel diameters, (ii) of the functional capillary density and (iii) of postischemic leakage of FITC-labeled high molecular weight dextran from postcapillary venules. We present evidence of the technical accuracy of the software tools and of a high degree of interobserver reliability. Inexpensive commercially available imaging programs (i.e., Adobe Photoshop) provide versatile tools for image analysis with a wide range of potential applications in microcirculation research.
NASA Technical Reports Server (NTRS)
Zak, Michail
2008-01-01
A report discusses an algorithm for a new kind of dynamics based on a quantum- classical hybrid-quantum-inspired maximizer. The model is represented by a modified Madelung equation in which the quantum potential is replaced by different, specially chosen 'computational' potential. As a result, the dynamics attains both quantum and classical properties: it preserves superposition and entanglement of random solutions, while allowing one to measure its state variables, using classical methods. Such optimal combination of characteristics is a perfect match for quantum-inspired computing. As an application, an algorithm for global maximum of an arbitrary integrable function is proposed. The idea of the proposed algorithm is very simple: based upon the Quantum-inspired Maximizer (QIM), introduce a positive function to be maximized as the probability density to which the solution is attracted. Then the larger value of this function will have the higher probability to appear. Special attention is paid to simulation of integer programming and NP-complete problems. It is demonstrated that the problem of global maximum of an integrable function can be found in polynomial time by using the proposed quantum- classical hybrid. The result is extended to a constrained maximum with applications to integer programming and TSP (Traveling Salesman Problem).
Water Management Planning: A Case Study at Blue Grass Army Depot
DOE Office of Scientific and Technical Information (OSTI.GOV)
Solana, Amy E.; Mcmordie, Katherine
2006-04-03
Executive Order 13123, Greening the Government Through Efficient Energy Management, mandates an aggressive policy for reducing potable water consumption at federal facilities. Implementation guid¬ance from the U.S. Department of Energy (DOE) set a requirement for each federal agency to “reduce potable water usage by implementing life cycle, cost-effective water efficiency programs that include a water management plan, and not less than four Federal Energy Management Program (FEMP) Best Manage¬ment Practices (BMPs).” The objective of this plan is to gain full compliance with Executive Order 13123 and associated DOE implementation guidance on behalf of Blue Grass Army Depot (BGAD), Richmond, Kentucky.more » In accordance with this plan, BGAD must: • Incorporate the plan as a component of the Installation energy conservation plan • Investigate the water savings potential and life-cycle cost effectiveness of the Operations and Maintenance (O&M) and retrofit/replacement options associated with the ten FEMP BMPs • Put into practice all applicable O&M options • Identify retrofit/replacement options appropriate for implementation (based upon calculation of the simple payback periods) • Establish a schedule for implementation of applicable and cost-effective retrofit/replacement options.« less
The AIS-5000 parallel processor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmitt, L.A.; Wilson, S.S.
1988-05-01
The AIS-5000 is a commercially available massively parallel processor which has been designed to operate in an industrial environment. It has fine-grained parallelism with up to 1024 processing elements arranged in a single-instruction multiple-data (SIMD) architecture. The processing elements are arranged in a one-dimensional chain that, for computer vision applications, can be as wide as the image itself. This architecture has superior cost/performance characteristics than two-dimensional mesh-connected systems. The design of the processing elements and their interconnections as well as the software used to program the system allow a wide variety of algorithms and applications to be implemented. In thismore » paper, the overall architecture of the system is described. Various components of the system are discussed, including details of the processing elements, data I/O pathways and parallel memory organization. A virtual two-dimensional model for programming image-based algorithms for the system is presented. This model is supported by the AIS-5000 hardware and software and allows the system to be treated as a full-image-size, two-dimensional, mesh-connected parallel processor. Performance bench marks are given for certain simple and complex functions.« less
Shuttle-Data-Tape XML Translator
NASA Technical Reports Server (NTRS)
Barry, Matthew R.; Osborne, Richard N.
2005-01-01
JSDTImport is a computer program for translating native Shuttle Data Tape (SDT) files from American Standard Code for Information Interchange (ASCII) format into databases in other formats. JSDTImport solves the problem of organizing the SDT content, affording flexibility to enable users to choose how to store the information in a database to better support client and server applications. JSDTImport can be dynamically configured by use of a simple Extensible Markup Language (XML) file. JSDTImport uses this XML file to define how each record and field will be parsed, its layout and definition, and how the resulting database will be structured. JSDTImport also includes a client application programming interface (API) layer that provides abstraction for the data-querying process. The API enables a user to specify the search criteria to apply in gathering all the data relevant to a query. The API can be used to organize the SDT content and translate into a native XML database. The XML format is structured into efficient sections, enabling excellent query performance by use of the XPath query language. Optionally, the content can be translated into a Structured Query Language (SQL) database for fast, reliable SQL queries on standard database server computers.
77 FR 56912 - Proposed Collection; Comment Request for Form 2438
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-14
... Program for SIMPLE IRAs; Revenue Procedure 2006-30, Restaurant Tips--Attributed Tip Income Program (ATIP... Reporting Burden Hours: 25,870. (3) Title: Restaurant Tips--Attributed Tip Income Program (ATIP). OMB Number...
Simple interventions to improve healthy eating behaviors in the school cafeteria.
Kessler, Holly S
2016-03-01
The National School Lunch Program in the United States provides an important opportunity to improve nutrition for the 30 million children who participate every school day. The purpose of this narrative review is to present and evaluate simple, evidence-based strategies to improve healthy eating behaviors at school. Healthy eating behaviors are defined as increased selection/consumption of fruits and/or vegetables, increased selection of nutrient-dense foods, or decreased selection of low-nutrient, energy-dense foods. Data were collected from sales records, 24-hour food recalls, direct observation, and estimation of plate waste. The review is limited to simple, discrete interventions that are easy to implement. Sixteen original, peer-reviewed articles are included. Interventions are divided into 5 categories: modification of choice, behavior modification, marketing strategies, time-efficiency strategies, and fruit slicing. All interventions resulted in improved eating behaviors, but not all interventions are applicable or feasible in all settings. Because these studies were performed prior to the implementation of the new federally mandated school meal standards, it is unknown if these interventions would yield similar results if repeated now. © The Author(s) 2016. Published by Oxford University Press on behalf of the International Life Sciences Institute. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Planetary micro-rover operations on Mars using a Bayesian framework for inference and control
NASA Astrophysics Data System (ADS)
Post, Mark A.; Li, Junquan; Quine, Brendan M.
2016-03-01
With the recent progress toward the application of commercially-available hardware to small-scale space missions, it is now becoming feasible for groups of small, efficient robots based on low-power embedded hardware to perform simple tasks on other planets in the place of large-scale, heavy and expensive robots. In this paper, we describe design and programming of the Beaver micro-rover developed for Northern Light, a Canadian initiative to send a small lander and rover to Mars to study the Martian surface and subsurface. For a small, hardware-limited rover to handle an uncertain and mostly unknown environment without constant management by human operators, we use a Bayesian network of discrete random variables as an abstraction of expert knowledge about the rover and its environment, and inference operations for control. A framework for efficient construction and inference into a Bayesian network using only the C language and fixed-point mathematics on embedded hardware has been developed for the Beaver to make intelligent decisions with minimal sensor data. We study the performance of the Beaver as it probabilistically maps a simple outdoor environment with sensor models that include uncertainty. Results indicate that the Beaver and other small and simple robotic platforms can make use of a Bayesian network to make intelligent decisions in uncertain planetary environments.
ERIC Educational Resources Information Center
Physics Education, 1982
1982-01-01
Describes: (1) an apparatus which provides a simple method for measuring Stefan's constant; (2) a simple phase shifting circuit; (3) a radioactive decay computer program (for ZX81); and (4) phase difference between transformer voltages. (Author/JN)
Event Driven Messaging with Role-Based Subscriptions
NASA Technical Reports Server (NTRS)
Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Kim, rachel; Allen, Christopher; Luong, Ivy; Chang, George; Zendejas, Silvino; Sadaqathulla, Syed
2009-01-01
Event Driven Messaging with Role-Based Subscriptions (EDM-RBS) is a framework integrated into the Service Management Database (SMDB) to allow for role-based and subscription-based delivery of synchronous and asynchronous messages over JMS (Java Messaging Service), SMTP (Simple Mail Transfer Protocol), or SMS (Short Messaging Service). This allows for 24/7 operation with users in all parts of the world. The software classifies messages by triggering data type, application source, owner of data triggering event (mission), classification, sub-classification and various other secondary classifying tags. Messages are routed to applications or users based on subscription rules using a combination of the above message attributes. This program provides a framework for identifying connected users and their applications for targeted delivery of messages over JMS to the client applications the user is logged into. EDMRBS provides the ability to send notifications over e-mail or pager rather than having to rely on a live human to do it. It is implemented as an Oracle application that uses Oracle relational database management system intrinsic functions. It is configurable to use Oracle AQ JMS API or an external JMS provider for messaging. It fully integrates into the event-logging framework of SMDB (Subnet Management Database).
NASA Technical Reports Server (NTRS)
Degroh, H.
1994-01-01
The Metallurgical Programs include three simple programs which calculate solutions to problems common to metallurgical engineers and persons making metal castings. The first program calculates the mass of a binary ideal (alloy) given the weight fractions and densities of the pure components and the total volume. The second program calculates the densities of a binary ideal mixture. The third program converts the atomic percentages of a binary mixture to weight percentages. The programs use simple equations to assist the materials staff with routine calculations. The Metallurgical Programs are written in Microsoft QuickBASIC for interactive execution and have been implemented on an IBM PC-XT/AT operating MS-DOS 2.1 or higher with 256K bytes of memory. All instructions needed by the user appear as prompts as the software is used. Data is input using the keyboard only and output is via the monitor. The Metallurgical programs were written in 1987.
Validation of a DICE Simulation Against a Discrete Event Simulation Implemented Entirely in Code.
Möller, Jörgen; Davis, Sarah; Stevenson, Matt; Caro, J Jaime
2017-10-01
Modeling is an essential tool for health technology assessment, and various techniques for conceptualizing and implementing such models have been described. Recently, a new method has been proposed-the discretely integrated condition event or DICE simulation-that enables frequently employed approaches to be specified using a common, simple structure that can be entirely contained and executed within widely available spreadsheet software. To assess if a DICE simulation provides equivalent results to an existing discrete event simulation, a comparison was undertaken. A model of osteoporosis and its management programmed entirely in Visual Basic for Applications and made public by the National Institute for Health and Care Excellence (NICE) Decision Support Unit was downloaded and used to guide construction of its DICE version in Microsoft Excel ® . The DICE model was then run using the same inputs and settings, and the results were compared. The DICE version produced results that are nearly identical to the original ones, with differences that would not affect the decision direction of the incremental cost-effectiveness ratios (<1% discrepancy), despite the stochastic nature of the models. The main limitation of the simple DICE version is its slow execution speed. DICE simulation did not alter the results and, thus, should provide a valid way to design and implement decision-analytic models without requiring specialized software or custom programming. Additional efforts need to be made to speed up execution.
NASA Astrophysics Data System (ADS)
Katsaounis, T. D.
2005-02-01
The scope of this book is to present well known simple and advanced numerical methods for solving partial differential equations (PDEs) and how to implement these methods using the programming environment of the software package Diffpack. A basic background in PDEs and numerical methods is required by the potential reader. Further, a basic knowledge of the finite element method and its implementation in one and two space dimensions is required. The authors claim that no prior knowledge of the package Diffpack is required, which is true, but the reader should be at least familiar with an object oriented programming language like C++ in order to better comprehend the programming environment of Diffpack. Certainly, a prior knowledge or usage of Diffpack would be a great advantage to the reader. The book consists of 15 chapters, each one written by one or more authors. Each chapter is basically divided into two parts: the first part is about mathematical models described by PDEs and numerical methods to solve these models and the second part describes how to implement the numerical methods using the programming environment of Diffpack. Each chapter closes with a list of references on its subject. The first nine chapters cover well known numerical methods for solving the basic types of PDEs. Further, programming techniques on the serial as well as on the parallel implementation of numerical methods are also included in these chapters. The last five chapters are dedicated to applications, modelled by PDEs, in a variety of fields. The first chapter is an introduction to parallel processing. It covers fundamentals of parallel processing in a simple and concrete way and no prior knowledge of the subject is required. Examples of parallel implementation of basic linear algebra operations are presented using the Message Passing Interface (MPI) programming environment. Here, some knowledge of MPI routines is required by the reader. Examples solving in parallel simple PDEs using Diffpack and MPI are also presented. Chapter 2 presents the overlapping domain decomposition method for solving PDEs. It is well known that these methods are suitable for parallel processing. The first part of the chapter covers the mathematical formulation of the method as well as algorithmic and implementational issues. The second part presents a serial and a parallel implementational framework within the programming environment of Diffpack. The chapter closes by showing how to solve two application examples with the overlapping domain decomposition method using Diffpack. Chapter 3 is a tutorial about how to incorporate the multigrid solver in Diffpack. The method is illustrated by examples such as a Poisson solver, a general elliptic problem with various types of boundary conditions and a nonlinear Poisson type problem. In chapter 4 the mixed finite element is introduced. Technical issues concerning the practical implementation of the method are also presented. The main difficulties of the efficient implementation of the method, especially in two and three space dimensions on unstructured grids, are presented and addressed in the framework of Diffpack. The implementational process is illustrated by two examples, namely the system formulation of the Poisson problem and the Stokes problem. Chapter 5 is closely related to chapter 4 and addresses the problem of how to solve efficiently the linear systems arising by the application of the mixed finite element method. The proposed method is block preconditioning. Efficient techniques for implementing the method within Diffpack are presented. Optimal block preconditioners are used to solve the system formulation of the Poisson problem, the Stokes problem and the bidomain model for the electrical activity in the heart. The subject of chapter 6 is systems of PDEs. Linear and nonlinear systems are discussed. Fully implicit and operator splitting methods are presented. Special attention is paid to how existing solvers for scalar equations in Diffpack can be used to derive fully implicit solvers for systems. The proposed techniques are illustrated in terms of two applications, namely a system of PDEs modelling pipeflow and a two-phase porous media flow. Stochastic PDEs is the topic of chapter 7. The first part of the chapter is a simple introduction to stochastic PDEs; basic analytical properties are presented for simple models like transport phenomena and viscous drag forces. The second part considers the numerical solution of stochastic PDEs. Two basic techniques are presented, namely Monte Carlo and perturbation methods. The last part explains how to implement and incorporate these solvers into Diffpack. Chapter 8 describes how to operate Diffpack from Python scripts. The main goal here is to provide all the programming and technical details in order to glue the programming environment of Diffpack with visualization packages through Python and in general take advantage of the Python interfaces. Chapter 9 attempts to show how to use numerical experiments to measure the performance of various PDE solvers. The authors gathered a rather impressive list, a total of 14 PDE solvers. Solvers for problems like Poisson, Navier--Stokes, elasticity, two-phase flows and methods such as finite difference, finite element, multigrid, and gradient type methods are presented. The authors provide a series of numerical results combining various solvers with various methods in order to gain insight into their computational performance and efficiency. In Chapter 10 the authors consider a computationally challenging problem, namely the computation of the electrical activity of the human heart. After a brief introduction on the biology of the problem the authors present the mathematical models involved and a numerical method for solving them within the framework of Diffpack. Chapter 11 and 12 are closely related; actually they could have been combined in a single chapter. Chapter 11 introduces several mathematical models used in finance, based on the Black--Scholes equation. Chapter 12 considers several numerical methods like Monte Carlo, lattice methods, finite difference and finite element methods. Implementation of these methods within Diffpack is presented in the last part of the chapter. Chapter 13 presents how the finite element method is used for the modelling and analysis of elastic structures. The authors describe the structural elements of Diffpack which include popular elements such as beams and plates and examples are presented on how to use them to simulate elastic structures. Chapter 14 describes an application problem, namely the extrusion of aluminum. This is a rather\\endcolumn complicated process which involves non-Newtonian flow, heat transfer and elasticity. The authors describe the systems of PDEs modelling the underlying process and use a finite element method to obtain a numerical solution. The implementation of the numerical method in Diffpack is presented along with some applications. The last chapter, chapter 15, focuses on mathematical and numerical models of systems of PDEs governing geological processes in sedimentary basins. The underlying mathematical model is solved using the finite element method within a fully implicit scheme. The authors discuss the implementational issues involved within Diffpack and they present results from several examples. In summary, the book focuses on the computational and implementational issues involved in solving partial differential equations. The potential reader should have a basic knowledge of PDEs and the finite difference and finite element methods. The examples presented are solved within the programming framework of Diffpack and the reader should have prior experience with the particular software in order to take full advantage of the book. Overall the book is well written, the subject of each chapter is well presented and can serve as a reference for graduate students, researchers and engineers who are interested in the numerical solution of partial differential equations modelling various applications.
HIPAA Compliant Wireless Sensing Smartwatch Application for the Self-Management of Pediatric Asthma
Hosseini, Anahita; Buonocore, Chris M.; Hashemzadeh, Sepideh; Hojaiji, Hannaneh; Kalantarian, Haik; Sideris, Costas; Bui, Alex A.T.; King, Christine E.; Sarrafzadeh, Majid
2018-01-01
Asthma is the most prevalent chronic disease among pediatrics, as it is the leading cause of student absenteeism and hospitalization for those under the age of 15. To address the significant need to manage this disease in children, the authors present a mobile health (mHealth) system that determines the risk of an asthma attack through physiological and environmental wireless sensors and representational state transfer application program interfaces (RESTful APIs). The data is sent from wireless sensors to a smartwatch application (app) via a Health Insurance Portability and Accountability Act (HIPAA) compliant cryptography framework, which then sends data to a cloud for real-time analytics. The asthma risk is then sent to the smartwatch and provided to the user via simple graphics for easy interpretation by children. After testing the safety and feasibility of the system in an adult with moderate asthma prior to testing in children, it was found that the analytics model is able to determine the overall asthma risk (high, medium, or low risk) with an accuracy of 80.10±14.13%. Furthermore, the features most important for assessing the risk of an asthma attack were multifaceted, highlighting the importance of continuously monitoring different wireless sensors and RESTful APIs. Future testing this asthma attack risk prediction system in pediatric asthma individuals may lead to an effective self-management asthma program. PMID:29354688
Assessment of steam-injected gas turbine systems and their potential application
NASA Technical Reports Server (NTRS)
Stochl, R. J.
1982-01-01
Results were arrived at by utilizing and expanding on information presented in the literature. The results were analyzed and compared with those for simple gas turbine and combined cycles for both utility power generation and industrial cogeneration applications. The efficiency and specific power of simple gas turbine cycles can be increased as much as 30 and 50 percent, respectively, by the injection of steam into the combustor. Steam-injected gas turbines appear to be economically competitive with both simple gas turbine and combined cycles for small, clean-fuel-fired utility power generation and industrial cogeneration applications. For large powerplants with integrated coal gasifiers, the economic advantages appear to be marginal.
Teaching Machines and Programmed Instruction.
ERIC Educational Resources Information Center
Kay, Harry; And Others
The various devices used in programed instruction range from the simple linear programed book to branching and skip branching programs, adaptive teaching machines, and even complex computer based systems. In order to provide a background for the would-be programer, the essential principles of each of these devices is outlined. Different ideas of…
Web Navigation Sequences Automation in Modern Websites
NASA Astrophysics Data System (ADS)
Montoto, Paula; Pan, Alberto; Raposo, Juan; Bellas, Fernando; López, Javier
Most today’s web sources are designed to be used by humans, but they do not provide suitable interfaces for software programs. That is why a growing interest has arisen in so-called web automation applications that are widely used for different purposes such as B2B integration, automated testing of web applications or technology and business watch. Previous proposals assume models for generating and reproducing navigation sequences that are not able to correctly deal with new websites using technologies such as AJAX: on one hand existing systems only allow recording simple navigation actions and, on the other hand, they are unable to detect the end of the effects caused by an user action. In this paper, we propose a set of new techniques to record and execute web navigation sequences able to deal with all the complexity existing in AJAX-based web sites. We also present an exhaustive evaluation of the proposed techniques that shows very promising results.
Limited take-up of health coverage tax credits: a challenge to future tax credit design.
Dorn, Stan; Varon, Janet; Pervez, Fouad
2005-10-01
The Trade Act of 2002 created federal tax credits to subsidize health coverage for certain early retirees and workers displaced by international trade. Though small, this program offers the opportunity to learn how to design future tax credits for larger groups of uninsured. During September 2004, the most recent month for which there are data about all forms of Trade Act credits, roughly 22 percent of eligible individuals received credits. The authors find that health insurance tax credits are more likely to reach their target populations if such credits: 1) limit premium costs for the low-income uninsured and do not require full premium payments while applications are pending; 2) provide access to coverage that beneficiaries value, including care for preexisting conditions; 3) are combined with outreach that uses easily understandable, multilingual materials and proactive enrollment efforts; and 4) feature a simple application process involving one form filed with one agency.
Thoth: Software for data visualization & statistics
NASA Astrophysics Data System (ADS)
Laher, R. R.
2016-10-01
Thoth is a standalone software application with a graphical user interface for making it easy to query, display, visualize, and analyze tabular data stored in relational databases and data files. From imported data tables, it can create pie charts, bar charts, scatter plots, and many other kinds of data graphs with simple menus and mouse clicks (no programming required), by leveraging the open-source JFreeChart library. It also computes useful table-column data statistics. A mature tool, having underwent development and testing over several years, it is written in the Java computer language, and hence can be run on any computing platform that has a Java Virtual Machine and graphical-display capability. It can be downloaded and used by anyone free of charge, and has general applicability in science, engineering, medical, business, and other fields. Special tools and features for common tasks in astronomy and astrophysical research are included in the software.
Measurement of community empowerment in three community programs in Rapla (Estonia).
Kasmel, Anu; Andersen, Pernille Tanggaard
2011-03-01
Community empowerment approaches have been proven to be powerful tools for solving local health problems. However, the methods for measuring empowerment in the community remain unclear and open to dispute. This study aims to describe how a context-specific community empowerment measurement tool was developed and changes made to three health promotion programs in Rapla, Estonia. An empowerment expansion model was compiled and applied to three existing programs: Safe Community, Drug/HIV Prevention and Elderly Quality of Life. The consensus workshop method was used to create the measurement tool and collect data on the Organizational Domains of Community Empowerment (ODCE). The study demonstrated considerable increases in the ODCE among the community workgroup, which was initiated by community members and the municipality's decision-makers. The increase was within the workgroup, which had strong political and financial support on a national level but was not the community's priority. The program was initiated and implemented by the local community members, and continuous development still occurred, though at a reduced pace. The use of the empowerment expansion model has proven to be an applicable, relevant, simple and inexpensive tool for the evaluation of community empowerment.
A Rewriting-Based Approach to Trace Analysis
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)
2002-01-01
We present a rewriting-based algorithm for efficiently evaluating future time Linear Temporal Logic (LTL) formulae on finite execution traces online. While the standard models of LTL are infinite traces, finite traces appear naturally when testing and/or monitoring red applications that only run for limited time periods. The presented algorithm is implemented in the Maude executable specification language and essentially consists of a set of equations establishing an executable semantics of LTL using a simple formula transforming approach. The algorithm is further improved to build automata on-the-fly from formulae, using memoization. The result is a very efficient and small Maude program that can be used to monitor program executions. We furthermore present an alternative algorithm for synthesizing probably minimal observer finite state machines (or automata) from LTL formulae, which can be used to analyze execution traces without the need for a rewriting system, and can hence be used by observers written in conventional programming languages. The presented work is part of an ambitious runtime verification and monitoring project at NASA Ames, called PATHEXPLORER, and demonstrates that rewriting can be a tractable and attractive means for experimenting and implementing program monitoring logics.
NASA Technical Reports Server (NTRS)
Funk, Christie J.; Perry, Boyd, III; Silva, Walter A.; Newman, Brett
2014-01-01
A software program and associated methodology to study gust loading on aircraft exists for a classification of geometrically simplified flexible configurations. This program consists of a simple aircraft response model with two rigid and three flexible symmetric degrees-of - freedom and allows for the calculation of various airplane responses due to a discrete one-minus- cosine gust as well as continuous turbulence. Simplifications, assumptions, and opportunities for potential improvements pertaining to the existing software program are first identified, then a revised version of the original software tool is developed with improved methodology to include more complex geometries, additional excitation cases, and additional output data so as to provide a more useful and precise tool for gust load analysis. In order to improve the original software program to enhance usefulness, a wing control surface and horizontal tail control surface is added, an extended application of the discrete one-minus-cosine gust input is employed, a supplemental continuous turbulence spectrum is implemented, and a capability to animate the total vehicle deformation response to gust inputs is included. These revisions and enhancements are implemented and an analysis of the results is used to validate the modifications.
Foundations of the Bandera Abstraction Tools
NASA Technical Reports Server (NTRS)
Hatcliff, John; Dwyer, Matthew B.; Pasareanu, Corina S.; Robby
2003-01-01
Current research is demonstrating that model-checking and other forms of automated finite-state verification can be effective for checking properties of software systems. Due to the exponential costs associated with model-checking, multiple forms of abstraction are often necessary to obtain system models that are tractable for automated checking. The Bandera Tool Set provides multiple forms of automated support for compiling concurrent Java software systems to models that can be supplied to several different model-checking tools. In this paper, we describe the foundations of Bandera's data abstraction mechanism which is used to reduce the cardinality (and the program's state-space) of data domains in software to be model-checked. From a technical standpoint, the form of data abstraction used in Bandera is simple, and it is based on classical presentations of abstract interpretation. We describe the mechanisms that Bandera provides for declaring abstractions, for attaching abstractions to programs, and for generating abstracted programs and properties. The contributions of this work are the design and implementation of various forms of tool support required for effective application of data abstraction to software components written in a programming language like Java which has a rich set of linguistic features.
Measurement of Community Empowerment in Three Community Programs in Rapla (Estonia)
Kasmel, Anu; Andersen, Pernille Tanggaard
2011-01-01
Community empowerment approaches have been proven to be powerful tools for solving local health problems. However, the methods for measuring empowerment in the community remain unclear and open to dispute. This study aims to describe how a context-specific community empowerment measurement tool was developed and changes made to three health promotion programs in Rapla, Estonia. An empowerment expansion model was compiled and applied to three existing programs: Safe Community, Drug/HIV Prevention and Elderly Quality of Life. The consensus workshop method was used to create the measurement tool and collect data on the Organizational Domains of Community Empowerment (ODCE). The study demonstrated considerable increases in the ODCE among the community workgroup, which was initiated by community members and the municipality’s decision-makers. The increase was within the workgroup, which had strong political and financial support on a national level but was not the community’s priority. The program was initiated and implemented by the local community members, and continuous development still occurred, though at a reduced pace. The use of the empowerment expansion model has proven to be an applicable, relevant, simple and inexpensive tool for the evaluation of community empowerment. PMID:21556179
Hawkins, Kenneth R; Cantera, Jason L; Storey, Helen L; Leader, Brandon T; de Los Santos, Tala
2016-12-01
Global efforts to address schistosomiasis and soil-transmitted helminthiases (STH) include deworming programs for school-aged children that are made possible by large-scale drug donations. Decisions on these mass drug administration (MDA) programs currently rely on microscopic examination of clinical specimens to determine the presence of parasite eggs. However, microscopy-based methods are not sensitive to the low-intensity infections that characterize populations that have undergone MDA. Thus, there has been increasing recognition within the schistosomiasis and STH communities of the need for improved diagnostic tools to support late-stage control program decisions, such as when to stop or reduce MDA. Failure to adequately address the need for new diagnostics could jeopardize achievement of the 2020 London Declaration goals. In this report, we assess diagnostic needs and landscape potential solutions and determine appropriate strategies to improve diagnostic testing to support control and elimination programs. Based upon literature reviews and previous input from experts in the schistosomiasis and STH communities, we prioritized two diagnostic use cases for further exploration: to inform MDA-stopping decisions and post-MDA surveillance. To this end, PATH has refined target product profiles (TPPs) for schistosomiasis and STH diagnostics that are applicable to these use cases. We evaluated the limitations of current diagnostic methods with regards to these use cases and identified candidate biomarkers and diagnostics with potential application as new tools. Based on this analysis, there is a need to develop antigen-detecting rapid diagnostic tests (RDTs) with simplified, field-deployable sample preparation for schistosomiasis. Additionally, there is a need for diagnostic tests that are more sensitive than the current methods for STH, which may include either a field-deployable molecular test or a simple, low-cost, rapid antigen-detecting test.
Characterization of Louisiana asphalt mixtures using simple performance tests and MEPDG.
DOT National Transportation Integrated Search
2014-04-01
The National Cooperative Highway Research Program (NCHRP) Project 9-19, Superpave Support and Performance : Models Management, recommended three Simple Performance Tests (SPTs) to complement the Superpave volumetric : mixture design method. These are...
Computer Technology for Industry
NASA Technical Reports Server (NTRS)
1979-01-01
In this age of the computer, more and more business firms are automating their operations for increased efficiency in a great variety of jobs, from simple accounting to managing inventories, from precise machining to analyzing complex structures. In the interest of national productivity, NASA is providing assistance both to longtime computer users and newcomers to automated operations. Through a special technology utilization service, NASA saves industry time and money by making available already developed computer programs which have secondary utility. A computer program is essentially a set of instructions which tells the computer how to produce desired information or effect by drawing upon its stored input. Developing a new program from scratch can be costly and time-consuming. Very often, however, a program developed for one purpose can readily be adapted to a totally different application. To help industry take advantage of existing computer technology, NASA operates the Computer Software Management and Information Center (COSMIC)(registered TradeMark),located at the University of Georgia. COSMIC maintains a large library of computer programs developed for NASA, the Department of Defense, the Department of Energy and other technology-generating agencies of the government. The Center gets a continual flow of software packages, screens them for adaptability to private sector usage, stores them and informs potential customers of their availability.
Wibowo, Santoso; Deng, Hepu
2015-06-01
This paper presents a multi-criteria group decision making approach for effectively evaluating the performance of e-waste recycling programs under uncertainty in an organization. Intuitionistic fuzzy numbers are used for adequately representing the subjective and imprecise assessments of the decision makers in evaluating the relative importance of evaluation criteria and the performance of individual e-waste recycling programs with respect to individual criteria in a given situation. An interactive fuzzy multi-criteria decision making algorithm is developed for facilitating consensus building in a group decision making environment to ensure that all the interest of individual decision makers have been appropriately considered in evaluating alternative e-waste recycling programs with respect to their corporate sustainability performance. The developed algorithm is then incorporated into a multi-criteria decision support system for making the overall performance evaluation process effectively and simple to use. Such a multi-criteria decision making system adequately provides organizations with a proactive mechanism for incorporating the concept of corporate sustainability into their regular planning decisions and business practices. An example is presented for demonstrating the applicability of the proposed approach in evaluating the performance of e-waste recycling programs in organizations. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Nordin, Noraimi Azlin Mohd; Omar, Mohd; Sharif, S. Sarifah Radiah
2017-04-01
Companies are looking forward to improve their productivity within their warehouse operations and distribution centres. In a typical warehouse operation, order picking contributes more than half percentage of the operating costs. Order picking is a benchmark in measuring the performance and productivity improvement of any warehouse management. Solving order picking problem is crucial in reducing response time and waiting time of a customer in receiving his demands. To reduce the response time, proper routing for picking orders is vital. Moreover, in production line, it is vital to always make sure the supplies arrive on time. Hence, a sample routing network will be applied on EP Manufacturing Berhad (EPMB) as a case study. The Dijkstra's algorithm and Dynamic Programming method are applied to find the shortest distance for an order picker in order picking. The results show that the Dynamic programming method is a simple yet competent approach in finding the shortest distance to pick an order that is applicable in a warehouse within a short time period.
NASA Astrophysics Data System (ADS)
Zielinski, Theresa Julia; Brooks, David W.; Crippen, Kent J.; March, Joe L.
2001-06-01
Time management is an important issue for teachers and students. This article discusses teachers' use of time from the perspective of curriculum and instruction. Average high school students spend fewer than 5 hours per week in outside-of-class study; average college students spend about 20 hours. Procrastination, often viewed in a negative light by teachers, usually pays off so well for college students that seniors become better at it than freshmen. Three suggestions for designing instruction are: test early and often; do not waste the best students' time in an effort to improve overall performance; and use engaging activities that motivate students to give of their time. The impact of computers on curricula is a double-edged sword. Time must be devoted to teaching the use of applications, but the programs reduce busywork. Will this turn out to be a simple tradeoff, or will the programs make us much more efficient so that less time is required? Will computer programs ultimately lead to an expanded criterion for expertise, thus demanding even more time to become an expert? These issues are described and suggestions for controlling time during instruction are provided.
The pacific island health care project.
Person, Donald Ames
2014-01-01
US Associated/Affiliated Pacific Islands (USAPI) include three freely associated states: Marshall Islands, Federated States of Micronesia, Palau, and three Territories: American Samoa, Guam, and Commonwealth of the Northern Mariana Islands. The Pacific Island Health Care Project (PIHCP) provides humanitarian medical referral/consultation/care to >500,000 indigenous people of these remote islands. In the mid-1990s, we developed a simple store-and-forward program to link the USAPI with Tripler Army Medical Center. This application allowed image attachment to email consultations. More than 8000 Pacific Islanders have benefited from the program. Three thousand Pacific Islanders prior to telemedicine (1990-1997) and since store-and-forward telemedicine (1997-present), the PIHCP has helped an additional 5000. Records post dynamically and are stored in an archival database. The PIHCP is the longest running telemedicine program in the world delivering humanitarian medical care. It has bridged the Developing World of the remote Pacific Islands with advanced medical and surgical care available at a major US military teaching hospital. (The opinions expressed here are those of the author and not that of the Army, Department of Defense, or the US Government.).
Nakagaki, Naomi; Hitt, Kerie J.; Price, Curtis V.; Falcone, James A.
2012-01-01
Characterization of natural and anthropogenic features that define the environmental settings of sampling sites for streams and groundwater, including drainage basins and groundwater study areas, is an essential component of water-quality and ecological investigations being conducted as part of the U.S. Geological Survey's National Water-Quality Assessment program. Quantitative characterization of environmental settings, combined with physical, chemical, and biological data collected at sampling sites, contributes to understanding the status of, and influences on, water-quality and ecological conditions. To support studies for the National Water-Quality Assessment program, a geographic information system (GIS) was used to develop a standard set of methods to consistently characterize the sites, drainage basins, and groundwater study areas across the nation. This report describes three methods used for characterization-simple overlay, area-weighted areal interpolation, and land-cover-weighted areal interpolation-and their appropriate applications to geographic analyses that have different objectives and data constraints. In addition, this document records the GIS thematic datasets that are used for the Program's national design and data analyses.
Darrah, Johanna; Loomis, Joan; Manns, Patricia; Norton, Barbara; May, Laura
2006-11-01
The Department of Physical Therapy, University of Alberta, Edmonton, Alberta, Canada, recently implemented a Master of Physical Therapy (MPT) entry-level degree program. As part of the curriculum design, two models were developed, a Model of Best Practice and the Clinical Decision-Making Model. Both models incorporate four key concepts of the new curriculum: 1) the concept that theory, research, and clinical practice are interdependent and inform each other; 2) the importance of client-centered practice; 3) the terminology and philosophical framework of the World Health Organization's International Classification of Functioning, Disability, and Health; and 4) the importance of evidence-based practice. In this article the general purposes of models for learning are described; the two models developed for the MPT program are described; and examples of their use with curriculum design and teaching are provided. Our experiences with both the development and use of models of practice have been positive. The models have provided both faculty and students with a simple, systematic structured framework to organize teaching and learning in the MPT program.
Simple Simulation Algorithms and Sample Applications
NASA Astrophysics Data System (ADS)
Kröger, Martin
This section offers basic recipes and sample applications which allow the reader to immediately start his/her own simulation project on topics we dealt with in this book. Concerning molecular dynamics and Monte Carlo simulation there are, of course, several useful books already available which describe the ‘art of simulation‘ [141, 156, 256] in an exhaustive way. The reason we print some simple codes is that we skipped algorithmic details in the foregoing chapters. Simulations are always performed using dimensionless numbers, and all dimensional quantities can be expressed in terms of reduced units, cf. Sect. 4.3 for conventional Lennard Jones units. In this chapter, we concentrate on the necessary, and skip anything more sophisticated. Codes have been used in classrooms, they are obviously open for modifications and extensions, and offer not only an executable, but all necessary formulas for doing simulations in the correct (which is often essential) order. The overall spirit is as follows: codes are short, run without changes, demonstrate the main principle in a modular fashion, and are thus in particular open regarding efficiency issues and extensions. Algorithms are presented in the MatlabTM language, which is mostly directly portable to programming languages like fortran, c, or MathematicaTM. For an introduction we refer to [423]. Additional commands needed to visualize the results are given in the figure title for each application. Simulation codes, in a less modular fashion, are also available online at www.complexfluids.ethz.ch. Functions are shared over sections, for that reason we begin with an alphabetic list of all (nonbuiltin) functions in this chapter.
Testing the Structure of Hydrological Models using Genetic Programming
NASA Astrophysics Data System (ADS)
Selle, B.; Muttil, N.
2009-04-01
Genetic Programming is able to systematically explore many alternative model structures of different complexity from available input and response data. We hypothesised that genetic programming can be used to test the structure hydrological models and to identify dominant processes in hydrological systems. To test this, genetic programming was used to analyse a data set from a lysimeter experiment in southeastern Australia. The lysimeter experiment was conducted to quantify the deep percolation response under surface irrigated pasture to different soil types, water table depths and water ponding times during surface irrigation. Using genetic programming, a simple model of deep percolation was consistently evolved in multiple model runs. This simple and interpretable model confirmed the dominant process contributing to deep percolation represented in a conceptual model that was published earlier. Thus, this study shows that genetic programming can be used to evaluate the structure of hydrological models and to gain insight about the dominant processes in hydrological systems.
MM Algorithms for Geometric and Signomial Programming
Lange, Kenneth; Zhou, Hua
2013-01-01
This paper derives new algorithms for signomial programming, a generalization of geometric programming. The algorithms are based on a generic principle for optimization called the MM algorithm. In this setting, one can apply the geometric-arithmetic mean inequality and a supporting hyperplane inequality to create a surrogate function with parameters separated. Thus, unconstrained signomial programming reduces to a sequence of one-dimensional minimization problems. Simple examples demonstrate that the MM algorithm derived can converge to a boundary point or to one point of a continuum of minimum points. Conditions under which the minimum point is unique or occurs in the interior of parameter space are proved for geometric programming. Convergence to an interior point occurs at a linear rate. Finally, the MM framework easily accommodates equality and inequality constraints of signomial type. For the most important special case, constrained quadratic programming, the MM algorithm involves very simple updates. PMID:24634545
MM Algorithms for Geometric and Signomial Programming.
Lange, Kenneth; Zhou, Hua
2014-02-01
This paper derives new algorithms for signomial programming, a generalization of geometric programming. The algorithms are based on a generic principle for optimization called the MM algorithm. In this setting, one can apply the geometric-arithmetic mean inequality and a supporting hyperplane inequality to create a surrogate function with parameters separated. Thus, unconstrained signomial programming reduces to a sequence of one-dimensional minimization problems. Simple examples demonstrate that the MM algorithm derived can converge to a boundary point or to one point of a continuum of minimum points. Conditions under which the minimum point is unique or occurs in the interior of parameter space are proved for geometric programming. Convergence to an interior point occurs at a linear rate. Finally, the MM framework easily accommodates equality and inequality constraints of signomial type. For the most important special case, constrained quadratic programming, the MM algorithm involves very simple updates.
Lancioni, Giulio E; Singh, Nirbhay N; O'Reilly, Mark F; Sigafoos, Jeff; D'Amico, Fiora; Pinto, Katia; Chiapparino, Claudia
2017-05-01
These 2 studies assessed a technology-aided program to support mild physical exercise or simple occupational activity in participants with moderate to severe Alzheimer's disease. Study 1 included 11 participants who were to perform a leg-raising response. Study 2 included 10 participants who were to sort objects into different containers. The program ensured that they received positive stimulation contingent on the responses and reminders/prompts after periods of nonresponding. Each study was carried out according to a nonconcurrent multiple baseline design across participants. The program was successful in supporting mild physical exercise and activity with objects in the 2 groups of participants, respectively. The participants also showed signs of positive involvement (eg, smiles and verbalizations) during the sessions. Moreover, staff personnel rated the program and its impact positively. The program may be considered a practical resource for supporting positive engagement in persons with moderate to severe Alzheimer's disease.
CAI-BASIC: A Program to Teach the Programming Language BASIC.
ERIC Educational Resources Information Center
Barry, Thomas Anthony
A computer-assisted instruction (CAI) program was designed which fulfills the objectives of teaching a simple programing language, interpreting student responses, and executing and editing student programs. The CAI-BASIC program is written in FORTRAN IV and executes on IBM-2741 terminals while running under a time-sharing system on an IBM-360-70…
The `Miracle' of Applicability? The Curious Case of the Simple Harmonic Oscillator
NASA Astrophysics Data System (ADS)
Bangu, Sorin; Moir, Robert H. C.
2018-05-01
The paper discusses to what extent the conceptual issues involved in solving the simple harmonic oscillator model fit Wigner's famous point that the applicability of mathematics borders on the miraculous. We argue that although there is ultimately nothing mysterious here, as is to be expected, a careful demonstration that this is so involves unexpected difficulties. Consequently, through the lens of this simple case we derive some insight into what is responsible for the appearance of mystery in more sophisticated examples of the Wigner problem.
The `Miracle' of Applicability? The Curious Case of the Simple Harmonic Oscillator
NASA Astrophysics Data System (ADS)
Bangu, Sorin; Moir, Robert H. C.
2018-03-01
The paper discusses to what extent the conceptual issues involved in solving the simple harmonic oscillator model fit Wigner's famous point that the applicability of mathematics borders on the miraculous. We argue that although there is ultimately nothing mysterious here, as is to be expected, a careful demonstration that this is so involves unexpected difficulties. Consequently, through the lens of this simple case we derive some insight into what is responsible for the appearance of mystery in more sophisticated examples of the Wigner problem.
Configuration Management of an Optimization Application in a Research Environment
NASA Technical Reports Server (NTRS)
Townsend, James C.; Salas, Andrea O.; Schuler, M. Patricia
1999-01-01
Multidisciplinary design optimization (MDO) research aims to increase interdisciplinary communication and reduce design cycle time by combining system analyses (simulations) with design space search and decision making. The High Performance Computing and Communication Program's current High Speed Civil Transport application, HSCT4.0, at NASA Langley Research Center involves a highly complex analysis process with high-fidelity analyses that are more realistic than previous efforts at the Center. The multidisciplinary processes have been integrated to form a distributed application by using the Java language and Common Object Request Broker Architecture (CORBA) software techniques. HSCT4.0 is a research project in which both the application problem and the implementation strategy have evolved as the MDO and integration issues became better understood. Whereas earlier versions of the application and integrated system were developed with a simple, manual software configuration management (SCM) process, it was evident that this larger project required a more formal SCM procedure. This report briefly describes the HSCT4.0 analysis and its CORBA implementation and then discusses some SCM concepts and their application to this project. In anticipation that SCM will prove beneficial for other large research projects, the report concludes with some lessons learned in overcoming SCM implementation problems for HSCT4.0.
The MOLGENIS toolkit: rapid prototyping of biosoftware at the push of a button
2010-01-01
Background There is a huge demand on bioinformaticians to provide their biologists with user friendly and scalable software infrastructures to capture, exchange, and exploit the unprecedented amounts of new *omics data. We here present MOLGENIS, a generic, open source, software toolkit to quickly produce the bespoke MOLecular GENetics Information Systems needed. Methods The MOLGENIS toolkit provides bioinformaticians with a simple language to model biological data structures and user interfaces. At the push of a button, MOLGENIS’ generator suite automatically translates these models into a feature-rich, ready-to-use web application including database, user interfaces, exchange formats, and scriptable interfaces. Each generator is a template of SQL, JAVA, R, or HTML code that would require much effort to write by hand. This ‘model-driven’ method ensures reuse of best practices and improves quality because the modeling language and generators are shared between all MOLGENIS applications, so that errors are found quickly and improvements are shared easily by a re-generation. A plug-in mechanism ensures that both the generator suite and generated product can be customized just as much as hand-written software. Results In recent years we have successfully evaluated the MOLGENIS toolkit for the rapid prototyping of many types of biomedical applications, including next-generation sequencing, GWAS, QTL, proteomics and biobanking. Writing 500 lines of model XML typically replaces 15,000 lines of hand-written programming code, which allows for quick adaptation if the information system is not yet to the biologist’s satisfaction. Each application generated with MOLGENIS comes with an optimized database back-end, user interfaces for biologists to manage and exploit their data, programming interfaces for bioinformaticians to script analysis tools in R, Java, SOAP, REST/JSON and RDF, a tab-delimited file format to ease upload and exchange of data, and detailed technical documentation. Existing databases can be quickly enhanced with MOLGENIS generated interfaces using the ‘ExtractModel’ procedure. Conclusions The MOLGENIS toolkit provides bioinformaticians with a simple model to quickly generate flexible web platforms for all possible genomic, molecular and phenotypic experiments with a richness of interfaces not provided by other tools. All the software and manuals are available free as LGPLv3 open source at http://www.molgenis.org. PMID:21210979
Physical Applications of a Simple Approximation of Bessel Functions of Integer Order
ERIC Educational Resources Information Center
Barsan, V.; Cojocaru, S.
2007-01-01
Applications of a simple approximation of Bessel functions of integer order, in terms of trigonometric functions, are discussed for several examples from electromagnetism and optics. The method may be applied in the intermediate regime, bridging the "small values regime" and the "asymptotic" one, and covering, in this way, an area of great…
Estimating linear temporal trends from aggregated environmental monitoring data
Erickson, Richard A.; Gray, Brian R.; Eager, Eric A.
2017-01-01
Trend estimates are often used as part of environmental monitoring programs. These trends inform managers (e.g., are desired species increasing or undesired species decreasing?). Data collected from environmental monitoring programs is often aggregated (i.e., averaged), which confounds sampling and process variation. State-space models allow sampling variation and process variations to be separated. We used simulated time-series to compare linear trend estimations from three state-space models, a simple linear regression model, and an auto-regressive model. We also compared the performance of these five models to estimate trends from a long term monitoring program. We specifically estimated trends for two species of fish and four species of aquatic vegetation from the Upper Mississippi River system. We found that the simple linear regression had the best performance of all the given models because it was best able to recover parameters and had consistent numerical convergence. Conversely, the simple linear regression did the worst job estimating populations in a given year. The state-space models did not estimate trends well, but estimated population sizes best when the models converged. We found that a simple linear regression performed better than more complex autoregression and state-space models when used to analyze aggregated environmental monitoring data.
EnviroLand: A Simple Computer Program for Quantitative Stream Assessment.
ERIC Educational Resources Information Center
Dunnivant, Frank; Danowski, Dan; Timmens-Haroldson, Alice; Newman, Meredith
2002-01-01
Introduces the Enviroland computer program which features lab simulations of theoretical calculations for quantitative analysis and environmental chemistry, and fate and transport models. Uses the program to demonstrate the nature of linear and nonlinear equations. (Author/YDS)
Collen, M F
1994-01-01
This article summarizes the origins of informatics, which is based on the science, engineering, and technology of computer hardware, software, and communications. In just four decades, from the 1950s to the 1990s, computer technology has progressed from slow, first-generation vacuum tubes, through the invention of the transistor and its incorporation into microprocessor chips, and ultimately, to fast, fourth-generation very-large-scale-integrated silicon chips. Programming has undergone a parallel transformation, from cumbersome, first-generation, machine languages to efficient, fourth-generation application-oriented languages. Communication has evolved from simple copper wires to complex fiberoptic cables in computer-linked networks. The digital computer has profound implications for the development and practice of clinical medicine. PMID:7719803
Infinite horizon optimal impulsive control with applications to Internet congestion control
NASA Astrophysics Data System (ADS)
Avrachenkov, Konstantin; Habachi, Oussama; Piunovskiy, Alexey; Zhang, Yi
2015-04-01
We investigate infinite-horizon deterministic optimal control problems with both gradual and impulsive controls, where any finitely many impulses are allowed simultaneously. Both discounted and long-run time-average criteria are considered. We establish very general and at the same time natural conditions, under which the dynamic programming approach results in an optimal feedback policy. The established theoretical results are applied to the Internet congestion control, and by solving analytically and nontrivially the underlying optimal control problems, we obtain a simple threshold-based active queue management scheme, which takes into account the main parameters of the transmission control protocols, and improves the fairness among the connections in a given network.
BioServices: a common Python package to access biological Web Services programmatically.
Cokelaer, Thomas; Pultz, Dennis; Harder, Lea M; Serra-Musach, Jordi; Saez-Rodriguez, Julio
2013-12-15
Web interfaces provide access to numerous biological databases. Many can be accessed to in a programmatic way thanks to Web Services. Building applications that combine several of them would benefit from a single framework. BioServices is a comprehensive Python framework that provides programmatic access to major bioinformatics Web Services (e.g. KEGG, UniProt, BioModels, ChEMBLdb). Wrapping additional Web Services based either on Representational State Transfer or Simple Object Access Protocol/Web Services Description Language technologies is eased by the usage of object-oriented programming. BioServices releases and documentation are available at http://pypi.python.org/pypi/bioservices under a GPL-v3 license.
(abstract) Simple Spreadsheet Thermal Models for Cryogenic Applications
NASA Technical Reports Server (NTRS)
Nash, A. E.
1994-01-01
Self consistent circuit analog thermal models, that can be run in commercial spreadsheet programs on personal computers, have been created to calculate the cooldown and steady state performance of cryogen cooled Dewars. The models include temperature dependent conduction and radiation effects. The outputs of the models provide temperature distribution and Dewar performance information. These models have been used to analyze the Cryogenic Telescope Test Facility (CTTF). The facility will be on line in early 1995 for its first user, the Infrared Telescope Technology Testbed (ITTT), for the Space Infrared Telescope Facility (SIRTF) at JPL. The model algorithm as well as a comparison of the model predictions and actual performance of this facility will be presented.
Simple Spreadsheet Thermal Models for Cryogenic Applications
NASA Technical Reports Server (NTRS)
Nash, Alfred
1995-01-01
Self consistent circuit analog thermal models that can be run in commercial spreadsheet programs on personal computers have been created to calculate the cooldown and steady state performance of cryogen cooled Dewars. The models include temperature dependent conduction and radiation effects. The outputs of the models provide temperature distribution and Dewar performance information. these models have been used to analyze the SIRTF Telescope Test Facility (STTF). The facility has been brought on line for its first user, the Infrared Telescope Technology Testbed (ITTT), for the Space Infrared Telescope Facility (SIRTF) at JPL. The model algorithm as well as a comparison between the models' predictions and actual performance of this facility will be presented.
Coherent light squeezing states within a modified microring system
NASA Astrophysics Data System (ADS)
Ali, J.; Pornsuwancharoen, N.; Youplao, P.; Aziz, M. S.; Amiri, I. S.; Chaiwong, K.; Chiangga, S.; Singh, G.; Yupapin, P.
2018-06-01
We have proposed the simple method of the squeezed light generation in the modified microring resonator, which is known as the microring conjugate mirror (MCM). When the monochromatic light is input into the MCM, the general form of the squeezed coherent states for a quantum harmonic oscillator can be generated by controlling the additional two side rings, which are the phase modulators. By using the graphical method called the Optiwave program, the coherent squeezed states of coherent light within an MCM can be obtained and interpreted as the amplitude, phase, quadrature and photon number-squeezed states. This method has shown potentials for microring related device design, which can be used before practical applications.
ITER structural design criteria and their extension to advanced reactor blankets*1
NASA Astrophysics Data System (ADS)
Majumdar, S.; Kalinin, G.
2000-12-01
Applications of the recent ITER structural design criteria (ISDC) are illustrated by two components. First, the low-temperature-design rules are applied to copper alloys that are particularly prone to irradiation embrittlement at relatively low fluences at certain temperatures. Allowable stresses are derived and the impact of the embrittlement on allowable surface heat flux of a simple first-wall/limiter design is demonstrated. Next, the high-temperature-design rules of ISDC are applied to evaporation of lithium and vapor extraction (EVOLVE), a blanket design concept currently being investigated under the US Advanced Power Extraction (APEX) program. A single tungsten first-wall tube is considered for thermal and stress analyses by finite-element method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boman, Erik G.
This LDRD project was a campus exec fellowship to fund (in part) Donald Nguyen’s PhD research at UT-Austin. His work has focused on parallel programming models, and scheduling irregular algorithms on shared-memory systems using the Galois framework. Galois provides a simple but powerful way for users and applications to automatically obtain good parallel performance using certain supported data containers. The naïve user can write serial code, while advanced users can optimize performance by advanced features, such as specifying the scheduling policy. Galois was used to parallelize two sparse matrix reordering schemes: RCM and Sloan. Such reordering is important in high-performancemore » computing to obtain better data locality and thus reduce run times.« less
Implementation of customized health information technology in diabetes self management programs.
Alexander, Susan; Frith, Karen H; O'Keefe, Louise; Hennigan, Michael A
2011-01-01
The project was a nurse-led implementation of a software application, designed to combine clinical and demographic records for a diabetes education program, which would result in secure, long-term record storage. Clinical information systems may be prohibitively expensive for small practices and require extensive training for implementation. A review of the literature suggests that the use of simple, practice-based registries offer an economical method of monitoring the outcomes of diabetic patients. The database was designed using a common software application, Microsoft Access. The theory used to guide implementation and staff training was Rogers' Diffusion of Innovations theory (1995). Outcomes after a 3-month period included incorporation of 100% of new clinical and demographic patient records into the database and positive changes in staff attitudes regarding software applications used in diabetes self-management training. These objectives were met while keeping project costs under budgeted amounts. As a function of the clinical nurse specialist (CNS) researcher role, there is a need for CNSs to identify innovative and economical methods of data collection. The success of this nurse-led project reinforces suggestions in the literature for less costly methods of data maintenance in small practice settings. Ongoing utilization and enhancement have resulted in the creation of a robust database that could aid in the research of multiple clinical issues. Clinical nurse specialists can use existing evidence to guide and improve both their own practice and outcomes for patients and organizations. Further research regarding specific factors that predict efficient transition of informatics applications, how these factors vary according to practice settings, and the role of the CNS in implementation of such applications is needed.
Kim, Hye Hyeon
2014-01-01
Objectives Diabetes is a chronic disease of continuously increasing prevalence. It is a disease with risks of serious complications, thus warranting its long-term management. However, current health management and education programs for diabetes mainly consist of one-way communication, and systematic social support backup to solve diabetics' emotional problems is insufficient. Methods According to individual behavioral changes based on the Transtheoretical Model, we designed a non-drug intervention, including exercise, and applied it to a mobile based application. For effective data sharing between patients and physicians, we adopted an SNS function for our application in order to offer a social support environment. Results To induce continual and comprehensive care for diabetes, rigorous self-management is essential during the diabetic's life; this is possible through a collaborative patient-physician healthcare model. We designed and developed an SNS-based diabetes self-management mobile application that supports the use of social groups, which are present in three social GYM types. With simple testing of patients in their 20s and 30s, we were able to validate the usefulness of our application. Conclusions Mobile gadget-based chronic disease symptom management and intervention has the merit that health management can be conducted anywhere and anytime in order to cope with increases in the demand for health and medical services that are occurring due to the aging of the population and to cope with the surge of national medical service costs. This patient-driven and SNS-based intervention program is expected to contribute to promoting the health management habits of diabetics, who need to constantly receive health guidance. PMID:25152836
SIMPSON: A General Simulation Program for Solid-State NMR Spectroscopy
NASA Astrophysics Data System (ADS)
Bak, Mads; Rasmussen, Jimmy T.; Nielsen, Niels Chr.
2000-12-01
A computer program for fast and accurate numerical simulation of solid-state NMR experiments is described. The program is designed to emulate a NMR spectrometer by letting the user specify high-level NMR concepts such as spin systems, nuclear spin interactions, RF irradiation, free precession, phase cycling, coherence-order filtering, and implicit/explicit acquisition. These elements are implemented using the Tcl scripting language to ensure a minimum of programming overhead and direct interpretation without the need for compilation, while maintaining the flexibility of a full-featured programming language. Basicly, there are no intrinsic limitations to the number of spins, types of interactions, sample conditions (static or spinning, powders, uniaxially oriented molecules, single crystals, or solutions), and the complexity or number of spectral dimensions for the pulse sequence. The applicability ranges from simple 1D experiments to advanced multiple-pulse and multiple-dimensional experiments, series of simulations, parameter scans, complex data manipulation/visualization, and iterative fitting of simulated to experimental spectra. A major effort has been devoted to optimizing the computation speed using state-of-the-art algorithms for the time-consuming parts of the calculations implemented in the core of the program using the C programming language. Modification and maintenance of the program are facilitated by releasing the program as open source software (General Public License) currently at http://nmr.imsb.au.dk. The general features of the program are demonstrated by numerical simulations of various aspects for REDOR, rotational resonance, DRAMA, DRAWS, HORROR, C7, TEDOR, POST-C7, CW decoupling, TPPM, F-SLG, SLF, SEMA-CP, PISEMA, RFDR, QCPMG-MAS, and MQ-MAS experiments.
SIMPSON: A general simulation program for solid-state NMR spectroscopy
NASA Astrophysics Data System (ADS)
Bak, Mads; Rasmussen, Jimmy T.; Nielsen, Niels Chr.
2011-12-01
A computer program for fast and accurate numerical simulation of solid-state NMR experiments is described. The program is designed to emulate a NMR spectrometer by letting the user specify high-level NMR concepts such as spin systems, nuclear spin interactions, RF irradiation, free precession, phase cycling, coherence-order filtering, and implicit/explicit acquisition. These elements are implemented using the Tel scripting language to ensure a minimum of programming overhead and direct interpretation without the need for compilation, while maintaining the flexibility of a full-featured programming language. Basicly, there are no intrinsic limitations to the number of spins, types of interactions, sample conditions (static or spinning, powders, uniaxially oriented molecules, single crystals, or solutions), and the complexity or number of spectral dimensions for the pulse sequence. The applicability ranges from simple ID experiments to advanced multiple-pulse and multiple-dimensional experiments, series of simulations, parameter scans, complex data manipulation/visualization, and iterative fitting of simulated to experimental spectra. A major effort has been devoted to optimizing the computation speed using state-of-the-art algorithms for the time-consuming parts of the calculations implemented in the core of the program using the C programming language. Modification and maintenance of the program are facilitated by releasing the program as open source software (General Public License) currently at http://nmr.imsb.au.dk. The general features of the program are demonstrated by numerical simulations of various aspects for REDOR, rotational resonance, DRAMA, DRAWS, HORROR, C7, TEDOR, POST-C7, CW decoupling, TPPM, F-SLG, SLF, SEMA-CP, PISEMA, RFDR, QCPMG-MAS, and MQ-MAS experiments.
Measuring Success: Evaluating Educational Programs
ERIC Educational Resources Information Center
Fisher, Yael
2010-01-01
This paper reveals a new evaluation model, which enables educational program and project managers to evaluate their programs with a simple and easy to understand approach. The "index of success model" is comprised of five parameters that enable to focus on and evaluate both the implementation and results of an educational program. The…
Using Small-Step Refinement for Algorithm Verification in Computer Science Education
ERIC Educational Resources Information Center
Simic, Danijela
2015-01-01
Stepwise program refinement techniques can be used to simplify program verification. Programs are better understood since their main properties are clearly stated, and verification of rather complex algorithms is reduced to proving simple statements connecting successive program specifications. Additionally, it is easy to analyse similar…
Testing the structure of a hydrological model using Genetic Programming
NASA Astrophysics Data System (ADS)
Selle, Benny; Muttil, Nitin
2011-01-01
SummaryGenetic Programming is able to systematically explore many alternative model structures of different complexity from available input and response data. We hypothesised that Genetic Programming can be used to test the structure of hydrological models and to identify dominant processes in hydrological systems. To test this, Genetic Programming was used to analyse a data set from a lysimeter experiment in southeastern Australia. The lysimeter experiment was conducted to quantify the deep percolation response under surface irrigated pasture to different soil types, watertable depths and water ponding times during surface irrigation. Using Genetic Programming, a simple model of deep percolation was recurrently evolved in multiple Genetic Programming runs. This simple and interpretable model supported the dominant process contributing to deep percolation represented in a conceptual model that was published earlier. Thus, this study shows that Genetic Programming can be used to evaluate the structure of hydrological models and to gain insight about the dominant processes in hydrological systems.
NASA Astrophysics Data System (ADS)
Lee, Ho Won; Lee, Ki-Heon; Lee, Jae Woo; Kim, Jong-Hoon; Yang, Heesun; Kim, Young Kwan
2015-02-01
In this work, the simple process of hybrid quantum dot (QD)/organic light-emitting diode (OLED) was proposed to apply a white illumination light by using QD plate and organic fluorescence. Conventional blue fluorescent OLEDs were firstly fabricated and then QD plates of various concentrations, which can be controlled of UV-vis absorption and photoluminescence spectrum, were attached under glass substrate of completed blue devices. The suggested process indicates that we could fabricate the white device through very simple process without any deposition of orange or red organic emitters. Therefore, this work would be demonstrated that the potential simple process for white applications can be applied and also can be extended to additional research on light applications.
NASA Astrophysics Data System (ADS)
Maas, Christian; Schmalzl, Jörg
2013-08-01
Ground Penetrating Radar (GPR) is used for the localization of supply lines, land mines, pipes and many other buried objects. These objects can be recognized in the recorded data as reflection hyperbolas with a typical shape depending on depth and material of the object and the surrounding material. To obtain the parameters, the shape of the hyperbola has to be fitted. In the last years several methods were developed to automate this task during post-processing. In this paper we show another approach for the automated localization of reflection hyperbolas in GPR data by solving a pattern recognition problem in grayscale images. In contrast to other methods our detection program is also able to immediately mark potential objects in real-time. For this task we use a version of the Viola-Jones learning algorithm, which is part of the open source library "OpenCV". This algorithm was initially developed for face recognition, but can be adapted to any other simple shape. In our program it is used to narrow down the location of reflection hyperbolas to certain areas in the GPR data. In order to extract the exact location and the velocity of the hyperbolas we apply a simple Hough Transform for hyperbolas. Because the Viola-Jones Algorithm reduces the input for the computational expensive Hough Transform dramatically the detection system can also be implemented on normal field computers, so on-site application is possible. The developed detection system shows promising results and detection rates in unprocessed radargrams. In order to improve the detection results and apply the program to noisy radar images more data of different GPR systems as input for the learning algorithm is necessary.
ERIC Educational Resources Information Center
Rockwell, S. Kay; Albrecht, Julie A.; Nugent, Gwen C.; Kunz, Gina M.
2012-01-01
Targeting Outcomes of Programs (TOP) is a seven-step hierarchical programming model in which the program development and performance sides are mirror images of each other. It served as a framework to identify a simple method for targeting photographic events in nonformal education programs, indicating why, when, and how photographs would be useful…
Khanal, Sumesh; Burgon, Joseph; Leonard, Saoirse; Griffiths, Matthew; Eddowes, Lucy A
2015-11-01
A lack of decisive evidence on the impact of telemedicine on financial and clinical outcomes has not prohibited significant investment in developing countries. Understanding characteristics that facilitate effective telemedicine programs is required to allow telemedicine to be used to its full potential. This systematic review aimed to identify organizational, technological, and financial features of successful telemedicine programs providing direct clinical care in developing countries. Databases were searched, and the results were reviewed systematically according to predefined inclusion/exclusion criteria. Information on location(s), measure of success, and organizational, technological, and financial characteristics were extracted. This review was impeded by inadequate program reporting, and so a concise checklist was developed to aid improved reporting, enabling future reviews to identify key characteristics of effective programs. This systematic review identified 46 articles reporting 36 programs that fulfilled the inclusion/exclusion criteria. Programs were distributed globally, including regional, national, and international programs. Technological modalities included synchronous technology, real-time teleconsultations, and asynchronous technology. Program integration with existing systems and twinning of international institutions were identified as factors enabling program success. Other factors included simple and easy-to-use technology, ability to reduce the burden on healthcare professionals, and technology able to maintain functionality in challenging environmental circumstances. Reports describing effectiveness and costs were limited. This systematic review identified key factors associated with telemedicine program success. However, inconsistencies in reporting represent an obstacle to establishment of successful programs in developing countries by limiting the application of previous experiences. Adhering to the guidelines suggested here may allow more quantitative assessments of effectiveness and impact for future programs.
Python-Based Applications for Hydrogeological Modeling
NASA Astrophysics Data System (ADS)
Khambhammettu, P.
2013-12-01
Python is a general-purpose, high-level programming language whose design philosophy emphasizes code readability. Add-on packages supporting fast array computation (numpy), plotting (matplotlib), scientific /mathematical Functions (scipy), have resulted in a powerful ecosystem for scientists interested in exploratory data analysis, high-performance computing and data visualization. Three examples are provided to demonstrate the applicability of the Python environment in hydrogeological applications. Python programs were used to model an aquifer test and estimate aquifer parameters at a Superfund site. The aquifer test conducted at a Groundwater Circulation Well was modeled with the Python/FORTRAN-based TTIM Analytic Element Code. The aquifer parameters were estimated with PEST such that a good match was produced between the simulated and observed drawdowns. Python scripts were written to interface with PEST and visualize the results. A convolution-based approach was used to estimate source concentration histories based on observed concentrations at receptor locations. Unit Response Functions (URFs) that relate the receptor concentrations to a unit release at the source were derived with the ATRANS code. The impact of any releases at the source could then be estimated by convolving the source release history with the URFs. Python scripts were written to compute and visualize receptor concentrations for user-specified source histories. The framework provided a simple and elegant way to test various hypotheses about the site. A Python/FORTRAN-based program TYPECURVEGRID-Py was developed to compute and visualize groundwater elevations and drawdown through time in response to a regional uniform hydraulic gradient and the influence of pumping wells using either the Theis solution for a fully-confined aquifer or the Hantush-Jacob solution for a leaky confined aquifer. The program supports an arbitrary number of wells that can operate according to arbitrary schedules. The python wrapper invokes the underlying FORTRAN layer to compute transient groundwater elevations and processes this information to create time-series and 2D plots.
Taimoory, S Maryamdokht; Sadraei, S Iraj; Fayoumi, Rose Anne; Nasri, Sarah; Revington, Matthew; Trant, John F
2018-04-20
The reaction between furans and maleimides has increasingly become a method of interest as its reversibility makes it a useful tool for applications ranging from self-healing materials, to self-immolative polymers, to hydrogels for cell culture and for the preparation of bone repair. However, most of these applications have relied on simple monosubstituted furans and simple maleimides and have not extensively evaluated the potential thermal variability inherent in the process that is achievable through simple substrate modification. A small library of cycloadducts suitable for the above applications was prepared, and the temperature dependence of the retro-Diels-Alder processes was determined through in situ 1 H NMR analyses complemented by computational calculations. The practical range of the reported systems ranges from 40 to >110 °C. The cycloreversion reactions are more complex than would be expected based on simple trends expected based on frontier molecular orbital analyses of the materials.
PYROLASER - PYROLASER OPTICAL PYROMETER OPERATING SYSTEM
NASA Technical Reports Server (NTRS)
Roberts, F. E.
1994-01-01
The PYROLASER package is an operating system for the Pyrometer Instrument Company's Pyrolaser. There are 6 individual programs in the PYROLASER package: two main programs, two lower level subprograms, and two programs which, although independent, function predominantly as macros. The package provides a quick and easy way to setup, control, and program a standard Pyrolaser. Temperature and emissivity measurements may be either collected as if the Pyrolaser were in the manual operations mode, or displayed on real time strip charts and stored in standard spreadsheet format for post-test analysis. A shell is supplied to allow macros, which are test-specific, to be easily added to the system. The Pyrolaser Simple Operation program provides full on-screen remote operation capabilities, thus allowing the user to operate the Pyrolaser from the computer just as it would be operated manually. The Pyrolaser Simple Operation program also allows the use of "quick starts". Quick starts provide an easy way to permit routines to be used as setup macros for specific applications or tests. The specific procedures required for a test may be ordered in a sequence structure and then the sequence structure can be started with a simple button in the cluster structure provided. One quick start macro is provided for continuous Pyrolaser operation. A subprogram, Display Continuous Pyr Data, is used to display and store the resulting data output. Using this macro, the system is set up for continuous operation and the subprogram is called to display the data in real time on strip charts. The data is simultaneously stored in a spreadsheet format. The resulting spreadsheet file can be opened in any one of a number of commercially available spreadsheet programs. The Read Continuous Pyrometer program is provided as a continuously run subprogram for incorporation of the Pyrolaser software into a process control or feedback control scheme in a multi-component system. The program requires the Pyrolaser to be set up using the Pyrometer String Transfer macro. It requires no inputs and provides temperature and emissivity as outputs. The Read Continuous Pyrometer program can be run continuously and the data can be sampled as often or as seldom as updates of temperature and emissivity are required. PYROLASER is written using the Labview software for use on Macintosh series computers running System 6.0.3 or later, Sun Sparc series computers running OpenWindows 3.0 or MIT's X Window System (X11R4 or X11R5), and IBM PC or compatibles running Microsoft Windows 3.1 or later. Labview requires a minimum of 5Mb of RAM on a Macintosh, 24Mb of RAM on a Sun, and 8Mb of RAM on an IBM PC or compatible. The Labview software is a product of National Instruments (Austin,TX; 800-433-3488), and is not included with this program. The standard distribution medium for PYROLASER is a 3.5 inch 800K Macintosh format diskette. It is also available on a 3.5 inch 720K MS-DOS format diskette, a 3.5 inch diskette in UNIX tar format, and a .25 inch streaming magnetic tape cartridge in UNIX tar format. An electronic copy of the documentation in Macintosh WordPerfect version 2.0.4 format is included on the distribution medium. Printed documentation is included in the price of the program. PYROLASER was developed in 1992.
NASA Astrophysics Data System (ADS)
Filatov, Michael; Cremer, Dieter
2005-01-01
A simple modification of the zeroth-order regular approximation (ZORA) in relativistic theory is suggested to suppress its erroneous gauge dependence to a high level of approximation. The method, coined gauge-independent ZORA (ZORA-GI), can be easily installed in any existing nonrelativistic quantum chemical package by programming simple one-electron matrix elements for the quasirelativistic Hamiltonian. Results of benchmark calculations obtained with ZORA-GI at the Hartree-Fock (HF) and second-order Møller-Plesset perturbation theory (MP2) level for dihalogens X2 (X=F,Cl,Br,I,At) are in good agreement with the results of four-component relativistic calculations (HF level) and experimental data (MP2 level). ZORA-GI calculations based on MP2 or coupled-cluster theory with single and double perturbations and a perturbative inclusion of triple excitations [CCSD(T)] lead to accurate atomization energies and molecular geometries for the tetroxides of group VIII elements. With ZORA-GI/CCSD(T), an improved estimate for the atomization energy of hassium (Z=108) tetroxide is obtained.
NASA Technical Reports Server (NTRS)
Acton, Charles H., Jr.; Bachman, Nathaniel J.; Semenov, Boris V.; Wright, Edward D.
2010-01-01
The Navigation Ancillary Infor ma tion Facility (NAIF) at JPL, acting under the direction of NASA s Office of Space Science, has built a data system named SPICE (Spacecraft Planet Instrument Cmatrix Events) to assist scientists in planning and interpreting scientific observations (see figure). SPICE provides geometric and some other ancillary information needed to recover the full value of science instrument data, including correlation of individual instrument data sets with data from other instruments on the same or other spacecraft. This data system is used to produce space mission observation geometry data sets known as SPICE kernels. It is also used to read SPICE kernels and to compute derived quantities such as positions, orientations, lighting angles, etc. The SPICE toolkit consists of a subroutine/ function library, executable programs (both large applications and simple utilities that focus on kernel management), and simple examples of using SPICE toolkit subroutines. This software is very accurate, thoroughly tested, and portable to all computers. It is extremely stable and reusable on all missions. Since the previous version, three significant capabilities have been added: Interactive Data Language (IDL) interface, MATLAB interface, and a geometric event finder subsystem.
Visualising higher order Brillouin zones with applications
NASA Astrophysics Data System (ADS)
Andrew, R. C.; Salagaram, T.; Chetty, N.
2017-05-01
A key concept in material science is the relationship between the Bravais lattice, the reciprocal lattice and the resulting Brillouin zones (BZ). These zones are often complicated shapes that are hard to construct and visualise without the use of sophisticated software, even by professional scientists. We have used a simple sorting algorithm to construct BZ of any order for a chosen Bravais lattice that is easy to implement in any scientific programming language. The resulting zones can then be visualised using freely available plotting software. This method has pedagogical value for upper-level undergraduate students since, along with other computational methods, it can be used to illustrate how constant-energy surfaces combine with these zones to create van Hove singularities in the density of states. In this paper we apply our algorithm along with the empirical pseudopotential method and the 2D equivalent of the tetrahedron method to show how they can be used in a simple software project to investigate this interaction for a 2D crystal. This project not only enhances students’ fundamental understanding of the principles involved but also improves transferable coding skills.
Highly accurate symplectic element based on two variational principles
NASA Astrophysics Data System (ADS)
Qing, Guanghui; Tian, Jia
2018-02-01
For the stability requirement of numerical resultants, the mathematical theory of classical mixed methods are relatively complex. However, generalized mixed methods are automatically stable, and their building process is simple and straightforward. In this paper, based on the seminal idea of the generalized mixed methods, a simple, stable, and highly accurate 8-node noncompatible symplectic element (NCSE8) was developed by the combination of the modified Hellinger-Reissner mixed variational principle and the minimum energy principle. To ensure the accuracy of in-plane stress results, a simultaneous equation approach was also suggested. Numerical experimentation shows that the accuracy of stress results of NCSE8 are nearly the same as that of displacement methods, and they are in good agreement with the exact solutions when the mesh is relatively fine. NCSE8 has advantages of the clearing concept, easy calculation by a finite element computer program, higher accuracy and wide applicability for various linear elasticity compressible and nearly incompressible material problems. It is possible that NCSE8 becomes even more advantageous for the fracture problems due to its better accuracy of stresses.
Shih, Ching-Hsiang
2011-01-01
This study evaluated whether two people with developmental disabilities would be able to actively perform simple physical activities by controlling their favorite environmental stimulation using Nintendo Wii Balance Boards with a newly developed standing location detection program (SLDP, i.e., a new software program turning a Nintendo Wii Balance Board into a standing location detector). This study was carried out using to an ABAB design. The data showed that both participants significantly increased their simple physical activity (target response) to activate the control system to produce environmental stimulation during the B (intervention) phases. The practical and developmental implications of the findings are discussed. Copyright © 2010 Elsevier Ltd. All rights reserved.
A Simulation Program for Dynamic Infrared (IR) Spectra
ERIC Educational Resources Information Center
Zoerb, Matthew C.; Harris, Charles B.
2013-01-01
A free program for the simulation of dynamic infrared (IR) spectra is presented. The program simulates the spectrum of two exchanging IR peaks based on simple input parameters. Larger systems can be simulated with minor modifications. The program is available as an executable program for PCs or can be run in MATLAB on any operating system. Source…
1979-05-01
and social problems, does not lend itself to a single or simple solution. This is why we must all be involved. For this reason we. believe that...of admission to decisionmaking. At times the implications of this relatively simple premise are not minor. Many people beginning community...involvement programs have found it extremely difficult to locate technical people able to translate technical reports into simple , every- day English. There
Microscale Organic Laboratory: IV. A Simple and Rapid Procedure for Carrying Out Wittig Reactions.
ERIC Educational Resources Information Center
Pike, R. M.; And Others
1986-01-01
Describes two examples where synthetic salt-base mixtures are used in a microscale organic laboratory program as a simple and quick procedure for carrying out Wittig reactions. Both experimental procedures are outlined and discussed. (TW)
A strategy for automatically generating programs in the lucid programming language
NASA Technical Reports Server (NTRS)
Johnson, Sally C.
1987-01-01
A strategy for automatically generating and verifying simple computer programs is described. The programs are specified by a precondition and a postcondition in predicate calculus. The programs generated are in the Lucid programming language, a high-level, data-flow language known for its attractive mathematical properties and ease of program verification. The Lucid programming is described, and the automatic program generation strategy is described and applied to several example problems.
Suitability evaluation tool for lands (rice, corn and soybean) as mobile application
NASA Astrophysics Data System (ADS)
Rahim, S. E.; Supli, A. A.; Damiri, N.
2017-09-01
Evaluation of land suitability for special purposes e.g. for food crops is a must, a means to understand determining factors to be considered in the management of a land successfully. A framework for evaluating the land suitability for purposes in agriculture was first introduced by the Food and Agriculture Organization (FAO) in late 1970s. When using the framework manually, it is time consuming and not interesting for land users. Therefore, the authors have developed an effective tool by transforming the FAO framework into smart mobile application. This application is designed by using simple language for each factor and also by utilizing rule based system (RBS) algorithm. The factors involved are soil type, depth of soil solum, soil fertility, soil pH, drainage, risk of flood, etc. Suitability in this paper is limited to rice, corn and soybean. The application is found to be easier to understand and also could automatically determine the suitability of land. Usability testing was also conducted with 75 respondents. The results showed the usability was in "very good" classification. The program is urgently needed by the land managers, farmers, lecturers, students and government officials (planners) to help them more easily manage their land for a better future.
Construction of crystal structure prototype database: methods and applications.
Su, Chuanxun; Lv, Jian; Li, Quan; Wang, Hui; Zhang, Lijun; Wang, Yanchao; Ma, Yanming
2017-04-26
Crystal structure prototype data have become a useful source of information for materials discovery in the fields of crystallography, chemistry, physics, and materials science. This work reports the development of a robust and efficient method for assessing the similarity of structures on the basis of their interatomic distances. Using this method, we proposed a simple and unambiguous definition of crystal structure prototype based on hierarchical clustering theory, and constructed the crystal structure prototype database (CSPD) by filtering the known crystallographic structures in a database. With similar method, a program structure prototype analysis package (SPAP) was developed to remove similar structures in CALYPSO prediction results and extract predicted low energy structures for a separate theoretical structure database. A series of statistics describing the distribution of crystal structure prototypes in the CSPD was compiled to provide an important insight for structure prediction and high-throughput calculations. Illustrative examples of the application of the proposed database are given, including the generation of initial structures for structure prediction and determination of the prototype structure in databases. These examples demonstrate the CSPD to be a generally applicable and useful tool for materials discovery.
NASA Technical Reports Server (NTRS)
Schmidt, H.; Tango, G. J.; Werby, M. F.
1985-01-01
A new matrix method for rapid wave propagation modeling in generalized stratified media, which has recently been applied to numerical simulations in diverse areas of underwater acoustics, solid earth seismology, and nondestructive ultrasonic scattering is explained and illustrated. A portion of recent efforts jointly undertaken at NATOSACLANT and NORDA Numerical Modeling groups in developing, implementing, and testing a new fast general-applications wave propagation algorithm, SAFARI, formulated at SACLANT is summarized. The present general-applications SAFARI program uses a Direct Global Matrix Approach to multilayer Green's function calculation. A rapid and unconditionally stable solution is readily obtained via simple Gaussian ellimination on the resulting sparsely banded block system, precisely analogous to that arising in the Finite Element Method. The resulting gains in accuracy and computational speed allow consideration of much larger multilayered air/ocean/Earth/engineering material media models, for many more source-receiver configurations than previously possible. The validity and versatility of the SAFARI-DGM method is demonstrated by reviewing three practical examples of engineering interest, drawn from ocean acoustics, engineering seismology and ultrasonic scattering.
NASA Astrophysics Data System (ADS)
Philippi, T. M.
1981-11-01
The final result of an international assessment of the market for stand-alone photovoltaic systems in cottage industry applications is reported. Nonindustrialized countries without centrally planned economies were considered. Cottage industries were defined as small rural manufacturers, employing less than 50 people, producing consumer and simple products. The data to support this analysis were obtained from secondary and expert sources in the U.S. and in-country field investigations of the Philippines and Mexico. The near-term market for photovoltaics for rural cottage industry applications appears to be limited to demonstration projects and pilot programs, based on an in-depth study of the nature of cottage industry, its role in the rural economy, the electric energy requirements of cottage industry, and a financial analysis of stand-alone photovoltaic systems as compared to their most viable competitor, diesel driven generators. Photovoltaics are shown to be a better long-term option only for very low power requirements. Some of these uses would include clay mixers, grinders, centrifuges, lathes, power saws and lighting of a workshop.
Compiling knowledge-based systems from KEE to Ada
NASA Technical Reports Server (NTRS)
Filman, Robert E.; Bock, Conrad; Feldman, Roy
1990-01-01
The dominant technology for developing AI applications is to work in a multi-mechanism, integrated, knowledge-based system (KBS) development environment. Unfortunately, systems developed in such environments are inappropriate for delivering many applications - most importantly, they carry the baggage of the entire Lisp environment and are not written in conventional languages. One resolution of this problem would be to compile applications from complex environments to conventional languages. Here the first efforts to develop a system for compiling KBS developed in KEE to Ada (trademark). This system is called KATYDID, for KEE/Ada Translation Yields Development Into Delivery. KATYDID includes early prototypes of a run-time KEE core (object-structure) library module for Ada, and translation mechanisms for knowledge structures, rules, and Lisp code to Ada. Using these tools, part of a simple expert system was compiled (not quite automatically) to run in a purely Ada environment. This experience has given us various insights on Ada as an artificial intelligence programming language, potential solutions of some of the engineering difficulties encountered in early work, and inspiration on future system development.
Lawhern, Vernon; Hairston, W David; Robbins, Kay
2013-01-01
Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration.
Lawhern, Vernon; Hairston, W. David; Robbins, Kay
2013-01-01
Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration. PMID:23638169
Construction of crystal structure prototype database: methods and applications
NASA Astrophysics Data System (ADS)
Su, Chuanxun; Lv, Jian; Li, Quan; Wang, Hui; Zhang, Lijun; Wang, Yanchao; Ma, Yanming
2017-04-01
Crystal structure prototype data have become a useful source of information for materials discovery in the fields of crystallography, chemistry, physics, and materials science. This work reports the development of a robust and efficient method for assessing the similarity of structures on the basis of their interatomic distances. Using this method, we proposed a simple and unambiguous definition of crystal structure prototype based on hierarchical clustering theory, and constructed the crystal structure prototype database (CSPD) by filtering the known crystallographic structures in a database. With similar method, a program structure prototype analysis package (SPAP) was developed to remove similar structures in CALYPSO prediction results and extract predicted low energy structures for a separate theoretical structure database. A series of statistics describing the distribution of crystal structure prototypes in the CSPD was compiled to provide an important insight for structure prediction and high-throughput calculations. Illustrative examples of the application of the proposed database are given, including the generation of initial structures for structure prediction and determination of the prototype structure in databases. These examples demonstrate the CSPD to be a generally applicable and useful tool for materials discovery.
NASA Technical Reports Server (NTRS)
Philippi, T. M.
1981-01-01
The final result of an international assessment of the market for stand-alone photovoltaic systems in cottage industry applications is reported. Nonindustrialized countries without centrally planned economies were considered. Cottage industries were defined as small rural manufacturers, employing less than 50 people, producing consumer and simple products. The data to support this analysis were obtained from secondary and expert sources in the U.S. and in-country field investigations of the Philippines and Mexico. The near-term market for photovoltaics for rural cottage industry applications appears to be limited to demonstration projects and pilot programs, based on an in-depth study of the nature of cottage industry, its role in the rural economy, the electric energy requirements of cottage industry, and a financial analysis of stand-alone photovoltaic systems as compared to their most viable competitor, diesel driven generators. Photovoltaics are shown to be a better long-term option only for very low power requirements. Some of these uses would include clay mixers, grinders, centrifuges, lathes, power saws and lighting of a workshop.
Program Manipulates Plots For Effective Display
NASA Technical Reports Server (NTRS)
Bauer, F.; Downing, J.
1990-01-01
Windowed Observation of Relative Motion (WORM) computer program primarily intended for generation of simple X-Y plots from data created by other programs. Enables user to label, zoom, and change scales of various plots. Three-dimensional contour and line plots provided. Written in PASCAL.
Mittag, O; Döbler, A; Pollmann, H; Farin-Glattacker, E; Raspe, H
2014-10-01
Type 2 diabetes (DM II) is the world's most widespread metabolic disease. Numerous investigations have demonstrated that intensive, multimodal interventions can reduce the occurrence of DM-associated comobidities and mortality. Medical rehabilitation could offer such an alternative, albeit one with an obvious time limit. There is currently no active program in Germany designed to screen for pa-tients' need for rehab. Here, we investigated -whether screening for rehab need in DMII pa-tients accompanied by written advice to file an application for rehab treatment would generate a relevant number of rehab measures, whether -inpatient rehab results in improved mid-term prognoses, and which patients demonstrate a particular benefit from such a program. We screened 5 500 employed individuals aged 18-54 years for their need for rehab via an extensive questionnaire based on the "Lübeck Algorithm". The patients were registered in the DMP (disease management program) Diabetes mellitus Type 2 in the AOK Rheinland/-Hamburg health insurance division, and payed into DRV (German statutory pension insurance -scheme) Rheinland retirement insurance. Pa-tients needing rehab who presented no exclusion criteria (i. e., for a rehab intervention far from their place of residence) were randomized to a control or intervention group at a ratio of 3:1. Patients in the intervention group received a letter from the AOK advising them to fill out an application for rehab. A very short, simple application form was included in the mailing. 12 months after randomization we conducted a query to determine the effects of rehab. Our primary endpoint was a cardiovascular risk score specifically devised for diabetics. Multi-level models were applied to measure changes in cardiovascular risk. 850 patients (rate of return=16%) returned completed screening forms to us. After having excluded those with faulty diagnoses and/or those who had refused to participate, 829 patients remained. 94% of them presented a need for rehab according to specific criteria (39% with a simple and 55% with complex problem profiles). 266 patients stated in the questionnaire that a rehab program was impossible for them for personal reasons. Of those patients who remained, we randomized 299 to the intervention cohort and 102 to the control group. Almost 70% of the intervention group completed an application for rehab, and our follow-up revealed that most of them participated in a rehab intervention. Return rate after one year was 82%. Analysis on the intention-to-treat (ITT) principle revealed no significant effect on cardiovascular risk (p=0.68); however, per-protocol analysis demonstrated a significant effect in the intervention cohort (p=0.025). Males, and patients with an uncomplicated problem profile profited from the intervention. We discovered that a proactive procedure leads to the identification of a highly relevant group of insured individuals, and that it is suited to generating a large number of medically -justified rehab applications. ITT analysis on the effi-cacy of inpatient rehabilitation for type 2 diabetes mellitus in terms of the cardiovascular 5-year risk, however, failed to display a significant statistical effect in this study population (insurees of generally lower socioeconomic status having no intention to apply for rehab treatment). Rehab treatment for type 2 diabetes does not seem to be universally effective. This of course does not apply to rehab in general, as patients usually participate in rehab of their own volition. More research is needed on this issue. © Georg Thieme Verlag KG Stuttgart · New York.
Model Program: Unionville High School, Kennett Square, PA
ERIC Educational Resources Information Center
Berkeihiser, Mike
2008-01-01
After attending a conference session about marketing, the author and his colleagues were inspired to start their own marketing program for the technology education program at Unionville High School in Kennett Square, Pennsylvania. When they started, they had no idea how much that simple marketing program would pay off. Over the past seven years,…
School Voucher Program and Its Enlightenments to the Education Reform in China
ERIC Educational Resources Information Center
Shen, Youlu
2005-01-01
This article roughly retrospects the idea of school voucher program proposed by Milton Friedman, lately developed by Peacock, Wiseman and Jencks. The reasons like privatization in education, deterioration of public schooling and school choice promote this program. Then taking a simple look at the ramification of voucher program and its value…
Motor Programming in Apraxia of Speech
ERIC Educational Resources Information Center
Maas, Edwin; Robin, Donald A.; Wright, David L.; Ballard, Kirrie J.
2008-01-01
Apraxia of Speech (AOS) is an impairment of motor programming. However, the exact nature of this deficit remains unclear. The present study examined motor programming in AOS in the context of a recent two-stage model [Klapp, S. T. (1995). Motor response programming during simple and choice reaction time: The role of practice. "Journal of…
Women in Technology: The Evolution of a Simple Program That Works.
ERIC Educational Resources Information Center
Crumb, Jean Marie; Fenton, Ray
Three papers present views on women in technology programs and occupations, and on Corning Community College's (CCC's) program to encourage women to enter technological fields in which they have been historically underrepresented. First, Edward F. Herman presents the historical background to the development of CCC's Women in Technology program,…
Action Research: Effective Marketing Strategies for a Blended University Program
ERIC Educational Resources Information Center
Cook, Ruth Gannon; Ley, Kathryn
2008-01-01
This action research study investigated a marketing plan based on collaboration among a program faculty team and other organizational units for a graduate professional program. From its inception through the second year of operation, program enrollment increased due to the marketing plan based on an effective approach grounded in simple marketing…
Goal programming for land use planning.
Enoch F. Bell
1976-01-01
A simple transformation of the linear programing model used in land use planning to a goal programing model allows the multiple goals implied by multiple use management to be explicitly recognized. This report outlines the procedure for accomplishing the transformation and discusses problems with use of goal programing. Of particular concern are the expert opinions...
Draft SEI Program Plans: 1995-1999
1994-08-01
risk management because we believe that (a) structured techniques, even quite simple ones, can be effective in identifying and quantifying risk ; and (b...belief that (1) structured techniques, even quite simple ones, could be effective in identifying and quantifying risk ; and (2) techniques existed to
Karpievitch, Yuliya V; Almeida, Jonas S
2006-01-01
Background Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. Results mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Conclusion Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet. PMID:16539707
SPARTA: Simple Program for Automated reference-based bacterial RNA-seq Transcriptome Analysis.
Johnson, Benjamin K; Scholz, Matthew B; Teal, Tracy K; Abramovitch, Robert B
2016-02-04
Many tools exist in the analysis of bacterial RNA sequencing (RNA-seq) transcriptional profiling experiments to identify differentially expressed genes between experimental conditions. Generally, the workflow includes quality control of reads, mapping to a reference, counting transcript abundance, and statistical tests for differentially expressed genes. In spite of the numerous tools developed for each component of an RNA-seq analysis workflow, easy-to-use bacterially oriented workflow applications to combine multiple tools and automate the process are lacking. With many tools to choose from for each step, the task of identifying a specific tool, adapting the input/output options to the specific use-case, and integrating the tools into a coherent analysis pipeline is not a trivial endeavor, particularly for microbiologists with limited bioinformatics experience. To make bacterial RNA-seq data analysis more accessible, we developed a Simple Program for Automated reference-based bacterial RNA-seq Transcriptome Analysis (SPARTA). SPARTA is a reference-based bacterial RNA-seq analysis workflow application for single-end Illumina reads. SPARTA is turnkey software that simplifies the process of analyzing RNA-seq data sets, making bacterial RNA-seq analysis a routine process that can be undertaken on a personal computer or in the classroom. The easy-to-install, complete workflow processes whole transcriptome shotgun sequencing data files by trimming reads and removing adapters, mapping reads to a reference, counting gene features, calculating differential gene expression, and, importantly, checking for potential batch effects within the data set. SPARTA outputs quality analysis reports, gene feature counts and differential gene expression tables and scatterplots. SPARTA provides an easy-to-use bacterial RNA-seq transcriptional profiling workflow to identify differentially expressed genes between experimental conditions. This software will enable microbiologists with limited bioinformatics experience to analyze their data and integrate next generation sequencing (NGS) technologies into the classroom. The SPARTA software and tutorial are available at sparta.readthedocs.org.
Karpievitch, Yuliya V; Almeida, Jonas S
2006-03-15
Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet.
Pírez, Macarena; Gonzalez-Sapienza, Gualberto; Sienra, Daniel; Ferrari, Graciela; Last, Michael; Last, Jerold A; Brena, Beatriz M
2013-01-15
In recent years, the international demand for commodities has prompted enormous growth in agriculture in most South American countries. Due to intensive use of fertilizers, cyanobacterial blooms have become a recurrent phenomenon throughout the continent, but their potential health risk remains largely unknown due to the lack of analytical capacity. In this paper we report the main results and conclusions of more than five years of systematic monitoring of cyanobacterial blooms in 20 beaches of Montevideo, Uruguay, on the Rio de la Plata, the fifth largest basin in the world. A locally developed microcystin ELISA was used to establish a sustainable monitoring program that revealed seasonal peaks of extremely high toxicity, more than one-thousand-fold greater than the WHO limit for recreational water. Comparison with cyanobacterial cell counts and chlorophyll-a determination, two commonly used parameters for indirect estimation of toxicity, showed that such indicators can be highly misleading. On the other hand, the accumulated experience led to the definition of a simple criterion for visual classification of blooms, that can be used by trained lifeguards and technicians to take rapid on-site decisions on beach management. The simple and low cost approach is broadly applicable to risk assessment and risk management in developing countries. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Lekkas, Efthymis; Andreadakis, Emmanouil; Nomikou, Paraskevi; Antoniou, Varvara; Kapourani, Eleni; Papaspyropoulos, Konstantinos
2017-04-01
Environmental issues, disasters and crises have been showing an increasing complexity and interconnections in every level and aspect, thus requiring a holistic approach from simple problem solving to emergency management. Recent challenges include geographical and affected population escalation, complex or cascading disasters and interconnection of regional conflicts to transboundary social, political and environmental impact. One of the issues concerning the traditional management is competition or even antagonism between organizations, services and disciplines, from science to operations. In this context, a postgraduate program answering to these issues was designed in Greece, applying multidisciplinarity, crossdisciplinarity and interdisiplinarity, from teaching staff and tutors, to students, objects and fields of knowledge and research. The program offers a curriculum of lessons and disciplines integrating science, humanities, legislation, institutions and operations. Geosciences carry an inherent interdisciplinarity culture and a long tradition in the research of environment and disasters, along with their familiarity with the complexity of such issues. That is why the program "Environmental, Disaster and Crisis Management Strategies" was organized by the Department of Geology and Geoenvironment of the National and Kapodistrian University of Athens, but involves social scientists, emergency operators, medical scientists etc. The program aims at the diffusion of basic principles and tools of all related disciplines and develops a common ground and a communication language with the least barriers, and the building of trust and understanding between all parties involved. The curriculum is designed so that professionals of all disciplines and industries are able to attend without interrupting their other activities, while pursuing their personal scientific and professional educational goals and interests through selection of lessons and thesis subject. As a result, admitted students come, in a large percentage, from services and authorities involved in environmental, disaster and crisis management issues (for example fire service, police, armed forces, ministries, local administration etc) apart from graduate students continuing their studies. An added value of the program has been observed the development of a critical mass of personnel of these organizations and young scientists with increased connectivity, extending from simple acquaintance to cooperation and trust and synergy development of the services themselves. This is a promising condition for a more effective risk and emergency management in a context of ethical and responsible practices. The curriculum comprises live or online lectures, asynchronous education with exercises and essay writing, seminars on tools such as related GIS and SPSS applications, and applied field exercises on both scientific and emergency management subjects. The program completed the second year of function, and was upgraded after internal and external evaluation, to adjust to new fields, ideas and challenges and include students' suggestions. More than one hundred students have graduated so far, and another 350 are currently attending. The program, which originally available in Greek, is going to be available in English starting September 2017, and is open for applications, and presented at: http://www.edcm.edu.gr
SCTE: An open-source Perl framework for testing equipment control and data acquisition
NASA Astrophysics Data System (ADS)
Mostaço-Guidolin, Luiz C.; Frigori, Rafael B.; Ruchko, Leonid; Galvão, Ricardo M. O.
2012-07-01
SCTE intends to provide a simple, yet powerful, framework for building data acquisition and equipment control systems for experimental Physics, and correlated areas. Via its SCTE::Instrument module, RS-232, USB, and LAN buses are supported, and the intricacies of hardware communication are encapsulated underneath an object oriented abstraction layer. Written in Perl, and using the SCPI protocol, enabled instruments can be easily programmed to perform a wide variety of tasks. While this work presents general aspects of the development of data acquisition systems using the SCTE framework, it is illustrated by particular applications designed for the calibration of several in-house developed devices for power measurement in the tokamak TCABR Alfvén Waves Excitement System. Catalogue identifier: AELZ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELZ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License Version 3 No. of lines in distributed program, including test data, etc.: 13 811 No. of bytes in distributed program, including test data, etc.: 743 709 Distribution format: tar.gz Programming language: Perl version 5.10.0 or higher. Computer: PC. SCPI capable digital oscilloscope, with RS-232, USB, or LAN communication ports, null modem, USB, or Ethernet cables Operating system: GNU/Linux (2.6.28-11), should also work on any Unix-based operational system Classification: 4.14 External routines: Perl modules: Device::SerialPort, Term::ANSIColor, Math::GSL, Net::HTTP. Gnuplot 4.0 or higher Nature of problem: Automation of experiments and data acquisition often requires expensive equipment and in-house development of software applications. Nowadays personal computers and test equipment come with fast and easy-to-use communication ports. Instrument vendors often supply application programs capable of controlling such devices, but are very restricted in terms of functionalities. For instance, they are not capable of controlling more than one test equipment at a same time or to automate repetitive tasks. SCTE provides a way of using auxiliary equipment in order to automate experiment procedures at low cost using only free, and open-source operational system and libraries. Solution method: SCTE provides a Perl module that implements RS-232, USB, and LAN communication allowing the use of SCPI capable instruments [1]. Therefore providing a straightforward way of creating automation and data acquisition applications using personal computers and testing instruments [2]. SCPI Consortium, Standard Commands for Programmable Instruments, 1999, http://www.scpiconsortium.org. L.C.B. Mostaço-Guidolin, Determinação da configuração de ondas de Alfvén excitadas no tokamak TCABR, Master's thesis, Universidade de São Paulo (2007), http://www.teses.usp.br/teses/disponiveis/43/43134/tde-23042009-230419/.