Sample records for user defined functions

  1. A database system to support image algorithm evaluation

    NASA Technical Reports Server (NTRS)

    Lien, Y. E.

    1977-01-01

    The design is given of an interactive image database system IMDB, which allows the user to create, retrieve, store, display, and manipulate images through the facility of a high-level, interactive image query (IQ) language. The query language IQ permits the user to define false color functions, pixel value transformations, overlay functions, zoom functions, and windows. The user manipulates the images through generic functions. The user can direct images to display devices for visual and qualitative analysis. Image histograms and pixel value distributions can also be computed to obtain a quantitative analysis of images.

  2. Charliecloud: Unprivileged containers for user-defined software stacks in HPC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Priedhorsky, Reid; Randles, Timothy C.

    Supercomputing centers are seeing increasing demand for user-defined software stacks (UDSS), instead of or in addition to the stack provided by the center. These UDSS support user needs such as complex dependencies or build requirements, externally required configurations, portability, and consistency. The challenge for centers is to provide these services in a usable manner while minimizing the risks: security, support burden, missing functionality, and performance. We present Charliecloud, which uses the Linux user and mount namespaces to run industry-standard Docker containers with no privileged operations or daemons on center resources. Our simple approach avoids most security risks while maintaining accessmore » to the performance and functionality already on offer, doing so in less than 500 lines of code. Charliecloud promises to bring an industry-standard UDSS user workflow to existing, minimally altered HPC resources.« less

  3. MetaNET--a web-accessible interactive platform for biological metabolic network analysis.

    PubMed

    Narang, Pankaj; Khan, Shawez; Hemrom, Anmol Jaywant; Lynn, Andrew Michael

    2014-01-01

    Metabolic reactions have been extensively studied and compiled over the last century. These have provided a theoretical base to implement models, simulations of which are used to identify drug targets and optimize metabolic throughput at a systemic level. While tools for the perturbation of metabolic networks are available, their applications are limited and restricted as they require varied dependencies and often a commercial platform for full functionality. We have developed MetaNET, an open source user-friendly platform-independent and web-accessible resource consisting of several pre-defined workflows for metabolic network analysis. MetaNET is a web-accessible platform that incorporates a range of functions which can be combined to produce different simulations related to metabolic networks. These include (i) optimization of an objective function for wild type strain, gene/catalyst/reaction knock-out/knock-down analysis using flux balance analysis. (ii) flux variability analysis (iii) chemical species participation (iv) cycles and extreme paths identification and (v) choke point reaction analysis to facilitate identification of potential drug targets. The platform is built using custom scripts along with the open-source Galaxy workflow and Systems Biology Research Tool as components. Pre-defined workflows are available for common processes, and an exhaustive list of over 50 functions are provided for user defined workflows. MetaNET, available at http://metanet.osdd.net , provides a user-friendly rich interface allowing the analysis of genome-scale metabolic networks under various genetic and environmental conditions. The framework permits the storage of previous results, the ability to repeat analysis and share results with other users over the internet as well as run different tools simultaneously using pre-defined workflows, and user-created custom workflows.

  4. GDF v2.0, an enhanced version of GDF

    NASA Astrophysics Data System (ADS)

    Tsoulos, Ioannis G.; Gavrilis, Dimitris; Dermatas, Evangelos

    2007-12-01

    An improved version of the function estimation program GDF is presented. The main enhancements of the new version include: multi-output function estimation, capability of defining custom functions in the grammar and selection of the error function. The new version has been evaluated on a series of classification and regression datasets, that are widely used for the evaluation of such methods. It is compared to two known neural networks and outperforms them in 5 (out of 10) datasets. Program summaryTitle of program: GDF v2.0 Catalogue identifier: ADXC_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXC_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 98 147 No. of bytes in distributed program, including test data, etc.: 2 040 684 Distribution format: tar.gz Programming language: GNU C++ Computer: The program is designed to be portable in all systems running the GNU C++ compiler Operating system: Linux, Solaris, FreeBSD RAM: 200000 bytes Classification: 4.9 Does the new version supersede the previous version?: Yes Nature of problem: The technique of function estimation tries to discover from a series of input data a functional form that best describes them. This can be performed with the use of parametric models, whose parameters can adapt according to the input data. Solution method: Functional forms are being created by genetic programming which are approximations for the symbolic regression problem. Reasons for new version: The GDF package was extended in order to be more flexible and user customizable than the old package. The user can extend the package by defining his own error functions and he can extend the grammar of the package by adding new functions to the function repertoire. Also, the new version can perform function estimation of multi-output functions and it can be used for classification problems. Summary of revisions: The following features have been added to the package GDF: Multi-output function approximation. The package can now approximate any function f:R→R. This feature gives also to the package the capability of performing classification and not only regression. User defined function can be added to the repertoire of the grammar, extending the regression capabilities of the package. This feature is limited to 3 functions, but easily this number can be increased. Capability of selecting the error function. The package offers now to the user apart from the mean square error other error functions such as: mean absolute square error, maximum square error. Also, user defined error functions can be added to the set of error functions. More verbose output. The main program displays more information to the user as well as the default values for the parameters. Also, the package gives to the user the capability to define an output file, where the output of the gdf program for the testing set will be stored after the termination of the process. Additional comments: A technical report describing the revisions, experiments and test runs is packaged with the source code. Running time: Depending on the train data.

  5. Using component technology to facilitate external software reuse in ground-based planning systems

    NASA Technical Reports Server (NTRS)

    Chase, A.

    2003-01-01

    APGEN (Activity Plan GENerator - 314), a multi-mission planning tool, must interface with external software to vest serve its users. AP-GEN's original method for incorporating external software, the User-Defined library mechanism, has been very successful in allowing APGEN users access to external software functionality.

  6. SMOG 2: A Versatile Software Package for Generating Structure-Based Models.

    PubMed

    Noel, Jeffrey K; Levi, Mariana; Raghunathan, Mohit; Lammert, Heiko; Hayes, Ryan L; Onuchic, José N; Whitford, Paul C

    2016-03-01

    Molecular dynamics simulations with coarse-grained or simplified Hamiltonians have proven to be an effective means of capturing the functionally important long-time and large-length scale motions of proteins and RNAs. Originally developed in the context of protein folding, structure-based models (SBMs) have since been extended to probe a diverse range of biomolecular processes, spanning from protein and RNA folding to functional transitions in molecular machines. The hallmark feature of a structure-based model is that part, or all, of the potential energy function is defined by a known structure. Within this general class of models, there exist many possible variations in resolution and energetic composition. SMOG 2 is a downloadable software package that reads user-designated structural information and user-defined energy definitions, in order to produce the files necessary to use SBMs with high performance molecular dynamics packages: GROMACS and NAMD. SMOG 2 is bundled with XML-formatted template files that define commonly used SBMs, and it can process template files that are altered according to the needs of each user. This computational infrastructure also allows for experimental or bioinformatics-derived restraints or novel structural features to be included, e.g. novel ligands, prosthetic groups and post-translational/transcriptional modifications. The code and user guide can be downloaded at http://smog-server.org/smog2.

  7. User-defined functions in the Arden Syntax: An extension proposal.

    PubMed

    Karadimas, Harry; Ebrahiminia, Vahid; Lepage, Eric

    2015-12-11

    The Arden Syntax is a knowledge-encoding standard, started in 1989, and now in its 10th revision, maintained by the health level seven (HL7) organization. It has constructs borrowed from several language concepts that were available at that time (mainly the HELP hospital information system and the Regenstrief medical record system (RMRS), but also the Pascal language, functional languages and the data structure of frames, used in artificial intelligence). The syntax has a rationale for its constructs, and has restrictions that follow this rationale. The main goal of the Standard is to promote knowledge sharing, by avoiding the complexity of traditional programs, so that a medical logic module (MLM) written in the Arden Syntax can remain shareable and understandable across institutions. One of the restrictions of the syntax is that you cannot define your own functions and subroutines inside an MLM. An MLM can, however, call another MLM, where this MLM will serve as a function. This will add an additional dependency between MLMs, a known criticism of the Arden Syntax knowledge model. This article explains why we believe the Arden Syntax would benefit from a construct for user-defined functions, discusses the need, the benefits and the limitations of such a construct. We used the recent grammar of the Arden Syntax v.2.10, and both the Arden Syntax standard document and the Arden Syntax Rationale article as guidelines. We gradually introduced production rules to the grammar. We used the CUP parsing tool to verify that no ambiguities were detected. A new grammar was produced, that supports user-defined functions. 22 production rules were added to the grammar. A parser was built using the CUP parsing tool. A few examples are given to illustrate the concepts. All examples were parsed correctly. It is possible to add user-defined functions to the Arden Syntax in a way that remains coherent with the standard. We believe that this enhances the readability and the robustness of MLMs. A detailed proposal will be submitted by the end of the year to the HL7 workgroup on Arden Syntax. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Software For Computer-Aided Design Of Control Systems

    NASA Technical Reports Server (NTRS)

    Wette, Matthew

    1994-01-01

    Computer Aided Engineering System (CAESY) software developed to provide means to evaluate methods for dealing with users' needs in computer-aided design of control systems. Interpreter program for performing engineering calculations. Incorporates features of both Ada and MATLAB. Designed to be flexible and powerful. Includes internally defined functions, procedures and provides for definition of functions and procedures by user. Written in C language.

  9. Designing Class Methods from Dataflow Diagrams

    NASA Astrophysics Data System (ADS)

    Shoval, Peretz; Kabeli-Shani, Judith

    A method for designing the class methods of an information system is described. The method is part of FOOM - Functional and Object-Oriented Methodology. In the analysis phase of FOOM, two models defining the users' requirements are created: a conceptual data model - an initial class diagram; and a functional model - hierarchical OO-DFDs (object-oriented dataflow diagrams). Based on these models, a well-defined process of methods design is applied. First, the OO-DFDs are converted into transactions, i.e., system processes that supports user task. The components and the process logic of each transaction are described in detail, using pseudocode. Then, each transaction is decomposed, according to well-defined rules, into class methods of various types: basic methods, application-specific methods and main transaction (control) methods. Each method is attached to a proper class; messages between methods express the process logic of each transaction. The methods are defined using pseudocode or message charts.

  10. SU-F-T-301: Planar Dose Pass Rate Inflation Due to the MapCHECK Measurement Uncertainty Function

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bailey, D; Spaans, J; Kumaraswamy, L

    Purpose: To quantify the effect of the Measurement Uncertainty function on planar dosimetry pass rates, as analyzed with Sun Nuclear Corporation analytic software (“MapCHECK” or “SNC Patient”). This optional function is toggled on by default upon software installation, and automatically increases the user-defined dose percent difference (%Diff) tolerance for each planar dose comparison. Methods: Dose planes from 109 IMRT fields and 40 VMAT arcs were measured with the MapCHECK 2 diode array, and compared to calculated planes from a commercial treatment planning system. Pass rates were calculated within the SNC analytic software using varying calculation parameters, including Measurement Uncertainty onmore » and off. By varying the %Diff criterion for each dose comparison performed with Measurement Uncertainty turned off, an effective %Diff criterion was defined for each field/arc corresponding to the pass rate achieved with MapCHECK Uncertainty turned on. Results: For 3%/3mm analysis, the Measurement Uncertainty function increases the user-defined %Diff by 0.8–1.1% average, depending on plan type and calculation technique, for an average pass rate increase of 1.0–3.5% (maximum +8.7%). For 2%, 2 mm analysis, the Measurement Uncertainty function increases the user-defined %Diff by 0.7–1.2% average, for an average pass rate increase of 3.5–8.1% (maximum +14.2%). The largest increases in pass rate are generally seen with poorly-matched planar dose comparisons; the MapCHECK Uncertainty effect is markedly smaller as pass rates approach 100%. Conclusion: The Measurement Uncertainty function may substantially inflate planar dose comparison pass rates for typical IMRT and VMAT planes. The types of uncertainties incorporated into the function (and their associated quantitative estimates) as described in the software user’s manual may not accurately estimate realistic measurement uncertainty for the user’s measurement conditions. Pass rates listed in published reports or otherwise compared to the results of other users or vendors should clearly indicate whether the Measurement Uncertainty function is used.« less

  11. User's guide to programming fault injection and data acquisition in the SIFT environment

    NASA Technical Reports Server (NTRS)

    Elks, Carl R.; Green, David F.; Palumbo, Daniel L.

    1987-01-01

    Described are the features, command language, and functional design of the SIFT (Software Implemented Fault Tolerance) fault injection and data acquisition interface software. The document is also intended to assist and guide the SIFT user in defining, developing, and executing SIFT fault injection experiments and the subsequent collection and reduction of that fault injection data. It is also intended to be used in conjunction with the SIFT User's Guide (NASA Technical Memorandum 86289) for reference to SIFT system commands, procedures and functions, and overall guidance in SIFT system programming.

  12. AESOP- INTERACTIVE DESIGN OF LINEAR QUADRATIC REGULATORS AND KALMAN FILTERS

    NASA Technical Reports Server (NTRS)

    Lehtinen, B.

    1994-01-01

    AESOP was developed to solve a number of problems associated with the design of controls and state estimators for linear time-invariant systems. The systems considered are modeled in state-variable form by a set of linear differential and algebraic equations with constant coefficients. Two key problems solved by AESOP are the linear quadratic regulator (LQR) design problem and the steady-state Kalman filter design problem. AESOP is designed to be used in an interactive manner. The user can solve design problems and analyze the solutions in a single interactive session. Both numerical and graphical information are available to the user during the session. The AESOP program is structured around a list of predefined functions. Each function performs a single computation associated with control, estimation, or system response determination. AESOP contains over sixty functions and permits the easy inclusion of user defined functions. The user accesses these functions either by inputting a list of desired functions in the order they are to be performed, or by specifying a single function to be performed. The latter case is used when the choice of function and function order depends on the results of previous functions. The available AESOP functions are divided into several general areas including: 1) program control, 2) matrix input and revision, 3) matrix formation, 4) open-loop system analysis, 5) frequency response, 6) transient response, 7) transient function zeros, 8) LQR and Kalman filter design, 9) eigenvalues and eigenvectors, 10) covariances, and 11) user-defined functions. The most important functions are those that design linear quadratic regulators and Kalman filters. The user interacts with AESOP when using these functions by inputting design weighting parameters and by viewing displays of designed system response. Support functions obtain system transient and frequency responses, transfer functions, and covariance matrices. AESOP can also provide the user with open-loop system information including stability, controllability, and observability. The AESOP program is written in FORTRAN IV for interactive execution and has been implemented on an IBM 3033 computer using TSS 370. As currently configured, AESOP has a central memory requirement of approximately 2 Megs of 8 bit bytes. Memory requirements can be reduced by redimensioning arrays in the AESOP program. Graphical output requires adaptation of the AESOP plot routines to whatever device is available. The AESOP program was developed in 1984.

  13. ANL/RBC: A computer code for the analysis of Rankine bottoming cycles, including system cost evaluation and off-design performance

    NASA Technical Reports Server (NTRS)

    Mclennan, G. A.

    1986-01-01

    This report describes, and is a User's Manual for, a computer code (ANL/RBC) which calculates cycle performance for Rankine bottoming cycles extracting heat from a specified source gas stream. The code calculates cycle power and efficiency and the sizes for the heat exchangers, using tabular input of the properties of the cycle working fluid. An option is provided to calculate the costs of system components from user defined input cost functions. These cost functions may be defined in equation form or by numerical tabular data. A variety of functional forms have been included for these functions and they may be combined to create very general cost functions. An optional calculation mode can be used to determine the off-design performance of a system when operated away from the design-point, using the heat exchanger areas calculated for the design-point.

  14. The Profile-Query Relationship.

    ERIC Educational Resources Information Center

    Shepherd, Michael A.; Phillips, W. J.

    1986-01-01

    Defines relationship between user profile and user query in terms of relationship between clusters of documents retrieved by each, and explores the expression of cluster similarity and cluster overlap as linear functions of similarity existing between original pairs of profiles and queries, given the desired retrieval threshold. (23 references)…

  15. Nonlinear Meshfree Analysis Program (NMAP) Version 1.0 (User’s Manual)

    DTIC Science & Technology

    2012-12-01

    divided by the number of time increments used in the analysis . In addition to prescribing total nodal displacements in the neutral file, users are...conditions, the user must define material properties, initial conditions, and a variety of control parameters for the NMAP analysis . These data are provided...a script file. Restart A restart function is provided in the NMAP code, where the user may restart an analysis using a set of restart files. In

  16. Scenario-Based Assessment of User Needs for Point-of-Care Robots.

    PubMed

    Lee, Hyeong Suk; Kim, Jeongeun

    2018-01-01

    This study aimed to derive specific user requirements and barriers in a real medical environment to define the essential elements and functions of two types of point-of-care (POC) robot: a telepresence robot as a tool for teleconsultation, and a bedside robot to provide emotional care for patients. An analysis of user requirements was conducted; user needs were gathered and identified, and detailed, realistic scenarios were created. The prototype robots were demonstrated in physical environments for envisioning and evaluation. In all, three nurses and three clinicians participated as evaluators to observe the demonstrations and evaluate the robot systems. The evaluators were given a brief explanation of each scene and the robots' functionality. Four major functions of the teleconsultation robot were defined and tested in the demonstration. In addition, four major functions of the bedside robot were evaluated. Among the desired functions for a teleconsultation robot, medical information delivery and communication had high priority. For a bedside robot, patient support, patient monitoring, and healthcare provider support were the desired functions. The evaluators reported that the teleconsultation robot can increase support from and access to specialists and resources. They mentioned that the bedside robot can improve the quality of hospital life. Problems identified in the demonstration were those of space conflict, communication errors, and safety issues. Incorporating this technology into healthcare services will enhance communication and teamwork skills across distances and thereby facilitate teamwork. However, repeated tests will be needed to evaluate and ensure improved performance.

  17. Scenario-Based Assessment of User Needs for Point-of-Care Robots

    PubMed Central

    Lee, Hyeong Suk

    2018-01-01

    Objectives This study aimed to derive specific user requirements and barriers in a real medical environment to define the essential elements and functions of two types of point-of-care (POC) robot: a telepresence robot as a tool for teleconsultation, and a bedside robot to provide emotional care for patients. Methods An analysis of user requirements was conducted; user needs were gathered and identified, and detailed, realistic scenarios were created. The prototype robots were demonstrated in physical environments for envisioning and evaluation. In all, three nurses and three clinicians participated as evaluators to observe the demonstrations and evaluate the robot systems. The evaluators were given a brief explanation of each scene and the robots' functionality. Four major functions of the teleconsultation robot were defined and tested in the demonstration. In addition, four major functions of the bedside robot were evaluated. Results Among the desired functions for a teleconsultation robot, medical information delivery and communication had high priority. For a bedside robot, patient support, patient monitoring, and healthcare provider support were the desired functions. The evaluators reported that the teleconsultation robot can increase support from and access to specialists and resources. They mentioned that the bedside robot can improve the quality of hospital life. Problems identified in the demonstration were those of space conflict, communication errors, and safety issues. Conclusions Incorporating this technology into healthcare services will enhance communication and teamwork skills across distances and thereby facilitate teamwork. However, repeated tests will be needed to evaluate and ensure improved performance. PMID:29503748

  18. Adaptable Constrained Genetic Programming: Extensions and Applications

    NASA Technical Reports Server (NTRS)

    Janikow, Cezary Z.

    2005-01-01

    An evolutionary algorithm applies evolution-based principles to problem solving. To solve a problem, the user defines the space of potential solutions, the representation space. Sample solutions are encoded in a chromosome-like structure. The algorithm maintains a population of such samples, which undergo simulated evolution by means of mutation, crossover, and survival of the fittest principles. Genetic Programming (GP) uses tree-like chromosomes, providing very rich representation suitable for many problems of interest. GP has been successfully applied to a number of practical problems such as learning Boolean functions and designing hardware circuits. To apply GP to a problem, the user needs to define the actual representation space, by defining the atomic functions and terminals labeling the actual trees. The sufficiency principle requires that the label set be sufficient to build the desired solution trees. The closure principle allows the labels to mix in any arity-consistent manner. To satisfy both principles, the user is often forced to provide a large label set, with ad hoc interpretations or penalties to deal with undesired local contexts. This unfortunately enlarges the actual representation space, and thus usually slows down the search. In the past few years, three different methodologies have been proposed to allow the user to alleviate the closure principle by providing means to define, and to process, constraints on mixing the labels in the trees. Last summer we proposed a new methodology to further alleviate the problem by discovering local heuristics for building quality solution trees. A pilot system was implemented last summer and tested throughout the year. This summer we have implemented a new revision, and produced a User's Manual so that the pilot system can be made available to other practitioners and researchers. We have also designed, and partly implemented, a larger system capable of dealing with much more powerful heuristics.

  19. A Computer Program for Testing Grammars On-Line.

    ERIC Educational Resources Information Center

    Gross, Louis N.

    This paper describes a computer system which is intended to aid the linguist in building a transformational grammar. The program operates as a rule tester, performing three services for the user through sets of functions which allow the user to--specify, change, and print base trees (to which transformations would apply); define transformations…

  20. SLiMSearch 2.0: biological context for short linear motifs in proteins

    PubMed Central

    Davey, Norman E.; Haslam, Niall J.; Shields, Denis C.

    2011-01-01

    Short, linear motifs (SLiMs) play a critical role in many biological processes. The SLiMSearch 2.0 (Short, Linear Motif Search) web server allows researchers to identify occurrences of a user-defined SLiM in a proteome, using conservation and protein disorder context statistics to rank occurrences. User-friendly output and visualizations of motif context allow the user to quickly gain insight into the validity of a putatively functional motif occurrence. For each motif occurrence, overlapping UniProt features and annotated SLiMs are displayed. Visualization also includes annotated multiple sequence alignments surrounding each occurrence, showing conservation and protein disorder statistics in addition to known and predicted SLiMs, protein domains and known post-translational modifications. In addition, enrichment of Gene Ontology terms and protein interaction partners are provided as indicators of possible motif function. All web server results are available for download. Users can search motifs against the human proteome or a subset thereof defined by Uniprot accession numbers or GO term. The SLiMSearch server is available at: http://bioware.ucd.ie/slimsearch2.html. PMID:21622654

  1. Detailed requirements document for the integrated structural analysis system, phase B

    NASA Technical Reports Server (NTRS)

    Rainey, J. A.

    1976-01-01

    The requirements are defined for a software system entitled integrated Structural Analysis System (ISAS) Phase B which is being developed to provide the user with a tool by which a complete and detailed analysis of a complex structural system can be performed. This software system will allow for automated interface with numerous structural analysis batch programs and for user interaction in the creation, selection, and validation of data. This system will include modifications to the 4 functions developed for ISAS, and the development of 25 new functions. The new functions are described.

  2. Numerical Function Generators Using LUT Cascades

    DTIC Science & Technology

    2007-06-01

    either algebraically (for example, sinðxÞ) or as a table of input/ output values. The user defines the numerical function by using the syntax of Scilab ...defined function in Scilab or specify it directly. Note that, by changing the parser of our system, any format can be used for the design entry. First...Methods for Multiple-Valued Input Address Generators,” Proc. 36th IEEE Int’l Symp. Multiple-Valued Logic (ISMVL ’06), May 2006. [29] Scilab 3.0, INRIA-ENPC

  3. Information Needs of the Ceramic Industry; A Users-Need Study.

    ERIC Educational Resources Information Center

    Janning, Edward A.; And Others

    This report examines the problems in the flow of scientific and technological information in the Ceramic Industry. The research methodology used involved a panel of experts which defined the functions performed by ceramists and their corresponding information needs, listed sources of information available to ceramists, and defined problems and…

  4. User productivity as a function of AutoCAD interface design.

    PubMed

    Mitta, D A; Flores, P L

    1995-12-01

    Increased operator productivity is a desired outcome of user-CAD interaction scenarios. Two objectives of this research were to (1) define a measure of operator productivity and (2) empirically investigate the potential effects of CAD interface design on operator productivity, where productivity is defined as the percentage of a drawing session correctly completed per unit time. Here, AutoCAD provides the CAD environment of interest. Productivity with respect to two AutoCAD interface designs (menu, template) and three task types (draw, dimension, display) was investigated. Analysis of user productivity data revealed significantly higher productivity under the menu interface condition than under the template interface condition. A significant effect of task type was also discovered, where user productivity under display tasks was higher than productivity under the draw and dimension tasks. Implications of these results are presented.

  5. Definition study of land/sea civil user navigational location monitoring systems for NAVSTAR GPS: User requirements and systems concepts

    NASA Technical Reports Server (NTRS)

    Devito, D. M.

    1981-01-01

    A low-cost GPS civil-user mobile terminal whose purchase cost is substantially an order of magnitude less than estimates for the military counterpart is considered with focus on ground station requirements for position monitoring of civil users requiring this capability and the civil user navigation and location-monitoring requirements. Existing survey literature was examined to ascertain the potential users of a low-cost NAVSTAR receiver and to estimate their number, function, and accuracy requirements. System concepts are defined for low cost user equipments for in-situ navigation and the retransmission of low data rate positioning data via a geostationary satellite to a central computing facility.

  6. Web-based Toolkit for Dynamic Generation of Data Processors

    NASA Astrophysics Data System (ADS)

    Patel, J.; Dascalu, S.; Harris, F. C.; Benedict, K. K.; Gollberg, G.; Sheneman, L.

    2011-12-01

    All computation-intensive scientific research uses structured datasets, including hydrology and all other types of climate-related research. When it comes to testing their hypotheses, researchers might use the same dataset differently, and modify, transform, or convert it to meet their research needs. Currently, many researchers spend a good amount of time performing data processing and building tools to speed up this process. They might routinely repeat the same process activities for new research projects, spending precious time that otherwise could be dedicated to analyzing and interpreting the data. Numerous tools are available to run tests on prepared datasets and many of them work with datasets in different formats. However, there is still a significant need for applications that can comprehensively handle data transformation and conversion activities and help prepare the various processed datasets required by the researchers. We propose a web-based application (a software toolkit) that dynamically generates data processors capable of performing data conversions, transformations, and customizations based on user-defined mappings and selections. As a first step, the proposed solution allows the users to define various data structures and, in the next step, can select various file formats and data conversions for their datasets of interest. In a simple scenario, the core of the proposed web-based toolkit allows the users to define direct mappings between input and output data structures. The toolkit will also support defining complex mappings involving the use of pre-defined sets of mathematical, statistical, date/time, and text manipulation functions. Furthermore, the users will be allowed to define logical cases for input data filtering and sampling. At the end of the process, the toolkit is designed to generate reusable source code and executable binary files for download and use by the scientists. The application is also designed to store all data structures and mappings defined by a user (an author), and allow the original author to modify them using standard authoring techniques. The users can change or define new mappings to create new data processors for download and use. In essence, when executed, the generated data processor binary file can take an input data file in a given format and output this data, possibly transformed, in a different file format. If they so desire, the users will be able modify directly the source code in order to define more complex mappings and transformations that are not currently supported by the toolkit. Initially aimed at supporting research in hydrology, the toolkit's functions and features can be either directly used or easily extended to other areas of climate-related research. The proposed web-based data processing toolkit will be able to generate various custom software processors for data conversion and transformation in a matter of seconds or minutes, saving a significant amount of researchers' time and allowing them to focus on core research issues.

  7. Crack propagation in functionally graded strip under thermal shock

    NASA Astrophysics Data System (ADS)

    Ivanov, I. V.; Sadowski, T.; Pietras, D.

    2013-09-01

    The thermal shock problem in a strip made of functionally graded composite with an interpenetrating network micro-structure of Al2O3 and Al is analysed numerically. The material considered here could be used in brake disks or cylinder liners. In both applications it is subjected to thermal shock. The description of the position-dependent properties of the considered functionally graded material are based on experimental data. Continuous functions were constructed for the Young's modulus, thermal expansion coefficient, thermal conductivity and thermal diffusivity and implemented as user-defined material properties in user-defined subroutines of the commercial finite element software ABAQUS™. The thermal stress and the residual stress of the manufacturing process distributions inside the strip are considered. The solution of the transient heat conduction problem for thermal shock is used for crack propagation simulation using the XFEM method. The crack length developed during the thermal shock is the criterion for crack resistance of the different graduation profiles as a step towards optimization of the composition gradient with respect to thermal shock sensitivity.

  8. Space station needs, attributes and architectural options study. Volume 3: Mission requirements

    NASA Technical Reports Server (NTRS)

    1983-01-01

    User missions that are enabled or enhanced by a manned space station are identified. The mission capability requirements imposed on the space station by these users are delineated. The accommodation facilities, equipment, and functional requirements necessary to achieve these capabilities are identified, and the economic, performance, and social benefits which accrue from the space station are defined.

  9. Dynamic User Interfaces for Service Oriented Architectures in Healthcare.

    PubMed

    Schweitzer, Marco; Hoerbst, Alexander

    2016-01-01

    Electronic Health Records (EHRs) play a crucial role in healthcare today. Considering a data-centric view, EHRs are very advanced as they provide and share healthcare data in a cross-institutional and patient-centered way adhering to high syntactic and semantic interoperability. However, the EHR functionalities available for the end users are rare and hence often limited to basic document query functions. Future EHR use necessitates the ability to let the users define their needed data according to a certain situation and how this data should be processed. Workflow and semantic modelling approaches as well as Web services provide means to fulfil such a goal. This thesis develops concepts for dynamic interfaces between EHR end users and a service oriented eHealth infrastructure, which allow the users to design their flexible EHR needs, modeled in a dynamic and formal way. These are used to discover, compose and execute the right Semantic Web services.

  10. Significance of User Participation in a Hospital Information System Success: Insights From a Case Study.

    PubMed

    Saleem, Naveed; Steel, Douglas; Gercek, Gokhan; Chandra, Ashish

    User participation in the development of a system is universally prescribed as an effective strategy to ensure the success of the resultant system. However, the existing literature on the merits of user participation only provides equivocal evidence. Various analyses of this literature point out that this equivocal evidence may be due to inconsistent operational measures of the user participation and system success constructs. Planned organizational change and participative decision making, the underlying paradigms of user participation construct, suggest that the development of some information systems may require blending of users' system-related functional expertise and developers' technical expertise to ensure system success. These paradigms also maintain that in case of well-defined, structured information systems user participation should enhance the likelihood of system success through better user understanding of the need for the system and system content and objectives, user trust, and a sense of system ownership. This research also described a case study involving the development and implementation of a medical records system for a neonatal intensive care unit in a large hospital in Texas. The case study provides evidence that in systems that require incorporation of user functional expertise user participation will enhance the likelihood of system success.

  11. Development of methodologies and procedures for identifying STS users and uses

    NASA Technical Reports Server (NTRS)

    Archer, J. L.; Beauchamp, N. A.; Macmichael, D. C.

    1974-01-01

    A study was conducted to identify new uses and users of the new Space Transporation System (STS) within the domestic government sector. The study develops a series of analytical techniques and well-defined functions structured as an integrated planning process to assure efficient and meaningful use of the STS. The purpose of the study is to provide NASA with the following functions: (1) to realize efficient and economic use of the STS and other NASA capabilities, (2) to identify new users and uses of the STS, (3) to contribute to organized planning activities for both current and future programs, and (4) to air in analyzing uses of NASA's overall capabilities.

  12. Managing multiple image stacks from confocal laser scanning microscopy

    NASA Astrophysics Data System (ADS)

    Zerbe, Joerg; Goetze, Christian H.; Zuschratter, Werner

    1999-05-01

    A major goal in neuroanatomy is to obtain precise information about the functional organization of neuronal assemblies and their interconnections. Therefore, the analysis of histological sections frequently requires high resolution images in combination with an overview about the structure. To overcome this conflict we have previously introduced a software for the automatic acquisition of multiple image stacks (3D-MISA) in confocal laser scanning microscopy. Here, we describe a Windows NT based software for fast and easy navigation through the multiple images stacks (MIS-browser), the visualization of individual channels and layers and the selection of user defined subregions. In addition, the MIS browser provides useful tools for the visualization and evaluation of the datavolume, as for instance brightness and contrast corrections of individual layers and channels. Moreover, it includes a maximum intensity projection, panning and zoom in/out functions within selected channels or focal planes (x/y) and tracking along the z-axis. The import module accepts any tiff-format and reconstructs the original image arrangement after the user has defined the sequence of images in x/y and z and the number of channels. The implemented export module allows storage of user defined subregions (new single image stacks) for further 3D-reconstruction and evaluation.

  13. General Tool for Evaluating High-Contrast Coronagraphic Telescope Performance Error Budgets

    NASA Technical Reports Server (NTRS)

    Marchen, Luis F.

    2011-01-01

    The Coronagraph Performance Error Budget (CPEB) tool automates many of the key steps required to evaluate the scattered starlight contrast in the dark hole of a space-based coronagraph. The tool uses a Code V prescription of the optical train, and uses MATLAB programs to call ray-trace code that generates linear beam-walk and aberration sensitivity matrices for motions of the optical elements and line-of-sight pointing, with and without controlled fine-steering mirrors (FSMs). The sensitivity matrices are imported by macros into Excel 2007, where the error budget is evaluated. The user specifies the particular optics of interest, and chooses the quality of each optic from a predefined set of PSDs. The spreadsheet creates a nominal set of thermal and jitter motions, and combines that with the sensitivity matrices to generate an error budget for the system. CPEB also contains a combination of form and ActiveX controls with Visual Basic for Applications code to allow for user interaction in which the user can perform trade studies such as changing engineering requirements, and identifying and isolating stringent requirements. It contains summary tables and graphics that can be instantly used for reporting results in view graphs. The entire process to obtain a coronagraphic telescope performance error budget has been automated into three stages: conversion of optical prescription from Zemax or Code V to MACOS (in-house optical modeling and analysis tool), a linear models process, and an error budget tool process. The first process was improved by developing a MATLAB package based on the Class Constructor Method with a number of user-defined functions that allow the user to modify the MACOS optical prescription. The second process was modified by creating a MATLAB package that contains user-defined functions that automate the process. The user interfaces with the process by utilizing an initialization file where the user defines the parameters of the linear model computations. Other than this, the process is fully automated. The third process was developed based on the Terrestrial Planet Finder coronagraph Error Budget Tool, but was fully automated by using VBA code, form, and ActiveX controls.

  14. SPLICER - A GENETIC ALGORITHM TOOL FOR SEARCH AND OPTIMIZATION, VERSION 1.0 (MACINTOSH VERSION)

    NASA Technical Reports Server (NTRS)

    Wang, L.

    1994-01-01

    SPLICER is a genetic algorithm tool which can be used to solve search and optimization problems. Genetic algorithms are adaptive search procedures (i.e. problem solving methods) based loosely on the processes of natural selection and Darwinian "survival of the fittest." SPLICER provides the underlying framework and structure for building a genetic algorithm application. These algorithms apply genetically-inspired operators to populations of potential solutions in an iterative fashion, creating new populations while searching for an optimal or near-optimal solution to the problem at hand. SPLICER 1.0 was created using a modular architecture that includes a Genetic Algorithm Kernel, interchangeable Representation Libraries, Fitness Modules and User Interface Libraries, and well-defined interfaces between these components. The architecture supports portability, flexibility, and extensibility. SPLICER comes with all source code and several examples. For instance, a "traveling salesperson" example searches for the minimum distance through a number of cities visiting each city only once. Stand-alone SPLICER applications can be used without any programming knowledge. However, to fully utilize SPLICER within new problem domains, familiarity with C language programming is essential. SPLICER's genetic algorithm (GA) kernel was developed independent of representation (i.e. problem encoding), fitness function or user interface type. The GA kernel comprises all functions necessary for the manipulation of populations. These functions include the creation of populations and population members, the iterative population model, fitness scaling, parent selection and sampling, and the generation of population statistics. In addition, miscellaneous functions are included in the kernel (e.g., random number generators). Different problem-encoding schemes and functions are defined and stored in interchangeable representation libraries. This allows the GA kernel to be used with any representation scheme. The SPLICER tool provides representation libraries for binary strings and for permutations. These libraries contain functions for the definition, creation, and decoding of genetic strings, as well as multiple crossover and mutation operators. Furthermore, the SPLICER tool defines the appropriate interfaces to allow users to create new representation libraries. Fitness modules are the only component of the SPLICER system a user will normally need to create or alter to solve a particular problem. Fitness functions are defined and stored in interchangeable fitness modules which must be created using C language. Within a fitness module, a user can create a fitness (or scoring) function, set the initial values for various SPLICER control parameters (e.g., population size), create a function which graphically displays the best solutions as they are found, and provide descriptive information about the problem. The tool comes with several example fitness modules, while the process of developing a fitness module is fully discussed in the accompanying documentation. The user interface is event-driven and provides graphic output in windows. SPLICER is written in Think C for Apple Macintosh computers running System 6.0.3 or later and Sun series workstations running SunOS. The UNIX version is easily ported to other UNIX platforms and requires MIT's X Window System, Version 11 Revision 4 or 5, MIT's Athena Widget Set, and the Xw Widget Set. Example executables and source code are included for each machine version. The standard distribution media for the Macintosh version is a set of three 3.5 inch Macintosh format diskettes. The standard distribution medium for the UNIX version is a .25 inch streaming magnetic tape cartridge in UNIX tar format. For the UNIX version, alternate distribution media and formats are available upon request. SPLICER was developed in 1991.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chao, Tian-Jy; Kim, Younghun

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented thatmore » communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.« less

  16. Electronic health record "super-users" and "under-users" in ambulatory care practices.

    PubMed

    Rumball-Smith, Juliet; Shekelle, Paul; Damberg, Cheryl L

    2018-01-01

    This study explored variation in the extent of use of electronic health record (EHR)-based health information technology (IT) functionalities across US ambulatory care practices. Use of health IT functionalities in ambulatory care is important for delivering high-quality care, including that provided in coordination with multiple practitioners. We used data from the 2014 Healthcare Information and Management Systems Society Analytics survey. The responses of 30,123 ambulatory practices with an operational EHR were analyzed to examine the extent of use of EHR-based health IT functionalities for each practice. We created a novel framework for classifying ambulatory care practices employing 7 domains of health IT functionality. Drawing from the survey responses, we created a composite "use" variable indicating the extent of health IT functionality use across these domains. "Super-user" practices were defined as having near-full employment of the 7 domains of health IT functionalities and "under-users" as those with minimal or no use of health IT functionalities. We used multivariable logistic regression to investigate how the odds of super-use and under-use varied by practice size, type, urban or rural location, and geographic region. Seventy-three percent of practices were not using EHR technologies to their full capability, and nearly 40% were classified as under-users. Under-user practices were more likely to be of smaller size, situated in the West, and located outside a metropolitan area. To achieve the broader benefits of the EHR and health IT, health systems and policy makers need to identify and address barriers to full use of health IT functionalities.

  17. The Westinghouse Series 1000 Mobile Phone: Technology and applications

    NASA Technical Reports Server (NTRS)

    Connelly, Brian

    1993-01-01

    Mobile satellite communications will be popularized by the North American Mobile Satellite (MSAT) system. The success of the overall system is dependent upon the quality of the mobile units. Westinghouse is designing our unit, the Series 1000 Mobile Phone, with the user in mind. The architecture and technology aim at providing optimum performance at a low per unit cost. The features and functions of the Series 1000 Mobile Phone have been defined by potential MSAT users. The latter portion of this paper deals with who those users may be.

  18. The computational structural mechanics testbed procedures manual

    NASA Technical Reports Server (NTRS)

    Stewart, Caroline B. (Compiler)

    1991-01-01

    The purpose of this manual is to document the standard high level command language procedures of the Computational Structural Mechanics (CSM) Testbed software system. A description of each procedure including its function, commands, data interface, and use is presented. This manual is designed to assist users in defining and using command procedures to perform structural analysis in the CSM Testbed User's Manual and the CSM Testbed Data Library Description.

  19. On the use of Bayesian decision theory for issuing natural hazard warnings

    NASA Astrophysics Data System (ADS)

    Economou, T.; Stephenson, D. B.; Rougier, J. C.; Neal, R. A.; Mylne, K. R.

    2016-10-01

    Warnings for natural hazards improve societal resilience and are a good example of decision-making under uncertainty. A warning system is only useful if well defined and thus understood by stakeholders. However, most operational warning systems are heuristic: not formally or transparently defined. Bayesian decision theory provides a framework for issuing warnings under uncertainty but has not been fully exploited. Here, a decision theoretic framework is proposed for hazard warnings. The framework allows any number of warning levels and future states of nature, and a mathematical model for constructing the necessary loss functions for both generic and specific end-users is described. The approach is illustrated using one-day ahead warnings of daily severe precipitation over the UK, and compared to the current decision tool used by the UK Met Office. A probability model is proposed to predict precipitation, given ensemble forecast information, and loss functions are constructed for two generic stakeholders: an end-user and a forecaster. Results show that the Met Office tool issues fewer high-level warnings compared with our system for the generic end-user, suggesting the former may not be suitable for risk averse end-users. In addition, raw ensemble forecasts are shown to be unreliable and result in higher losses from warnings.

  20. On the use of Bayesian decision theory for issuing natural hazard warnings.

    PubMed

    Economou, T; Stephenson, D B; Rougier, J C; Neal, R A; Mylne, K R

    2016-10-01

    Warnings for natural hazards improve societal resilience and are a good example of decision-making under uncertainty. A warning system is only useful if well defined and thus understood by stakeholders. However, most operational warning systems are heuristic: not formally or transparently defined. Bayesian decision theory provides a framework for issuing warnings under uncertainty but has not been fully exploited. Here, a decision theoretic framework is proposed for hazard warnings. The framework allows any number of warning levels and future states of nature, and a mathematical model for constructing the necessary loss functions for both generic and specific end-users is described. The approach is illustrated using one-day ahead warnings of daily severe precipitation over the UK, and compared to the current decision tool used by the UK Met Office. A probability model is proposed to predict precipitation, given ensemble forecast information, and loss functions are constructed for two generic stakeholders: an end-user and a forecaster. Results show that the Met Office tool issues fewer high-level warnings compared with our system for the generic end-user, suggesting the former may not be suitable for risk averse end-users. In addition, raw ensemble forecasts are shown to be unreliable and result in higher losses from warnings.

  1. On the use of Bayesian decision theory for issuing natural hazard warnings

    PubMed Central

    Stephenson, D. B.; Rougier, J. C.; Neal, R. A.; Mylne, K. R.

    2016-01-01

    Warnings for natural hazards improve societal resilience and are a good example of decision-making under uncertainty. A warning system is only useful if well defined and thus understood by stakeholders. However, most operational warning systems are heuristic: not formally or transparently defined. Bayesian decision theory provides a framework for issuing warnings under uncertainty but has not been fully exploited. Here, a decision theoretic framework is proposed for hazard warnings. The framework allows any number of warning levels and future states of nature, and a mathematical model for constructing the necessary loss functions for both generic and specific end-users is described. The approach is illustrated using one-day ahead warnings of daily severe precipitation over the UK, and compared to the current decision tool used by the UK Met Office. A probability model is proposed to predict precipitation, given ensemble forecast information, and loss functions are constructed for two generic stakeholders: an end-user and a forecaster. Results show that the Met Office tool issues fewer high-level warnings compared with our system for the generic end-user, suggesting the former may not be suitable for risk averse end-users. In addition, raw ensemble forecasts are shown to be unreliable and result in higher losses from warnings. PMID:27843399

  2. Research Challenges in Managing and Using Service Level Agreements

    NASA Astrophysics Data System (ADS)

    Rana, Omer; Ziegler, Wolfgang

    A Service Level Agreement (SLA) represents an agreement between a service user and a provider in the context of a particular service provision. SLAs contain Quality of Service properties that must be maintained by a provider, and as agreed between a provider and a user/client. These are generally defined as a set of Service Level Objectives (SLOs). These properties need to be measurable and must be monitored during the provision of the service that has been agreed in the SLA. The SLA must also contain a set of penalty clauses specifying what happens when service providers fail to deliver the pre-agreed quality. Hence, an SLA may be used by both a user and a provider - from a user perspective, an SLA defines what is required - often defined using non-functional attributes of service provision. From a providers perspective, an SLA may be used to support capacity planning - especially if a provider is making it's capability available to multiple users. An SLA may be used by a client and provider to manage their behaviour over time - for instance, to optimise their long running revenue (cost) or QoS attributes (such as execution time), for instance. The lifecycle of an SLA is outlined, along with various uses of SLAs to support infrastructure management. A discussion about WS-Agreement - the emerging standard for specifying SLAs - is also provided.

  3. International Space Station Alpha user payload operations concept

    NASA Technical Reports Server (NTRS)

    Schlagheck, Ronald A.; Crysel, William B.; Duncan, Elaine F.; Rider, James W.

    1994-01-01

    International Space Station Alpha (ISSA) will accommodate a variety of user payloads investigating diverse scientific and technology disciplines on behalf of five international partners: Canada, Europe, Japan, Russia, and the United States. A combination of crew, automated systems, and ground operations teams will control payload operations that require complementary on-board and ground systems. This paper presents the current planning for the ISSA U.S. user payload operations concept and the functional architecture supporting the concept. It describes various NASA payload operations facilities, their interfaces, user facility flight support, the payload planning system, the onboard and ground data management system, and payload operations crew and ground personnel training. This paper summarizes the payload operations infrastructure and architecture developed at the Marshall Space Flight Center (MSFC) to prepare and conduct ISSA on-orbit payload operations from the Payload Operations Integration Center (POIC), and from various user operations locations. The authors pay particular attention to user data management, which includes interfaces with both the onboard data management system and the ground data system. Discussion covers the functional disciplines that define and support POIC payload operations: Planning, Operations Control, Data Management, and Training. The paper describes potential interfaces between users and the POIC disciplines, from the U.S. user perspective.

  4. Trajectory Optimization: OTIS 4

    NASA Technical Reports Server (NTRS)

    Riehl, John P.; Sjauw, Waldy K.; Falck, Robert D.; Paris, Stephen W.

    2010-01-01

    The latest release of the Optimal Trajectories by Implicit Simulation (OTIS4) allows users to simulate and optimize aerospace vehicle trajectories. With OTIS4, one can seamlessly generate optimal trajectories and parametric vehicle designs simultaneously. New features also allow OTIS4 to solve non-aerospace continuous time optimal control problems. The inputs and outputs of OTIS4 have been updated extensively from previous versions. Inputs now make use of objectoriented constructs, including one called a metastring. Metastrings use a greatly improved calculator and common nomenclature to reduce the user s workload. They allow for more flexibility in specifying vehicle physical models, boundary conditions, and path constraints. The OTIS4 calculator supports common mathematical functions, Boolean operations, and conditional statements. This allows users to define their own variables for use as outputs, constraints, or objective functions. The user-defined outputs can directly interface with other programs, such as spreadsheets, plotting packages, and visualization programs. Internally, OTIS4 has more explicit and implicit integration procedures, including high-order collocation methods, the pseudo-spectral method, and several variations of multiple shooting. Users may switch easily between the various methods. Several unique numerical techniques such as automated variable scaling and implicit integration grid refinement, support the integration methods. OTIS4 is also significantly more user friendly than previous versions. The installation process is nearly identical on various platforms, including Microsoft Windows, Apple OS X, and Linux operating systems. Cross-platform scripts also help make the execution of OTIS and post-processing of data easier. OTIS4 is supplied free by NASA and is subject to ITAR (International Traffic in Arms Regulations) restrictions. Users must have a Fortran compiler, and a Python interpreter is highly recommended.

  5. An object oriented extension to CLIPS

    NASA Technical Reports Server (NTRS)

    Sobkowicz, Clifford

    1990-01-01

    A presentation of software sub-system developed to augment C Language Production Systems (CLIPS) with facilities for object oriented Knowledge representation. Functions are provided to define classes, instantiate objects, access attributes, and assert object related facts. This extension is implemented via the CLIPS user function interface and does not require modification of any CLIPS code. It does rely on internal CLIPS functions for memory management and symbol representation.

  6. Multicellular Vascularized Engineered Tissues through User-Programmable Biomaterial Photodegradation.

    PubMed

    Arakawa, Christopher K; Badeau, Barry A; Zheng, Ying; DeForest, Cole A

    2017-10-01

    A photodegradable material-based approach to generate endothelialized 3D vascular networks within cell-laden hydrogel biomaterials is introduced. Exploiting multiphoton lithography, microchannel networks spanning nearly all size scales of native human vasculature are readily generated with unprecedented user-defined 4D control. Intraluminal channel architectures of synthetic vessels are fully customizable, providing new opportunities for next-generation microfluidics and directed cell function. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. End-to-end interoperability and workflows from building architecture design to one or more simulations

    DOEpatents

    Chao, Tian-Jy; Kim, Younghun

    2015-02-10

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented that communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.

  8. Embedded CLIPS for SDI BM/C3 simulation and analysis

    NASA Technical Reports Server (NTRS)

    Gossage, Brett; Nanney, Van

    1990-01-01

    Nichols Research Corporation is developing the BM/C3 Requirements Analysis Tool (BRAT) for the U.S. Army Strategic Defense Command. BRAT uses embedded CLIPS/Ada to model the decision making processes used by the human commander of a defense system. Embedding CLlPS/Ada in BRAT allows the user to explore the role of the human in Command and Control (C2) and the use of expert systems for automated C2. BRAT models assert facts about the current state of the system, the simulated scenario, and threat information into CLIPS/Ada. A user-defined rule set describes the decision criteria for the commander. We have extended CLIPS/Ada with user-defined functions that allow the firing of a rule to invoke a system action such as weapons release or a change in strategy. The use of embedded CLIPS/Ada will provide a powerful modeling tool for our customer at minimal cost.

  9. Brain connectivity aberrations in anabolic-androgenic steroid users.

    PubMed

    Westlye, Lars T; Kaufmann, Tobias; Alnæs, Dag; Hullstein, Ingunn R; Bjørnebekk, Astrid

    2017-01-01

    Sustained anabolic-androgenic steroid (AAS) use has adverse behavioral consequences, including aggression, violence and impulsivity. Candidate mechanisms include disruptions of brain networks with high concentrations of androgen receptors and critically involved in emotional and cognitive regulation. Here, we tested the effects of AAS on resting-state functional brain connectivity in the largest sample of AAS-users to date. We collected resting-state functional magnetic resonance imaging (fMRI) data from 151 males engaged in heavy resistance strength training. 50 users tested positive for AAS based on the testosterone to epitestosterone (T/E) ratio and doping substances in urine. 16 previous users and 59 controls tested negative. We estimated brain network nodes and their time-series using ICA and dual regression and defined connectivity matrices as the between-node partial correlations. In line with the emotional and behavioral consequences of AAS, current users exhibited reduced functional connectivity between key nodes involved in emotional and cognitive regulation, in particular reduced connectivity between the amygdala and default-mode network (DMN) and between the dorsal attention network (DAN) and a frontal node encompassing the superior and inferior frontal gyri (SFG/IFG) and the anterior cingulate cortex (ACC), with further reductions as a function of dependency, lifetime exposure, and cycle state (on/off).

  10. Dimensional feature weighting utilizing multiple kernel learning for single-channel talker location discrimination using the acoustic transfer function.

    PubMed

    Takashima, Ryoichi; Takiguchi, Tetsuya; Ariki, Yasuo

    2013-02-01

    This paper presents a method for discriminating the location of the sound source (talker) using only a single microphone. In a previous work, the single-channel approach for discriminating the location of the sound source was discussed, where the acoustic transfer function from a user's position is estimated by using a hidden Markov model of clean speech in the cepstral domain. In this paper, each cepstral dimension of the acoustic transfer function is newly weighted, in order to obtain the cepstral dimensions having information that is useful for classifying the user's position. Then, this paper proposes a feature-weighting method for the cepstral parameter using multiple kernel learning, defining the base kernels for each cepstral dimension of the acoustic transfer function. The user's position is trained and classified by support vector machine. The effectiveness of this method has been confirmed by sound source (talker) localization experiments performed in different room environments.

  11. A lightweight, flow-based toolkit for parallel and distributed bioinformatics pipelines

    PubMed Central

    2011-01-01

    Background Bioinformatic analyses typically proceed as chains of data-processing tasks. A pipeline, or 'workflow', is a well-defined protocol, with a specific structure defined by the topology of data-flow interdependencies, and a particular functionality arising from the data transformations applied at each step. In computer science, the dataflow programming (DFP) paradigm defines software systems constructed in this manner, as networks of message-passing components. Thus, bioinformatic workflows can be naturally mapped onto DFP concepts. Results To enable the flexible creation and execution of bioinformatics dataflows, we have written a modular framework for parallel pipelines in Python ('PaPy'). A PaPy workflow is created from re-usable components connected by data-pipes into a directed acyclic graph, which together define nested higher-order map functions. The successive functional transformations of input data are evaluated on flexibly pooled compute resources, either local or remote. Input items are processed in batches of adjustable size, all flowing one to tune the trade-off between parallelism and lazy-evaluation (memory consumption). An add-on module ('NuBio') facilitates the creation of bioinformatics workflows by providing domain specific data-containers (e.g., for biomolecular sequences, alignments, structures) and functionality (e.g., to parse/write standard file formats). Conclusions PaPy offers a modular framework for the creation and deployment of parallel and distributed data-processing workflows. Pipelines derive their functionality from user-written, data-coupled components, so PaPy also can be viewed as a lightweight toolkit for extensible, flow-based bioinformatics data-processing. The simplicity and flexibility of distributed PaPy pipelines may help users bridge the gap between traditional desktop/workstation and grid computing. PaPy is freely distributed as open-source Python code at http://muralab.org/PaPy, and includes extensive documentation and annotated usage examples. PMID:21352538

  12. A lightweight, flow-based toolkit for parallel and distributed bioinformatics pipelines.

    PubMed

    Cieślik, Marcin; Mura, Cameron

    2011-02-25

    Bioinformatic analyses typically proceed as chains of data-processing tasks. A pipeline, or 'workflow', is a well-defined protocol, with a specific structure defined by the topology of data-flow interdependencies, and a particular functionality arising from the data transformations applied at each step. In computer science, the dataflow programming (DFP) paradigm defines software systems constructed in this manner, as networks of message-passing components. Thus, bioinformatic workflows can be naturally mapped onto DFP concepts. To enable the flexible creation and execution of bioinformatics dataflows, we have written a modular framework for parallel pipelines in Python ('PaPy'). A PaPy workflow is created from re-usable components connected by data-pipes into a directed acyclic graph, which together define nested higher-order map functions. The successive functional transformations of input data are evaluated on flexibly pooled compute resources, either local or remote. Input items are processed in batches of adjustable size, all flowing one to tune the trade-off between parallelism and lazy-evaluation (memory consumption). An add-on module ('NuBio') facilitates the creation of bioinformatics workflows by providing domain specific data-containers (e.g., for biomolecular sequences, alignments, structures) and functionality (e.g., to parse/write standard file formats). PaPy offers a modular framework for the creation and deployment of parallel and distributed data-processing workflows. Pipelines derive their functionality from user-written, data-coupled components, so PaPy also can be viewed as a lightweight toolkit for extensible, flow-based bioinformatics data-processing. The simplicity and flexibility of distributed PaPy pipelines may help users bridge the gap between traditional desktop/workstation and grid computing. PaPy is freely distributed as open-source Python code at http://muralab.org/PaPy, and includes extensive documentation and annotated usage examples.

  13. User engineering: A new look at system engineering

    NASA Technical Reports Server (NTRS)

    Mclaughlin, Larry L.

    1987-01-01

    User Engineering is a new System Engineering perspective responsible for defining and maintaining the user view of the system. Its elements are a process to guide the project and customer, a multidisciplinary team including hard and soft sciences, rapid prototyping tools to build user interfaces quickly and modify them frequently at low cost, and a prototyping center for involving users and designers in an iterative way. The main consideration is reducing the risk that the end user will not or cannot effectively use the system. The process begins with user analysis to produce cognitive and work style models, and task analysis to produce user work functions and scenarios. These become major drivers of the human computer interface design which is presented and reviewed as an interactive prototype by users. Feedback is rapid and productive, and user effectiveness can be measured and observed before the system is built and fielded. Requirements are derived via the prototype and baselined early to serve as an input to the architecture and software design.

  14. Automated search method for AFM and profilers

    NASA Astrophysics Data System (ADS)

    Ray, Michael; Martin, Yves C.

    2001-08-01

    A new automation software creates a search model as an initial setup and searches for a user-defined target in atomic force microscopes or stylus profilometers used in semiconductor manufacturing. The need for such automation has become critical in manufacturing lines. The new method starts with a survey map of a small area of a chip obtained from a chip-design database or an image of the area. The user interface requires a user to point to and define a precise location to be measured, and to select a macro function for an application such as line width or contact hole. The search algorithm automatically constructs a range of possible scan sequences within the survey, and provides increased speed and functionality compared to the methods used in instruments to date. Each sequence consists in a starting point relative to the target, a scan direction, and a scan length. The search algorithm stops when the location of a target is found and criteria for certainty in positioning is met. With today's capability in high speed processing and signal control, the tool can simultaneously scan and search for a target in a robotic and continuous manner. Examples are given that illustrate the key concepts.

  15. MODOPTIM: A general optimization program for ground-water flow model calibration and ground-water management with MODFLOW

    USGS Publications Warehouse

    Halford, Keith J.

    2006-01-01

    MODOPTIM is a non-linear ground-water model calibration and management tool that simulates flow with MODFLOW-96 as a subroutine. A weighted sum-of-squares objective function defines optimal solutions for calibration and management problems. Water levels, discharges, water quality, subsidence, and pumping-lift costs are the five direct observation types that can be compared in MODOPTIM. Differences between direct observations of the same type can be compared to fit temporal changes and spatial gradients. Water levels in pumping wells, wellbore storage in the observation wells, and rotational translation of observation wells also can be compared. Negative and positive residuals can be weighted unequally so inequality constraints such as maximum chloride concentrations or minimum water levels can be incorporated in the objective function. Optimization parameters are defined with zones and parameter-weight matrices. Parameter change is estimated iteratively with a quasi-Newton algorithm and is constrained to a user-defined maximum parameter change per iteration. Parameters that are less sensitive than a user-defined threshold are not estimated. MODOPTIM facilitates testing more conceptual models by expediting calibration of each conceptual model. Examples of applying MODOPTIM to aquifer-test analysis, ground-water management, and parameter estimation problems are presented.

  16. Social relevance: toward understanding the impact of the individual in an information cascade

    NASA Astrophysics Data System (ADS)

    Hall, Robert T.; White, Joshua S.; Fields, Jeremy

    2016-05-01

    Information Cascades (IC) through a social network occur due to the decision of users to disseminate content. We define this decision process as User Diffusion (UD). IC models typically describe an information cascade by treating a user as a node within a social graph, where a node's reception of an idea is represented by some activation state. The probability of activation then becomes a function of a node's connectedness to other activated nodes as well as, potentially, the history of activation attempts. We enrich this Coarse-Grained User Diffusion (CGUD) model by applying actor type logics to the nodes of the graph. The resulting Fine-Grained User Diffusion (FGUD) model utilizes prior research in actor typing to generate a predictive model regarding the future influence a user will have on an Information Cascade. Furthermore, we introduce a measure of Information Resonance that is used to aid in predictions regarding user behavior.

  17. Soft qualities in healthcare. Method and tools for soft qualities design in hospitals' built environments.

    PubMed

    Capolongo, S; Bellini, E; Nachiero, D; Rebecchi, A; Buffoli, M

    2014-01-01

    The design of hospital environments is determined by functional requirements and technical regulations, as well as numerous protocols, which define the structure and system characteristics that such environments need to achieve. In order to improve people's well-being and the quality of their experience within public hospitals, design elements (soft qualities) are added to those 'necessary' features. The aim of this research has been to experiment a new design process and also to create health care spaces with high environmental quality and capable to meet users' emotional and perceptual needs. Such needs were investigated with the help of qualitative research tools and the design criteria for one of these soft qualities - colour - were subsequently defined on the basis of the findings. The colour scheme design for the new San Paolo Hospital Emergency Department in Milan was used as case study. Focus groups were fundamental in defining the project's goals and criteria. The issues raised have led to believe that the proper procedure is not the mere consultation of the users in order to define the goals: users should rather be involved in the whole design process and become co-agents of the choices that determine the environment characteristics, so as to meet the quality requirements identified by the users themselves. The case study has shown the possibility of developing a designing methodology made by three steps (or operational tools) in which users' groups are involved in the choices, loading to plan the environments where compliance with expectations is already implied and verified by means of the process itself. Thus, the method leads to the creation of soft qualities in Healthcare.

  18. A FORTRAN program for the analysis of linear continuous and sample-data systems

    NASA Technical Reports Server (NTRS)

    Edwards, J. W.

    1976-01-01

    A FORTRAN digital computer program which performs the general analysis of linearized control systems is described. State variable techniques are used to analyze continuous, discrete, and sampled data systems. Analysis options include the calculation of system eigenvalues, transfer functions, root loci, root contours, frequency responses, power spectra, and transient responses for open- and closed-loop systems. A flexible data input format allows the user to define systems in a variety of representations. Data may be entered by inputing explicit data matrices or matrices constructed in user written subroutines, by specifying transfer function block diagrams, or by using a combination of these methods.

  19. The varieties of ecstatic experience: an exploration of the subjective experiences of ecstasy.

    PubMed

    Sumnall, Harry R; Cole, Jon C; Jerome, Lisa

    2006-09-01

    Previous investigations of the subjective effects of MDMA (material sold as ecstasy) have conducted interviews and surveys of various groups of ecstasy users within particular sub-populations. This study examined subjective drug effects reported by different sub-populations of ecstasy users and explored whether the function or purpose served by using ecstasy influenced the nature of the drug experience. Drawing on previous measures of alterations in consciousness, psychedelic drugs and cannabis, and informal interviews with ecstasy users and MDMA researchers, a 130-item survey assessing subjective effects of ecstasy/MDMA was developed. Principal components analysis of responses of ecstasy users revealed six components; perceptual alterations, entactogenic effects, prosocial effects, aesthetic effects, negative effects and sexual effects. The derived scale was used to predict ecstasy use behaviours, and functions and experiences of use. A variety of component scores were related to ecstasy use parameters; in particular, heavier users expected fewer negative, perceptual and aesthetic effects from taking the drug. The reasons given for using ecstasy (use function) also influenced reported drug effects. Abstainers expected greater negative, perceptual, aesthetic and sexual effects than users. These data indicate that the subjective ecstasy experience is influenced by a variety of extra-psychopharmacological factors. Drug intervention strategies may be made more effective by targeting particular user groups defined by reasons given for substance use, as it is likely that their experiences of ecstasy effects will differ. Future research into ecstasy may be improved by recognizing user diversity.

  20. User-Centered Indexing for Adaptive Information Access

    NASA Technical Reports Server (NTRS)

    Chen, James R.; Mathe, Nathalie

    1996-01-01

    We are focusing on information access tasks characterized by large volume of hypermedia connected technical documents, a need for rapid and effective access to familiar information, and long-term interaction with evolving information. The problem for technical users is to build and maintain a personalized task-oriented model of the information to quickly access relevant information. We propose a solution which provides user-centered adaptive information retrieval and navigation. This solution supports users in customizing information access over time. It is complementary to information discovery methods which provide access to new information, since it lets users customize future access to previously found information. It relies on a technique, called Adaptive Relevance Network, which creates and maintains a complex indexing structure to represent personal user's information access maps organized by concepts. This technique is integrated within the Adaptive HyperMan system, which helps NASA Space Shuttle flight controllers organize and access large amount of information. It allows users to select and mark any part of a document as interesting, and to index that part with user-defined concepts. Users can then do subsequent retrieval of marked portions of documents. This functionality allows users to define and access personal collections of information, which are dynamically computed. The system also supports collaborative review by letting users share group access maps. The adaptive relevance network provides long-term adaptation based both on usage and on explicit user input. The indexing structure is dynamic and evolves over time. Leading and generalization support flexible retrieval of information under similar concepts. The network is geared towards more recent information access, and automatically manages its size in order to maintain rapid access when scaling up to large hypermedia space. We present results of simulated learning experiments.

  1. Tank waste remediation system functions and requirements document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carpenter, K.E

    1996-10-03

    This is the Tank Waste Remediation System (TWRS) Functions and Requirements Document derived from the TWRS Technical Baseline. The document consists of several text sections that provide the purpose, scope, background information, and an explanation of how this document assists the application of Systems Engineering to the TWRS. The primary functions identified in the TWRS Functions and Requirements Document are identified in Figure 4.1 (Section 4.0) Currently, this document is part of the overall effort to develop the TWRS Functional Requirements Baseline, and contains the functions and requirements needed to properly define the top three TWRS function levels. TWRS Technicalmore » Baseline information (RDD-100 database) included in the appendices of the attached document contain the TWRS functions, requirements, and architecture necessary to define the TWRS Functional Requirements Baseline. Document organization and user directions are provided in the introductory text. This document will continue to be modified during the TWRS life-cycle.« less

  2. In silico design of context-responsive mammalian promoters with user-defined functionality

    PubMed Central

    Gibson, Suzanne J.; Hatton, Diane

    2017-01-01

    Abstract Comprehensive de novo-design of complex mammalian promoters is restricted by unpredictable combinatorial interactions between constituent transcription factor regulatory elements (TFREs). In this study, we show that modular binding sites that do not function cooperatively can be identified by analyzing host cell transcription factor expression profiles, and subsequently testing cognate TFRE activities in varying homotypic and heterotypic promoter architectures. TFREs that displayed position-insensitive, additive function within a specific expression context could be rationally combined together in silico to create promoters with highly predictable activities. As TFRE order and spacing did not affect the performance of these TFRE-combinations, compositions could be specifically arranged to preclude the formation of undesirable sequence features. This facilitated simple in silico-design of promoters with context-required, user-defined functionalities. To demonstrate this, we de novo-created promoters for biopharmaceutical production in CHO cells that exhibited precisely designed activity dynamics and long-term expression-stability, without causing observable retroactive effects on cellular performance. The design process described can be utilized for applications requiring context-responsive, customizable promoter function, particularly where co-expression of synthetic TFs is not suitable. Although the synthetic promoter structure utilized does not closely resemble native mammalian architectures, our findings also provide additional support for a flexible billboard model of promoter regulation. PMID:28977454

  3. GPS test range mission planning

    NASA Astrophysics Data System (ADS)

    Roberts, Iris P.; Hancock, Thomas P.

    The principal features of the Test Range User Mission Planner (TRUMP), a PC-resident tool designed to aid in deploying and utilizing GPS-based test range assets, are reviewed. TRUMP features time history plots of time-space-position information (TSPI); performance based on a dynamic GPS/inertial system simulation; time history plots of TSPI data link connectivity; digital terrain elevation data maps with user-defined cultural features; and two-dimensional coverage plots of ground-based test range assets. Some functions to be added during the next development phase are discussed.

  4. The ATOMFT integrator - Using Taylor series to solve ordinary differential equations

    NASA Technical Reports Server (NTRS)

    Berryman, Kenneth W.; Stanford, Richard H.; Breckheimer, Peter J.

    1988-01-01

    This paper discusses the application of ATOMFT, an integration package based on Taylor series solution with a sophisticated user interface. ATOMFT has the capabilities to allow the implementation of user defined functions and the solution of stiff and algebraic equations. Detailed examples, including the solutions to several astrodynamics problems, are presented. Comparisons with its predecessor ATOMCC and other modern integrators indicate that ATOMFT is a fast, accurate, and easy method to use to solve many differential equation problems.

  5. Variational Trajectory Optimization Tool Set: Technical description and user's manual

    NASA Technical Reports Server (NTRS)

    Bless, Robert R.; Queen, Eric M.; Cavanaugh, Michael D.; Wetzel, Todd A.; Moerder, Daniel D.

    1993-01-01

    The algorithms that comprise the Variational Trajectory Optimization Tool Set (VTOTS) package are briefly described. The VTOTS is a software package for solving nonlinear constrained optimal control problems from a wide range of engineering and scientific disciplines. The VTOTS package was specifically designed to minimize the amount of user programming; in fact, for problems that may be expressed in terms of analytical functions, the user needs only to define the problem in terms of symbolic variables. This version of the VTOTS does not support tabular data; thus, problems must be expressed in terms of analytical functions. The VTOTS package consists of two methods for solving nonlinear optimal control problems: a time-domain finite-element algorithm and a multiple shooting algorithm. These two algorithms, under the VTOTS package, may be run independently or jointly. The finite-element algorithm generates approximate solutions, whereas the shooting algorithm provides a more accurate solution to the optimization problem. A user's manual, some examples with results, and a brief description of the individual subroutines are included.

  6. Rule-based graph theory to enable exploration of the space system architecture design space

    NASA Astrophysics Data System (ADS)

    Arney, Dale Curtis

    The primary goal of this research is to improve upon system architecture modeling in order to enable the exploration of design space options. A system architecture is the description of the functional and physical allocation of elements and the relationships, interactions, and interfaces between those elements necessary to satisfy a set of constraints and requirements. The functional allocation defines the functions that each system (element) performs, and the physical allocation defines the systems required to meet those functions. Trading the functionality between systems leads to the architecture-level design space that is available to the system architect. The research presents a methodology that enables the modeling of complex space system architectures using a mathematical framework. To accomplish the goal of improved architecture modeling, the framework meets five goals: technical credibility, adaptability, flexibility, intuitiveness, and exhaustiveness. The framework is technically credible, in that it produces an accurate and complete representation of the system architecture under consideration. The framework is adaptable, in that it provides the ability to create user-specified locations, steady states, and functions. The framework is flexible, in that it allows the user to model system architectures to multiple destinations without changing the underlying framework. The framework is intuitive for user input while still creating a comprehensive mathematical representation that maintains the necessary information to completely model complex system architectures. Finally, the framework is exhaustive, in that it provides the ability to explore the entire system architecture design space. After an extensive search of the literature, graph theory presents a valuable mechanism for representing the flow of information or vehicles within a simple mathematical framework. Graph theory has been used in developing mathematical models of many transportation and network flow problems in the past, where nodes represent physical locations and edges represent the means by which information or vehicles travel between those locations. In space system architecting, expressing the physical locations (low-Earth orbit, low-lunar orbit, etc.) and steady states (interplanetary trajectory) as nodes and the different means of moving between the nodes (propulsive maneuvers, etc.) as edges formulates a mathematical representation of this design space. The selection of a given system architecture using graph theory entails defining the paths that the systems take through the space system architecture graph. A path through the graph is defined as a list of edges that are traversed, which in turn defines functions performed by the system. A structure to compactly represent this information is a matrix, called the system map, in which the column indices are associated with the systems that exist and row indices are associated with the edges, or functions, to which each system has access. Several contributions have been added to the state of the art in space system architecture analysis. The framework adds the capability to rapidly explore the design space without the need to limit trade options or the need for user interaction during the exploration process. The unique mathematical representation of a system architecture, through the use of the adjacency, incidence, and system map matrices, enables automated design space exploration using stochastic optimization processes. The innovative rule-based graph traversal algorithm ensures functional feasibility of each system architecture that is analyzed, and the automatic generation of the system hierarchy eliminates the need for the user to manually determine the relationships between systems during or before the design space exploration process. Finally, the rapid evaluation of system architectures for various mission types enables analysis of the system architecture design space for multiple destinations within an evolutionary exploration program. (Abstract shortened by UMI.).

  7. Abnormal cerebellar morphometry in abstinent adolescent marijuana users

    PubMed Central

    Medina, Krista Lisdahl; Nagel, Bonnie J.; Tapert, Susan F.

    2010-01-01

    Background Functional neuroimaging data from adults have, in general, found frontocerebellar dysfunction associated with acute and chronic marijuana (MJ) use (Loeber & Yurgelun-Todd, 1999). One structural neuroimaging study found reduced cerebellar vermis volume in young adult MJ users with a history of heavy polysubstance use (Aasly et al., 1993). The goal of this study was to characterize cerebellar volume in adolescent chronic MJ users following one month of monitored abstinence. Method Participants were MJ users (n=16) and controls (n=16) aged 16-18 years. Extensive exclusionary criteria included history of psychiatric or neurologic disorders. Drug use history, neuropsychological data, and structural brain scans were collected after 28 days of monitored abstinence. Trained research staff defined cerebellar volumes (including three cerebellar vermis lobes and both cerebellar hemispheres) on high-resolution T1-weighted magnetic resonance images. Results Adolescent MJ users demonstrated significantly larger inferior posterior (lobules VIII-X) vermis volume (p<.009) than controls, above and beyond effects of lifetime alcohol and other drug use, gender, and intracranial volume. Larger vermis volumes were associated with poorer executive functioning (p’s<.05). Conclusions Following one month of abstinence, adolescent MJ users had significantly larger posterior cerebellar vermis volumes than non-using controls. These greater volumes are suggested to be pathological based on linkage to poorer executive functioning. Longitudinal studies are needed to examine typical cerebellar development during adolescence and the influence of marijuana use. PMID:20413277

  8. PubstractHelper: A Web-based Text-Mining Tool for Marking Sentences in Abstracts from PubMed Using Multiple User-Defined Keywords.

    PubMed

    Chen, Chou-Cheng; Ho, Chung-Liang

    2014-01-01

    While a huge amount of information about biological literature can be obtained by searching the PubMed database, reading through all the titles and abstracts resulting from such a search for useful information is inefficient. Text mining makes it possible to increase this efficiency. Some websites use text mining to gather information from the PubMed database; however, they are database-oriented, using pre-defined search keywords while lacking a query interface for user-defined search inputs. We present the PubMed Abstract Reading Helper (PubstractHelper) website which combines text mining and reading assistance for an efficient PubMed search. PubstractHelper can accept a maximum of ten groups of keywords, within each group containing up to ten keywords. The principle behind the text-mining function of PubstractHelper is that keywords contained in the same sentence are likely to be related. PubstractHelper highlights sentences with co-occurring keywords in different colors. The user can download the PMID and the abstracts with color markings to be reviewed later. The PubstractHelper website can help users to identify relevant publications based on the presence of related keywords, which should be a handy tool for their research. http://bio.yungyun.com.tw/ATM/PubstractHelper.aspx and http://holab.med.ncku.edu.tw/ATM/PubstractHelper.aspx.

  9. Visual gene developer: a fully programmable bioinformatics software for synthetic gene optimization.

    PubMed

    Jung, Sang-Kyu; McDonald, Karen

    2011-08-16

    Direct gene synthesis is becoming more popular owing to decreases in gene synthesis pricing. Compared with using natural genes, gene synthesis provides a good opportunity to optimize gene sequence for specific applications. In order to facilitate gene optimization, we have developed a stand-alone software called Visual Gene Developer. The software not only provides general functions for gene analysis and optimization along with an interactive user-friendly interface, but also includes unique features such as programming capability, dedicated mRNA secondary structure prediction, artificial neural network modeling, network & multi-threaded computing, and user-accessible programming modules. The software allows a user to analyze and optimize a sequence using main menu functions or specialized module windows. Alternatively, gene optimization can be initiated by designing a gene construct and configuring an optimization strategy. A user can choose several predefined or user-defined algorithms to design a complicated strategy. The software provides expandable functionality as platform software supporting module development using popular script languages such as VBScript and JScript in the software programming environment. Visual Gene Developer is useful for both researchers who want to quickly analyze and optimize genes, and those who are interested in developing and testing new algorithms in bioinformatics. The software is available for free download at http://www.visualgenedeveloper.net.

  10. Visual gene developer: a fully programmable bioinformatics software for synthetic gene optimization

    PubMed Central

    2011-01-01

    Background Direct gene synthesis is becoming more popular owing to decreases in gene synthesis pricing. Compared with using natural genes, gene synthesis provides a good opportunity to optimize gene sequence for specific applications. In order to facilitate gene optimization, we have developed a stand-alone software called Visual Gene Developer. Results The software not only provides general functions for gene analysis and optimization along with an interactive user-friendly interface, but also includes unique features such as programming capability, dedicated mRNA secondary structure prediction, artificial neural network modeling, network & multi-threaded computing, and user-accessible programming modules. The software allows a user to analyze and optimize a sequence using main menu functions or specialized module windows. Alternatively, gene optimization can be initiated by designing a gene construct and configuring an optimization strategy. A user can choose several predefined or user-defined algorithms to design a complicated strategy. The software provides expandable functionality as platform software supporting module development using popular script languages such as VBScript and JScript in the software programming environment. Conclusion Visual Gene Developer is useful for both researchers who want to quickly analyze and optimize genes, and those who are interested in developing and testing new algorithms in bioinformatics. The software is available for free download at http://www.visualgenedeveloper.net. PMID:21846353

  11. Mathematics Programming on the Apple II and IBM PC.

    ERIC Educational Resources Information Center

    Myers, Roy E.; Schneider, David I.

    1987-01-01

    Details the features of BASIC used in mathematics programming and provides the information needed to translate between the Apple II and IBM PC computers. Discusses inputing a user-defined function, setting scroll windows, displaying subscripts and exponents, variable names, mathematical characters and special symbols. (TW)

  12. Creating a user friendly GIS tool to define functional process zones

    EPA Science Inventory

    The goal of this research is to develop methods and indicators that are useful for evaluating the condition of aquatic communities, for assessing the restoration of aquatic communities in response to mitigation and best management practices, and for determining the exposure of aq...

  13. MyDas, an Extensible Java DAS Server

    PubMed Central

    Jimenez, Rafael C.; Quinn, Antony F.; Jenkinson, Andrew M.; Mulder, Nicola; Martin, Maria; Hunter, Sarah; Hermjakob, Henning

    2012-01-01

    A large number of diverse, complex, and distributed data resources are currently available in the Bioinformatics domain. The pace of discovery and the diversity of information means that centralised reference databases like UniProt and Ensembl cannot integrate all potentially relevant information sources. From a user perspective however, centralised access to all relevant information concerning a specific query is essential. The Distributed Annotation System (DAS) defines a communication protocol to exchange annotations on genomic and protein sequences; this standardisation enables clients to retrieve data from a myriad of sources, thus offering centralised access to end-users. We introduce MyDas, a web server that facilitates the publishing of biological annotations according to the DAS specification. It deals with the common functionality requirements of making data available, while also providing an extension mechanism in order to implement the specifics of data store interaction. MyDas allows the user to define where the required information is located along with its structure, and is then responsible for the communication protocol details. PMID:23028496

  14. MyDas, an extensible Java DAS server.

    PubMed

    Salazar, Gustavo A; García, Leyla J; Jones, Philip; Jimenez, Rafael C; Quinn, Antony F; Jenkinson, Andrew M; Mulder, Nicola; Martin, Maria; Hunter, Sarah; Hermjakob, Henning

    2012-01-01

    A large number of diverse, complex, and distributed data resources are currently available in the Bioinformatics domain. The pace of discovery and the diversity of information means that centralised reference databases like UniProt and Ensembl cannot integrate all potentially relevant information sources. From a user perspective however, centralised access to all relevant information concerning a specific query is essential. The Distributed Annotation System (DAS) defines a communication protocol to exchange annotations on genomic and protein sequences; this standardisation enables clients to retrieve data from a myriad of sources, thus offering centralised access to end-users.We introduce MyDas, a web server that facilitates the publishing of biological annotations according to the DAS specification. It deals with the common functionality requirements of making data available, while also providing an extension mechanism in order to implement the specifics of data store interaction. MyDas allows the user to define where the required information is located along with its structure, and is then responsible for the communication protocol details.

  15. An analysis of electronic document management in oncology care.

    PubMed

    Poulter, Thomas; Gannon, Brian; Bath, Peter A

    2012-06-01

    In this research in progress, a reference model for the use of electronic patient record (EPR) systems in oncology is described. The model, termed CICERO, comprises technical and functional components, and emphasises usability, clinical safety and user acceptance. One of the functional components of the model-an electronic document and records management (EDRM) system-is monitored in the course of its deployment at a leading oncology centre in the UK. Specifically, the user requirements and design of the EDRM solution are described.The study is interpretative and forms part a wider research programme to define and validate the CICERO model. Preliminary conclusions confirm the importance of a socio-technical perspective in Onco-EPR system design.

  16. Design and implementation of visual-haptic assistive control system for virtual rehabilitation exercise and teleoperation manipulation.

    PubMed

    Veras, Eduardo J; De Laurentis, Kathryn J; Dubey, Rajiv

    2008-01-01

    This paper describes the design and implementation of a control system that integrates visual and haptic information to give assistive force feedback through a haptic controller (Omni Phantom) to the user. A sensor-based assistive function and velocity scaling program provides force feedback that helps the user complete trajectory following exercises for rehabilitation purposes. This system also incorporates a PUMA robot for teleoperation, which implements a camera and a laser range finder, controlled in real time by a PC, were implemented into the system to help the user to define the intended path to the selected target. The real-time force feedback from the remote robot to the haptic controller is made possible by using effective multithreading programming strategies in the control system design and by novel sensor integration. The sensor-based assistant function concept applied to teleoperation as well as shared control enhances the motion range and manipulation capabilities of the users executing rehabilitation exercises such as trajectory following along a sensor-based defined path. The system is modularly designed to allow for integration of different master devices and sensors. Furthermore, because this real-time system is versatile the haptic component can be used separately from the telerobotic component; in other words, one can use the haptic device for rehabilitation purposes for cases in which assistance is needed to perform tasks (e.g., stroke rehab) and also for teleoperation with force feedback and sensor assistance in either supervisory or automatic modes.

  17. SIG. Signal Processing, Analysis, & Display

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez, J.; Lager, D.; Azevedo, S.

    1992-01-22

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG; a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time and frequency-domain signals includingmore » operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments, commenting lines, defining commands, and automatic execution for each item in a `repeat` sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.« less

  18. SIG. Signal Processing, Analysis, & Display

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez, J.; Lager, D.; Azevedo, S.

    1992-01-22

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time-and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG - a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time and frequency-domain signals includingmore » operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments, commenting lines, defining commands, and automatic execution for each item in a repeat sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.« less

  19. Signal Processing, Analysis, & Display

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lager, Darrell; Azevado, Stephen

    1986-06-01

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time- and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG - a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time- and frequency-domain signalsmore » including operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments,commenting lines, defining commands, and automatic execution for each item in a repeat sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.« less

  20. SIG. Signal Processing, Analysis, & Display

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez, J.; Lager, D.; Azevedo, S.

    1992-01-22

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time- and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG - a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time- and frequency-domain signalsmore » including operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments,commenting lines, defining commands, and automatic execution for each item in a repeat sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.« less

  1. Parametric Cognitive Modeling of Information and Computer Technology Usage by People with Aging- and Disability-Derived Functional Impairments

    PubMed Central

    García-Betances, Rebeca I.; Cabrera-Umpiérrez, María Fernanda; Ottaviano, Manuel; Pastorino, Matteo; Arredondo, María T.

    2016-01-01

    Despite the speedy evolution of Information and Computer Technology (ICT), and the growing recognition of the importance of the concept of universal design in all domains of daily living, mainstream ICT-based product designers and developers still work without any truly structured tools, guidance or support to effectively adapt their products and services to users’ real needs. This paper presents the approach used to define and evaluate parametric cognitive models that describe interaction and usage of ICT by people with aging- and disability-derived functional impairments. A multisensorial training platform was used to train, based on real user measurements in real conditions, the virtual parameterized user models that act as subjects of the test-bed during all stages of simulated disabilities-friendly ICT-based products design. An analytical study was carried out to identify the relevant cognitive functions involved, together with their corresponding parameters as related to aging- and disability-derived functional impairments. Evaluation of the final cognitive virtual user models in a real application has confirmed that the use of these models produce concrete valuable benefits to the design and testing process of accessible ICT-based applications and services. Parameterization of cognitive virtual user models allows incorporating cognitive and perceptual aspects during the design process. PMID:26907296

  2. A study of diverse clinical decision support rule authoring environments and requirements for integration

    PubMed Central

    2012-01-01

    Background Efficient rule authoring tools are critical to allow clinical Knowledge Engineers (KEs), Software Engineers (SEs), and Subject Matter Experts (SMEs) to convert medical knowledge into machine executable clinical decision support rules. The goal of this analysis was to identify the critical success factors and challenges of a fully functioning Rule Authoring Environment (RAE) in order to define requirements for a scalable, comprehensive tool to manage enterprise level rules. Methods The authors evaluated RAEs in active use across Partners Healthcare, including enterprise wide, ambulatory only, and system specific tools, with a focus on rule editors for reminder and medication rules. We conducted meetings with users of these RAEs to discuss their general experience and perceived advantages and limitations of these tools. Results While the overall rule authoring process is similar across the 10 separate RAEs, the system capabilities and architecture vary widely. Most current RAEs limit the ability of the clinical decision support (CDS) interventions to be standardized, sharable, interoperable, and extensible. No existing system meets all requirements defined by knowledge management users. Conclusions A successful, scalable, integrated rule authoring environment will need to support a number of key requirements and functions in the areas of knowledge representation, metadata, terminology, authoring collaboration, user interface, integration with electronic health record (EHR) systems, testing, and reporting. PMID:23145874

  3. Optimization of Residual Stresses in MMC's through Process Parameter Control and the use of Heterogeneous Compensating/Compliant Interfacial Layers. OPTCOMP2 User's Guide

    NASA Technical Reports Server (NTRS)

    Pindera, Marek-Jerzy; Salzar, Robert S.

    1996-01-01

    A user's guide for the computer program OPTCOMP2 is presented in this report. This program provides a capability to optimize the fabrication or service-induced residual stresses in unidirectional metal matrix composites subjected to combined thermomechanical axisymmetric loading by altering the processing history, as well as through the microstructural design of interfacial fiber coatings. The user specifies the initial architecture of the composite and the load history, with the constituent materials being elastic, plastic, viscoplastic, or as defined by the 'user-defined' constitutive model, in addition to the objective function and constraints, through a user-friendly data input interface. The optimization procedure is based on an efficient solution methodology for the inelastic response of a fiber/interface layer(s)/matrix concentric cylinder model where the interface layers can be either homogeneous or heterogeneous. The response of heterogeneous layers is modeled using Aboudi's three-dimensional method of cells micromechanics model. The commercial optimization package DOT is used for the nonlinear optimization problem. The solution methodology for the arbitrarily layered cylinder is based on the local-global stiffness matrix formulation and Mendelson's iterative technique of successive elastic solutions developed for elastoplastic boundary-value problems. The optimization algorithm employed in DOT is based on the method of feasible directions.

  4. SIG: a general-purpose signal processing program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lager, D.; Azevedo, S.

    1986-02-01

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time- and frequency-domain signals. It also accommodates other representations for data such as transfer function polynomials. Signal processing operations include digital filtering, auto/cross spectral density, transfer function/impulse response, convolution, Fourier transform, and inverse Fourier transform. Graphical operations provide display of signals and spectra, including plotting, cursor zoom, families of curves, and multiple viewport plots. SIG provides two user interfaces with a menu mode for occasional users and a command mode for more experienced users. Capability exits for multiple commands per line, commandmore » files with arguments, commenting lines, defining commands, automatic execution for each item in a repeat sequence, etc. SIG is presently available for VAX(VMS), VAX (BERKELEY 4.2 UNIX), SUN (BERKELEY 4.2 UNIX), DEC-20 (TOPS-20), LSI-11/23 (TSX), and DEC PRO 350 (TSX). 4 refs., 2 figs.« less

  5. Biopathways representation and simulation on hybrid functional petri net.

    PubMed

    Matsuno, Hiroshi; Tanaka, Yukiko; Aoshima, Hitoshi; Doi, Atsushi; Matsui, Mika; Miyano, Satoru

    2011-01-01

    The following two matters should be resolved in order for biosimulation tools to be accepted by users in biology/medicine: (1) remove issues which are irrelevant to biological importance, and (2) allow users to represent biopathways intuitively and understand/manage easily the details of representation and simulation mechanism. From these criteria, we firstly define a novel notion of Petri net called Hybrid Functional Petri Net (HFPN). Then, we introduce a software tool, Genomic Object Net, for representing and simulating biopathways, which we have developed by employing the architecture of HFPN. In order to show the usefulness of Genomic Object Net for representing and simulating biopathways, we show two HFPN representations of gene regulation mechanisms of Drosophila melanogaster (fruit fly) circadian rhythm and apoptosis induced by Fas ligand. The simulation results of these biopathways are also correlated with biological observations. The software is available to academic users from http://www.GenomicObject.Net/.

  6. Evaluation of expert systems - An approach and case study. [of determining software functional requirements for command management of satellites

    NASA Technical Reports Server (NTRS)

    Liebowitz, J.

    1985-01-01

    Techniques that were applied in defining an expert system prototype for first-cut evaluations of the software functional requirements of NASA satellite command management activities are described. The prototype was developed using the Knowledge Engineering System. Criteria were selected for evaluating the satellite software before defining the expert system prototype. Application of the prototype system is illustrated in terms of the evaluation procedures used with the COBE satellite to be launched in 1988. The limited number of options which can be considered by the program mandates that biases in the system output must be well understood by the users.

  7. 77 FR 9679 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-17

    .... This tool provides descriptive information about intense users of services, defined as all individuals... surveys are completed by and collected from a sample of service recipients, not every recipient. A time...-functioning, types of exposure, and event reactions. CCP Service Provider Feedback. These surveys are...

  8. circlncRNAnet: an integrated web-based resource for mapping functional networks of long or circular forms of noncoding RNAs.

    PubMed

    Wu, Shao-Min; Liu, Hsuan; Huang, Po-Jung; Chang, Ian Yi-Feng; Lee, Chi-Ching; Yang, Chia-Yu; Tsai, Wen-Sy; Tan, Bertrand Chin-Ming

    2018-01-01

    Despite their lack of protein-coding potential, long noncoding RNAs (lncRNAs) and circular RNAs (circRNAs) have emerged as key determinants in gene regulation, acting to fine-tune transcriptional and signaling output. These noncoding RNA transcripts are known to affect expression of messenger RNAs (mRNAs) via epigenetic and post-transcriptional regulation. Given their widespread target spectrum, as well as extensive modes of action, a complete understanding of their biological relevance will depend on integrative analyses of systems data at various levels. While a handful of publicly available databases have been reported, existing tools do not fully capture, from a network perspective, the functional implications of lncRNAs or circRNAs of interest. Through an integrated and streamlined design, circlncRNAnet aims to broaden the understanding of ncRNA candidates by testing in silico several hypotheses of ncRNA-based functions, on the basis of large-scale RNA-seq data. This web server is implemented with several features that represent advances in the bioinformatics of ncRNAs: (1) a flexible framework that accepts and processes user-defined next-generation sequencing-based expression data; (2) multiple analytic modules that assign and productively assess the regulatory networks of user-selected ncRNAs by cross-referencing extensively curated databases; (3) an all-purpose, information-rich workflow design that is tailored to all types of ncRNAs. Outputs on expression profiles, co-expression networks and pathways, and molecular interactomes, are dynamically and interactively displayed according to user-defined criteria. In short, users may apply circlncRNAnet to obtain, in real time, multiple lines of functionally relevant information on circRNAs/lncRNAs of their interest. In summary, circlncRNAnet provides a "one-stop" resource for in-depth analyses of ncRNA biology. circlncRNAnet is freely available at http://app.cgu.edu.tw/circlnc/. © The Authors 2017. Published by Oxford University Press.

  9. TOAD Editor

    NASA Technical Reports Server (NTRS)

    Bingle, Bradford D.; Shea, Anne L.; Hofler, Alicia S.

    1993-01-01

    Transferable Output ASCII Data (TOAD) computer program (LAR-13755), implements format designed to facilitate transfer of data across communication networks and dissimilar host computer systems. Any data file conforming to TOAD format standard called TOAD file. TOAD Editor is interactive software tool for manipulating contents of TOAD files. Commonly used to extract filtered subsets of data for visualization of results of computation. Also offers such user-oriented features as on-line help, clear English error messages, startup file, macroinstructions defined by user, command history, user variables, UNDO features, and full complement of mathematical statistical, and conversion functions. Companion program, TOAD Gateway (LAR-14484), converts data files from variety of other file formats to that of TOAD. TOAD Editor written in FORTRAN 77.

  10. Version 1.00 programmer`s tools used in constructing the INEL RML/analytical radiochemistry sample tracking database and its user interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Femec, D.A.

    This report describes two code-generating tools used to speed design and implementation of relational databases and user interfaces: CREATE-SCHEMA and BUILD-SCREEN. CREATE-SCHEMA produces the SQL commands that actually create and define the database. BUILD-SCREEN takes templates for data entry screens and generates the screen management system routine calls to display the desired screen. Both tools also generate the related FORTRAN declaration statements and precompiled SQL calls. Included with this report is the source code for a number of FORTRAN routines and functions used by the user interface. This code is broadly applicable to a number of different databases.

  11. Image Segmentation for Improvised Explosive Devices

    DTIC Science & Technology

    2012-12-01

    us to generate color models for IEDs without user input that labels parts of the IED. v THIS PAGE INTENTIONALLY LEFT BLANK vi Table of Contents 1...has to be generated. All graph cut algorithms we analyze define the undirected network G( V ,E) as a set of nodes V , edges E, and capacities C: E → R. 3...algorithms we study, this objective function is the sum of the two functions U and V , where the function U is a region property which evaluates the

  12. Modeling of a 3DTV service in the software-defined networking architecture

    NASA Astrophysics Data System (ADS)

    Wilczewski, Grzegorz

    2014-11-01

    In this article a newly developed concept towards modeling of a multimedia service offering stereoscopic motion imagery is presented. Proposed model is based on the approach of utilization of Software-defined Networking or Software Defined Networks architecture (SDN). The definition of 3D television service spanning SDN concept is identified, exposing basic characteristic of a 3DTV service in a modern networking organization layout. Furthermore, exemplary functionalities of the proposed 3DTV model are depicted. It is indicated that modeling of a 3DTV service in the Software-defined Networking architecture leads to multiplicity of improvements, especially towards flexibility of a service supporting heterogeneity of end user devices.

  13. Enabling complex queries to drug information sources through functional composition.

    PubMed

    Peters, Lee; Mortensen, Jonathan; Nguyen, Thang; Bodenreider, Olivier

    2013-01-01

    Our objective was to enable an end-user to create complex queries to drug information sources through functional composition, by creating sequences of functions from application program interfaces (API) to drug terminologies. The development of a functional composition model seeks to link functions from two distinct APIs. An ontology was developed using Protégé to model the functions of the RxNorm and NDF-RT APIs by describing the semantics of their input and output. A set of rules were developed to define the interoperable conditions for functional composition. The operational definition of interoperability between function pairs is established by executing the rules on the ontology. We illustrate that the functional composition model supports common use cases, including checking interactions for RxNorm drugs and deploying allergy lists defined in reference to drug properties in NDF-RT. This model supports the RxMix application (http://mor.nlm.nih.gov/RxMix/), an application we developed for enabling complex queries to the RxNorm and NDF-RT APIs.

  14. GeneTools--application for functional annotation and statistical hypothesis testing.

    PubMed

    Beisvag, Vidar; Jünge, Frode K R; Bergum, Hallgeir; Jølsum, Lars; Lydersen, Stian; Günther, Clara-Cecilie; Ramampiaro, Heri; Langaas, Mette; Sandvik, Arne K; Laegreid, Astrid

    2006-10-24

    Modern biology has shifted from "one gene" approaches to methods for genomic-scale analysis like microarray technology, which allow simultaneous measurement of thousands of genes. This has created a need for tools facilitating interpretation of biological data in "batch" mode. However, such tools often leave the investigator with large volumes of apparently unorganized information. To meet this interpretation challenge, gene-set, or cluster testing has become a popular analytical tool. Many gene-set testing methods and software packages are now available, most of which use a variety of statistical tests to assess the genes in a set for biological information. However, the field is still evolving, and there is a great need for "integrated" solutions. GeneTools is a web-service providing access to a database that brings together information from a broad range of resources. The annotation data are updated weekly, guaranteeing that users get data most recently available. Data submitted by the user are stored in the database, where it can easily be updated, shared between users and exported in various formats. GeneTools provides three different tools: i) NMC Annotation Tool, which offers annotations from several databases like UniGene, Entrez Gene, SwissProt and GeneOntology, in both single- and batch search mode. ii) GO Annotator Tool, where users can add new gene ontology (GO) annotations to genes of interest. These user defined GO annotations can be used in further analysis or exported for public distribution. iii) eGOn, a tool for visualization and statistical hypothesis testing of GO category representation. As the first GO tool, eGOn supports hypothesis testing for three different situations (master-target situation, mutually exclusive target-target situation and intersecting target-target situation). An important additional function is an evidence-code filter that allows users, to select the GO annotations for the analysis. GeneTools is the first "all in one" annotation tool, providing users with a rapid extraction of highly relevant gene annotation data for e.g. thousands of genes or clones at once. It allows a user to define and archive new GO annotations and it supports hypothesis testing related to GO category representations. GeneTools is freely available through www.genetools.no

  15. Large Scale Landslide Database System Established for the Reservoirs in Southern Taiwan

    NASA Astrophysics Data System (ADS)

    Tsai, Tsai-Tsung; Tsai, Kuang-Jung; Shieh, Chjeng-Lun

    2017-04-01

    Typhoon Morakot seriously attack southern Taiwan awaken the public awareness of large scale landslide disasters. Large scale landslide disasters produce large quantity of sediment due to negative effects on the operating functions of reservoirs. In order to reduce the risk of these disasters within the study area, the establishment of a database for hazard mitigation / disaster prevention is necessary. Real time data and numerous archives of engineering data, environment information, photo, and video, will not only help people make appropriate decisions, but also bring the biggest concern for people to process and value added. The study tried to define some basic data formats / standards from collected various types of data about these reservoirs and then provide a management platform based on these formats / standards. Meanwhile, in order to satisfy the practicality and convenience, the large scale landslide disasters database system is built both provide and receive information abilities, which user can use this large scale landslide disasters database system on different type of devices. IT technology progressed extreme quick, the most modern system might be out of date anytime. In order to provide long term service, the system reserved the possibility of user define data format /standard and user define system structure. The system established by this study was based on HTML5 standard language, and use the responsive web design technology. This will make user can easily handle and develop this large scale landslide disasters database system.

  16. STS pilot user development program

    NASA Technical Reports Server (NTRS)

    Mcdowell, J. R.

    1977-01-01

    Full exploitation of the STS capabilities will be not only dependent on the extensive use of the STS for known space applications and research, but also on new, innovative ideas of use originating with both current and new users. In recognition of this, NASA has been engaged in a User Development Program for the STS. The program began with four small studies. Each study addressed a separate sector of potential new users to identify techniques and methodologies for user development. The collective results established that a user development function was not only feasible, but necessary for NASA to realize the full potential of the STS. This final report begins with a description of the overall pilot program plan, which involved five specific tasks defined in the contract Statement of Work. Each task is then discussed separately; but two subjects, the development of principal investigators and space processing users, are discussed separately for improved continuity of thought. These discussions are followed by a summary of the primary results and conclusions of the Pilot User Development Program. Specific recommendations of the study are given.

  17. DSN Array Simulator

    NASA Technical Reports Server (NTRS)

    Tikidjian, Raffi; Mackey, Ryan

    2008-01-01

    The DSN Array Simulator (wherein 'DSN' signifies NASA's Deep Space Network) is an updated version of software previously denoted the DSN Receive Array Technology Assessment Simulation. This software (see figure) is used for computational modeling of a proposed DSN facility comprising user-defined arrays of antennas and transmitting and receiving equipment for microwave communication with spacecraft on interplanetary missions. The simulation includes variations in spacecraft tracked and communication demand changes for up to several decades of future operation. Such modeling is performed to estimate facility performance, evaluate requirements that govern facility design, and evaluate proposed improvements in hardware and/or software. The updated version of this software affords enhanced capability for characterizing facility performance against user-defined mission sets. The software includes a Monte Carlo simulation component that enables rapid generation of key mission-set metrics (e.g., numbers of links, data rates, and date volumes), and statistical distributions thereof as functions of time. The updated version also offers expanded capability for mixed-asset network modeling--for example, for running scenarios that involve user-definable mixtures of antennas having different diameters (in contradistinction to a fixed number of antennas having the same fixed diameter). The improved version also affords greater simulation fidelity, sufficient for validation by comparison with actual DSN operations and analytically predictable performance metrics.

  18. Where can pixel counting area estimates meet user-defined accuracy requirements?

    NASA Astrophysics Data System (ADS)

    Waldner, François; Defourny, Pierre

    2017-08-01

    Pixel counting is probably the most popular way to estimate class areas from satellite-derived maps. It involves determining the number of pixels allocated to a specific thematic class and multiplying it by the pixel area. In the presence of asymmetric classification errors, the pixel counting estimator is biased. The overarching objective of this article is to define the applicability conditions of pixel counting so that the estimates are below a user-defined accuracy target. By reasoning in terms of landscape fragmentation and spatial resolution, the proposed framework decouples the resolution bias and the classifier bias from the overall classification bias. The consequence is that prior to any classification, part of the tolerated bias is already committed due to the choice of the spatial resolution of the imagery. How much classification bias is affordable depends on the joint interaction of spatial resolution and fragmentation. The method was implemented over South Africa for cropland mapping, demonstrating its operational applicability. Particular attention was paid to modeling a realistic sensor's spatial response by explicitly accounting for the effect of its point spread function. The diagnostic capabilities offered by this framework have multiple potential domains of application such as guiding users in their choice of imagery and providing guidelines for space agencies to elaborate the design specifications of future instruments.

  19. CIRCAL-2 - General-purpose on-line circuit design.

    NASA Technical Reports Server (NTRS)

    Dertouzos, M. L.; Jessel, G. P.; Stinger, J. R.

    1972-01-01

    CIRCAL-2 is a second-generation general-purpose on-line circuit-design program with the following main features: (1) multiple-analysis capability; (2) uniform and general data structures for handling text editing, network representations, and output results, regardless of analysis; (3) special techniques and structures for minimizing and controlling user-program interaction; (4) use of functionals for the description of hysteresis and heat effects; and (5) ability to define optimization procedures that 'replace' the user. The paper discusses the organization of CIRCAL-2, the aforementioned main features, and their consequences, such as a set of network elements and models general enough for most analyses and a set of functions tailored to circuit-design requirements. The presentation is descriptive, concentrating on conceptual rather than on program implementation details.

  20. Networked differential GPS system

    NASA Technical Reports Server (NTRS)

    Sheynblat, Leonid (Inventor); Kalafus, Rudolph M. (Inventor); Loomis, Peter V. W. (Inventor); Mueller, K. Tysen (Inventor)

    1994-01-01

    An embodiment of the present invention relates to a worldwide network of differential GPS reference stations (NDGPS) that continually track the entire GPS satellite constellation and provide interpolations of reference station corrections tailored for particular user locations between the reference stations Each reference station takes real-time ionospheric measurements with codeless cross-correlating dual-frequency carrier GPS receivers and computes real-time orbit ephemerides independently. An absolute pseudorange correction (PRC) is defined for each satellite as a function of a particular user's location. A map of the function is constructed, with iso-PRC contours. The network measures the PRCs at a few points, so-called reference stations and constructs an iso-PRC map for each satellite. Corrections are interpolated for each user's site on a subscription basis. The data bandwidths are kept to a minimum by transmitting information that cannot be obtained directly by the user and by updating information by classes and according to how quickly each class of data goes stale given the realities of the GPS system. Sub-decimeter-level kinematic accuracy over a given area is accomplished by establishing a mini-fiducial network.

  1. Repository-based software engineering program: Concept document

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This document provides the context for Repository-Based Software Engineering's (RBSE's) evolving functional and operational product requirements, and it is the parent document for development of detailed technical and management plans. When furnished, requirements documents will serve as the governing RBSE product specification. The RBSE Program Management Plan will define resources, schedules, and technical and organizational approaches to fulfilling the goals and objectives of this concept. The purpose of this document is to provide a concise overview of RBSE, describe the rationale for the RBSE Program, and define a clear, common vision for RBSE team members and customers. The document also provides the foundation for developing RBSE user and system requirements and a corresponding Program Management Plan. The concept is used to express the program mission to RBSE users and managers and to provide an exhibit for community review.

  2. [Lateral chromatic aberrations correction for AOTF imaging spectrometer based on doublet prism].

    PubMed

    Zhao, Hui-Jie; Zhou, Peng-Wei; Zhang, Ying; Li, Chong-Chong

    2013-10-01

    An user defined surface function method was proposed to model the acousto-optic interaction of AOTF based on wave-vector match principle. Assessment experiment result shows that this model can achieve accurate ray trace of AOTF diffracted beam. In addition, AOTF imaging spectrometer presents large residual lateral color when traditional chromatic aberrations correcting method is adopted. In order to reduce lateral chromatic aberrations, a method based on doublet prism is proposed. The optical material and angle of the prism are optimized automatically using global optimization with the help of user defined AOTF surface. Simulation result shows that the proposed method provides AOTF imaging spectrometer with great conveniences, which reduces the lateral chromatic aberration to less than 0.000 3 degrees and improves by one order of magnitude, with spectral image shift effectively corrected.

  3. Equilibrator: Modeling Chemical Equilibria with Excel

    ERIC Educational Resources Information Center

    Vander Griend, Douglas A.

    2011-01-01

    Equilibrator is a Microsoft Excel program for learning about chemical equilibria through modeling, similar in function to EQS4WIN, which is no longer supported and does not work well with newer Windows operating systems. Similar to EQS4WIN, Equilibrator allows the user to define a system with temperature, initial moles, and then either total…

  4. Geolocating thermal binoculars based on a software defined camera core incorporating HOT MCT grown by MOVPE

    NASA Astrophysics Data System (ADS)

    Pillans, Luke; Harmer, Jack; Edwards, Tim; Richardson, Lee

    2016-05-01

    Geolocation is the process of calculating a target position based on bearing and range relative to the known location of the observer. A high performance thermal imager with integrated geolocation functions is a powerful long range targeting device. Firefly is a software defined camera core incorporating a system-on-a-chip processor running the AndroidTM operating system. The processor has a range of industry standard serial interfaces which were used to interface to peripheral devices including a laser rangefinder and a digital magnetic compass. The core has built in Global Positioning System (GPS) which provides the third variable required for geolocation. The graphical capability of Firefly allowed flexibility in the design of the man-machine interface (MMI), so the finished system can give access to extensive functionality without appearing cumbersome or over-complicated to the user. This paper covers both the hardware and software design of the system, including how the camera core influenced the selection of peripheral hardware, and the MMI design process which incorporated user feedback at various stages.

  5. Application of new type of distributed multimedia databases to networked electronic museum

    NASA Astrophysics Data System (ADS)

    Kuroda, Kazuhide; Komatsu, Naohisa; Komiya, Kazumi; Ikeda, Hiroaki

    1999-01-01

    Recently, various kinds of multimedia application systems have actively been developed based on the achievement of advanced high sped communication networks, computer processing technologies, and digital contents-handling technologies. Under this background, this paper proposed a new distributed multimedia database system which can effectively perform a new function of cooperative retrieval among distributed databases. The proposed system introduces a new concept of 'Retrieval manager' which functions as an intelligent controller so that the user can recognize a set of distributed databases as one logical database. The logical database dynamically generates and performs a preferred combination of retrieving parameters on the basis of both directory data and the system environment. Moreover, a concept of 'domain' is defined in the system as a managing unit of retrieval. The retrieval can effectively be performed by cooperation of processing among multiple domains. Communication language and protocols are also defined in the system. These are used in every action for communications in the system. A language interpreter in each machine translates a communication language into an internal language used in each machine. Using the language interpreter, internal processing, such internal modules as DBMS and user interface modules can freely be selected. A concept of 'content-set' is also introduced. A content-set is defined as a package of contents. Contents in the content-set are related to each other. The system handles a content-set as one object. The user terminal can effectively control the displaying of retrieved contents, referring to data indicating the relation of the contents in the content- set. In order to verify the function of the proposed system, a networked electronic museum was experimentally built. The results of this experiment indicate that the proposed system can effectively retrieve the objective contents under the control to a number of distributed domains. The result also indicate that the system can effectively work even if the system becomes large.

  6. WiseEye: Next Generation Expandable and Programmable Camera Trap Platform for Wildlife Research.

    PubMed

    Nazir, Sajid; Newey, Scott; Irvine, R Justin; Verdicchio, Fabio; Davidson, Paul; Fairhurst, Gorry; Wal, René van der

    2017-01-01

    The widespread availability of relatively cheap, reliable and easy to use digital camera traps has led to their extensive use for wildlife research, monitoring and public outreach. Users of these units are, however, often frustrated by the limited options for controlling camera functions, the generation of large numbers of images, and the lack of flexibility to suit different research environments and questions. We describe the development of a user-customisable open source camera trap platform named 'WiseEye', designed to provide flexible camera trap technology for wildlife researchers. The novel platform is based on a Raspberry Pi single-board computer and compatible peripherals that allow the user to control its functions and performance. We introduce the concept of confirmatory sensing, in which the Passive Infrared triggering is confirmed through other modalities (i.e. radar, pixel change) to reduce the occurrence of false positives images. This concept, together with user-definable metadata, aided identification of spurious images and greatly reduced post-collection processing time. When tested against a commercial camera trap, WiseEye was found to reduce the incidence of false positive images and false negatives across a range of test conditions. WiseEye represents a step-change in camera trap functionality, greatly increasing the value of this technology for wildlife research and conservation management.

  7. WiseEye: Next Generation Expandable and Programmable Camera Trap Platform for Wildlife Research

    PubMed Central

    Nazir, Sajid; Newey, Scott; Irvine, R. Justin; Verdicchio, Fabio; Davidson, Paul; Fairhurst, Gorry; van der Wal, René

    2017-01-01

    The widespread availability of relatively cheap, reliable and easy to use digital camera traps has led to their extensive use for wildlife research, monitoring and public outreach. Users of these units are, however, often frustrated by the limited options for controlling camera functions, the generation of large numbers of images, and the lack of flexibility to suit different research environments and questions. We describe the development of a user-customisable open source camera trap platform named ‘WiseEye’, designed to provide flexible camera trap technology for wildlife researchers. The novel platform is based on a Raspberry Pi single-board computer and compatible peripherals that allow the user to control its functions and performance. We introduce the concept of confirmatory sensing, in which the Passive Infrared triggering is confirmed through other modalities (i.e. radar, pixel change) to reduce the occurrence of false positives images. This concept, together with user-definable metadata, aided identification of spurious images and greatly reduced post-collection processing time. When tested against a commercial camera trap, WiseEye was found to reduce the incidence of false positive images and false negatives across a range of test conditions. WiseEye represents a step-change in camera trap functionality, greatly increasing the value of this technology for wildlife research and conservation management. PMID:28076444

  8. ADS's Dexter Data Extraction Applet

    NASA Astrophysics Data System (ADS)

    Demleitner, M.; Accomazzi, A.; Eichhorn, G.; Grant, C. S.; Kurtz, M. J.; Murray, S. S.

    The NASA Astrophysics Data System (ADS) now holds 1.3 million scanned pages, containing numerous plots and figures for which the original data sets are lost or inaccessible. The availability of scans of the figures can significantly ease the regeneration of the data sets. For this purpose, the ADS has developed Dexter, a Java applet that supports the user in this process. Dexter's basic functionality is to let the user manually digitize a plot by marking points and defining the coordinate transformation from the logical to the physical coordinate system. Advanced features include automatic identification of axes, tracing lines and finding points matching a template. This contribution both describes the operation of Dexter from a user's point of view and discusses some of the architectural issues we faced during implementation.

  9. The EOSDIS Products Usability for Disaster Response

    NASA Technical Reports Server (NTRS)

    Kafle, Durga N.; Wanchoo, Lalit; Won, Young-In; Michael, Karen

    2016-01-01

    The focus of the study is to categorize both NRT and standard data products based on applicability to the SDR-defined disaster types. This will identify which datasets from current NASA satellite missions instruments are best suited for disaster response. The distribution metrics of the products that have been used for studying various selected disasters that have occurred over last 5 years will be analyzed that include volume, number of files, number of users, user domains, user country, etc. This data usage analysis will provide information to the data centers staff that can help them develop the functionality and allocate the resources needed for enhanced access and timely availability of the data products that are critical for the time-sensitive analyses.

  10. 14 CFR 1215.108 - Defining user service requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 5 2011-01-01 2010-01-01 true Defining user service requirements. 1215.108 Section 1215.108 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION TRACKING AND DATA..., spacecraft design, operations planning, and other significant mission parameters. When these user evaluations...

  11. 14 CFR 1215.108 - Defining user service requirements.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 5 2012-01-01 2012-01-01 false Defining user service requirements. 1215.108 Section 1215.108 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION TRACKING AND... services, spacecraft design, operations planning, and other significant mission parameters. When these user...

  12. Human motion retrieval from hand-drawn sketch.

    PubMed

    Chao, Min-Wen; Lin, Chao-Hung; Assa, Jackie; Lee, Tong-Yee

    2012-05-01

    The rapid growth of motion capture data increases the importance of motion retrieval. The majority of the existing motion retrieval approaches are based on a labor-intensive step in which the user browses and selects a desired query motion clip from the large motion clip database. In this work, a novel sketching interface for defining the query is presented. This simple approach allows users to define the required motion by sketching several motion strokes over a drawn character, which requires less effort and extends the users’ expressiveness. To support the real-time interface, a specialized encoding of the motions and the hand-drawn query is required. Here, we introduce a novel hierarchical encoding scheme based on a set of orthonormal spherical harmonic (SH) basis functions, which provides a compact representation, and avoids the CPU/processing intensive stage of temporal alignment used by previous solutions. Experimental results show that the proposed approach can well retrieve the motions, and is capable of retrieve logically and numerically similar motions, which is superior to previous approaches. The user study shows that the proposed system can be a useful tool to input motion query if the users are familiar with it. Finally, an application of generating a 3D animation from a hand-drawn comics strip is demonstrated.

  13. Dexter: Data Extractor for scanned graphs

    NASA Astrophysics Data System (ADS)

    Demleitner, Markus

    2011-12-01

    The NASA Astrophysics Data System (ADS) now holds 1.3 million scanned pages, containing numerous plots and figures for which the original data sets are lost or inaccessible. The availability of scans of the figures can significantly ease the regeneration of the data sets. For this purpose, the ADS has developed Dexter, a Java applet that supports the user in this process. Dexter's basic functionality is to let the user manually digitize a plot by marking points and defining the coordinate transformation from the logical to the physical coordinate system. Advanced features include automatic identification of axes, tracing lines and finding points matching a template.

  14. GEsture: an online hand-drawing tool for gene expression pattern search.

    PubMed

    Wang, Chunyan; Xu, Yiqing; Wang, Xuelin; Zhang, Li; Wei, Suyun; Ye, Qiaolin; Zhu, Youxiang; Yin, Hengfu; Nainwal, Manoj; Tanon-Reyes, Luis; Cheng, Feng; Yin, Tongming; Ye, Ning

    2018-01-01

    Gene expression profiling data provide useful information for the investigation of biological function and process. However, identifying a specific expression pattern from extensive time series gene expression data is not an easy task. Clustering, a popular method, is often used to classify similar expression genes, however, genes with a 'desirable' or 'user-defined' pattern cannot be efficiently detected by clustering methods. To address these limitations, we developed an online tool called GEsture. Users can draw, or graph a curve using a mouse instead of inputting abstract parameters of clustering methods. GEsture explores genes showing similar, opposite and time-delay expression patterns with a gene expression curve as input from time series datasets. We presented three examples that illustrate the capacity of GEsture in gene hunting while following users' requirements. GEsture also provides visualization tools (such as expression pattern figure, heat map and correlation network) to display the searching results. The result outputs may provide useful information for researchers to understand the targets, function and biological processes of the involved genes.

  15. Application-Defined Decentralized Access Control

    PubMed Central

    Xu, Yuanzhong; Dunn, Alan M.; Hofmann, Owen S.; Lee, Michael Z.; Mehdi, Syed Akbar; Witchel, Emmett

    2014-01-01

    DCAC is a practical OS-level access control system that supports application-defined principals. It allows normal users to perform administrative operations within their privilege, enabling isolation and privilege separation for applications. It does not require centralized policy specification or management, giving applications freedom to manage their principals while the policies are still enforced by the OS. DCAC uses hierarchically-named attributes as a generic framework for user-defined policies such as groups defined by normal users. For both local and networked file systems, its execution time overhead is between 0%–9% on file system microbenchmarks, and under 1% on applications. This paper shows the design and implementation of DCAC, as well as several real-world use cases, including sandboxing applications, enforcing server applications’ security policies, supporting NFS, and authenticating user-defined sub-principals in SSH, all with minimal code changes. PMID:25426493

  16. The Basic Organizing/Optimizing Training Scheduler (BOOTS): User's Guide. Technical Report 151.

    ERIC Educational Resources Information Center

    Church, Richard L.; Keeler, F. Laurence

    This report provides the step-by-step instructions required for using the Navy's Basic Organizing/Optimizing Training Scheduler (BOOTS) system. BOOTS is a computerized tool designed to aid in the creation of master training schedules for each Navy recruit training command. The system is defined in terms of three major functions: (1) data file…

  17. A predictive model to allocate frequent service users of community-based mental health services to different packages of care.

    PubMed

    Grigoletti, Laura; Amaddeo, Francesco; Grassi, Aldrigo; Boldrini, Massimo; Chiappelli, Marco; Percudani, Mauro; Catapano, Francesco; Fiorillo, Andrea; Perris, Francesco; Bacigalupi, Maurizio; Albanese, Paolo; Simonetti, Simona; De Agostini, Paola; Tansella, Michele

    2010-01-01

    To develop predictive models to allocate patients into frequent and low service users groups within the Italian Community-based Mental Health Services (CMHSs). To allocate frequent users to different packages of care, identifying the costs of these packages. Socio-demographic and clinical data and GAF scores at baseline were collected for 1250 users attending five CMHSs. All psychiatric contacts made by these patients during six months were recorded. A logistic regression identified frequent service users predictive variables. Multinomial logistic regression identified variables able to predict the most appropriate package of care. A cost function was utilised to estimate costs. Frequent service users were 49%, using nearly 90% of all contacts. The model classified correctly 80% of users in the frequent and low users groups. Three packages of care were identified: Basic Community Treatment (4,133 Euro per six months); Intensive Community Treatment (6,180 Euro) and Rehabilitative Community Treatment (11,984 Euro) for 83%, 6% and 11% of frequent service users respectively. The model was found to be accurate for 85% of users. It is possible to develop predictive models to identify frequent service users and to assign them to pre-defined packages of care, and to use these models to inform the funding of psychiatric care.

  18. Applying a Participatory Design Approach to Define Objectives and Properties of a “Data Profiling” Tool for Electronic Health Data

    PubMed Central

    Estiri, Hossein; Lovins, Terri; Afzalan, Nader; Stephens, Kari A.

    2016-01-01

    We applied a participatory design approach to define the objectives, characteristics, and features of a “data profiling” tool for primary care Electronic Health Data (EHD). Through three participatory design workshops, we collected input from potential tool users who had experience working with EHD. We present 15 recommended features and characteristics for the data profiling tool. From these recommendations we derived three overarching objectives and five properties for the tool. A data profiling tool, in Biomedical Informatics, is a visual, clear, usable, interactive, and smart tool that is designed to inform clinical and biomedical researchers of data utility and let them explore the data, while conveniently orienting the users to the tool’s functionalities. We suggest that developing scalable data profiling tools will provide new capacities to disseminate knowledge about clinical data that will foster translational research and accelerate new discoveries. PMID:27570651

  19. Rapid Copper Metallization of Textile Materials: a Controlled Two-Step Route to Achieve User-Defined Patterns under Ambient Conditions.

    PubMed

    Zhang, Shuang-Yuan; Guan, Guijian; Jiang, Shan; Guo, Hongchen; Xia, Jing; Regulacio, Michelle D; Wu, Mingda; Shah, Kwok Wei; Dong, Zhili; Zhang, Jie; Han, Ming-Yong

    2015-09-30

    Throughout history earth-abundant copper has been incorporated into textiles and it still caters to various needs in modern society. In this paper, we present a two-step copper metallization strategy to realize sequentially nondiffusive copper(II) patterning and rapid copper deposition on various textile materials, including cotton, polyester, nylon, and their mixtures. A new, cost-effective formulation is designed to minimize the copper pattern migration on textiles and to achieve user-defined copper patterns. The metallized copper is found to be very adhesive and stable against washing and oxidation. Furthermore, the copper-metallized textile exhibits excellent electrical conductivity that is ~3 times better than that of stainless steel and also inhibits the growth of bacteria effectively. This new copper metallization approach holds great promise as a commercially viable method to metallize an insulating textile, opening up research avenues for wearable electronics and functional garments.

  20. On the Supply Chain Management Supported by E-Commerce Service Platform for Agreement based Circulation of Fruits and Vegetables

    NASA Astrophysics Data System (ADS)

    Bao, Liwei; Huang, Yuchi; Ma, Zengjun; Zhang, Jie; Lv, Qingchu

    According to analysis of the supply chain process of agricultural products, the IT application requirements of the market entities participating in the agreement based circulation of fruits and vegetables have been discussed. The strategy of supply chain management basing on E-commerce service platform for fruits and vegetables has been proposed in this paper. The architecture and function composing of the service platform have been designed and implemented. The platform is constructed on a set of application service modules User can choose some of the application service modules and define them according to the business process. The application service modules chosen and defined by user are integrated as an application service package and applied as management information system of business process. With the E-commerce service platform, the supply chain management for agreement based circulation of agricultural products of vegetables and fruits can be implemented.

  1. ProphTools: general prioritization tools for heterogeneous biological networks.

    PubMed

    Navarro, Carmen; Martínez, Victor; Blanco, Armando; Cano, Carlos

    2017-12-01

    Networks have been proven effective representations for the analysis of biological data. As such, there exist multiple methods to extract knowledge from biological networks. However, these approaches usually limit their scope to a single biological entity type of interest or they lack the flexibility to analyze user-defined data. We developed ProphTools, a flexible open-source command-line tool that performs prioritization on a heterogeneous network. ProphTools prioritization combines a Flow Propagation algorithm similar to a Random Walk with Restarts and a weighted propagation method. A flexible model for the representation of a heterogeneous network allows the user to define a prioritization problem involving an arbitrary number of entity types and their interconnections. Furthermore, ProphTools provides functionality to perform cross-validation tests, allowing users to select the best network configuration for a given problem. ProphTools core prioritization methodology has already been proven effective in gene-disease prioritization and drug repositioning. Here we make ProphTools available to the scientific community as flexible, open-source software and perform a new proof-of-concept case study on long noncoding RNAs (lncRNAs) to disease prioritization. ProphTools is robust prioritization software that provides the flexibility not present in other state-of-the-art network analysis approaches, enabling researchers to perform prioritization tasks on any user-defined heterogeneous network. Furthermore, the application to lncRNA-disease prioritization shows that ProphTools can reach the performance levels of ad hoc prioritization tools without losing its generality. © The Authors 2017. Published by Oxford University Press.

  2. A solution to the surface intersection problem. [Boolean functions in geometric modeling

    NASA Technical Reports Server (NTRS)

    Timer, H. G.

    1977-01-01

    An application-independent geometric model within a data base framework should support the use of Boolean operators which allow the user to construct a complex model by appropriately combining a series of simple models. The use of these operators leads to the concept of implicitly and explicitly defined surfaces. With an explicitly defined model, the surface area may be computed by simply summing the surface areas of the bounding surfaces. For an implicitly defined model, the surface area computation must deal with active and inactive regions. Because the surface intersection problem involves four unknowns and its solution is a space curve, the parametric coordinates of each surface must be determined as a function of the arc length. Various subproblems involved in the general intersection problem are discussed, and the mathematical basis for their solution is presented along with a program written in FORTRAN IV for implementation on the IBM 370 TSO system.

  3. Systematic Sensor Selection Strategy (S4) User Guide

    NASA Technical Reports Server (NTRS)

    Sowers, T. Shane

    2012-01-01

    This paper describes a User Guide for the Systematic Sensor Selection Strategy (S4). S4 was developed to optimally select a sensor suite from a larger pool of candidate sensors based on their performance in a diagnostic system. For aerospace systems, selecting the proper sensors is important for ensuring adequate measurement coverage to satisfy operational, maintenance, performance, and system diagnostic criteria. S4 optimizes the selection of sensors based on the system fault diagnostic approach while taking conflicting objectives such as cost, weight and reliability into consideration. S4 can be described as a general architecture structured to accommodate application-specific components and requirements. It performs combinational optimization with a user defined merit or cost function to identify optimum or near-optimum sensor suite solutions. The S4 User Guide describes the sensor selection procedure and presents an example problem using an open source turbofan engine simulation to demonstrate its application.

  4. Improved personalized recommendation based on a similarity network

    NASA Astrophysics Data System (ADS)

    Wang, Ximeng; Liu, Yun; Xiong, Fei

    2016-08-01

    A recommender system helps individual users find the preferred items rapidly and has attracted extensive attention in recent years. Many successful recommendation algorithms are designed on bipartite networks, such as network-based inference or heat conduction. However, most of these algorithms define the resource-allocation methods for an average allocation. That is not reasonable because average allocation cannot indicate the user choice preference and the influence between users which leads to a series of non-personalized recommendation results. We propose a personalized recommendation approach that combines the similarity function and bipartite network to generate a similarity network that improves the resource-allocation process. Our model introduces user influence into the recommender system and states that the user influence can make the resource-allocation process more reasonable. We use four different metrics to evaluate our algorithms for three benchmark data sets. Experimental results show that the improved recommendation on a similarity network can obtain better accuracy and diversity than some competing approaches.

  5. Case studies on optimization problems in MATLAB and COMSOL multiphysics by means of the livelink

    NASA Astrophysics Data System (ADS)

    Ozana, Stepan; Pies, Martin; Docekal, Tomas

    2016-06-01

    LiveLink for COMSOL is a tool that integrates COMSOL Multiphysics with MATLAB to extend one's modeling with scripting programming in the MATLAB environment. It allows user to utilize the full power of MATLAB and its toolboxes in preprocessing, model manipulation, and post processing. At first, the head script launches COMSOL with MATLAB and defines initial value of all parameters, refers to the objective function J described in the objective function and creates and runs the defined optimization task. Once the task is launches, the COMSOL model is being called in the iteration loop (from MATLAB environment by use of API interface), changing defined optimization parameters so that the objective function is minimized, using fmincon function to find a local or global minimum of constrained linear or nonlinear multivariable function. Once the minimum is found, it returns exit flag, terminates optimization and returns the optimized values of the parameters. The cooperation with MATLAB via LiveLink enhances a powerful computational environment with complex multiphysics simulations. The paper will introduce using of the LiveLink for COMSOL for chosen case studies in the field of technical cybernetics and bioengineering.

  6. A Web-Based Information System for Field Data Management

    NASA Astrophysics Data System (ADS)

    Weng, Y. H.; Sun, F. S.

    2014-12-01

    A web-based field data management system has been designed and developed to allow field geologists to store, organize, manage, and share field data online. System requirements were analyzed and clearly defined first regarding what data are to be stored, who the potential users are, and what system functions are needed in order to deliver the right data in the right way to the right user. A 3-tiered architecture was adopted to create this secure, scalable system that consists of a web browser at the front end while a database at the back end and a functional logic server in the middle. Specifically, HTML, CSS, and JavaScript were used to implement the user interface in the front-end tier, the Apache web server runs PHP scripts, and MySQL to server is used for the back-end database. The system accepts various types of field information, including image, audio, video, numeric, and text. It allows users to select data and populate them on either Google Earth or Google Maps for the examination of the spatial relations. It also makes the sharing of field data easy by converting them into XML format that is both human-readable and machine-readable, and thus ready for reuse.

  7. HF Propagation sensitivity study and system performance analysis with the Air Force Coverage Analysis Program (AFCAP)

    NASA Astrophysics Data System (ADS)

    Caton, R. G.; Colman, J. J.; Parris, R. T.; Nickish, L.; Bullock, G.

    2017-12-01

    The Air Force Research Laboratory, in collaboration with NorthWest Research Associates, is developing advanced software capabilities for high fidelity simulations of high frequency (HF) sky wave propagation and performance analysis of HF systems. Based on the HiCIRF (High-frequency Channel Impulse Response Function) platform [Nickisch et. al, doi:10.1029/2011RS004928], the new Air Force Coverage Analysis Program (AFCAP) provides the modular capabilities necessary for a comprehensive sensitivity study of the large number of variables which define simulations of HF propagation modes. In this paper, we report on an initial exercise of AFCAP to analyze the sensitivities of the tool to various environmental and geophysical parameters. Through examination of the channel scattering function and amplitude-range-Doppler output on two-way propagation paths with injected target signals, we will compare simulated returns over a range of geophysical conditions as well as varying definitions for environmental noise, meteor clutter, and sea state models for Bragg backscatter. We also investigate the impacts of including clutter effects due to field-aligned backscatter from small scale ionization structures at varied levels of severity as defined by the climatologically WideBand Model (WBMOD). In the absence of additional user provided information, AFCAP relies on International Reference Ionosphere (IRI) model to define the ionospheric state for use in 2D ray tracing algorithms. Because the AFCAP architecture includes the option for insertion of a user defined gridded ionospheric representation, we compare output from the tool using the IRI and ionospheric definitions from assimilative models such as GPSII (GPS Ionospheric Inversion).

  8. Use of health services by Brazilian older adults with and without functional limitation

    PubMed Central

    Silva, Alexandre Moreira de Melo; Mambrini, Juliana Vaz de Melo; Peixoto, Sérgio Viana; Malta, Deborah Carvalho; Lima-Costa, Maria Fernanda

    2017-01-01

    ABSTRACT OBJECTIVE To analyze the use of health services and the quality of medical care received by Brazilian older adults with and without functional limitation. METHODS The main analyses were based on a national sample representing 23,815 participants of the National Survey on Health (PNS) aged 60 years or older. Functional limitation was defined by the difficulty to perform at least one out of ten basic or instrumental activities of daily living. Potential confounding variables included predisposing and enabling factors of the use of health services. RESULTS The prevalence of functional limitation was 30.1% (95%CI 29.2–31.4). The number of doctor visits and hospitalizations in the past 12 months showed statistically significant associations with functional limitation, both for users of the public system (OR = 2.48 [95%CI 2.13–2.88] for three or more doctor visits and OR = 2.58 [95%CI 2.15–3.09] for one or more hospitalizations) and of the private system (OR = 2.56 [95%CI 1.50–4.36] and OR = 2.22 [95%CI 1.64–3.00], respectively). The propensity to use basic health units was higher among users of the private system with functional limitations (OR = 2.01 [95%CI 1.12–3.59]). Only two out of seven indicators of the quality of medical care received were associated with functional limitation, in the perception of users of public and private systems. The public system users with functional limitations did worse evaluation of the freedom for choosing the doctor and waiting time for an appointment, when compared with users of the same system without these limitations (OR = 0.81 [95%CI 0.67–0.99] and OR = 0.76 [95%CI 0.62–0.93], respectively). CONCLUSIONS Older adults with functional limitations use more health services in comparison with those without such limitations. The magnitude of the association between functional limitation and number of doctor visits and hospitalizations was similar in the public and private health systems. PMID:28591357

  9. IAIMS Architecture

    PubMed Central

    Hripcsak, George

    1997-01-01

    Abstract An information system architecture defines the components of a system and the interfaces among the components. A good architecture is essential for creating an Integrated Advanced Information Management System (IAIMS) that works as an integrated whole yet is flexible enough to accommodate many users and roles, multiple applications, changing vendors, evolving user needs, and advancing technology. Modularity and layering promote flexibility by reducing the complexity of a system and by restricting the ways in which components may interact. Enterprise-wide mediation promotes integration by providing message routing, support for standards, dictionary-based code translation, a centralized conceptual data schema, business rule implementation, and consistent access to databases. Several IAIMS sites have adopted a client-server architecture, and some have adopted a three-tiered approach, separating user interface functions, application logic, and repositories. PMID:9067884

  10. Scripting Module for the Satellite Orbit Analysis Program (SOAP)

    NASA Technical Reports Server (NTRS)

    Carnright, Robert; Paget, Jim; Coggi, John; Stodden, David

    2008-01-01

    This add-on module to the SOAP software can perform changes to simulation objects based on the occurrence of specific conditions. This allows the software to encompass simulation response of scheduled or physical events. Users can manipulate objects in the simulation environment under programmatic control. Inputs to the scripting module are Actions, Conditions, and the Script. Actions are arbitrary modifications to constructs such as Platform Objects (i.e. satellites), Sensor Objects (representing instruments or communication links), or Analysis Objects (user-defined logical or numeric variables). Examples of actions include changes to a satellite orbit ( v), changing a sensor-pointing direction, and the manipulation of a numerical expression. Conditions represent the circumstances under which Actions are performed and can be couched in If-Then-Else logic, like performing v at specific times or adding to the spacecraft power only when it is being illuminated by the Sun. The SOAP script represents the entire set of conditions being considered over a specific time interval. The output of the scripting module is a series of events, which are changes to objects at specific times. As the SOAP simulation clock runs forward, the scheduled events are performed. If the user sets the clock back in time, the events within that interval are automatically undone. This script offers an interface for defining scripts where the user does not have to remember the vocabulary of various keywords. Actions can be captured by employing the same user interface that is used to define the objects themselves. Conditions can be set to invoke Actions by selecting them from pull-down lists. Users define the script by selecting from the pool of defined conditions. Many space systems have to react to arbitrary events that can occur from scheduling or from the environment. For example, an instrument may cease to draw power when the area that it is tasked to observe is not in view. The contingency of the planetary body blocking the line of sight is a condition upon which the power being drawn is set to zero. It remains at zero until the observation objective is again in view. Computing the total power drawn by the instrument over a period of days or weeks can now take such factors into consideration. What makes the architecture especially powerful is that the scripting module can look ahead and behind in simulation time, and this temporal versatility can be leveraged in displays such as x-y plots. For example, a plot of a satellite s altitude as a function of time can take changes to the orbit into account.

  11. Development of functional requirements for electronic health communication: preliminary results from the ELIN project.

    PubMed

    Christensen, Tom; Grimsmo, Anders

    2005-01-01

    User participation is important for developing a functional requirements specification for electronic communication. General practitioners and practising specialists, however, often work in small practices without the resources to develop and present their requirements. It was necessary to find a method that could engage practising doctors in order to promote their needs related to electronic communication. Qualitative research methods were used, starting a process to develop and study documents and collect data from meetings in project groups. Triangulation was used, in that the participants were organised into a panel of experts, a user group, a supplier group and an editorial committee. The panel of experts created a list of functional requirements for electronic communication in health care, consisting of 197 requirements, in addition to 67 requirements selected from an existing Norwegian standard for electronic patient records (EPRs). Elimination of paper copies sent in parallel with electronic messages, optimal workflow, a common electronic 'envelope' with directory services for units and end-users, and defined requirements for content with the possibility of decision support were the most important requirements. The results indicate that we have found a method of developing functional requirements which provides valid results both for practising doctors and for suppliers of EPR systems.

  12. LANES 1 Users' Guide

    NASA Technical Reports Server (NTRS)

    Jordan, J.

    1985-01-01

    This document is intended for users of the Local Area Network Extensible Simulator, version I. This simulator models the performance of a Fiber Optic network under a variety of loading conditions and network characteristics. The options available to the user for defining the network conditions are described in this document. Computer hardware and software requirements are also defined.

  13. GenCade Version 1 Quick-Start Guide: How to Start a Successful GenCade Project

    DTIC Science & Technology

    2015-03-01

    Properly defining the inlets is a crucial part of a GenCade project and can be difficult. A user should become familiar with the Inlet Reservoir Model ( IRM ...GenCade Report 2 provide additional documentation on IRM variable names and functions. 3.9.4 Export data The data may easily be exported to a text

  14. Space Station overall management approach for operations

    NASA Technical Reports Server (NTRS)

    Paules, G.

    1986-01-01

    An Operations Management Concept developed by NASA for its Space Station Program is discussed. The operational goals, themes, and design principles established during program development are summarized. The major operations functions are described, including: space systems operations, user support operations, prelaunch/postlanding operations, logistics support operations, market research, and cost/financial management. Strategic, tactical, and execution levels of operational decision-making are defined.

  15. Developing and Validation a Usability Evaluation Tools for Distance Education Websites: Persian Version

    ERIC Educational Resources Information Center

    Hafezi, Soheila; Farahi, Ahmad; Mehri, Soheil Najafi; Mahmoodi, Hosein

    2010-01-01

    The web is playing a central role in distance education. The word "usability" is usually synonymous with functionality of the system for the user. Also, usability of a website is defined as something that can be used by a specific group of people to carry out specific objectives in an effective way, with efficiency and satisfaction.…

  16. Expressions Module for the Satellite Orbit Analysis Program

    NASA Technical Reports Server (NTRS)

    Edmonds, Karina

    2008-01-01

    The Expressions Module is a software module that has been incorporated into the Satellite Orbit Analysis Program (SOAP). The module includes an expressions- parser submodule built on top of an analytical system, enabling the user to define logical and numerical variables and constants. The variables can capture output from SOAP orbital-prediction and geometric-engine computations. The module can combine variables and constants with built-in logical operators (such as Boolean AND, OR, and NOT), relational operators (such as >, <, or =), and mathematical operators (such as addition, subtraction, multiplication, division, modulus, exponentiation, differentiation, and integration). Parentheses can be used to specify precedence of operations. The module contains a library of mathematical functions and operations, including logarithms, trigonometric functions, Bessel functions, minimum/ maximum operations, and floating- point-to-integer conversions. The module supports combinations of time, distance, and angular units and has a dimensional- analysis component that checks for correct usage of units. A parser based on the Flex language and the Bison program looks for and indicates errors in syntax. SOAP expressions can be built using other expressions as arguments, thus enabling the user to build analytical trees. A graphical user interface facilitates use.

  17. Scalable Computing of the Mesh Size Effect on Modeling Damage Mechanics in Woven Armor Composites

    DTIC Science & Technology

    2008-12-01

    manner of a user defined material subroutine to provide overall stress increments to, the parallel LS-DYNA3D a Lagrangian explicit code used in...finite element code, as a user defined material subroutine . The ability of this subroutine to model the effect of the progressions of a select number...is added as a user defined material subroutine to parallel LS-DYNA3D. The computations of the global mesh are handled by LS-DYNA3D and are spread

  18. User-defined Material Model for Thermo-mechanical Progressive Failure Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.

    2008-01-01

    Previously a user-defined material model for orthotropic bimodulus materials was developed for linear and nonlinear stress analysis of composite structures using either shell or solid finite elements within a nonlinear finite element analysis tool. Extensions of this user-defined material model to thermo-mechanical progressive failure analysis are described, and the required input data are documented. The extensions include providing for temperature-dependent material properties, archival of the elastic strains, and a thermal strain calculation for materials exhibiting a stress-free temperature.

  19. Drawing road networks with focus regions.

    PubMed

    Haunert, Jan-Henrik; Sering, Leon

    2011-12-01

    Mobile users of maps typically need detailed information about their surroundings plus some context information about remote places. In order to avoid that the map partly gets too dense, cartographers have designed mapping functions that enlarge a user-defined focus region--such functions are sometimes called fish-eye projections. The extra map space occupied by the enlarged focus region is compensated by distorting other parts of the map. We argue that, in a map showing a network of roads relevant to the user, distortion should preferably take place in those areas where the network is sparse. Therefore, we do not apply a predefined mapping function. Instead, we consider the road network as a graph whose edges are the road segments. We compute a new spatial mapping with a graph-based optimization approach, minimizing the square sum of distortions at edges. Our optimization method is based on a convex quadratic program (CQP); CQPs can be solved in polynomial time. Important requirements on the output map are expressed as linear inequalities. In particular, we show how to forbid edge crossings. We have implemented our method in a prototype tool. For instances of different sizes, our method generated output maps that were far less distorted than those generated with a predefined fish-eye projection. Future work is needed to automate the selection of roads relevant to the user. Furthermore, we aim at fast heuristics for application in real-time systems. © 2011 IEEE

  20. Establishing a group of endpoints to support collective operations without specifying unique identifiers for any endpoints

    DOEpatents

    Archer, Charles J.; Blocksom, Michael A.; Ratterman, Joseph D.; Smith, Brian E.; Xue, Hanghon

    2016-02-02

    A parallel computer executes a number of tasks, each task includes a number of endpoints and the endpoints are configured to support collective operations. In such a parallel computer, establishing a group of endpoints receiving a user specification of a set of endpoints included in a global collection of endpoints, where the user specification defines the set in accordance with a predefined virtual representation of the endpoints, the predefined virtual representation is a data structure setting forth an organization of tasks and endpoints included in the global collection of endpoints and the user specification defines the set of endpoints without a user specification of a particular endpoint; and defining a group of endpoints in dependence upon the predefined virtual representation of the endpoints and the user specification.

  1. Establishing a group of endpoints in a parallel computer

    DOEpatents

    Archer, Charles J.; Blocksome, Michael A.; Ratterman, Joseph D.; Smith, Brian E.; Xue, Hanhong

    2016-02-02

    A parallel computer executes a number of tasks, each task includes a number of endpoints and the endpoints are configured to support collective operations. In such a parallel computer, establishing a group of endpoints receiving a user specification of a set of endpoints included in a global collection of endpoints, where the user specification defines the set in accordance with a predefined virtual representation of the endpoints, the predefined virtual representation is a data structure setting forth an organization of tasks and endpoints included in the global collection of endpoints and the user specification defines the set of endpoints without a user specification of a particular endpoint; and defining a group of endpoints in dependence upon the predefined virtual representation of the endpoints and the user specification.

  2. Nuclear data made easily accessible through the Notre Dame Nuclear Database

    NASA Astrophysics Data System (ADS)

    Khouw, Timothy; Lee, Kevin; Fasano, Patrick; Mumpower, Matthew; Aprahamian, Ani

    2014-09-01

    In 1994, the NNDC revolutionized nuclear research by providing a colorful, clickable, searchable database over the internet. Over the last twenty years, web technology has evolved dramatically. Our project, the Notre Dame Nuclear Database, aims to provide a more comprehensive and broadly searchable interactive body of data. The database can be searched by an array of filters which includes metadata such as the facility where a measurement is made, the author(s), or date of publication for the datum of interest. The user interface takes full advantage of HTML, a web markup language, CSS (cascading style sheets to define the aesthetics of the website), and JavaScript, a language that can process complex data. A command-line interface is supported that interacts with the database directly on a user's local machine which provides single command access to data. This is possible through the use of a standardized API (application programming interface) that relies upon well-defined filtering variables to produce customized search results. We offer an innovative chart of nuclides utilizing scalable vector graphics (SVG) to deliver users an unsurpassed level of interactivity supported on all computers and mobile devices. We will present a functional demo of our database at the conference.

  3. WebMapping at school

    NASA Astrophysics Data System (ADS)

    de Lange, Norbert

    2010-11-01

    The paper discusses the position of GIS in Geography as a subject especially at German schools. It points out that students only need simple GIS-functions in order to explore digital atlases or webbased data viewers. Furthermore it is widely accepted that learning achievements improve if students make use of the idea of self-employed and explorative working on information produced by themselves. These two arguments have led to the development of the WebMapping tool "kartografix_school". It allows users to generate maps with new and individually defined content on the internet. For that purpose the tool contains generalized outlines of all countries of the world as well as of German States. As these boundaries are given users can assign new attribute data to these geoobjects. These data are transferred to a graphic presentation. It is possible to define the classification and colours for each class. Users can change and update all information (data as well as number of classes, definition of classes, colours) at any time. Moreover "kartografix_school" offers the possibility to produce maps which are composed of two layers. All data are stored at a server located in the University of Osnabrück. "kartografix_school" is integrated in an e-Learning environment.

  4. Teaching and Learning Activity Sequencing System using Distributed Genetic Algorithms

    NASA Astrophysics Data System (ADS)

    Matsui, Tatsunori; Ishikawa, Tomotake; Okamoto, Toshio

    The purpose of this study is development of a supporting system for teacher's design of lesson plan. Especially design of lesson plan which relates to the new subject "Information Study" is supported. In this study, we developed a system which generates teaching and learning activity sequences by interlinking lesson's activities corresponding to the various conditions according to the user's input. Because user's input is multiple information, there will be caused contradiction which the system should solve. This multiobjective optimization problem is resolved by Distributed Genetic Algorithms, in which some fitness functions are defined with reference models on lesson, thinking and teaching style. From results of various experiments, effectivity and validity of the proposed methods and reference models were verified; on the other hand, some future works on reference models and evaluation functions were also pointed out.

  5. Lowering the Barrier to Cross-Disciplinary Scientific Data Access via a Brokering Service Built Around a Unified Data Model

    NASA Astrophysics Data System (ADS)

    Lindholm, D. M.; Wilson, A.

    2012-12-01

    The steps many scientific data users go through to use data (after discovering it) can be rather tedious, even when dealing with datasets within their own discipline. Accessing data across domains often seems intractable. We present here, LaTiS, an Open Source brokering solution that bridges the gap between the source data and the user's code by defining a unified data model plus a plugin framework for "adapters" to read data from their native source, "filters" to perform server side data processing, and "writers" to output any number of desired formats or streaming protocols. A great deal of work is being done in the informatics community to promote multi-disciplinary science with a focus on search and discovery based on metadata - information about the data. The goal of LaTiS is to go that last step to provide a uniform interface to read the dataset into computer programs and other applications once it has been identified. The LaTiS solution for integrating a wide variety of data models is to return to mathematical fundamentals. The LaTiS data model emphasizes functional relationships between variables. For example, a time series of temperature measurements can be thought of as a function that maps a time to a temperature. With just three constructs: "Scalar" for a single variable, "Tuple" for a collection of variables, and "Function" to represent a set of independent and dependent variables, the LaTiS data model can represent most scientific datasets at a low level that enables uniform data access. Higher level abstractions can be built on top of the basic model to add more meaningful semantics for specific user communities. LaTiS defines its data model in terms of the Unified Modeling Language (UML). It also defines a very thin Java Interface that can be implemented by numerous existing data interfaces (e.g. NetCDF-Java) such that client code can access any dataset via the Java API, independent of the underlying data access mechanism. LaTiS also provides a reference implementation of the data model and server framework (with a RESTful service interface) in the Scala programming language. Scala can be thought of as the next generation of Java. It runs on the Java Virtual Machine and can directly use Java code. Scala improves upon Java's object-oriented capabilities and adds support for functional programming paradigms which are particularly well suited for scientific data analysis. The Scala implementation of LaTiS can be thought of as a Domain Specific Language (DSL) which presents an API that better matches the semantics of the problems scientific data users are trying to solve. Instead of working with bytes, ints, or arrays, the data user can directly work with data as "time series" or "spectra". LaTiS provides many layers of abstraction with which users can interact to support a wide variety of data access and analysis needs.

  6. A new approach to global control of redundant manipulators

    NASA Technical Reports Server (NTRS)

    Seraji, Homayoun

    1989-01-01

    A new and simple approach to configuration control of redundant manipulators is presented. In this approach, the redundancy is utilized to control the manipulator configuration directly in task space, where the task will be performed. A number of kinematic functions are defined to reflect the desirable configuration that will be achieved for a given end-effector position. The user-defined kinematic functions and the end-effector Cartesian coordinates are combined to form a set of task-related configuration variables as generalized coordinates for the manipulator. An adaptive scheme is then utilized to globally control the configuration variables so as to achieve tracking of some desired reference trajectories. This accomplishes the basic task of desired end-effector motion, while utilizing the redundancy to achieve any additional task through the desired time variation of the kinematic functions. The control law is simple and computationally very fast, and does not require the complex manipulator dynamic model.

  7. Remote Data Exploration with the Interactive Data Language (IDL)

    NASA Technical Reports Server (NTRS)

    Galloy, Michael

    2013-01-01

    A difficulty for many NASA researchers is that often the data to analyze is located remotely from the scientist and the data is too large to transfer for local analysis. Researchers have developed the Data Access Protocol (DAP) for accessing remote data. Presently one can use DAP from within IDL, but the IDL-DAP interface is both limited and cumbersome. A more powerful and user-friendly interface to DAP for IDL has been developed. Users are able to browse remote data sets graphically, select partial data to retrieve, import that data and make customized plots, and have an interactive IDL command line session simultaneous with the remote visualization. All of these IDL-DAP tools are usable easily and seamlessly for any IDL user. IDL and DAP are both widely used in science, but were not easily used together. The IDL DAP bindings were incomplete and had numerous bugs that prevented their serious use. For example, the existing bindings did not read DAP Grid data, which is the organization of nearly all NASA datasets currently served via DAP. This project uniquely provides a fully featured, user-friendly interface to DAP from IDL, both from the command line and a GUI application. The DAP Explorer GUI application makes browsing a dataset more user-friendly, while also providing the capability to run user-defined functions on specified data. Methods for running remote functions on the DAP server were investigated, and a technique for accomplishing this task was decided upon.

  8. User modeling techniques for enhanced usability of OPSMODEL operations simulation software

    NASA Technical Reports Server (NTRS)

    Davis, William T.

    1991-01-01

    The PC based OPSMODEL operations software for modeling and simulation of space station crew activities supports engineering and cost analyses and operations planning. Using top-down modeling, the level of detail required in the data base can be limited to being commensurate with the results required of any particular analysis. To perform a simulation, a resource environment consisting of locations, crew definition, equipment, and consumables is first defined. Activities to be simulated are then defined as operations and scheduled as desired. These operations are defined within a 1000 level priority structure. The simulation on OPSMODEL, then, consists of the following: user defined, user scheduled operations executing within an environment of user defined resource and priority constraints. Techniques for prioritizing operations to realistically model a representative daily scenario of on-orbit space station crew activities are discussed. The large number of priority levels allows priorities to be assigned commensurate with the detail necessary for a given simulation. Several techniques for realistic modeling of day-to-day work carryover are also addressed.

  9. Mass-storage management for distributed image/video archives

    NASA Astrophysics Data System (ADS)

    Franchi, Santina; Guarda, Roberto; Prampolini, Franco

    1993-04-01

    The realization of image/video database requires a specific design for both database structures and mass storage management. This issue has addressed the project of the digital image/video database system that has been designed at IBM SEMEA Scientific & Technical Solution Center. Proper database structures have been defined to catalog image/video coding technique with the related parameters, and the description of image/video contents. User workstations and servers are distributed along a local area network. Image/video files are not managed directly by the DBMS server. Because of their wide size, they are stored outside the database on network devices. The database contains the pointers to the image/video files and the description of the storage devices. The system can use different kinds of storage media, organized in a hierarchical structure. Three levels of functions are available to manage the storage resources. The functions of the lower level provide media management. They allow it to catalog devices and to modify device status and device network location. The medium level manages image/video files on a physical basis. It manages file migration between high capacity media and low access time media. The functions of the upper level work on image/video file on a logical basis, as they archive, move and copy image/video data selected by user defined queries. These functions are used to support the implementation of a storage management strategy. The database information about characteristics of both storage devices and coding techniques are used by the third level functions to fit delivery/visualization requirements and to reduce archiving costs.

  10. The Ultimate Big Data Enterprise Initiative: Defining Functional Capabilities for an International Information System (IIS) for Orbital Space Data (OSD)

    NASA Astrophysics Data System (ADS)

    Raygan, R.

    Global collaboration in support of an International Information System (IIS) for Orbital Space Data (OSD) literally requires a global enterprise. As with many information technology enterprise initiatives attempting to coral the desires of business with the budgets and limitations of technology, Space Situational Awareness (SSA) includes many of the same challenges: 1) Adaptive / Intuitive Dash Board that facilitates User Experience Design for a variety of users. 2) Asset Management of hundreds of thousands of objects moving at thousands of miles per hour hundreds of miles in space. 3) Normalization and integration of diverse data in various languages, possibly hidden or protected from easy access. 4) Expectations of near real-time information availability coupled with predictive analysis to affect decisions before critical points of no return, such as Space Object Conjunction Assessment (CA). 5) Data Ownership, management, taxonomy, and accuracy. 6) Integrated metrics and easily modified algorithms for "what if" analysis. This paper proposes an approach to define the functional capabilities for an IIS for OSD. These functional capabilities not only address previously identified gaps in current systems but incorporate lessons learned from other big data, enterprise, and agile information technology initiatives that correlate to the space domain. Viewing the IIS as the "data service provider" allows adoption of existing information technology processes which strengthen governance and ensure service consumers certain levels of service dependability and accuracy.

  11. SutraGUI, a graphical-user interface for SUTRA, a model for ground-water flow with solute or energy transport

    USGS Publications Warehouse

    Winston, Richard B.; Voss, Clifford I.

    2004-01-01

    This report describes SutraGUI, a flexible graphical user-interface (GUI) that supports two-dimensional (2D) and three-dimensional (3D) simulation with the U.S. Geological Survey (USGS) SUTRA ground-water-flow and transport model (Voss and Provost, 2002). SutraGUI allows the user to create SUTRA ground-water models graphically. SutraGUI provides all of the graphical functionality required for setting up and running SUTRA simulations that range from basic to sophisticated, but it is also possible for advanced users to apply programmable features within Argus ONE to meet the unique demands of particular ground-water modeling projects. SutraGUI is a public-domain computer program designed to run with the proprietary Argus ONE? package, which provides 2D Geographic Information System (GIS) and meshing support. For 3D simulation, GIS and meshing support is provided by programming contained within SutraGUI. When preparing a 3D SUTRA model, the model and all of its features are viewed within Argus 1 in 2D projection. For 2D models, SutraGUI is only slightly changed in functionality from the previous 2D-only version (Voss and others, 1997) and it provides visualization of simulation results. In 3D, only model preparation is supported by SutraGUI, and 3D simulation results may be viewed in SutraPlot (Souza, 1999) or Model Viewer (Hsieh and Winston, 2002). A comprehensive online Help system is included in SutraGUI. For 3D SUTRA models, the 3D model domain is conceptualized as bounded on the top and bottom by 2D surfaces. The 3D domain may also contain internal surfaces extending across the model that divide the domain into tabular units, which can represent hydrogeologic strata or other features intended by the user. These surfaces can be non-planar and non-horizontal. The 3D mesh is defined by one or more 2D meshes at different elevations that coincide with these surfaces. If the nodes in the 3D mesh are vertically aligned, only a single 2D mesh is needed. For nonaligned meshes, two or more 2D meshes of similar connectivity are used. Between each set of 2D meshes (and model surfaces), the vertical space in the 3D mesh is evenly divided into a user-specified number of layers of finite elements. Boundary conditions may be specified for 3D models in SutraGUI using a variety of geometric shapes that may be located freely within the 3D model domain. These shapes include points, lines, sheets, and solids. These are represented by 2D contours (within the vertically-projected Argus ONE view) with user-defined elevations. In addition, boundary conditions may be specified for 3D models as points, lines, and areas that are located exactly within the surfaces that define the model top and the bottoms of the tabular units. Aquifer properties may be specified separately for each tabular unit. If the aquifer properties vary vertically within a unit, SutraGUI provides the Sutra_Z function that can be used to specify such variation.

  12. SCAMP and the ASP

    NASA Astrophysics Data System (ADS)

    Idehara, H.; Carbon, D. F.

    2004-12-01

    We present two new, publicly available tools to support the examination and interpretation of spectra. SCAMP is a specialized graphical user interface for MATLAB. It allows researchers to rapidly intercompare sets of observational, theoretical, and/or laboratory spectra. Users have extensive control over the colors and placement of individual spectra, and over spectrum normalization from one spectral region to another. Spectra can be interactively assigned to user-defined groups and the groupings recalled at a later time. The user can measure/record positions and intensities of spectral features, interactively spline-fit spectra, and normalize spectra by fitted splines. User-defined wavelengths can be automatically highlighted in SCAMP plots. The user can save/print annotated graphical output suitable for a scientific notebook depicting the work at any point. The ASP is a WWW portal that provides interactive access to two spectrum data sets: a library of synthetic stellar spectra and a library of laboratory PAH spectra. The synthetic stellar spectra in the ASP are appropriate to the giant branch with an assortment of compositions. Each spectrum spans the full range from 2 to 600 microns at a variety of resolutions. The ASP is designed to allow users to quickly identify individual features at any resolution that arise from any of the included isotopic species. The user may also retrieve the depth of formation of individual features at any resolution. PAH spectra accessible through the ASP are drawn from the extensive library of spectra measured by the NASA Ames Astrochemistry Laboratory. The user may interactively choose any subset of PAHs in the data set, combine them with user-defined weights and temperatures, and view/download the resultant spectrum at any user-defined resolution. This work was funded by the NASA Advanced Supercomputing Division, NASA Ames Research Center.

  13. PyRosetta: a script-based interface for implementing molecular modeling algorithms using Rosetta.

    PubMed

    Chaudhury, Sidhartha; Lyskov, Sergey; Gray, Jeffrey J

    2010-03-01

    PyRosetta is a stand-alone Python-based implementation of the Rosetta molecular modeling package that allows users to write custom structure prediction and design algorithms using the major Rosetta sampling and scoring functions. PyRosetta contains Python bindings to libraries that define Rosetta functions including those for accessing and manipulating protein structure, calculating energies and running Monte Carlo-based simulations. PyRosetta can be used in two ways: (i) interactively, using iPython and (ii) script-based, using Python scripting. Interactive mode contains a number of help features and is ideal for beginners while script-mode is best suited for algorithm development. PyRosetta has similar computational performance to Rosetta, can be easily scaled up for cluster applications and has been implemented for algorithms demonstrating protein docking, protein folding, loop modeling and design. PyRosetta is a stand-alone package available at http://www.pyrosetta.org under the Rosetta license which is free for academic and non-profit users. A tutorial, user's manual and sample scripts demonstrating usage are also available on the web site.

  14. PyRosetta: a script-based interface for implementing molecular modeling algorithms using Rosetta

    PubMed Central

    Chaudhury, Sidhartha; Lyskov, Sergey; Gray, Jeffrey J.

    2010-01-01

    Summary: PyRosetta is a stand-alone Python-based implementation of the Rosetta molecular modeling package that allows users to write custom structure prediction and design algorithms using the major Rosetta sampling and scoring functions. PyRosetta contains Python bindings to libraries that define Rosetta functions including those for accessing and manipulating protein structure, calculating energies and running Monte Carlo-based simulations. PyRosetta can be used in two ways: (i) interactively, using iPython and (ii) script-based, using Python scripting. Interactive mode contains a number of help features and is ideal for beginners while script-mode is best suited for algorithm development. PyRosetta has similar computational performance to Rosetta, can be easily scaled up for cluster applications and has been implemented for algorithms demonstrating protein docking, protein folding, loop modeling and design. Availability: PyRosetta is a stand-alone package available at http://www.pyrosetta.org under the Rosetta license which is free for academic and non-profit users. A tutorial, user's manual and sample scripts demonstrating usage are also available on the web site. Contact: pyrosetta@graylab.jhu.edu PMID:20061306

  15. Gstat: a program for geostatistical modelling, prediction and simulation

    NASA Astrophysics Data System (ADS)

    Pebesma, Edzer J.; Wesseling, Cees G.

    1998-01-01

    Gstat is a computer program for variogram modelling, and geostatistical prediction and simulation. It provides a generic implementation of the multivariable linear model with trends modelled as a linear function of coordinate polynomials or of user-defined base functions, and independent or dependent, geostatistically modelled, residuals. Simulation in gstat comprises conditional or unconditional (multi-) Gaussian sequential simulation of point values or block averages, or (multi-) indicator sequential simulation. Besides many of the popular options found in other geostatistical software packages, gstat offers the unique combination of (i) an interactive user interface for modelling variograms and generalized covariances (residual variograms), that uses the device-independent plotting program gnuplot for graphical display, (ii) support for several ascii and binary data and map file formats for input and output, (iii) a concise, intuitive and flexible command language, (iv) user customization of program defaults, (v) no built-in limits, and (vi) free, portable ANSI-C source code. This paper describes the class of problems gstat can solve, and addresses aspects of efficiency and implementation, managing geostatistical projects, and relevant technical details.

  16. Modularizing Spatial Ontologies for Assisted Living Systems

    NASA Astrophysics Data System (ADS)

    Hois, Joana

    Assisted living systems are intended to support daily-life activities in user homes by automatizing and monitoring behavior of the environment while interacting with the user in a non-intrusive way. The knowledge base of such systems therefore has to define thematically different aspects of the environment mostly related to space, such as basic spatial floor plan information, pieces of technical equipment in the environment and their functions and spatial ranges, activities users can perform, entities that occur in the environment, etc. In this paper, we present thematically different ontologies, each of which describing environmental aspects from a particular perspective. The resulting modular structure allows the selection of application-specific ontologies as necessary. This hides information and reduces complexity in terms of the represented spatial knowledge and reasoning practicability. We motivate and present the different spatial ontologies applied to an ambient assisted living application.

  17. Satellite services system analysis study. Volume 1, part 2: Executive summary

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The early mission model was developed through a survey of the potential user market. Service functions were defined and a group of design reference missions were selected which represented needs for each of the service functions. Servicing concepts were developed through mission analysis and STS timeline constraint analysis. The hardware needs for accomplishing the service functions were identified with emphasis being placed on applying equipment in the current NASA inventory and that in advanced stages of planning. A more comprehensive service model was developed based on the NASA and DoD mission models segregated by mission class. The number of service events of each class were estimated based on average revisit and service assumptions. Service Kits were defined as collections of equipment applicable to performing one or more service functions. Preliminary design was carrid out on a selected set of hardware needed for early service missions. The organization and costing of the satellie service systems were addressed.

  18. Swarm formation control utilizing elliptical surfaces and limiting functions.

    PubMed

    Barnes, Laura E; Fields, Mary Anne; Valavanis, Kimon P

    2009-12-01

    In this paper, we present a strategy for organizing swarms of unmanned vehicles into a formation by utilizing artificial potential fields that were generated from normal and sigmoid functions. These functions construct the surface on which swarm members travel, controlling the overall swarm geometry and the individual member spacing. Nonlinear limiting functions are defined to provide tighter swarm control by modifying and adjusting a set of control variables that force the swarm to behave according to set constraints, formation, and member spacing. The artificial potential functions and limiting functions are combined to control swarm formation, orientation, and swarm movement as a whole. Parameters are chosen based on desired formation and user-defined constraints. This approach is computationally efficient and scales well to different swarm sizes, to heterogeneous systems, and to both centralized and decentralized swarm models. Simulation results are presented for a swarm of 10 and 40 robots that follow circle, ellipse, and wedge formations. Experimental results are included to demonstrate the applicability of the approach on a swarm of four custom-built unmanned ground vehicles (UGVs).

  19. AMICAL: An aid for architectural synthesis and exploration of control circuits

    NASA Astrophysics Data System (ADS)

    Park, Inhag

    AMICAL is an architectural synthesis system for control flow dominated circuits. A behavioral finite state machine specification, where the scheduling and register allocation were performed, is presented. An abstract architecture specification that may feed existing silicon compilers acting at the logic and register transfer levels is described. AMICAL consists of five main functions allowing automatic, interactive and manual synthesis, as well as the combination of these methods. These functions are a synthesizer, a graphics editor, a verifier, an evaluator, and a documentor. Automatic synthesis is achieved by algorithms that allocate both functional units, stored in an expandable user defined library, and connections. AMICAL also allows the designer to interrupt the synthesis process at any stage and make interactive modifications via a specially designed graphics editor. The user's modifications are verified and evaluated to ensure that no design rules are broken and that any imposed constraints are still met. A documentor provides the designer with status and feedback reports from the synthesis process.

  20. Altered resting-state connectivity in adolescent cannabis users.

    PubMed

    Orr, Catherine; Morioka, Rowen; Behan, Brendan; Datwani, Sameer; Doucet, Marika; Ivanovic, Jelena; Kelly, Clare; Weierstall, Karen; Watts, Richard; Smyth, Bobby; Garavan, Hugh

    2013-11-01

    Cannabis is the most commonly used illicit drug in adolescence. Heavy use is associated with deficits on a broad range of cognitive functions and heavy use during adolescence may impact development of gray and white matter. To examine differences in intrinsic brain activity and connectivity associated with cannabis dependence in adolescence using whole-brain voxelwise approaches. Adolescents admitted to a drug-treatment facility for cannabis dependence (n = 17) and age-matched controls (n = 18) were compared on a measure of oscillations in the low-frequency blood oxygen level-dependent signal at rest (the fractional amplitude of low-frequency fluctuations fALFF, 0.01-0.1 Hz) and interhemispheric resting-state functional connectivity (RSFC) using voxel-mirrored homotopic connectivity. The cannabis-dependent population showed increased fALFF activity compared to the control group in right hemisphere regions including the superior parietal gyrus, superior frontal gyrus, inferior frontal gyrus, inferior semilunar lobe of the cerebellum and the inferior temporal gyrus. Post-hoc analyses revealed stronger intra-hemispheric functional connectivity between these functionally defined regions of interest (ROIs) in the cannabis-dependent population than in the controls. Reduced interhemispheric connectivity was observed in the cannabis users compared to controls in the pyramis of the cerebellum and the superior frontal gyrus. Controls showed reduced interhemispheric connectivity compared to users in the supramarginal gyrus. The reduced interhemispheric RSFC in adolescent cannabis users complements previous reports of white matter deficits associated with cannabis use. The evidence of elevated connectivity within the right hemisphere may reflect a compensatory mechanism. Combined, the results suggest that altered intrinsic connectivity may be characteristic of adolescent cannabis dependence.

  1. Rational-spline approximation with automatic tension adjustment

    NASA Technical Reports Server (NTRS)

    Schiess, J. R.; Kerr, P. A.

    1984-01-01

    An algorithm for weighted least-squares approximation with rational splines is presented. A rational spline is a cubic function containing a distinct tension parameter for each interval defined by two consecutive knots. For zero tension, the rational spline is identical to a cubic spline; for very large tension, the rational spline is a linear function. The approximation algorithm incorporates an algorithm which automatically adjusts the tension on each interval to fulfill a user-specified criterion. Finally, an example is presented comparing results of the rational spline with those of the cubic spline.

  2. Query by forms: User-oriented relational database retrieving system and its application in analysis of experiment data

    NASA Astrophysics Data System (ADS)

    Skotniczny, Zbigniew

    1989-12-01

    The Query by Forms (QbF) system is a user-oriented interactive tool for querying large relational database with minimal queries difinition cost. The system was worked out under the assumption that user's time and effort for defining needed queries is the most severe bottleneck. The system may be applied in any Rdb/VMS databases system and is recommended for specific information systems of any project where end-user queries cannot be foreseen. The tool is dedicated to specialist of an application domain who have to analyze data maintained in database from any needed point of view, who do not need to know commercial databases languages. The paper presents the system developed as a compromise between its functionality and usability. User-system communication via a menu-driven "tree-like" structure of screen-forms which produces a query difinition and execution is discussed in detail. Output of query results (printed reports and graphics) is also discussed. Finally the paper shows one application of QbF to a HERA-project.

  3. Practical multipeptide synthesis: dedicated software for the definition of multiple, overlapping peptides covering polypeptide sequences.

    PubMed

    Heegaard, P M; Holm, A; Hagerup, M

    1993-01-01

    A personal computer program for the conversion of linear amino acid sequences to multiple, small, overlapping peptide sequences has been developed. Peptide lengths and "jumps" (the distance between two consecutive overlapping peptides) are defined by the user. To facilitate the use of the program for parallel solid-phase chemical peptide syntheses for the synchronous production of multiple peptides, amino acids at each acylation step are laid out by the program in a convenient standard multi-well setup. Also, the total number of equivalents, as well as the derived amount in milligrams (depend-ending on user-defined equivalent weights and molar surplus), of each amino acid are given. The program facilitates the implementation of multipeptide synthesis, e.g., for the elucidation of polypeptide structure-function relationships, and greatly reduces the risk of introducing mistakes at the planning step. It is written in Pascal and runs on any DOS-based personal computer. No special graphic display is needed.

  4. Development Requirements for Spacesuit Elbow Joint

    NASA Technical Reports Server (NTRS)

    Peters, Benjamin

    2017-01-01

    Functional Requirements for spacesuit elbow joint:1) The system is a conformal, single-axis spacesuit pressurized joint that encloses the elbow joint of the suited user and uses a defined interface to connect to the suit systems on either side of the joint.2) The system shall be designed to bear the loads incurred from the internal pressure of the system, as well as the expected loads induced by the user while enabling the user move the joint through the required range of motion. The joint torque of the system experienced by the user shall remain at or below the required specification for the entire range of motion.3) The design shall be constructed, at a minimum, as a two-layer system. The internal, air-tight layer shall be referred to as the bladder, and the layer on the unpressurized side of the bladder shall be referred to as the restraint. The design of the system may include additional features or layers, such as axial webbing, to meet the overall requirements of the design.

  5. Constraint programming based biomarker optimization.

    PubMed

    Zhou, Manli; Luo, Youxi; Sun, Guoquan; Mai, Guoqin; Zhou, Fengfeng

    2015-01-01

    Efficient and intuitive characterization of biological big data is becoming a major challenge for modern bio-OMIC based scientists. Interactive visualization and exploration of big data is proven to be one of the successful solutions. Most of the existing feature selection algorithms do not allow the interactive inputs from users in the optimizing process of feature selection. This study investigates this question as fixing a few user-input features in the finally selected feature subset and formulates these user-input features as constraints for a programming model. The proposed algorithm, fsCoP (feature selection based on constrained programming), performs well similar to or much better than the existing feature selection algorithms, even with the constraints from both literature and the existing algorithms. An fsCoP biomarker may be intriguing for further wet lab validation, since it satisfies both the classification optimization function and the biomedical knowledge. fsCoP may also be used for the interactive exploration of bio-OMIC big data by interactively adding user-defined constraints for modeling.

  6. Enabling User Preferences Through Data Exchange

    DOT National Transportation Integrated Search

    1997-08-01

    This paper describes a process, via user- air traffic management (ATM) data : exchange, for enabling user preferences in an ATM-based system. User : preferences may be defined in terms of a four-dimensional (4D) user-preferred : trajectory, or a seri...

  7. Development of a mental health smartphone app: perspectives of mental health service users.

    PubMed

    Goodwin, John; Cummins, John; Behan, Laura; O'Brien, Sinead M

    2016-10-01

    Current mental health policy emphasises the importance of service user involvement in the delivery of care. Information Technology can have an effect on quality and efficiency of care. The aim of this study is to gain the viewpoint of service users from a local mental health service in developing a mental health app. A qualitative descriptive approach was used. Eight volunteers aged 18-49 years were interviewed with the aid of a semi-structured questionnaire. Interviewees defined a good app by its ease of use. Common themes included availability of contact information, identifying triggers, the ability to rate mood/anxiety levels on a scale, guided relaxation techniques, and the option to personalise the app. The researchers will aim to produce an app that is easily accessible, highly personalisable and will include functions highlighted as important (i.e. contact information, etc.). This research will assist in the development of an easy-to-use app that could increase access to services, and allow service users to take an active role in their care. In previous studies, apps were developed without the involvement of service users. This study recognises the important role of service users in this area.

  8. Everyday life for users of electric wheelchairs - a qualitative interview study.

    PubMed

    Blach Rossen, Camilla; Sørensen, Bodil; Würtz Jochumsen, Bente; Wind, Gitte

    2012-09-01

    The aim of this paper is to explore how users of electric wheelchairs experience their everyday life and how their electric wheelchairs influence their daily occupation. Occupation is defined as a personalized dynamic interaction between person, task and environment, and implies the value and meaning attached. Nine semi-structured interviews were conducted with experienced electric wheelchair users. ValMo was used as the theoretical framework for both interviewing and the analysis. The transcribed interviews were analysed using thematic analysis. Findings revealed key elements in electric wheelchair users' experience of how the use of a wheelchair influences everyday life and occupation. Four central themes emerged from the participants' experiences 1) The functionality of the wheelchair, 2) The wheelchair as an extension of the body, 3) The wheelchair and social life, and 4) The wheelchair and identity issues. The themes were interrelated and show how all levels of occupation were influenced both in a positive and negative way, and how it affected identity. It is essential that professionals working with electric wheelchair users are aware of how all levels of occupation and identity are influenced by using a wheelchair. This will assist professionals in supporting the users living an autonomous and meaningful life.

  9. Design document for the MOODS Data Management System (MDMS), version 1.0

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The MOODS Data Management System (MDMS) provides access to the Master Oceanographic Observation Data Set (MOODS) which is maintained by the Naval Oceanographic Office (NAVOCEANO). The MDMS incorporates database technology in providing seamless access to parameter (temperature, salinity, soundspeed) vs. depth observational profile data. The MDMS is an interactive software application with a graphical user interface (GUI) that supports user control of MDMS functional capabilities. The purpose of this document is to define and describe the structural framework and logical design of the software components/units which are integrated into the major computer software configuration item (CSCI) identified as MDMS, Version 1.0. The preliminary design is based on functional specifications and requirements identified in the governing Statement of Work prepared by the Naval Oceanographic Office (NAVOCEANO) and distributed as a request for proposal by the National Aeronautics and Space Administration (NASA).

  10. Design document for the Surface Currents Data Base (SCDB) Management System (SCDBMS), version 1.0

    NASA Technical Reports Server (NTRS)

    Krisnnamagaru, Ramesh; Cesario, Cheryl; Foster, M. S.; Das, Vishnumohan

    1994-01-01

    The Surface Currents Database Management System (SCDBMS) provides access to the Surface Currents Data Base (SCDB) which is maintained by the Naval Oceanographic Office (NAVOCEANO). The SCDBMS incorporates database technology in providing seamless access to surface current data. The SCDBMS is an interactive software application with a graphical user interface (GUI) that supports user control of SCDBMS functional capabilities. The purpose of this document is to define and describe the structural framework and logistical design of the software components/units which are integrated into the major computer software configuration item (CSCI) identified as the SCDBMS, Version 1.0. The preliminary design is based on functional specifications and requirements identified in the governing Statement of Work prepared by the Naval Oceanographic Office (NAVOCEANO) and distributed as a request for proposal by the National Aeronautics and Space Administration (NASA).

  11. Path tracking control of an omni-directional walker considering pressures from a user.

    PubMed

    Tan, Renpeng; Wang, Shuoyu; Jiang, Yinlai; Ishida, Kenji; Fujie, Masakatsu G

    2013-01-01

    An omni-directional walker (ODW) is being developed to support the people with walking disabilities to do walking rehabilitation. The training paths, which the user follows in the rehabilitation, are defined by physical therapists and stored in the ODW. In order to obtain a good training effect, the defined training paths need to be performed accurately. However, the ODW deviates from the training path in real rehabilitation, which is caused by the variation of the whole system's parameters due to the force from the user. In this paper, the characteristics of pressures from a user are measured, based on which an adaptive controller is proposed to deal with this problem, and validated in an experiment in which a pseudo handicapped person follows the ODW. The experimental results show that the proposed method can control the ODW to accurately follow the defined path with or without a user.

  12. Use of complementary and alternative medicine among patients: classification criteria determine level of use.

    PubMed

    Kristoffersen, Agnete Egilsdatter; Fønnebø, Vinjar; Norheim, Arne Johan

    2008-10-01

    Self-reported use of complementary and alternative medicine (CAM) among patients varies widely between studies, possibly because the definition of a CAM user is not comparable. This makes it difficult to compare studies. The aim of this study is to present a six-level model for classifying patients' reported exposure to CAM. Prayer, physical exercise, special diets, over-the-counter products/CAM techniques, and personal visits to a CAM practitioner are successively removed from the model in a reductive fashion. By applying the model to responses given by Norwegian patients with cancer, we found that 72% use CAM if the user was defined to include all types of CAM. This proportion was reduced successively to only 11% in the same patient group when a CAM user was defined as a user visiting a CAM practitioner four or more times. When considering a sample of 10 recently published studies of CAM use among patients with breast cancer, we found 98% use when the CAM user was defined to include all sorts of CAM. This proportion was reduced successively to only 20% when a CAM user was defined as a user of a CAM practitioner. We recommend future surveys of CAM use to report at more than one level and to clarify which intensity level of CAM use the report is based on.

  13. SEQ-REVIEW: A tool for reviewing and checking spacecraft sequences

    NASA Astrophysics Data System (ADS)

    Maldague, Pierre F.; El-Boushi, Mekki; Starbird, Thomas J.; Zawacki, Steven J.

    1994-11-01

    A key component of JPL's strategy to make space missions faster, better and cheaper is the Advanced Multi-Mission Operations System (AMMOS), a ground software intensive system currently in use and in further development. AMMOS intends to eliminate the cost of re-engineering a ground system for each new JPL mission. This paper discusses SEQ-REVIEW, a component of AMMOS that was designed to facilitate and automate the task of reviewing and checking spacecraft sequences. SEQ-REVIEW is a smart browser for inspecting files created by other sequence generation tools in the AMMOS system. It can parse sequence-related files according to a computer-readable version of a 'Software Interface Specification' (SIS), which is a standard document for defining file formats. It lets users display one or several linked files and check simple constraints using a Basic-like 'Little Language'. SEQ-REVIEW represents the first application of the Quality Function Development (QFD) method to sequence software development at JPL. The paper will show how the requirements for SEQ-REVIEW were defined and converted into a design based on object-oriented principles. The process starts with interviews of potential users, a small but diverse group that spans multiple disciplines and 'cultures'. It continues with the development of QFD matrices that related product functions and characteristics to user-demanded qualities. These matrices are then turned into a formal Software Requirements Document (SRD). The process concludes with the design phase, in which the CRC (Class, Responsibility, Collaboration) approach was used to convert requirements into a blueprint for the final product.

  14. SEQ-REVIEW: A tool for reviewing and checking spacecraft sequences

    NASA Technical Reports Server (NTRS)

    Maldague, Pierre F.; El-Boushi, Mekki; Starbird, Thomas J.; Zawacki, Steven J.

    1994-01-01

    A key component of JPL's strategy to make space missions faster, better and cheaper is the Advanced Multi-Mission Operations System (AMMOS), a ground software intensive system currently in use and in further development. AMMOS intends to eliminate the cost of re-engineering a ground system for each new JPL mission. This paper discusses SEQ-REVIEW, a component of AMMOS that was designed to facilitate and automate the task of reviewing and checking spacecraft sequences. SEQ-REVIEW is a smart browser for inspecting files created by other sequence generation tools in the AMMOS system. It can parse sequence-related files according to a computer-readable version of a 'Software Interface Specification' (SIS), which is a standard document for defining file formats. It lets users display one or several linked files and check simple constraints using a Basic-like 'Little Language'. SEQ-REVIEW represents the first application of the Quality Function Development (QFD) method to sequence software development at JPL. The paper will show how the requirements for SEQ-REVIEW were defined and converted into a design based on object-oriented principles. The process starts with interviews of potential users, a small but diverse group that spans multiple disciplines and 'cultures'. It continues with the development of QFD matrices that related product functions and characteristics to user-demanded qualities. These matrices are then turned into a formal Software Requirements Document (SRD). The process concludes with the design phase, in which the CRC (Class, Responsibility, Collaboration) approach was used to convert requirements into a blueprint for the final product.

  15. Real-Time fMRI Pattern Decoding and Neurofeedback Using FRIEND: An FSL-Integrated BCI Toolbox

    PubMed Central

    Sato, João R.; Basilio, Rodrigo; Paiva, Fernando F.; Garrido, Griselda J.; Bramati, Ivanei E.; Bado, Patricia; Tovar-Moll, Fernanda; Zahn, Roland; Moll, Jorge

    2013-01-01

    The demonstration that humans can learn to modulate their own brain activity based on feedback of neurophysiological signals opened up exciting opportunities for fundamental and applied neuroscience. Although EEG-based neurofeedback has been long employed both in experimental and clinical investigation, functional MRI (fMRI)-based neurofeedback emerged as a promising method, given its superior spatial resolution and ability to gauge deep cortical and subcortical brain regions. In combination with improved computational approaches, such as pattern recognition analysis (e.g., Support Vector Machines, SVM), fMRI neurofeedback and brain decoding represent key innovations in the field of neuromodulation and functional plasticity. Expansion in this field and its applications critically depend on the existence of freely available, integrated and user-friendly tools for the neuroimaging research community. Here, we introduce FRIEND, a graphic-oriented user-friendly interface package for fMRI neurofeedback and real-time multivoxel pattern decoding. The package integrates routines for image preprocessing in real-time, ROI-based feedback (single-ROI BOLD level and functional connectivity) and brain decoding-based feedback using SVM. FRIEND delivers an intuitive graphic interface with flexible processing pipelines involving optimized procedures embedding widely validated packages, such as FSL and libSVM. In addition, a user-defined visual neurofeedback module allows users to easily design and run fMRI neurofeedback experiments using ROI-based or multivariate classification approaches. FRIEND is open-source and free for non-commercial use. Processing tutorials and extensive documentation are available. PMID:24312569

  16. Presentation of computer code SPIRALI for incompressible, turbulent, plane and spiral grooved cylindrical and face seals

    NASA Technical Reports Server (NTRS)

    Walowit, Jed A.

    1994-01-01

    A viewgraph presentation is made showing the capabilities of the computer code SPIRALI. Overall capabilities of SPIRALI include: computes rotor dynamic coefficients, flow, and power loss for cylindrical and face seals; treats turbulent, laminar, Couette, and Poiseuille dominated flows; fluid inertia effects are included; rotor dynamic coefficients in three (face) or four (cylindrical) degrees of freedom; includes effects of spiral grooves; user definable transverse film geometry including circular steps and grooves; independent user definable friction factor models for rotor and stator; and user definable loss coefficients for sudden expansions and contractions.

  17. A NEO population generation and observation simulation software tool

    NASA Astrophysics Data System (ADS)

    Müller, Sven; Gelhaus, Johannes; Hahn, Gerhard; Franco, Raffaella

    One of the main targets of ESA's Space Situational Awareness (SSA) program is to build a wide knowledge base about objects that can potentially harm Earth (Near-Earth Objects, NEOs). An important part of this effort is to create the Small Bodies Data Centre (SBDC) which is going to aggregate measurement data from a fully-integrated NEO observation sensor network. Until this network is developed, artificial NEO measurement data is needed in order to validate SBDC algorithms. Moreover, to establish a functioning NEO observation sensor network, it has to be determined where to place sensors, what technical requirements have to be met in order to be able to detect NEOs and which observation strategies work the best. Because of this, a sensor simulation software was needed. This paper presents a software tool which allows users to create and analyse NEO populations and to simulate and analyse population observations. It is a console program written in Fortran and comes with a Graphical User Interface (GUI) written in Java and C. The tool can be distinguished into the components ``Population Generator'' and ``Observation Simulator''. The Population Generator component is responsible for generating and analysing a NEO population. Users can choose between creating fictitious (random) and synthetic populations. The latter are based on one of two models describing the orbital and size distribution of observed NEOs: The existing socalled ``Bottke Model'' (Bottke et al. 2000, 2002) and the new ``Granvik Model'' (Granvik et al. 2014, in preparation) which has been developed in parallel to the tool. Generated populations can be analysed by defining 2D, 3D and scatter plots using various NEO attributes. As a result, the tool creates the appropiate files for the plotting tool ``gnuplot''. The tool's Observation Simulator component yields the Observation Simulation and Observation Analysis functions. Users can define sensor systems using ground- or space-based locations as well as optical or radar sensors and simulate observation campaigns. The tool outputs field-of-view crossings and actual detections of the selected NEO population objects. Using the Observation Analysis users are able to process and plot the results of the Observation Simulation. In order to enable end-users to handle the tool in a user-intuitive and comfortable way, a GUI has been created based on the modular Eclipse Rich Client Platform (RCP) technology. Through the GUI users can easily enter input data for the tool, execute it and view its output data in a clear way. Additionally, the GUI runs gnuplot to create plot pictures and presents them to the user. Furthermore, users can create projects to organise executions of the tool.

  18. CheD: chemical database compilation tool, Internet server, and client for SQL servers.

    PubMed

    Trepalin, S V; Yarkov, A V

    2001-01-01

    An efficient program, which runs on a personal computer, for the storage, retrieval, and processing of chemical information, is presented, The program can work both as a stand-alone application or in conjunction with a specifically written Web server application or with some standard SQL servers, e.g., Oracle, Interbase, and MS SQL. New types of data fields are introduced, e.g., arrays for spectral information storage, HTML and database links, and user-defined functions. CheD has an open architecture; thus, custom data types, controls, and services may be added. A WWW server application for chemical data retrieval features an easy and user-friendly installation on Windows NT or 95 platforms.

  19. Defining health-related quality of life for young wheelchair users: A qualitative health economics study

    PubMed Central

    2017-01-01

    Background Wheelchairs for children with impaired mobility provide health, developmental and psychosocial benefits, however there is limited understanding of how mobility aids affect the health-related quality of life of children with impaired mobility. Preference-based health-related quality of life outcome measures are used to calculate quality-adjusted life years; an important concept in health economics. The aim of this research was to understand how young wheelchair users and their parents define health-related quality of life in relation to mobility impairment and wheelchair use. Methods The sampling frame was children with impaired mobility (≤18 years) who use a wheelchair and their parents. Data were collected through semi-structured face-to-face interviews conducted in participants’ homes. Qualitative framework analysis was used to analyse the interview transcripts. An a priori thematic coding framework was developed. Emerging codes were grouped into categories, and refined into analytical themes. The data were used to build an understanding of how children with impaired mobility define health-related quality of life in relation to mobility impairment, and to assess the applicability of two standard measures of health-related quality of life. Results Eleven children with impaired mobility and 24 parents were interviewed across 27 interviews. Participants defined mobility-related quality of life through three distinct but interrelated concepts: 1) participation and positive experiences; 2) self-worth and feeling fulfilled; 3) health and functioning. A good degree of consensus was found between child and parent responses, although there was some evidence to suggest a shift in perception of mobility-related quality of life with child age. Conclusions Young wheelchair users define health-related quality of life in a distinct way as a result of their mobility impairment and adaptation use. Generic, preference-based measures of health-related quality of life lack sensitivity in this population. Development of a mobility-related quality of life outcome measure for children is recommended. PMID:28617820

  20. Optimization techniques applied to spectrum management for communications satellites

    NASA Astrophysics Data System (ADS)

    Ottey, H. R.; Sullivan, T. M.; Zusman, F. S.

    This paper describes user requirements, algorithms and software design features for the application of optimization techniques to the management of the geostationary orbit/spectrum resource. Relevant problems include parameter sensitivity analyses, frequency and orbit position assignment coordination, and orbit position allotment planning. It is shown how integer and nonlinear programming as well as heuristic search techniques can be used to solve these problems. Formalized mathematical objective functions that define the problems are presented. Constraint functions that impart the necessary solution bounds are described. A versatile program structure is outlined, which would allow problems to be solved in stages while varying the problem space, solution resolution, objective function and constraints.

  1. LEM-CF Premixed Tool Kit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-01-19

    The purpose of LEM-CF Premixed Tool Kit is to process premixed flame simulation data from the LEM-CF solver (https://fileshare.craft-tech.com/clusters/view/lem-cf) into a large-eddy simulation (LES) subgrid model database. These databases may be used with a user-defined-function (UDF) that is included in the Tool Kit. The subgrid model UDF may be used with the ANSYS FLUENT flow solver or other commercial flow solvers.

  2. Prowess - A Software Model for the Ooty Wide Field Array

    NASA Astrophysics Data System (ADS)

    Marthi, Visweshwar Ram

    2017-03-01

    One of the scientific objectives of the Ooty Wide Field Array (OWFA) is to observe the redshifted H i emission from z ˜ 3.35. Although predictions spell out optimistic outcomes in reasonable integration times, these studies were based purely on analytical assumptions, without accounting for limiting systematics. A software model for OWFA has been developed with a view to understanding the instrument-induced systematics, by describing a complete software model for the instrument. This model has been implemented through a suite of programs, together called Prowess, which has been conceived with the dual role of an emulator as well as observatory data analysis software. The programming philosophy followed in building Prowess enables a general user to define an own set of functions and add new functionality. This paper describes a co-ordinate system suitable for OWFA in which the baselines are defined. The foregrounds are simulated from their angular power spectra. The visibilities are then computed from the foregrounds. These visibilities are then used for further processing, such as calibration and power spectrum estimation. The package allows for rich visualization features in multiple output formats in an interactive fashion, giving the user an intuitive feel for the data. Prowess has been extensively used for numerical predictions of the foregrounds for the OWFA H i experiment.

  3. An Interpreted Language and System for the Visualization of Unstructured Meshes

    NASA Technical Reports Server (NTRS)

    Moran, Patrick J.; Gerald-Yamasaki, Michael (Technical Monitor)

    1998-01-01

    We present an interpreted language and system supporting the visualization of unstructured meshes and the manipulation of shapes defined in terms of mesh subsets. The language features primitives inspired by geometric modeling, mathematical morphology and algebraic topology. The adaptation of the topology ideas to an interpreted environment, along with support for programming constructs such, as user function definition, provide a flexible system for analyzing a mesh and for calculating with shapes defined in terms of the mesh. We present results demonstrating some of the capabilities of the language, based on an implementation called the Shape Calculator, for tetrahedral meshes in R^3.

  4. Review of Soil Models and Their Implementation in Multibody System Algorithms

    DTIC Science & Technology

    2012-02-01

    models for use with ABAQUS . The constitutive models of the user defined materials can be programmed in the user subroutine UMAT. Many user defined...mechanical characteristics of mildly or moderately expansive unsaturated soils. As originally proposed by Alonso, utilizing a critical state framework...review of some of these programs is presented. ABAQUS ABAQUS is a popular FE analysis program that contains a wide variety of material models and

  5. Adding and Removing Web Area Users, and Changing User Roles

    EPA Pesticide Factsheets

    Webmasters can add users to a web area, and assign or change roles, which define the actions a user is able to take in the web area. Non-webmasters must use a request form to add users and change roles.

  6. Flexible Method for Inter-object Communication in C++

    NASA Technical Reports Server (NTRS)

    Curlett, Brian P.; Gould, Jack J.

    1994-01-01

    A method has been developed for organizing and sharing large amounts of information between objects in C++ code. This method uses a set of object classes to define variables and group them into tables. The variable tables presented here provide a convenient way of defining and cataloging data, as well as a user-friendly input/output system, a standardized set of access functions, mechanisms for ensuring data integrity, methods for interprocessor data transfer, and an interpretive language for programming relationships between parameters. The object-oriented nature of these variable tables enables the use of multiple data types, each with unique attributes and behavior. Because each variable provides its own access methods, redundant table lookup functions can be bypassed, thus decreasing access times while maintaining data integrity. In addition, a method for automatic reference counting was developed to manage memory safely.

  7. Investigating the International Classification of Functioning, Disability, and Health (ICF) Framework to Capture User Needs in the Concept Stage of Rehabilitation Technology Development.

    PubMed

    Sivan, Manoj; Gallagher, Justin; Holt, Ray; Weightman, Andy; Levesley, Martin; Bhakta, Bipin

    2014-01-01

    This study evaluates whether the International Classification of Functioning, Disability, and Health (ICF) framework provides a useful basis to ensure that key user needs are identified in the development of a home-based arm rehabilitation system for stroke patients. Using a qualitative approach, nine people with residual arm weakness after stroke and six healthcare professionals with expertise in stroke rehabilitation were enrolled in the user-centered design process. They were asked, through semi-structured interviews, to define the needs and specification for a potential home-based rehabilitation device to facilitate self-managed arm exercise. The topic list for the interviews was derived by brainstorming ideas within the clinical and engineering multidisciplinary research team based on previous experience and existing literature in user-centered design. Meaningful concepts were extracted from questions and responses of these interviews. These concepts obtained were matched to the categories within the ICF comprehensive core set for stroke using ICF linking rules. Most of the concepts extracted from the interviews matched to the existing ICF Core Set categories. Person factors like gender, age, interest, compliance, motivation, choice, and convenience that might determine device usability are yet to be categorized within the ICF comprehensive core set. The results suggest that the categories of the comprehensive ICF Core Set for stroke provide a useful basis for structuring interviews to identify most users needs. However some personal factors (related to end users and healthcare professionals) need to be considered in addition to the ICF categories.

  8. Distributed analysis functional testing using GangaRobot in the ATLAS experiment

    NASA Astrophysics Data System (ADS)

    Legger, Federica; ATLAS Collaboration

    2011-12-01

    Automated distributed analysis tests are necessary to ensure smooth operations of the ATLAS grid resources. The HammerCloud framework allows for easy definition, submission and monitoring of grid test applications. Both functional and stress test applications can be defined in HammerCloud. Stress tests are large-scale tests meant to verify the behaviour of sites under heavy load. Functional tests are light user applications running at each site with high frequency, to ensure that the site functionalities are available at all times. Success or failure rates of these tests jobs are individually monitored. Test definitions and results are stored in a database and made available to users and site administrators through a web interface. In this work we present the recent developments of the GangaRobot framework. GangaRobot monitors the outcome of functional tests, creates a blacklist of sites failing the tests, and exports the results to the ATLAS Site Status Board (SSB) and to the Service Availability Monitor (SAM), providing on the one hand a fast way to identify systematic or temporary site failures, and on the other hand allowing for an effective distribution of the work load on the available resources.

  9. Potential markets for a satellite-based mobile communications system

    NASA Technical Reports Server (NTRS)

    Jamieson, W. M.; Peet, C. S.; Bengston, R. J.

    1976-01-01

    The objective of the study was to define the market needs for improved land mobile communications systems. Within the context of this objective, the following goals were set: (1) characterize the present mobile communications industry; (2) determine the market for an improved system for mobile communications; and (3) define the system requirements as seen from the potential customer's viewpoint. The scope of the study was defined by the following parameters: (1) markets were confined to U.S. and Canada; (2) range of operation generally exceeded 20 miles, but this was not restrictive; (3) the classes of potential users considered included all private sector users, and non-military public sector users; (4) the time span examined was 1975 to 1985; and (5) highly localized users were generally excluded - e.g., taxicabs, and local paging.

  10. Toward a Behavioral Approach to Privacy for Online Social Networks

    NASA Astrophysics Data System (ADS)

    Banks, Lerone D.; Wu, S. Felix

    We examine the correlation between user interactions and self reported information revelation preferences for users of the popular Online Social Network (OSN), Facebook. Our primary goal is to explore the use of indicators of tie strength to inform localized, per-user privacy preferences for users and their ties within OSNs. We examine the limitations of such an approach and discuss future plans to incorporate this approach into the development of an automated system for helping users define privacy policy. As part of future work, we discuss how to define/expand policy to the entire social network. We also present additional collected data similar to other studies such as perceived tie strength and information revelation preferences for OSN users.

  11. Using NetMaster to manage IBM networks

    NASA Technical Reports Server (NTRS)

    Ginsburg, Guss

    1991-01-01

    After defining a network and conveying its importance to support the activities at the JSC, the need for network management based on the size and complexity of the IBM SNA network at JSC is demonstrated. Network Management consists of being aware of component status and the ability to control resources to meet the availability and service needs of users. The concerns of the user are addressed as well as those of the staff responsible for managing the network. It is explained how NetMaster (a network management system for managing SNA networks) is used to enhance reliability and maximize service to SNA network users through automated procedures. The following areas are discussed: customization, problem and configuration management, and system measurement applications of NetMaster. Also, several examples are given that demonstrate NetMaster's ability to manage and control the network, integrate various product functions, as well as provide useful management information.

  12. A process for prototyping onboard payload displays for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Moore, Loretta A.

    1992-01-01

    Significant advances have been made in the area of Human-Computer Interface design. However, there is no well-defined process for going from user interface requirements to user interface design. Developing and designing a clear and consistent user interface for medium to large scale systems is a very challenging and complex task. The task becomes increasingly difficult when there is very little guidance and procedures on how the development process should flow from one stage to the next. Without a specific sequence of development steps each design becomes difficult to repeat, to evaluate, to improve, and to articulate to others. This research contributes a process which identifies the phases of development and products produced as a result of each phase for a rapid prototyping process to be used to develop requirements for the onboard payload displays for Space Station Freedom. The functional components of a dynamic prototyping environment in which this process can be carried out is also discussed. Some of the central questions which are answered here include: How does one go from specifications to an actual prototype? How is a prototype evaluated? How is usability defined and thus measured? How do we use the information from evaluation in redesign of an interface? and Are there techniques which allow for convergence on a design?

  13. SCORE user`s manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, S.A.

    SABrE is a set of tools to facilitate the development of portable scientific software and to visualize scientific data. As with most constructs, SABRE has a foundation. In this case that foundation is SCORE. SCORE (SABRE CORE) has two main functions. The first and perhaps most important is to smooth over the differences between different C implementations and define the parameters which drive most of the conditional compilations in the rest of SABRE. Secondly, it contains several groups of functionality that are used extensively throughout SABRE. Although C is highly standardized now, that has not always been the case. Roughlymore » speaking C compilers fall into three categories: ANSI standard; derivative of the Portable C Compiler (Kernighan and Ritchie); and the rest. SABRE has been successfully ported to many ANSI and PCC systems. It has never been successfully ported to a system in the last category. The reason is mainly that the ``standard`` C library supplied with such implementations is so far from true ANSI or PCC standard that SABRE would have to include its own version of the standard C library in order to work at all. Even with standardized compilers life is not dead simple. The ANSI standard leaves several crucial points ambiguous as ``implementation defined.`` Under these conditions one can find significant differences in going from one ANSI standard compiler to another. SCORE`s job is to include the requisite standard headers and ensure that certain key standard library functions exist and function correctly (there are bugs in the standard library functions supplied with some compilers) so that, to applications which include the SCORE header(s) and load with SCORE, all C implementations look the same.« less

  14. Oscillatory serotonin function in depression.

    PubMed

    Salomon, Ronald M; Cowan, Ronald L

    2013-11-01

    Oscillations in brain activities with periods of minutes to hours may be critical for normal mood behaviors. Ultradian (faster than circadian) rhythms of mood behaviors and associated central nervous system activities are altered in depression. Recent data suggest that ultradian rhythms in serotonin (5HT) function also change in depression. In two separate studies, 5HT metabolites in cerebrospinal fluid (CSF) were measured every 10 min for 24 h before and after chronic antidepressant treatment. Antidepressant treatments were associated with enhanced ultradian amplitudes of CSF metabolite levels. Another study used resting-state functional magnetic resonance imaging (fMRI) to measure amplitudes of dorsal raphé activation cycles following sham or active dietary depletions of the 5HT precursor (tryptophan). During depletion, amplitudes of dorsal raphé activation cycles increased with rapid 6 s periods (about 0.18 Hz) while functional connectivity weakened between dorsal raphé and thalamus at slower periods of 20 s (0.05 Hz). A third approach studied MDMA (ecstasy, 3,4-methylenedioxy-N-methylamphetamine) users because of their chronically diminished 5HT function compared with non-MDMA polysubstance users (Karageorgiou et al., 2009). Compared with a non-MDMA using cohort, MDMA users showed diminished fMRI intra-regional coherence in motor regions along with altered functional connectivity, again suggesting effects of altered 5HT oscillatory function. These data support a hypothesis that qualities of ultradian oscillations in 5HT function may critically influence moods and behaviors. Dysfunctional 5HT rhythms in depression may be a common endpoint and biomarker for depression, linking dysfunction of slow brain network oscillators to 5HT mechanisms affected by commonly available treatments. 5HT oscillatory dysfunction may define illness subtypes and predict responses to serotonergic agents. Further studies of 5HT oscillations in depression are indicated. Copyright © 2013 Wiley Periodicals, Inc.

  15. OOM - OBJECT ORIENTATION MANIPULATOR, VERSION 6.1

    NASA Technical Reports Server (NTRS)

    Goza, S. P.

    1994-01-01

    The Object Orientation Manipulator (OOM) is an application program for creating, rendering, and recording three-dimensional computer-generated still and animated images. This is done using geometrically defined 3D models, cameras, and light sources, referred to collectively as animation elements. OOM does not provide the tools necessary to construct 3D models; instead, it imports binary format model files generated by the Solid Surface Modeler (SSM). Model files stored in other formats must be converted to the SSM binary format before they can be used in OOM. SSM is available as MSC-21914 or as part of the SSM/OOM bundle, COS-10047. Among OOM's features are collision detection (with visual and audio feedback), the capability to define and manipulate hierarchical relationships between animation elements, stereographic display, and ray-traced rendering. OOM uses Euler angle transformations for calculating the results of translation and rotation operations. OOM provides an interactive environment for the manipulation and animation of models, cameras, and light sources. Models are the basic entity upon which OOM operates and are therefore considered the primary animation elements. Cameras and light sources are considered secondary animation elements. A camera, in OOM, is simply a location within the three-space environment from which the contents of the environment are observed. OOM supports the creation and full animation of cameras. Light sources can be defined, positioned and linked to models, but they cannot be animated independently. OOM can simultaneously accommodate as many animation elements as the host computer's memory permits. Once the required animation elements are present, the user may position them, orient them, and define any initial relationships between them. Once the initial relationships are defined, the user can display individual still views for rendering and output, or define motion for the animation elements by using the Interp Animation Editor. The program provides the capability to save still images, animated sequences of frames, and the information that describes the initialization process for an OOM session. OOM provides the same rendering and output options for both still and animated images. OOM is equipped with a robust model manipulation environment featuring a full screen viewing window, a menu-oriented user interface, and an interpolative Animation Editor. It provides three display modes: solid, wire frame, and simple, that allow the user to trade off visual authenticity for update speed. In the solid mode, each model is drawn based on the shading characteristics assigned to it when it was built. All of the shading characteristics supported by SSM are recognized and properly rendered in this mode. If increasing model complexity impedes the operation of OOM in this mode, then wireframe and simple modes are available. These provide substantially faster screen updates than solid mode. The creation and placement of cameras and light sources is under complete control of the user. One light source is provided in the default element set. It is modeled as a direct light source providing a type of lighting analogous to that provided by the Sun. OOM can accommodate as many light sources as the memory of the host computer permits. Animation is created in OOM using a technique called key frame interpolation. First, various program functions are used to load models, load or create light sources and cameras, and specify initial positions for each element. When these steps are completed, the Interp function is used to create an animation sequence for each element to be animated. An animation sequence consists of a user-defined number of frames (screen images) with some subset of those being defined as key frames. The motion of the element between key frames is interpolated automatically by the software. Key frames thus act as transition points in the motion of an element. This saves the user from having to individually define element data at each frame of a sequence. Animation frames and still images can be output to videotape recorders, film recorders, color printers, and disk files. OOM is written in C-language for implementation on SGI IRIS 4D series workstations running the IRIX operating system. A minimum of 8Mb of RAM is recommended for this program. The standard distribution medium for OOM is a .25 inch streaming magnetic IRIX tape cartridge in UNIX tar format. OOM is also offered as a bundle with a related program, SSM (Solid Surface Modeler). Please see the abstract for SSM/OOM (COS-10047) for information about the bundled package. OOM was released in 1993.

  16. AITRAC: Augmented Interactive Transient Radiation Analysis by Computer. User's information manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1977-10-01

    AITRAC is a program designed for on-line, interactive, DC, and transient analysis of electronic circuits. The program solves linear and nonlinear simultaneous equations which characterize the mathematical models used to predict circuit response. The program features 100 external node--200 branch capability; conversional, free-format input language; built-in junction, FET, MOS, and switch models; sparse matrix algorithm with extended-precision H matrix and T vector calculations, for fast and accurate execution; linear transconductances: beta, GM, MU, ZM; accurate and fast radiation effects analysis; special interface for user-defined equations; selective control of multiple outputs; graphical outputs in wide and narrow formats; and on-line parametermore » modification capability. The user describes the problem by entering the circuit topology and part parameters. The program then automatically generates and solves the circuit equations, providing the user with printed or plotted output. The circuit topology and/or part values may then be changed by the user, and a new analysis, requested. Circuit descriptions may be saved on disk files for storage and later use. The program contains built-in standard models for resistors, voltage and current sources, capacitors, inductors including mutual couplings, switches, junction diodes and transistors, FETS, and MOS devices. Nonstandard models may be constructed from standard models or by using the special equations interface. Time functions may be described by straight-line segments or by sine, damped sine, and exponential functions. 42 figures, 1 table. (RWR)« less

  17. AUTO_DERIV: Tool for automatic differentiation of a Fortran code

    NASA Astrophysics Data System (ADS)

    Stamatiadis, S.; Farantos, S. C.

    2010-10-01

    AUTO_DERIV is a module comprised of a set of FORTRAN 95 procedures which can be used to calculate the first and second partial derivatives (mixed or not) of any continuous function with many independent variables. The mathematical function should be expressed as one or more FORTRAN 77/90/95 procedures. A new type of variables is defined and the overloading mechanism of functions and operators provided by the FORTRAN 95 language is extensively used to define the differentiation rules. Proper (standard complying) handling of floating-point exceptions is provided by using the IEEE_EXCEPTIONS intrinsic module (Technical Report 15580, incorporated in FORTRAN 2003). New version program summaryProgram title: AUTO_DERIV Catalogue identifier: ADLS_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADLS_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 2963 No. of bytes in distributed program, including test data, etc.: 10 314 Distribution format: tar.gz Programming language: Fortran 95 + (optionally) TR-15580 (Floating-point exception handling) Computer: all platforms with a Fortran 95 compiler Operating system: Linux, Windows, MacOS Classification: 4.12, 6.2 Catalogue identifier of previous version: ADLS_v1_0 Journal reference of previous version: Comput. Phys. Comm. 127 (2000) 343 Does the new version supersede the previous version?: Yes Nature of problem: The need to calculate accurate derivatives of a multivariate function frequently arises in computational physics and chemistry. The most versatile approach to evaluate them by a computer, automatically and to machine precision, is via user-defined types and operator overloading. AUTO_DERIV is a Fortran 95 implementation of them, designed to evaluate the first and second derivatives of a function of many variables. Solution method: The mathematical rules for differentiation of sums, products, quotients, elementary functions in conjunction with the chain rule for compound functions are applied. The function should be expressed as one or more Fortran 77/90/95 procedures. A new type of variables is defined and the overloading mechanism of functions and operators provided by the Fortran 95 language is extensively used to implement the differentiation rules. Reasons for new version: The new version supports Fortran 95, handles properly the floating-point exceptions, and is faster due to internal reorganization. All discovered bugs are fixed. Summary of revisions:The code was rewritten extensively to benefit from features introduced in Fortran 95. Additionally, there was a major internal reorganization of the code, resulting in faster execution. The user interface described in the original paper was not changed. The values that the user must or should specify before compilation (essentially, the number of independent variables) were moved into ad_types module. There were many minor bug fixes. One important bug was found and fixed; the code did not handle correctly the overloading of ∗ in aλ when a=0. The case of division by zero and the discontinuity of the function at the requested point are indicated by standard IEEE exceptions ( IEEE_DIVIDE_BY_ZERO and IEEE_INVALID respectively). If the compiler does not support IEEE exceptions, a module with the appropriate name is provided, imitating the behavior of the 'standard' module in the sense that it raises the corresponding exceptions. It is up to the compiler (through certain flags probably) to detect them. Restrictions: None imposed by the program. There are certain limitations that may appear mostly due to the specific implementation chosen in the user code. They can always be overcome by recoding parts of the routines developed by the user or by modifying AUTO_DERIV according to specific instructions given in [1]. The common restrictions of available memory and the capabilities of the compiler are the same as the original version. Additional comments: The program has been tested using the following compilers: Intel ifort, GNU gfortran, NAGWare f95, g95. Running time: The typical running time for the program depends on the compiler and the complexity of the differentiated function. A rough estimate is that AUTO_DERIV is ten times slower than the evaluation of the analytical ('by hand') function value and derivatives (if they are available). References:S. Stamatiadis, R. Prosmiti, S.C. Farantos, AUTO_DERIV: tool for automatic differentiation of a Fortran code, Comput. Phys. Comm. 127 (2000) 343.

  18. A new methodology for estimating nuclear casualties as a function of time.

    PubMed

    Zirkle, Robert A; Walsh, Terri J; Disraelly, Deena S; Curling, Carl A

    2011-09-01

    The Human Response Injury Profile (HRIP) nuclear methodology provides an estimate of casualties occurring as a consequence of nuclear attacks against military targets for planning purposes. The approach develops user-defined, time-based casualty and fatality estimates based on progressions of underlying symptoms and their severity changes over time. This paper provides a description of the HRIP nuclear methodology and its development, including inputs, human response and the casualty estimation process.

  19. C++ Programming Language

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali

    2007-01-01

    C++ Programming Language: The C++ seminar covers the fundamentals of C++ programming language. The C++ fundamentals are grouped into three parts where each part includes both concept and programming examples aimed at for hands-on practice. The first part covers the functional aspect of C++ programming language with emphasis on function parameters and efficient memory utilization. The second part covers the essential framework of C++ programming language, the object-oriented aspects. Information necessary to evaluate various features of object-oriented programming; including encapsulation, polymorphism and inheritance will be discussed. The last part of the seminar covers template and generic programming. Examples include both user defined and standard templates.

  20. VeriClick: an efficient tool for table format verification

    NASA Astrophysics Data System (ADS)

    Nagy, George; Tamhankar, Mangesh

    2012-01-01

    The essential layout attributes of a visual table can be defined by the location of four critical grid cells. Although these critical cells can often be located by automated analysis, some means of human interaction is necessary for correcting residual errors. VeriClick is a macro-enabled spreadsheet interface that provides ground-truthing, confirmation, correction, and verification functions for CSV tables. All user actions are logged. Experimental results of seven subjects on one hundred tables suggest that VeriClick can provide a ten- to twenty-fold speedup over performing the same functions with standard spreadsheet editing commands.

  1. Experiment Pointing Subsystems (EPS) requirements for Spacelab missions

    NASA Technical Reports Server (NTRS)

    Nein, M. E.; Nicaise, P. D.

    1975-01-01

    The goal of the experiment pointing subsystems (EPS) is to accommodate a broad spectrum of instrument types by providing a number of stability and control functions that greatly exceed the capability of the shuttle. These functions include target acquisition, target tracking through wide gimbal ranges, stabilization, simultaneous pointing to one or more targets, instrument rastering, and on-orbit calibration. The experiments will vary widely in size, weight, geometry, and instrument types, and many have not been completely defined. This great diversity of requirements reflects the long term plans of the user community and establishes challenging performance requirements for the EPS.

  2. Chemical Computer Man: Chemical Agent Response Simulation (CARS). Technical report, January 1983-September 1985

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, E.G.; Mioduszewski, R.J.

    The Chemical Computer Man: Chemical Agent Response Simulation (CARS) is a computer model and simulation program for estimating the dynamic changes in human physiological dysfunction resulting from exposures to chemical-threat nerve agents. The newly developed CARS methodology simulates agent exposure effects on the following five indices of human physiological function: mental, vision, cardio-respiratory, visceral, and limbs. Mathematical models and the application of basic pharmacokinetic principles were incorporated into the simulation so that for each chemical exposure, the relationship between exposure dosage, absorbed dosage (agent blood plasma concentration), and level of physiological response are computed as a function of time. CARS,more » as a simulation tool, is designed for the users with little or no computer-related experience. The model combines maximum flexibility with a comprehensive user-friendly interactive menu-driven system. Users define an exposure problem and obtain immediate results displayed in tabular, graphical, and image formats. CARS has broad scientific and engineering applications, not only in technology for the soldier in the area of Chemical Defense, but also in minimizing animal testing in biomedical and toxicological research and the development of a modeling system for human exposure to hazardous-waste chemicals.« less

  3. Defining Information Needs of Computer Users: A Human Communication Problem.

    ERIC Educational Resources Information Center

    Kimbrough, Kenneth L.

    This exploratory investigation of the process of defining the information needs of computer users and the impact of that process on information retrieval focuses on communication problems. Six sites were visited that used computers to process data or to provide information, including the California Department of Transportation, the California…

  4. The Primary Care Electronic Library (PCEL) five years on: open source evaluation of usage.

    PubMed

    Robinson, Judas; de Lusignan, Simon; Kostkova, Patty

    2005-01-01

    The Primary Care Electronic Library (PCEL) is a collection of indexed and abstracted internet resources. PCEL contains a directory of quality-assured internet material with associated search facilities. PCEL has been indexed, using metadata and established taxonomies. Site development requires an understanding of usage; this paper reports the use of open source tools to evaluate usage. This evaluation was conducted during a six-month period of development of PCEL. To use open source to evaluate changes in usage of an electronic library. We defined data we needed for analysis; this included: page requests, visits, unique visitors, page requests per visit, geographical location of users, NHS users, chronological information about users and resources used. During the evaluation period, page requests increased from 3500 to 10,000; visits from 1250 to 2300; and unique visitors from 750 to 1500. Up to 83% of users come from the UK, 15% were NHS users. The page requests of NHS users are slowly increasing but not as fast as requests by other users in the UK. PCEL is primarily used Monday to Friday, 9 a.m. to 5 p.m. Monday is the busiest day with use lessening through the week. NHS users had a different list of top ten resources accessed than non-NHS users, with only four resources appearing in both. Open source tools provide useful data which can be used to evaluate online resources. Improving the functionality of PCEL has been associated with increased use.

  5. Validating the usability of an interactive Earth Observation based web service for landslide investigation

    NASA Astrophysics Data System (ADS)

    Albrecht, Florian; Weinke, Elisabeth; Eisank, Clemens; Vecchiotti, Filippo; Hölbling, Daniel; Friedl, Barbara; Kociu, Arben

    2017-04-01

    Regional authorities and infrastructure maintainers in almost all mountainous regions of the Earth need detailed and up-to-date landslide inventories for hazard and risk management. Landslide inventories usually are compiled through ground surveys and manual image interpretation following landslide triggering events. We developed a web service that uses Earth Observation (EO) data to support the mapping and monitoring tasks for improving the collection of landslide information. The planned validation of the EO-based web service does not only cover the analysis of the achievable landslide information quality but also the usability and user friendliness of the user interface. The underlying validation criteria are based on the user requirements and the defined tasks and aims in the work description of the FFG project Land@Slide (EO-based landslide mapping: from methodological developments to automated web-based information delivery). The service will be validated in collaboration with stakeholders, decision makers and experts. Users are requested to test the web service functionality and give feedback with a web-based questionnaire by following the subsequently described workflow. The users will operate the web-service via the responsive user interface and can extract landslide information from EO data. They compare it to reference data for quality assessment, for monitoring changes and for assessing landslide-affected infrastructure. An overview page lets the user explore a list of example projects with resulting landslide maps and mapping workflow descriptions. The example projects include mapped landslides in several test areas in Austria and Northern Italy. Landslides were extracted from high resolution (HR) and very high resolution (VHR) satellite imagery, such as Landsat, Sentinel-2, SPOT-5, WorldView-2/3 or Pléiades. The user can create his/her own project by selecting available satellite imagery or by uploading new data. Subsequently, a new landslide extraction workflow can be initiated through the functionality that the web service provides: (1) a segmentation of the image into spectrally homogeneous objects, (2) a classification of the objects into landslide and non-landslide areas and (3) an editing tool for the manual refinement of extracted landslide boundaries. In addition, the user interface of the web service provides tools that enable the user (4) to perform a monitoring that identifies changes between landslide maps of different points in time, (5) to perform a validation of the landslide maps by comparing them to reference data, and (6) to perform an assessment of affected infrastructure by comparing the landslide maps to respective infrastructure data. After exploring the web service functionality, the users are asked to fill in the online validation protocol in form of a questionnaire in order to provide their feedback. Concerning usability, we evaluate how intuitive the web service functionality can be operated, how well the integrated help information guides the users, and what kind of background information, e.g. remote sensing concepts and theory, is necessary for a practitioner to fully exploit the value of EO data. The feedback will be used for improving the user interface and for the implementation of additional functionality.

  6. Defining Requirements and Related Methods for Designing Sensorized Garments.

    PubMed

    Andreoni, Giuseppe; Standoli, Carlo Emilio; Perego, Paolo

    2016-05-26

    Designing smart garments has strong interdisciplinary implications, specifically related to user and technical requirements, but also because of the very different applications they have: medicine, sport and fitness, lifestyle monitoring, workplace and job conditions analysis, etc. This paper aims to discuss some user, textile, and technical issues to be faced in sensorized clothes development. In relation to the user, the main requirements are anthropometric, gender-related, and aesthetical. In terms of these requirements, the user's age, the target application, and fashion trends cannot be ignored, because they determine the compliance with the wearable system. Regarding textile requirements, functional factors-also influencing user comfort-are elasticity and washability, while more technical properties are the stability of the chemical agents' effects for preserving the sensors' efficacy and reliability, and assuring the proper duration of the product for the complete life cycle. From the technical side, the physiological issues are the most important: skin conductance, tolerance, irritation, and the effect of sweat and perspiration are key factors for reliable sensing. Other technical features such as battery size and duration, and the form factor of the sensor collector, should be considered, as they affect aesthetical requirements, which have proven to be crucial, as well as comfort and wearability.

  7. A new user-assisted segmentation and tracking technique for an object-based video editing system

    NASA Astrophysics Data System (ADS)

    Yu, Hong Y.; Hong, Sung-Hoon; Lee, Mike M.; Choi, Jae-Gark

    2004-03-01

    This paper presents a semi-automatic segmentation method which can be used to generate video object plane (VOP) for object based coding scheme and multimedia authoring environment. Semi-automatic segmentation can be considered as a user-assisted segmentation technique. A user can initially mark objects of interest around the object boundaries and then the user-guided and selected objects are continuously separated from the unselected areas through time evolution in the image sequences. The proposed segmentation method consists of two processing steps: partially manual intra-frame segmentation and fully automatic inter-frame segmentation. The intra-frame segmentation incorporates user-assistance to define the meaningful complete visual object of interest to be segmentation and decides precise object boundary. The inter-frame segmentation involves boundary and region tracking to obtain temporal coherence of moving object based on the object boundary information of previous frame. The proposed method shows stable efficient results that could be suitable for many digital video applications such as multimedia contents authoring, content based coding and indexing. Based on these results, we have developed objects based video editing system with several convenient editing functions.

  8. Investigating Users' Requirements

    PubMed Central

    Walker, Deborah S.; Lee, Wen-Yu; Skov, Neil M.; Berger, Carl F.; Athley, Brian D.

    2002-01-01

    Objective: User data and information about anatomy education were used to guide development of a learning environment that is efficient and effective. The research question focused on how to design instructional software suitable for the educational goals of different groups of users of the Visible Human data set. The ultimate goal of the study was to provide options for students and teachers to use different anatomy learning modules corresponding to key topics, for course work and professional training. Design: The research used both qualitative and quantitative methods. It was driven by the belief that good instructional design must address learning context information and pedagogic content information. The data collection emphasized measurement of users' perspectives, experience, and demands in anatomy learning. Measurement: Users' requirements elicited from 12 focus groups were combined and rated by 11 researchers. Collective data were sorted and analyzed by use of multidimensional scaling and cluster analysis. Results: A set of functions and features in high demand across all groups of users was suggested by the results. However, several subgroups of users shared distinct demands. The design of the learning modules will encompass both unified core components and user-specific applications. The design templates will allow sufficient flexibility for dynamic insertion of different learning applications for different users. Conclusion: This study describes how users' requirements, associated with users' learning experiences, were systematically collected and analyzed and then transformed into guidelines informing the iterative design of multiple learning modules. Information about learning challenges and processes was gathered to define essential anatomy teaching strategies. A prototype instrument to design and polish the Visible Human user interface system is currently being developed using ideas and feedback from users. PMID:12087112

  9. Finite Element Analysis of Adaptive-Stiffening and Shape-Control SMA Hybrid Composites

    NASA Technical Reports Server (NTRS)

    Gao, Xiujie; Burton, Deborah; Turner, Travis L.; Brinson, Catherine

    2005-01-01

    Shape memory alloy hybrid composites with adaptive-stiffening or morphing functions are simulated using finite element analysis. The composite structure is a laminated fiber-polymer composite beam with embedded SMA ribbons at various positions with respect to the neutral axis of the beam. Adaptive stiffening or morphing is activated via selective resistance heating of the SMA ribbons or uniform thermal loads on the beam. The thermomechanical behavior of these composites was simulated in ABAQUS using user-defined SMA elements. The examples demonstrate the usefulness of the methods for the design and simulation of SMA hybrid composites. Keywords: shape memory alloys, Nitinol, ABAQUS, finite element analysis, post-buckling control, shape control, deflection control, adaptive stiffening, morphing, constitutive modeling, user element

  10. MINIS: Multipurpose Interactive NASA Information System

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The Multipurpose Interactive NASA Information Systems (MINIS) was developed in response to the need for a data management system capable of operation on several different minicomputer systems. The desired system had to be capable of performing the functions of a LANDSAT photo descriptive data retrieval system while remaining general in terms of other acceptable user definable data bases. The system also had to be capable of performing data base updates and providing user-formatted output reports. The resultant MINI System provides all of these capabilities and several other features to complement the data management system. The MINI System is currently implemented on two minicomputer systems and is in the process of being installed on another minicomputer system. The MINIS is operational on four different data bases.

  11. Assessing Service Delivery Systems for Assitive Technology in Brazil using HEART Study quality indicators.

    PubMed

    Maximo, Tulio; Clift, Laurence

    2015-01-01

    recently in Brazil, there have been investments and improvements in the service delivery system for assistive technology provision. However, there is little documentation of this process, or evidence that users are being involved appropriately. to understand how a ssistive technology service provision currently functions in Belo Horizonte city, Brazil, in order to provide context-specific interventions and recommendations to improve services. Qualitative research design, including visits to key institutions and semi-structured interviews with key stakeholders. Interview questions were divided with two purposes: 1) Exploratory, aiming to understand present service functioning; 2) Evaluative, aiming to assess staff difficulties in applying best existing best practices. Assistive Technology services in Belo Horizonte fall under the 'medical model' definition of service delivery developed by AAATE. It was also found that staff lack training and knowledge support to assess user requirements and involve them during the decision process. Additionally, there is no follow up stage after the device is delivered. The study clearly defines the service provision function and the staff difficulties at Belo Horizonte city, providing information for further studies.

  12. Prediction of anthropometric accommodation in aircraft cockpits

    NASA Astrophysics Data System (ADS)

    Zehner, Gregory Franklin

    Designing aircraft cockpits to accommodate the wide range of body sizes existing in the U.S. population has always been a difficult problem for Crewstation Engineers. The approach taken in the design of military aircraft has been to restrict the range of body sizes allowed into flight training, and then to develop standards and specifications to ensure that the majority of the pilots are accommodated. Accommodation in this instance is defined as the ability to: (1) Adequately see, reach, and actuate controls; (2) Have external visual fields so that the pilot can see to land, clear for other aircraft, and perform a wide variety of missions (ground support/attack or air to air combat); and (3) Finally, if problems arise, the pilot has to be able to escape safely. Each of these areas is directly affected by the body size of the pilot. Unfortunately, accommodation problems persist and may get worse. Currently the USAF is considering relaxing body size entrance requirements so that smaller and larger people could become pilots. This will make existing accommodation problems much worse. This dissertation describes a methodology for correcting this problem and demonstrates the method by predicting pilot fit and performance in the USAF T-38A aircraft based on anthropometric data. The methods described can be applied to a variety of design applications where fitting the human operator into a system is a major concern. A systematic approach is described which includes: defining the user population, setting functional requirements that operators must be able to perform, testing the ability of the user population to perform the functional requirements, and developing predictive equations for selecting future users of the system. Also described is a process for the development of new anthropometric design criteria and cockpit design methods that assure body size accommodation is improved in the future.

  13. R-CMap-An open-source software for concept mapping.

    PubMed

    Bar, Haim; Mentch, Lucas

    2017-02-01

    Planning and evaluating projects often involves input from many stakeholders. Fusing and organizing many different ideas, opinions, and interpretations into a coherent and acceptable plan or project evaluation is challenging. This is especially true when seeking contributions from a large number of participants, especially when not all can participate in group discussions, or when some prefer to contribute their perspectives anonymously. One of the major breakthroughs in the area of evaluation and program planning has been the use of graphical tools to represent the brainstorming process. This provides a quantitative framework for organizing ideas and general concepts into simple-to-interpret graphs. We developed a new, open-source concept mapping software called R-CMap, which is implemented in R. This software provides a graphical user interface to guide users through the analytical process of concept mapping. The R-CMap software allows users to generate a variety of plots, including cluster maps, point rating and cluster rating maps, as well as pattern matching and go-zone plots. Additionally, R-CMap is capable of generating detailed reports that contain useful statistical summaries of the data. The plots and reports can be embedded in Microsoft Office tools such as Word and PowerPoint, where users may manually adjust various plot and table features to achieve the best visual results in their presentations and official reports. The graphical user interface of R-CMap allows users to define cluster names, change the number of clusters, select rating variables for relevant plots, and importantly, select subsets of respondents by demographic criteria. The latter is particularly useful to project managers in order to identify different patterns of preferences by subpopulations. R-CMap is user-friendly, and does not require any programming experience. However, proficient R users can add to its functionality by directly accessing built-in functions in R and sharing new features with the concept mapping community. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. A Literature Review: Website Design and User Engagement.

    PubMed

    Garett, Renee; Chiu, Jason; Zhang, Ly; Young, Sean D

    2016-07-01

    Proper design has become a critical element needed to engage website and mobile application users. However, little research has been conducted to define the specific elements used in effective website and mobile application design. We attempt to review and consolidate research on effective design and to define a short list of elements frequently used in research. The design elements mentioned most frequently in the reviewed literature were navigation, graphical representation, organization, content utility, purpose, simplicity, and readability. We discuss how previous studies define and evaluate these seven elements. This review and the resulting short list of design elements may be used to help designers and researchers to operationalize best practices for facilitating and predicting user engagement.

  15. A Literature Review: Website Design and User Engagement

    PubMed Central

    Garett, Renee; Chiu, Jason; Zhang, Ly; Young, Sean D.

    2015-01-01

    Proper design has become a critical element needed to engage website and mobile application users. However, little research has been conducted to define the specific elements used in effective website and mobile application design. We attempt to review and consolidate research on effective design and to define a short list of elements frequently used in research. The design elements mentioned most frequently in the reviewed literature were navigation, graphical representation, organization, content utility, purpose, simplicity, and readability. We discuss how previous studies define and evaluate these seven elements. This review and the resulting short list of design elements may be used to help designers and researchers to operationalize best practices for facilitating and predicting user engagement. PMID:27499833

  16. Optimization of Residual Stresses in MMC's Using Compensating/Compliant Interfacial Layers. Part 2: OPTCOMP User's Guide

    NASA Technical Reports Server (NTRS)

    Pindera, Marek-Jerzy; Salzar, Robert S.; Williams, Todd O.

    1994-01-01

    A user's guide for the computer program OPTCOMP is presented in this report. This program provides a capability to optimize the fabrication or service-induced residual stresses in uni-directional metal matrix composites subjected to combined thermo-mechanical axisymmetric loading using compensating or compliant layers at the fiber/matrix interface. The user specifies the architecture and the initial material parameters of the interfacial region, which can be either elastic or elastoplastic, and defines the design variables, together with the objective function, the associated constraints and the loading history through a user-friendly data input interface. The optimization procedure is based on an efficient solution methodology for the elastoplastic response of an arbitrarily layered multiple concentric cylinder model that is coupled to the commercial optimization package DOT. The solution methodology for the arbitrarily layered cylinder is based on the local-global stiffness matrix formulation and Mendelson's iterative technique of successive elastic solutions developed for elastoplastic boundary-value problems. The optimization algorithm employed in DOT is based on the method of feasible directions.

  17. Japanese plan for SSF utilization

    NASA Technical Reports Server (NTRS)

    Mizuno, Toshio

    1992-01-01

    The Japanese Experiment Module (JEM) program has made significant progress. The JEM preliminary design review was completed in July 1992; construction of JEM operation facilities has begun; and the micro-G airplane, drop shaft, and micro-G experiment rocket are all operational. The national policy for JEM utilization was also established. The Space Experiment Laboratory (SEL) opened in June '92 and will function as a user support center. Eight JEM multiuser facilities are in phase B, and scientific requirements are being defined for 17 candidate multiuser facilities. The National Joint Research Program is about to start. Precursor missions and early Space Station utilization activities are being defined. This paper summarizes the program in outline and graphic form.

  18. Building an Electronic Handover Tool for Physicians Using a Collaborative Approach between Clinicians and the Development Team.

    PubMed

    Guilbeault, Peggy; Momtahan, Kathryn; Hudson, Jordan

    2015-01-01

    In an effort by The Ottawa Hospital (TOH) to become one of the top 10% performers in patient safety and quality of care, the hospital embarked on improving the communication process during handover between physicians by building an electronic handover tool. It is expected that this tool will decrease information loss during handover. The Information Systems (IS) department engaged a workgroup of physicians to become involved in defining requirements to build an electronic handover tool that suited their clinical handover needs. This group became ultimately responsible for defining the graphical user interface (GUI) and all functionality related to the tool. Prior to the pilot, the Information Systems team will run a usability testing session to ensure the application is user friendly and has met the goals and objectives of the workgroup. As a result, The Ottawa Hospital has developed a fully integrated electronic handover tool built on the Clinical Mobile Application (CMA) which allows clinicians to enter patient problems, notes and tasks available to all physicians to facilitate the handover process.

  19. Spectral resampling based on user-defined inter-band correlation filter: C3 and C4 grass species classification

    NASA Astrophysics Data System (ADS)

    Adjorlolo, Clement; Mutanga, Onisimo; Cho, Moses A.; Ismail, Riyad

    2013-04-01

    In this paper, a user-defined inter-band correlation filter function was used to resample hyperspectral data and thereby mitigate the problem of multicollinearity in classification analysis. The proposed resampling technique convolves the spectral dependence information between a chosen band-centre and its shorter and longer wavelength neighbours. Weighting threshold of inter-band correlation (WTC, Pearson's r) was calculated, whereby r = 1 at the band-centre. Various WTC (r = 0.99, r = 0.95 and r = 0.90) were assessed, and bands with coefficients beyond a chosen threshold were assigned r = 0. The resultant data were used in the random forest analysis to classify in situ C3 and C4 grass canopy reflectance. The respective WTC datasets yielded improved classification accuracies (kappa = 0.82, 0.79 and 0.76) with less correlated wavebands when compared to resampled Hyperion bands (kappa = 0.76). Overall, the results obtained from this study suggested that resampling of hyperspectral data should account for the spectral dependence information to improve overall classification accuracy as well as reducing the problem of multicollinearity.

  20. Using McStas for modelling complex optics, using simple building bricks

    NASA Astrophysics Data System (ADS)

    Willendrup, Peter K.; Udby, Linda; Knudsen, Erik; Farhi, Emmanuel; Lefmann, Kim

    2011-04-01

    The McStas neutron ray-tracing simulation package is a versatile tool for producing accurate neutron simulations, extensively used for design and optimization of instruments, virtual experiments, data analysis and user training.In McStas, component organization and simulation flow is intrinsically linear: the neutron interacts with the beamline components in a sequential order, one by one. Historically, a beamline component with several parts had to be implemented with a complete, internal description of all these parts, e.g. a guide component including all four mirror plates and required logic to allow scattering between the mirrors.For quite a while, users have requested the ability to allow “components inside components” or meta-components, allowing to combine functionality of several simple components to achieve more complex behaviour, i.e. four single mirror plates together defining a guide.We will here show that it is now possible to define meta-components in McStas, and present a set of detailed, validated examples including a guide with an embedded, wedged, polarizing mirror system of the Helmholtz-Zentrum Berlin type.

  1. Evolution simulation of lightning discharge based on a magnetohydrodynamics method

    NASA Astrophysics Data System (ADS)

    Fusheng, WANG; Xiangteng, MA; Han, CHEN; Yao, ZHANG

    2018-07-01

    In order to solve the load problem for aircraft lightning strikes, lightning channel evolution is simulated under the key physical parameters for aircraft lightning current component C. A numerical model of the discharge channel is established, based on magnetohydrodynamics (MHD) and performed by FLUENT software. With the aid of user-defined functions and a user-defined scalar, the Lorentz force, Joule heating and material parameters of an air thermal plasma are added. A three-dimensional lightning arc channel is simulated and the arc evolution in space is obtained. The results show that the temperature distribution of the lightning channel is symmetrical and that the hottest region occurs at the center of the lightning channel. The distributions of potential and current density are obtained, showing that the difference in electric potential or energy between two points tends to make the arc channel develop downwards. The arc channel comes into expansion on the anode surface due to stagnation of the thermal plasma and there exists impingement on the copper plate when the arc channel comes into contact with the anode plate.

  2. An Efficient Implementation of Fixed Failure-Rate Ratio Test for GNSS Ambiguity Resolution.

    PubMed

    Hou, Yanqing; Verhagen, Sandra; Wu, Jie

    2016-06-23

    Ambiguity Resolution (AR) plays a vital role in precise GNSS positioning. Correctly-fixed integer ambiguities can significantly improve the positioning solution, while incorrectly-fixed integer ambiguities can bring large positioning errors and, therefore, should be avoided. The ratio test is an extensively used test to validate the fixed integer ambiguities. To choose proper critical values of the ratio test, the Fixed Failure-rate Ratio Test (FFRT) has been proposed, which generates critical values according to user-defined tolerable failure rates. This contribution provides easy-to-implement fitting functions to calculate the critical values. With a massive Monte Carlo simulation, the functions for many different tolerable failure rates are provided, which enriches the choices of critical values for users. Moreover, the fitting functions for the fix rate are also provided, which for the first time allows users to evaluate the conditional success rate, i.e., the success rate once the integer candidates are accepted by FFRT. The superiority of FFRT over the traditional ratio test regarding controlling the failure rate and preventing unnecessary false alarms is shown by a simulation and a real data experiment. In the real data experiment with a baseline of 182.7 km, FFRT achieved much higher fix rates (up to 30% higher) and the same level of positioning accuracy from fixed solutions as compared to the traditional critical value.

  3. T-RMSD: a web server for automated fine-grained protein structural classification.

    PubMed

    Magis, Cedrik; Di Tommaso, Paolo; Notredame, Cedric

    2013-07-01

    This article introduces the T-RMSD web server (tree-based on root-mean-square deviation), a service allowing the online computation of structure-based protein classification. It has been developed to address the relation between structural and functional similarity in proteins, and it allows a fine-grained structural clustering of a given protein family or group of structurally related proteins using distance RMSD (dRMSD) variations. These distances are computed between all pairs of equivalent residues, as defined by the ungapped columns within a given multiple sequence alignment. Using these generated distance matrices (one per equivalent position), T-RMSD produces a structural tree with support values for each cluster node, reminiscent of bootstrap values. These values, associated with the tree topology, allow a quantitative estimate of structural distances between proteins or group of proteins defined by the tree topology. The clusters thus defined have been shown to be structurally and functionally informative. The T-RMSD web server is a free website open to all users and available at http://tcoffee.crg.cat/apps/tcoffee/do:trmsd.

  4. T-RMSD: a web server for automated fine-grained protein structural classification

    PubMed Central

    Magis, Cedrik; Di Tommaso, Paolo; Notredame, Cedric

    2013-01-01

    This article introduces the T-RMSD web server (tree-based on root-mean-square deviation), a service allowing the online computation of structure-based protein classification. It has been developed to address the relation between structural and functional similarity in proteins, and it allows a fine-grained structural clustering of a given protein family or group of structurally related proteins using distance RMSD (dRMSD) variations. These distances are computed between all pairs of equivalent residues, as defined by the ungapped columns within a given multiple sequence alignment. Using these generated distance matrices (one per equivalent position), T-RMSD produces a structural tree with support values for each cluster node, reminiscent of bootstrap values. These values, associated with the tree topology, allow a quantitative estimate of structural distances between proteins or group of proteins defined by the tree topology. The clusters thus defined have been shown to be structurally and functionally informative. The T-RMSD web server is a free website open to all users and available at http://tcoffee.crg.cat/apps/tcoffee/do:trmsd. PMID:23716642

  5. Some design constraints required for the assembly of software components: The incorporation of atomic abstract types into generically structured abstract types

    NASA Technical Reports Server (NTRS)

    Johnson, Charles S.

    1986-01-01

    It is nearly axiomatic, that to take the greatest advantage of the useful features available in a development system, and to avoid the negative interactions of those features, requires the exercise of a design methodology which constrains their use. A major design support feature of the Ada language is abstraction: for data, functions processes, resources, and system elements in general. Atomic abstract types can be created in packages defining those private types and all of the overloaded operators, functions, and hidden data required for their use in an application. Generically structured abstract types can be created in generic packages defining those structured private types, as buildups from the user-defined data types which are input as parameters. A study is made of the design constraints required for software incorporating either atomic or generically structured abstract types, if the integration of software components based on them is to be subsequently performed. The impact of these techniques on the reusability of software and the creation of project-specific software support environments is also discussed.

  6. The Effects of Selected Modelling Parameters on the Computed Optical Frequency Signatures of Naval Platforms

    DTIC Science & Technology

    2009-04-01

    Contrast signature plots for the simple wireframe model with user-defined thermal boundary conditions and an exhaust plume ...boundary conditions but no exhaust plume ................................................................................. 25 A.3. Contrast signature...plots for the simple wireframe model with no user-defined thermal boundary conditions or exhaust plume

  7. ISOT_Calc: A versatile tool for parameter estimation in sorption isotherms

    NASA Astrophysics Data System (ADS)

    Beltrán, José L.; Pignatello, Joseph J.; Teixidó, Marc

    2016-09-01

    Geochemists and soil chemists commonly use parametrized sorption data to assess transport and impact of pollutants in the environment. However, this evaluation is often hampered by a lack of detailed sorption data analysis, which implies further non-accurate transport modeling. To this end, we present a novel software tool to precisely analyze and interpret sorption isotherm data. Our developed tool, coded in Visual Basic for Applications (VBA), operates embedded within the Microsoft Excel™ environment. It consists of a user-defined function named ISOT_Calc, followed by a supplementary optimization Excel macro (Ref_GN_LM). The ISOT_Calc function estimates the solute equilibrium concentration in the aqueous and solid phases (Ce and q, respectively). Hence, it represents a very flexible way in the optimization of the sorption isotherm parameters, as it can be carried out over the residuals of q, Ce, or both simultaneously (i.e., orthogonal distance regression). The developed function includes the most usual sorption isotherm models, as predefined equations, as well as the possibility to easily introduce custom-defined ones. Regarding the Ref_GN_LM macro, it allows the parameter optimization by using a Levenberg-Marquardt modified Gauss-Newton iterative procedure. In order to evaluate the performance of the presented tool, both function and optimization macro have been applied to different sorption data examples described in the literature. Results showed that the optimization of the isotherm parameters was successfully achieved in all cases, indicating the robustness and reliability of the developed tool. Thus, the presented software tool, available to researchers and students for free, has proven to be a user-friendly and an interesting alternative to conventional fitting tools used in sorption data analysis.

  8. Partitioning of Function in a Distributed Graphics System.

    DTIC Science & Technology

    1985-03-01

    Interface specification ( VDI ) is yet another graphi:s standardization effort of ANSI committee X31133 [7]. As shown in figure 2-2, the Virtual Device... VDI specification could be realized in a real device, or at least a "black box" which the user treats as a hardware device. ’he device drivers would...be written by the manufacturer of the graphics device, instead of the author of the graphics system. Since the VDI specification is precisely defined

  9. A Planning Tool for Estimating Waste Generated by a Radiological Incident and Subsequent Decontamination Efforts - 13569

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boe, Timothy; Lemieux, Paul; Schultheisz, Daniel

    2013-07-01

    Management of debris and waste from a wide-area radiological incident would probably constitute a significant percentage of the total remediation cost and effort. The U.S. Environmental Protection Agency's (EPA's) Waste Estimation Support Tool (WEST) is a unique planning tool for estimating the potential volume and radioactivity levels of waste generated by a radiological incident and subsequent decontamination efforts. The WEST was developed to support planners and decision makers by generating a first-order estimate of the quantity and characteristics of waste resulting from a radiological incident. The tool then allows the user to evaluate the impact of various decontamination/demolition strategies onmore » the waste types and volumes generated. WEST consists of a suite of standalone applications and Esri{sup R} ArcGIS{sup R} scripts for rapidly estimating waste inventories and levels of radioactivity generated from a radiological contamination incident as a function of user-defined decontamination and demolition approaches. WEST accepts Geographic Information System (GIS) shape-files defining contaminated areas and extent of contamination. Building stock information, including square footage, building counts, and building composition estimates are then generated using the Federal Emergency Management Agency's (FEMA's) Hazus{sup R}-MH software. WEST then identifies outdoor surfaces based on the application of pattern recognition to overhead aerial imagery. The results from the GIS calculations are then fed into a Microsoft Excel{sup R} 2007 spreadsheet with a custom graphical user interface where the user can examine the impact of various decontamination/demolition scenarios on the quantity, characteristics, and residual radioactivity of the resulting waste streams. (authors)« less

  10. mGrid: A load-balanced distributed computing environment for the remote execution of the user-defined Matlab code

    PubMed Central

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-01-01

    Background Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. Results mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Conclusion Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet. PMID:16539707

  11. mGrid: a load-balanced distributed computing environment for the remote execution of the user-defined Matlab code.

    PubMed

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-03-15

    Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet.

  12. MetaPathways v2.5: quantitative functional, taxonomic and usability improvements.

    PubMed

    Konwar, Kishori M; Hanson, Niels W; Bhatia, Maya P; Kim, Dongjae; Wu, Shang-Ju; Hahn, Aria S; Morgan-Lang, Connor; Cheung, Hiu Kan; Hallam, Steven J

    2015-10-15

    Next-generation sequencing is producing vast amounts of sequence information from natural and engineered ecosystems. Although this data deluge has an enormous potential to transform our lives, knowledge creation and translation need software applications that scale with increasing data processing and analysis requirements. Here, we present improvements to MetaPathways, an annotation and analysis pipeline for environmental sequence information that expedites this transformation. We specifically address pathway prediction hazards through integration of a weighted taxonomic distance and enable quantitative comparison of assembled annotations through a normalized read-mapping measure. Additionally, we improve LAST homology searches through BLAST-equivalent E-values and output formats that are natively compatible with prevailing software applications. Finally, an updated graphical user interface allows for keyword annotation query and projection onto user-defined functional gene hierarchies, including the Carbohydrate-Active Enzyme database. MetaPathways v2.5 is available on GitHub: http://github.com/hallamlab/metapathways2. shallam@mail.ubc.ca Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  13. Information, intelligence, and interface: the pillars of a successful medical information system.

    PubMed

    Hadzikadic, M; Harrington, A L; Bohren, B F

    1995-01-01

    This paper addresses three key issues facing developers of clinical and/or research medical information systems. 1. INFORMATION. The basic function of every database is to store information about the phenomenon under investigation. There are many ways to organize information in a computer; however only a few will prove optimal for any real life situation. Computer Science theory has developed several approaches to database structure, with relational theory leading in popularity among end users [8]. Strict conformance to the rules of relational database design rewards the user with consistent data and flexible access to that data. A properly defined database structure minimizes redundancy i.e.,multiple storage of the same information. Redundancy introduces problems when updating a database, since the repeated value has to be updated in all locations--missing even a single value corrupts the whole database, and incorrect reports are produced [8]. To avoid such problems, relational theory offers a formal mechanism for determining the number and content of data files. These files not only preserve the conceptual schema of the application domain, but allow a virtually unlimited number of reports to be efficiently generated. 2. INTELLIGENCE. Flexible access enables the user to harvest additional value from collected data. This value is usually gained via reports defined at the time of database design. Although these reports are indispensable, with proper tools more information can be extracted from the database. For example, machine learning, a sub-discipline of artificial intelligence, has been successfully used to extract knowledge from databases of varying size by uncovering a correlation among fields and records[1-6, 9]. This knowledge, represented in the form of decision trees, production rules, and probabilistic networks, clearly adds a flavor of intelligence to the data collection and manipulation system. 3. INTERFACE. Despite the obvious importance of collecting data and extracting knowledge, current systems often impede these processes. Problems stem from the lack of user friendliness and functionality. To overcome these problems, several features of a successful human-computer interface have been identified [7], including the following "golden" rules of dialog design [7]: consistency, use of shortcuts for frequent users, informative feedback, organized sequence of actions, simple error handling, easy reversal of actions, user-oriented focus of control, and reduced short-term memory load. To this list of rules, we added visual representation of both data and query results, since our experience has demonstrated that users react much more positively to visual rather than textual information. In our design of the Orthopaedic Trauma Registry--under development at the Carolinas Medical Center--we have made every effort to follow the above rules. The results were rewarding--the end users actually not only want to use the product, but also to participate in its development.

  14. proGenomes: a resource for consistent functional and taxonomic annotations of prokaryotic genomes.

    PubMed

    Mende, Daniel R; Letunic, Ivica; Huerta-Cepas, Jaime; Li, Simone S; Forslund, Kristoffer; Sunagawa, Shinichi; Bork, Peer

    2017-01-04

    The availability of microbial genomes has opened many new avenues of research within microbiology. This has been driven primarily by comparative genomics approaches, which rely on accurate and consistent characterization of genomic sequences. It is nevertheless difficult to obtain consistent taxonomic and integrated functional annotations for defined prokaryotic clades. Thus, we developed proGenomes, a resource that provides user-friendly access to currently 25 038 high-quality genomes whose sequences and consistent annotations can be retrieved individually or by taxonomic clade. These genomes are assigned to 5306 consistent and accurate taxonomic species clusters based on previously established methodology. proGenomes also contains functional information for almost 80 million protein-coding genes, including a comprehensive set of general annotations and more focused annotations for carbohydrate-active enzymes and antibiotic resistance genes. Additionally, broad habitat information is provided for many genomes. All genomes and associated information can be downloaded by user-selected clade or multiple habitat-specific sets of representative genomes. We expect that the availability of high-quality genomes with comprehensive functional annotations will promote advances in clinical microbial genomics, functional evolution and other subfields of microbiology. proGenomes is available at http://progenomes.embl.de. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  15. In-Space Radiator Shape Optimization using Genetic Algorithms

    NASA Technical Reports Server (NTRS)

    Hull, Patrick V.; Kittredge, Ken; Tinker, Michael; SanSoucie, Michael

    2006-01-01

    Future space exploration missions will require the development of more advanced in-space radiators. These radiators should be highly efficient and lightweight, deployable heat rejection systems. Typical radiators for in-space heat mitigation commonly comprise a substantial portion of the total vehicle mass. A small mass savings of even 5-10% can greatly improve vehicle performance. The objective of this paper is to present the development of detailed tools for the analysis and design of in-space radiators using evolutionary computation techniques. The optimality criterion is defined as a two-dimensional radiator with a shape demonstrating the smallest mass for the greatest overall heat transfer, thus the end result is a set of highly functional radiator designs. This cross-disciplinary work combines topology optimization and thermal analysis design by means of a genetic algorithm The proposed design tool consists of the following steps; design parameterization based on the exterior boundary of the radiator, objective function definition (mass minimization and heat loss maximization), objective function evaluation via finite element analysis (thermal radiation analysis) and optimization based on evolutionary algorithms. The radiator design problem is defined as follows: the input force is a driving temperature and the output reaction is heat loss. Appropriate modeling of the space environment is added to capture its effect on the radiator. The design parameters chosen for this radiator shape optimization problem fall into two classes, variable height along the width of the radiator and a spline curve defining the -material boundary of the radiator. The implementation of multiple design parameter schemes allows the user to have more confidence in the radiator optimization tool upon demonstration of convergence between the two design parameter schemes. This tool easily allows the user to manipulate the driving temperature regions thus permitting detailed design of in-space radiators for unique situations. Preliminary results indicate an optimized shape following that of the temperature distribution regions in the "cooler" portions of the radiator. The results closely follow the expected radiator shape.

  16. Automatic programming of simulation models

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.

    1990-01-01

    The concepts of software engineering were used to improve the simulation modeling environment. Emphasis was placed on the application of an element of rapid prototyping, or automatic programming, to assist the modeler define the problem specification. Then, once the problem specification has been defined, an automatic code generator is used to write the simulation code. The following two domains were selected for evaluating the concepts of software engineering for discrete event simulation: manufacturing domain and a spacecraft countdown network sequence. The specific tasks were to: (1) define the software requirements for a graphical user interface to the Automatic Manufacturing Programming System (AMPS) system; (2) develop a graphical user interface for AMPS; and (3) compare the AMPS graphical interface with the AMPS interactive user interface.

  17. Access to the NCAR Research Data Archive via the Globus Data Transfer Service

    NASA Astrophysics Data System (ADS)

    Cram, T.; Schuster, D.; Ji, Z.; Worley, S. J.

    2014-12-01

    The NCAR Research Data Archive (RDA; http://rda.ucar.edu) contains a large and diverse collection of meteorological and oceanographic observations, operational and reanalysis outputs, and remote sensing datasets to support atmospheric and geoscience research. The RDA contains greater than 600 dataset collections which support the varying needs of a diverse user community. The number of RDA users is increasing annually, and the most popular method used to access the RDA data holdings is through web based protocols, such as wget and cURL based scripts. In the year 2013, 10,000 unique users downloaded greater than 820 terabytes of data from the RDA, and customized data products were prepared for more than 29,000 user-driven requests. In order to further support this increase in web download usage, the RDA is implementing the Globus data transfer service (www.globus.org) to provide a GridFTP data transfer option for the user community. The Globus service is broadly scalable, has an easy to install client, is sustainably supported, and provides a robust, efficient, and reliable data transfer option for RDA users. This paper highlights the main functionality and usefulness of the Globus data transfer service for accessing the RDA holdings. The Globus data transfer service, developed and supported by the Computation Institute at The University of Chicago and Argonne National Laboratory, uses the GridFTP as a fast, secure, and reliable method for transferring data between two endpoints. A Globus user account is required to use this service, and data transfer endpoints are defined on the Globus web interface. In the RDA use cases, the access endpoint is created on the RDA data server at NCAR. The data user defines the receiving endpoint for the data transfer, which can be the main file system at a host institution, a personal work station, or laptop. Once initiated, the data transfer runs as an unattended background process by Globus, and Globus ensures that the transfer is accurately fulfilled. Users can monitor the data transfer progress on the Globus web interface and optionally receive an email notification once it is complete. Globus also provides a command-line interface to support scripted transfers, which can be useful when embedded in data processing workflows.

  18. Chronic Azithromycin Use in Cystic Fibrosis and Risk of Treatment-Emergent Respiratory Pathogens.

    PubMed

    Cogen, Jonathan D; Onchiri, Frankline; Emerson, Julia; Gibson, Ronald L; Hoffman, Lucas R; Nichols, David P; Rosenfeld, Margaret

    2018-02-23

    Azithromycin has been shown to improve lung function and reduce the number of pulmonary exacerbations in cystic fibrosis patients. Concerns remain, however, regarding the potential emergence of treatment-related respiratory pathogens. To determine if chronic azithromycin use (defined as thrice weekly administration) is associated with increased rates of detection of eight specific respiratory pathogens. We performed a new-user, propensity-score matched retrospective cohort study utilizing data from the Cystic Fibrosis Foundation Patient Registry. Incident azithromycin users were propensity-score matched 1:1 with contemporaneous non-users. Kaplan-Meier curves and Cox proportional hazards regression were used to evaluate the association between chronic azithromycin use and incident respiratory pathogen detection. Analyses were performed separately for each pathogen, limited to patients among whom that pathogen had not been isolated in the two years prior to cohort entry. After propensity score matching, mean age of the cohorts was ~12 years. Chronic azithromycin users had a significantly lower risk of detection of new methicillin-resistant Staphylococcus aureus, non-tuberculous mycobacteria, and Burkholderia cepacia complex compared to non-users. The risk of acquiring the remaining five pathogens was not significantly different between users and non-users. Using an innovative new-user, propensity-score matched study design to minimize indication and selection biases, we found in a predominantly pediatric cohort that chronic azithromycin users had a lower risk of acquiring several cystic fibrosis-related respiratory pathogens. These results may ease concerns that chronic azithromycin exposure increases the risk of acquiring new respiratory pathogens among pediatric cystic fibrosis patients.

  19. Determinants of quality of shared sanitation facilities in informal settlements: case study of Kisumu, Kenya.

    PubMed

    Simiyu, Sheillah; Swilling, Mark; Cairncross, Sandy; Rheingans, Richard

    2017-01-11

    Shared facilities are not recognised as improved sanitation due to challenges of maintenance as they easily can be avenues for the spread of diseases. Thus there is need to evaluate the quality of shared facilities, especially in informal settlements, where they are commonly used. A shared facility can be equated to a common good whose management depends on the users. If users do not work collectively towards keeping the facility clean, it is likely that the quality may depreciate due to lack of maintenance. This study examined the quality of shared sanitation facilities and used the common pool resource (CPR) management principles to examine the determinants of shared sanitation quality in the informal settlements of Kisumu, Kenya. Using a multiple case study design, the study employed both quantitative and qualitative methods. In both phases, users of shared sanitation facilities were interviewed, while shared sanitation facilities were inspected. Shared sanitation quality was a score which was the dependent variable in a regression analysis. Interviews during the qualitative stage were aimed at understanding management practices of shared sanitation users. Qualitative data was analysed thematically by following the CPR principles. Shared facilities, most of which were dirty, were shared by an average of eight households, and their quality decreased with an increase in the number of households sharing. The effect of numbers on quality is explained by behaviour reflected in the CPR principles, as it was easier to define boundaries of shared facilities when there were fewer users who cooperated towards improving their shared sanitation facility. Other factors, such as defined management systems, cooperation, collective decision making, and social norms, also played a role in influencing the behaviour of users towards keeping shared facilities clean and functional. Apart from hardware factors, quality of shared sanitation is largely due to group behaviour of users. The CPR principles form a crucial lens through which the dynamics of shared sanitation facilities in informal settlements can be understood. Development and policy efforts should incorporate group behaviour as they determine the quality of shared sanitation facilities.

  20. End user and implementer experiences of mHealth technologies for noncommunicable chronic disease management in young adults: a qualitative systematic review protocol.

    PubMed

    Slater, Helen; Briggs, Andrew; Stinson, Jennifer; Campbell, Jared M

    2017-08-01

    The objective of this review is to systematically identify, review and synthesize relevant qualitative research on end user and implementer experiences of mobile health (mHealth) technologies developed for noncommunicable chronic disease management in young adults. "End users" are defined as young people aged 15-24 years, and "implementers" are defined as health service providers, clinicians, policy makers and administrators.The two key questions we wish to systematically explore from identified relevant qualitative studies or studies with qualitative components are.

  1. A prospective examination of online social network dynamics and smoking cessation

    PubMed Central

    Zhao, Kang; Papandonatos, George D.; Erar, Bahar; Wang, Xi; Amato, Michael S.; Cha, Sarah; Cohn, Amy M.; Pearson, Jennifer L.

    2017-01-01

    Introduction Use of online social networks for smoking cessation has been associated with abstinence. Little is known about the mechanisms through which the formation of social ties in an online network may influence smoking behavior. Using dynamic social network analysis, we investigated how temporal changes of an individual’s number of social network ties are prospectively related to abstinence in an online social network for cessation. In a network where quitting is normative and is the focus of communications among members, we predicted that an increasing number of ties would be positively associated with abstinence. Method Participants were N = 2,657 adult smokers recruited to a randomized cessation treatment trial following enrollment on BecomeAnEX.org, a longstanding Internet cessation program with a large and mature online social network. At 3-months post-randomization, 30-day point prevalence abstinence was assessed and website engagement metrics were extracted. The social network was constructed with clickstream data to capture the flow of information among members. Two network centrality metrics were calculated at weekly intervals over 3 months: 1) in-degree, defined as the number of members whose posts a participant read; and 2) out-degree-aware, defined as the number of members who read a participant’s post and commented, which was subsequently viewed by the participant. Three groups of users were identified based on social network engagement patterns: non-users (N = 1,362), passive users (N = 812), and active users (N = 483). Logistic regression modeled 3-month abstinence by group as a function of baseline variables, website utilization, and network centrality metrics. Results Abstinence rates varied by group (non-users = 7.7%, passive users = 10.7%, active users = 20.7%). Significant baseline predictors of abstinence were age, nicotine dependence, confidence to quit, and smoking temptations in social situations among passive users (ps < .05); age and confidence to quit among active users. Among centrality metrics, positive associations with abstinence were observed for in-degree increases from Week 2 to Week 12 among passive and active users, and for out-degree-aware increases from Week 2 to Week 12 among active users (ps < .05). Conclusions This study is the first to demonstrate that increased tie formation among members of an online social network for smoking cessation is prospectively associated with abstinence. It also highlights the value of using individuals’ activities in online social networks to predict their offline health behaviors. PMID:28832621

  2. Interactive Machine Learning at Scale with CHISSL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arendt, Dustin L.; Grace, Emily A.; Volkova, Svitlana

    We demonstrate CHISSL, a scalable client-server system for real-time interactive machine learning. Our system is capa- ble of incorporating user feedback incrementally and imme- diately without a structured or pre-defined prediction task. Computation is partitioned between a lightweight web-client and a heavyweight server. The server relies on representation learning and agglomerative clustering to learn a dendrogram, a hierarchical approximation of a representation space. The client uses only this dendrogram to incorporate user feedback into the model via transduction. Distances and predictions for each unlabeled instance are updated incrementally and deter- ministically, with O(n) space and time complexity. Our al- gorithmmore » is implemented in a functional prototype, designed to be easy to use by non-experts. The prototype organizes the large amounts of data into recommendations. This allows the user to interact with actual instances by dragging and drop- ping to provide feedback in an intuitive manner. We applied CHISSL to several domains including cyber, social media, and geo-temporal analysis.« less

  3. EnergyPlus Graphical User Interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2011-01-04

    LBNL, Infosys Technologies and Digital Alchemy are developing a free, comprehensive graphical user interface (GUI) that will enable EnergyPlus to be used more easily and effectively by building designers and other professionals, facilitating its widespread adoption. User requirements have been defined through a series of practitioner workshops. A new schematic editor for HVAC systems will be combined with different building envelope geometry generation tools and IFC-based BIM import and export. LBNL and Digital Alchemy have generated a detailed function requirements specification, which is being implemented in software by Infosys, LBNL and and Digital Alchemy. LBNL and practitioner subcontractors will developmore » a comprehensive set of templates and libraries and will perform extensive testing of the GUI before it is released in Q3 2011. It is planned to use an Open Platfom approach, in which a comprehensive set of well documented Application Programming Interfaces (API's) would be provided to facilitate both the development of third party contributions to the official, standard GUI and the development of derivative works.« less

  4. User Modeling in Adaptive Hypermedia Educational Systems

    ERIC Educational Resources Information Center

    Martins, Antonio Constantino; Faria, Luiz; Vaz de Carvalho, Carlos; Carrapatoso, Eurico

    2008-01-01

    This document is a survey in the research area of User Modeling (UM) for the specific field of Adaptive Learning. The aims of this document are: To define what it is a User Model; To present existing and well known User Models; To analyze the existent standards related with UM; To compare existing systems. In the scientific area of User Modeling…

  5. LMSS communication network design

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The architecture of the telecommunication network as the first step in the design of the LMSS system is described. A set of functional requirements including the total number of users to be served by the LMSS are hypothesized. The design parameters are then defined at length and are systematically selected such that the resultant system is capable of serving the hypothesized number of users. The design of the backhaul link is presented. The number of multiple backhaul beams required for communication to the base stations is determined. A conceptual procedure for call-routing and locating a mobile subscriber within the LMSS network is presented. The various steps in placing a call are explained, and the relationship between the two sets of UHF and S-band multiple beams is developed. A summary of the design parameters is presented.

  6. Analysis of counting data: Development of the SATLAS Python package

    NASA Astrophysics Data System (ADS)

    Gins, W.; de Groote, R. P.; Bissell, M. L.; Granados Buitrago, C.; Ferrer, R.; Lynch, K. M.; Neyens, G.; Sels, S.

    2018-01-01

    For the analysis of low-statistics counting experiments, a traditional nonlinear least squares minimization routine may not always provide correct parameter and uncertainty estimates due to the assumptions inherent in the algorithm(s). In response to this, a user-friendly Python package (SATLAS) was written to provide an easy interface between the data and a variety of minimization algorithms which are suited for analyzinglow, as well as high, statistics data. The advantage of this package is that it allows the user to define their own model function and then compare different minimization routines to determine the optimal parameter values and their respective (correlated) errors. Experimental validation of the different approaches in the package is done through analysis of hyperfine structure data of 203Fr gathered by the CRIS experiment at ISOLDE, CERN.

  7. BnmrOffice: A Free Software for β-nmr Data Analysis

    NASA Astrophysics Data System (ADS)

    Saadaoui, Hassan

    A data-analysis framework with a graphical user interface (GUI) is developed to analyze β-nmr spectra in an automated and intuitive way. This program, named BnmrOffice is written in C++ and employs the QT libraries and tools for designing the GUI, and the CERN's Minuit optimization routines for minimization. The program runs under multiple platforms, and is available for free under the terms of the GNU GPL standards. The GUI is structured in tabs to search, plot and analyze data, along other functionalities. The user can tweak the minimization options; and fit multiple data files (or runs) using single or global fitting routines with pre-defined or new models. Currently, BnmrOffice reads TRIUMF's MUD data and ASCII files, and can be extended to other formats.

  8. A simple tool for stereological assessment of digital images: the STEPanizer.

    PubMed

    Tschanz, S A; Burri, P H; Weibel, E R

    2011-07-01

    STEPanizer is an easy-to-use computer-based software tool for the stereological assessment of digitally captured images from all kinds of microscopical (LM, TEM, LSM) and macroscopical (radiology, tomography) imaging modalities. The program design focuses on providing the user a defined workflow adapted to most basic stereological tasks. The software is compact, that is user friendly without being bulky. STEPanizer comprises the creation of test systems, the appropriate display of digital images with superimposed test systems, a scaling facility, a counting module and an export function for the transfer of results to spreadsheet programs. Here we describe the major workflow of the tool illustrating the application on two examples from transmission electron microscopy and light microscopy, respectively. © 2011 The Authors Journal of Microscopy © 2011 Royal Microscopical Society.

  9. CDFISH: an individual-based, spatially-explicit, landscape genetics simulator for aquatic species in complex riverscapes

    USGS Publications Warehouse

    Erin L. Landguth,; Muhlfeld, Clint C.; Luikart, Gordon

    2012-01-01

    We introduce Cost Distance FISHeries (CDFISH), a simulator of population genetics and connectivity in complex riverscapes for a wide range of environmental scenarios of aquatic organisms. The spatially-explicit program implements individual-based genetic modeling with Mendelian inheritance and k-allele mutation on a riverscape with resistance to movement. The program simulates individuals in subpopulations through time employing user-defined functions of individual migration, reproduction, mortality, and dispersal through straying on a continuous resistance surface.

  10. U.S. Geological Survey National Computer Technology Meeting: Program and Abstracts, Norfolk, Virginia, May 17-22, 1992

    DTIC Science & Technology

    1992-05-01

    formats, and character formats that can easily integrate graphics and text into one document. FrameMaker is one of few ERP software programs that has...easier and faster using ERP software. The DIS-II ERP software program is FrameMaker by Frame Technology, Incorporated. FrameMaker uses the X window...functions, calculus, relations, and other complicated math applications. FrameMaker permits the user to define formats for master pages, reference pages

  11. ISSYS: An integrated synergistic Synthesis System

    NASA Technical Reports Server (NTRS)

    Dovi, A. R.

    1980-01-01

    Integrated Synergistic Synthesis System (ISSYS), an integrated system of computer codes in which the sequence of program execution and data flow is controlled by the user, is discussed. The commands available to exert such control, the ISSYS major function and rules, and the computer codes currently available in the system are described. Computational sequences frequently used in the aircraft structural analysis and synthesis are defined. External computer codes utilized by the ISSYS system are documented. A bibliography on the programs is included.

  12. Layered approach to workstation design for medical image viewing

    NASA Astrophysics Data System (ADS)

    Haynor, David R.; Zick, Gregory L.; Heritage, Marcus B.; Kim, Yongmin

    1992-07-01

    Software engineering principles suggest that complex software systems are best constructed from independent, self-contained modules, thereby maximizing the portability, maintainability and modifiability of the produced code. This principal is important in the design of medical imaging workstations, where further developments in technology (CPU, memory, interface devices, displays, network connections) are required for clinically acceptable workstations, and it is desirable to provide different hardware platforms with the ''same look and feel'' for the user. In addition, the set of desired functions is relatively well understood, but the optimal user interface for delivering these functions on a clinically acceptable workstation is still different depending on department, specialty, or individual preference. At the University of Washington, we are developing a viewing station based on the IBM RISC/6000 computer and on new technologies that are just becoming commercially available. These include advanced voice recognition systems and an ultra-high-speed network. We are developing a set of specifications and a conceptual design for the workstation, and will be producing a prototype. This paper presents our current concepts concerning the architecture and software system design of the future prototype. Our conceptual design specifies requirements for a Database Application Programming Interface (DBAPI) and for a User API (UAPI). The DBAPI consists of a set of subroutine calls that define the admissible transactions between the workstation and an image archive. The UAPI describes the requests a user interface program can make of the workstation. It incorporates basic display and image processing functions, yet is specifically designed to allow extensions to the basic set at the application level. We will discuss the fundamental elements of the two API''s and illustrate their application to workstation design.

  13. Regular use of nonsteroidal anti-inflammatory drugs and cognitive function in aging women.

    PubMed

    Kang, Jae Hee; Grodstein, Francine

    2003-05-27

    To examine the relationship of nonsteroidal anti-inflammatory drug (NSAID) use and cognitive decline in young-old women. The authors prospectively studied 16,128 Nurses' Health Study participants, aged 70 to 81 years at baseline, who provided information on NSAID use and potential confounders in biennial questionnaires from 1976 through 1998. From 1995 through 2001, we administered, by telephone, six tests of cognitive function, including the Telephone Interview of Cognitive Status (TICS). Second interviews were begun 2 years later and completed on 13,255 women to date. The authors used multiple logistic regression to estimate relative risks (RR) of low baseline scores (defined as the bottom 10%) and substantial decline (worst 10%). Compared to never users, the RR was 0.75 (95% CI 0.59, 0.96) for a low baseline TICS score with current aspirin use of 15+ years duration, and 0.79 (95% CI 0.62, 1.02) for current use of NSAID (primarily ibuprofen) lasting 8+ years. Results for aspirin users were weaker on other tests, but long-term ibuprofen users had a RR of 0.75 (95% CI 0.56, 1.00) for a low baseline global score (combination of all six tests). The RR for substantial global cognitive decline was 0.93 (95% CI 0.68, 1.26) with long-term aspirin use, and 0.77 (95% CI 0.57, 1.05) with long-term ibuprofen use. In these young-old women, current, long-term NSAID users, especially of nonaspirin agents, showed reduced odds of low cognitive function and possibly lower rates of substantial cognitive decline over 2 years. Continued follow-up will help determine if associations differ at older ages.

  14. Accidental Discovery of Information on the User-Defined Social Web: A Mixed-Method Study

    ERIC Educational Resources Information Center

    Lu, Chi-Jung

    2012-01-01

    Frequently interacting with other people or working in an information-rich environment can foster the "accidental discovery of information" (ADI) (Erdelez, 2000; McCay-Peet & Toms, 2010). With the increasing adoption of social web technologies, online user-participation communities and user-generated content have provided users the…

  15. Improving Requirements Generation Thoroughness in User-Centered Workshops: The Role of Prompting and Shared User Stories

    ERIC Educational Resources Information Center

    Read, Aaron

    2013-01-01

    The rise of stakeholder centered software development has led to organizations engaging users early in the development process to help define system requirements. To facilitate user involvement in the requirements elicitation process, companies can use Group Support Systems (GSS) to conduct requirements elicitation workshops. The effectiveness of…

  16. shinyGISPA: A web application for characterizing phenotype by gene sets using multiple omics data combinations.

    PubMed

    Dwivedi, Bhakti; Kowalski, Jeanne

    2018-01-01

    While many methods exist for integrating multi-omics data or defining gene sets, there is no one single tool that defines gene sets based on merging of multiple omics data sets. We present shinyGISPA, an open-source application with a user-friendly web-based interface to define genes according to their similarity in several molecular changes that are driving a disease phenotype. This tool was developed to help facilitate the usability of a previously published method, Gene Integrated Set Profile Analysis (GISPA), among researchers with limited computer-programming skills. The GISPA method allows the identification of multiple gene sets that may play a role in the characterization, clinical application, or functional relevance of a disease phenotype. The tool provides an automated workflow that is highly scalable and adaptable to applications that go beyond genomic data merging analysis. It is available at http://shinygispa.winship.emory.edu/shinyGISPA/.

  17. shinyGISPA: A web application for characterizing phenotype by gene sets using multiple omics data combinations

    PubMed Central

    Dwivedi, Bhakti

    2018-01-01

    While many methods exist for integrating multi-omics data or defining gene sets, there is no one single tool that defines gene sets based on merging of multiple omics data sets. We present shinyGISPA, an open-source application with a user-friendly web-based interface to define genes according to their similarity in several molecular changes that are driving a disease phenotype. This tool was developed to help facilitate the usability of a previously published method, Gene Integrated Set Profile Analysis (GISPA), among researchers with limited computer-programming skills. The GISPA method allows the identification of multiple gene sets that may play a role in the characterization, clinical application, or functional relevance of a disease phenotype. The tool provides an automated workflow that is highly scalable and adaptable to applications that go beyond genomic data merging analysis. It is available at http://shinygispa.winship.emory.edu/shinyGISPA/. PMID:29415010

  18. Development of a User-Defined Stressor in the Improved Performance Research Integration Tool (IMPRINT) for Conducting Tasks in a Moving Vehicle

    DTIC Science & Technology

    2007-03-01

    task termine if in travel la ons, visual recognitio nd information proce visual recognitio uation will yiee Δ = (0accuracy .37 * 06463) + (0.63 * 0.11...mission Figure 2. User-defined stresso err int face . 8 Figure 3. Stressor levels in IMPRINT. Figure 4. Accuracy stressor definition

  19. Homopolymer tail-mediated ligation PCR: a streamlined and highly efficient method for DNA cloning and library construction.

    PubMed

    Lazinski, David W; Camilli, Andrew

    2013-01-01

    The amplification of DNA fragments, cloned between user-defined 5' and 3' end sequences, is a prerequisite step in the use of many current applications including massively parallel sequencing (MPS). Here we describe an improved method, called homopolymer tail-mediated ligation PCR (HTML-PCR), that requires very little starting template, minimal hands-on effort, is cost-effective, and is suited for use in high-throughput and robotic methodologies. HTML-PCR starts with the addition of homopolymer tails of controlled lengths to the 3' termini of a double-stranded genomic template. The homopolymer tails enable the annealing-assisted ligation of a hybrid oligonucleotide to the template's recessed 5' ends. The hybrid oligonucleotide has a user-defined sequence at its 5' end. This primer, together with a second primer composed of a longer region complementary to the homopolymer tail and fused to a second 5' user-defined sequence, are used in a PCR reaction to generate the final product. The user-defined sequences can be varied to enable compatibility with a wide variety of downstream applications. We demonstrate our new method by constructing MPS libraries starting from nanogram and sub-nanogram quantities of Vibrio cholerae and Streptococcus pneumoniae genomic DNA.

  20. LoopX: A Graphical User Interface-Based Database for Comprehensive Analysis and Comparative Evaluation of Loops from Protein Structures.

    PubMed

    Kadumuri, Rajashekar Varma; Vadrevu, Ramakrishna

    2017-10-01

    Due to their crucial role in function, folding, and stability, protein loops are being targeted for grafting/designing to create novel or alter existing functionality and improve stability and foldability. With a view to facilitate a thorough analysis and effectual search options for extracting and comparing loops for sequence and structural compatibility, we developed, LoopX a comprehensively compiled library of sequence and conformational features of ∼700,000 loops from protein structures. The database equipped with a graphical user interface is empowered with diverse query tools and search algorithms, with various rendering options to visualize the sequence- and structural-level information along with hydrogen bonding patterns, backbone φ, ψ dihedral angles of both the target and candidate loops. Two new features (i) conservation of the polar/nonpolar environment and (ii) conservation of sequence and conformation of specific residues within the loops have also been incorporated in the search and retrieval of compatible loops for a chosen target loop. Thus, the LoopX server not only serves as a database and visualization tool for sequence and structural analysis of protein loops but also aids in extracting and comparing candidate loops for a given target loop based on user-defined search options.

  1. A review method for UML requirements analysis model employing system-side prototyping.

    PubMed

    Ogata, Shinpei; Matsuura, Saeko

    2013-12-01

    User interface prototyping is an effective method for users to validate the requirements defined by analysts at an early stage of a software development. However, a user interface prototype system offers weak support for the analysts to verify the consistency of the specifications about internal aspects of a system such as business logic. As the result, the inconsistency causes a lot of rework costs because the inconsistency often makes the developers impossible to actualize the system based on the specifications. For verifying such consistency, functional prototyping is an effective method for the analysts, but it needs a lot of costs and more detailed specifications. In this paper, we propose a review method so that analysts can verify the consistency among several different kinds of diagrams in UML efficiently by employing system-side prototyping without the detailed model. The system-side prototype system does not have any functions to achieve business logic, but visualizes the results of the integration among the diagrams in UML as Web pages. The usefulness of our proposal was evaluated by applying our proposal into a development of Library Management System (LMS) for a laboratory. This development was conducted by a group. As the result, our proposal was useful for discovering the serious inconsistency caused by the misunderstanding among the members of the group.

  2. AGORA : Organellar genome annotation from the amino acid and nucleotide references.

    PubMed

    Jung, Jaehee; Kim, Jong Im; Jeong, Young-Sik; Yi, Gangman

    2018-03-29

    Next-generation sequencing (NGS) technologies have led to the accumulation of highthroughput sequence data from various organisms in biology. To apply gene annotation of organellar genomes for various organisms, more optimized tools for functional gene annotation are required. Almost all gene annotation tools are mainly focused on the chloroplast genome of land plants or the mitochondrial genome of animals.We have developed a web application AGORA for the fast, user-friendly, and improved annotations of organellar genomes. AGORA annotates genes based on a BLAST-based homology search and clustering with selected reference sequences from the NCBI database or user-defined uploaded data. AGORA can annotate the functional genes in almost all mitochondrion and plastid genomes of eukaryotes. The gene annotation of a genome with an exon-intron structure within a gene or inverted repeat region is also available. It provides information of start and end positions of each gene, BLAST results compared with the reference sequence, and visualization of gene map by OGDRAW. Users can freely use the software, and the accessible URL is https://bigdata.dongguk.edu/gene_project/AGORA/.The main module of the tool is implemented by the python and php, and the web page is built by the HTML and CSS to support all browsers. gangman@dongguk.edu.

  3. Software For Graphical Representation Of A Network

    NASA Technical Reports Server (NTRS)

    Mcallister, R. William; Mclellan, James P.

    1993-01-01

    System Visualization Tool (SVT) computer program developed to provide systems engineers with means of graphically representing networks. Generates diagrams illustrating structures and states of networks defined by users. Provides systems engineers powerful tool simplifing analysis of requirements and testing and maintenance of complex software-controlled systems. Employs visual models supporting analysis of chronological sequences of requirements, simulation data, and related software functions. Applied to pneumatic, hydraulic, and propellant-distribution networks. Used to define and view arbitrary configurations of such major hardware components of system as propellant tanks, valves, propellant lines, and engines. Also graphically displays status of each component. Advantage of SVT: utilizes visual cues to represent configuration of each component within network. Written in Turbo Pascal(R), version 5.0.

  4. Simulating Damage Due to a Lightning Strike Event: Effects of Temperature Dependent Properties on Interlaminar Damage

    NASA Technical Reports Server (NTRS)

    Ghezeljeh, Paria Naghipour; Pineda, Evan Jorge

    2014-01-01

    A multidirectional, carbon fiber-epoxy, composite panel is subjected to a simulated lightning strike, within a finite element method framework, and the effect of material properties on the failure (delamination) response is investigated through a detailed numerical study. The numerical model of the composite panel consists of individual homogenized plies with user-defined, cohesive interface elements between them. Lightning strikes are simulated as an assumed combination of excessive heat and high pressure loadings. It is observed that the initiation and propagation of lightning-induced delamination is a significant function of the temperature dependency of interfacial fracture toughness. This dependency must be defined properly in order to achieve reliable predictions of the present lightning-induced delamination in the composite panel.

  5. Manned Orbital Transfer Vehicle (MOTV). Volume 2: Mission handbook

    NASA Technical Reports Server (NTRS)

    Boyland, R. E.; Sherman, S. W.; Morfin, H. W.

    1979-01-01

    The use of the manned orbit transfer vehicle (MOTV) for support of future space missions is defined. Some 20 generic missions are defined each representative of the types of missions expected to be flown in the future. These include the service and update of communications satellites, emergency repair of surveillance satellites, and passenger transport of a six man crew rotation/resupply service to a deep space command post. The propulsive and functional capabilities required of the MOTV to support a particular mission are described and data to enable the user to determine the number of STS flights needed to support the mission, mission peculiar equipment requirements, parametrics on mission phasing and requirements, ground and flight support requirements, recovery considerations, and IVA/EVA trade analysis are presented.

  6. MASTRE trajectory code update to automate flight trajectory design, performance predictions, and vehicle sizing for support of shuttle and shuttle derived vehicles: Programmers manual

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The information required by a programmer using the Minimum Hamiltonian AScent Trajectory Evaluation (MASTRE) Program is provided. This document enables the programmer to either modify the program or convert the program to computers other than the VAX computer. Documentation for each subroutine or function based on providing the definitions of the variables and a source listing are included. Questions concerning the equations, techniques, or input requirements should be answered by either the Engineering or User's manuals. Three appendices are also included which provide a listing of the Root-Sum-Square (RSS) program, a listing of subroutine names and definitions used in the MASTRE User Friendly Interface Program, and listing of the subroutine names and definitions used in the Mass Properties Program. The RSS Program is used to aid in the performance of dispersion analyses. The RSS program reads a file generated by the MASTRE Program, calculates dispersion parameters, and generates output tables and output plot files. UFI Program provides a screen user interface to aid the user in providing input to the model. The Mass Properties Program defines the mass properties data for the MASTRE program through the use of user interface software.

  7. Sequential addition of short DNA oligos in DNA-polymerase-based synthesis reactions

    DOEpatents

    Gardner, Shea N [San Leandro, CA; Mariella, Jr., Raymond P.; Christian, Allen T [Tracy, CA; Young, Jennifer A [Berkeley, CA; Clague, David S [Livermore, CA

    2011-01-18

    A method of fabricating a DNA molecule of user-defined sequence. The method comprises the steps of preselecting a multiplicity of DNA sequence segments that will comprise the DNA molecule of user-defined sequence, separating the DNA sequence segments temporally, and combining the multiplicity of DNA sequence segments with at least one polymerase enzyme wherein the multiplicity of DNA sequence segments join to produce the DNA molecule of user-defined sequence. Sequence segments may be of length n, where n is an even or odd integer. In one embodiment the length of desired hybridizing overlap is specified by the user and the sequences and the protocol for combining them are guided by computational (bioinformatics) predictions. In one embodiment sequence segments are combined from multiple reading frames to span the same region of a sequence, so that multiple desired hybridizations may occur with different overlap lengths. In one embodiment starting sequence fragments are of different lengths, n, n+1, n+2, etc.

  8. AOIPS data base management systems support for GARP data sets

    NASA Technical Reports Server (NTRS)

    Gary, J. P.

    1977-01-01

    A data base management system is identified, developed to provide flexible access to data sets produced by GARP during its data systems tests. The content and coverage of the data base are defined and a computer-aided, interactive information storage and retrieval system, implemented to facilitate access to user specified data subsets, is described. The computer programs developed to provide the capability were implemented on the highly interactive, minicomputer-based AOIPS and are referred to as the data retrieval system (DRS). Implemented as a user interactive but menu guided system, the DRS permits users to inventory the data tape library and create duplicate or subset data sets based on a user selected window defined by time and latitude/longitude boundaries. The DRS permits users to select, display, or produce formatted hard copy of individual data items contained within the data records.

  9. A Preliminary Data Model for Orbital Flight Dynamics in Shuttle Mission Control

    NASA Technical Reports Server (NTRS)

    ONeill, John; Shalin, Valerie L.

    2000-01-01

    The Orbital Flight Dynamics group in Shuttle Mission Control is investigating new user interfaces in a project called RIOTS [RIOTS 2000]. Traditionally, the individual functions of hardware and software guide the design of displays, which results in an aggregated, if not integrated interface. The human work system has then been designed and trained to navigate, operate and integrate the processors and displays. The aim of RIOTS is to reduce the cognitive demands of the flight controllers by redesigning the user interface to support the work of the flight controller. This document supports the RIOTS project by defining a preliminary data model for Orbital Flight Dynamics. Section 2 defines an information-centric perspective. An information-centric approach aims to reduce the cognitive workload of the flight controllers by reducing the need for manual integration of information across processors and displays. Section 3 describes the Orbital Flight Dynamics domain. Section 4 defines the preliminary data model for Orbital Flight Dynamics. Section 5 examines the implications of mapping the data model to Orbital Flight Dynamics current information systems. Two recurring patterns are identified in the Orbital Flight Dynamics work the iteration/rework cycle and the decision-making/information integration/mirroring role relationship. Section 6 identifies new requirements on Orbital Flight Dynamics work and makes recommendations based on changing the information environment, changing the implementation of the data model, and changing the two recurring patterns.

  10. MatLab Script and Functional Programming

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali

    2007-01-01

    MatLab Script and Functional Programming: MatLab is one of the most widely used very high level programming languages for scientific and engineering computations. It is very user-friendly and needs practically no formal programming knowledge. Presented here are MatLab programming aspects and not just the MatLab commands for scientists and engineers who do not have formal programming training and also have no significant time to spare for learning programming to solve their real world problems. Specifically provided are programs for visualization. The MatLab seminar covers the functional and script programming aspect of MatLab language. Specific expectations are: a) Recognize MatLab commands, script and function. b) Create, and run a MatLab function. c) Read, recognize, and describe MatLab syntax. d) Recognize decisions, loops and matrix operators. e) Evaluate scope among multiple files, and multiple functions within a file. f) Declare, define and use scalar variables, vectors and matrices.

  11. Increasing Usability in Ocean Observing Systems

    NASA Astrophysics Data System (ADS)

    Chase, A. C.; Gomes, K.; O'Reilly, T.

    2005-12-01

    As observatory systems move to more advanced techniques for instrument configuration and data management, standardized frameworks are being developed to benefit from commodities of scale. ACE (A Configuror and Editor) is a tool that was developed for SIAM (Software Infrastructure and Application for MOOS), a framework for the seamless integration of self-describing plug-and-work instruments into the Monterey Ocean Observing System. As a comprehensive solution, the SIAM infrastructure requires a number of processes to be run to configure an instrument for use within its framework. As solutions move from the lab to the field, the steps needed to implement the solution must be made bulletproof so that they may be used in the field with confidence. Loosely defined command line interfaces don't always provide enough user feedback and business logic can be difficult to maintain over a series of scripts. ACE is a tool developed for guiding the user through a number of complicated steps, removing the reliance on command-line utilities and reducing the difficulty of completing the necessary steps, while also preventing operator error and enforcing system constraints. Utilizing the cross-platform nature of the Java programming language, ACE provides a complete solution for deploying an instrument within the SIAM infrastructure without depending on special software being installed on the users computer. Requirements such as the installation of a Unix emulator for users running Windows machines, and the installation of, and ability to use, a CVS client, have all been removed by providing the equivalent functionality from within ACE. In order to achieve a "one stop shop" for configuring instruments, ACE had to be written to handle a wide variety of functionality including: compiling java code, interacting with a CVS server and maintaining client-side CVS information, editing XML, interacting with a server side database, and negotiating serial port communications through Java. This paper will address the relative tradeoffs of including all the afore-mentioned functionality in a single tool, its affects on user adoption of the framework (SIAM) it provides access to, as well as further discussion of some of the functionality generally pertinent to data management (XML editing, source code management and compilation, etc).

  12. A Distributed Trajectory-Oriented Approach to Managing Traffic Complexity

    NASA Technical Reports Server (NTRS)

    Idris, Husni; Wing, David J.; Vivona, Robert; Garcia-Chico, Jose-Luis

    2007-01-01

    In order to handle the expected increase in air traffic volume, the next generation air transportation system is moving towards a distributed control architecture, in which ground-based service providers such as controllers and traffic managers and air-based users such as pilots share responsibility for aircraft trajectory generation and management. While its architecture becomes more distributed, the goal of the Air Traffic Management (ATM) system remains to achieve objectives such as maintaining safety and efficiency. It is, therefore, critical to design appropriate control elements to ensure that aircraft and groundbased actions result in achieving these objectives without unduly restricting user-preferred trajectories. This paper presents a trajectory-oriented approach containing two such elements. One is a trajectory flexibility preservation function, by which aircraft plan their trajectories to preserve flexibility to accommodate unforeseen events. And the other is a trajectory constraint minimization function by which ground-based agents, in collaboration with air-based agents, impose just-enough restrictions on trajectories to achieve ATM objectives, such as separation assurance and flow management. The underlying hypothesis is that preserving trajectory flexibility of each individual aircraft naturally achieves the aggregate objective of avoiding excessive traffic complexity, and that trajectory flexibility is increased by minimizing constraints without jeopardizing the intended ATM objectives. The paper presents conceptually how the two functions operate in a distributed control architecture that includes self separation. The paper illustrates the concept through hypothetical scenarios involving conflict resolution and flow management. It presents a functional analysis of the interaction and information flow between the functions. It also presents an analytical framework for defining metrics and developing methods to preserve trajectory flexibility and minimize its constraints. In this framework flexibility is defined in terms of robustness and adaptability to disturbances and the impact of constraints is illustrated through analysis of a trajectory solution space with limited degrees of freedom and in simple constraint situations involving meeting multiple times of arrival and resolving a conflict.

  13. Integrating Actionable User-defined Faceted Rules into the Hybrid Science Data System for Advanced Rapid Imaging & Analysis

    NASA Astrophysics Data System (ADS)

    Manipon, G. J. M.; Hua, H.; Owen, S. E.; Sacco, G. F.; Agram, P. S.; Moore, A. W.; Yun, S. H.; Fielding, E. J.; Lundgren, P.; Rosen, P. A.; Webb, F.; Liu, Z.; Smith, A. T.; Wilson, B. D.; Simons, M.; Poland, M. P.; Cervelli, P. F.

    2014-12-01

    The Hybrid Science Data System (HySDS) scalably powers the ingestion, metadata extraction, cataloging, high-volume data processing, and publication of the geodetic data products for the Advanced Rapid Imaging & Analysis for Monitoring Hazard (ARIA-MH) project at JPL. HySDS uses a heterogeneous set of worker nodes from private & public clouds as well as virtual & bare-metal machines to perform every aspect of the traditional science data system. For our science data users, the forefront of HySDS is the facet search interface, FacetView, which allows them to browse, filter, and access the published products. Users are able to explore the collection of product metadata information and apply multiple filters to constrain the result set down to their particular interests. It allows them to download these faceted products for further analysis and generation of derived products. However, we have also employed a novel approach to faceting where it is also used to apply constraints for custom monitoring of products, system resources, and triggers for automated data processing. The power of the facet search interface is well documented across various domains and its usefulness is rooted in the current state of existence of metadata. However, user needs usually extend beyond what is currently present in the data system. A user interested in synthetic aperture radar (SAR) data over Kilauea will download them from FacetView but would also want email notification of future incoming scenes. The user may even want that data pushed to a remote workstation for automated processing. Better still, these future products could trigger HySDS to run the user's analysis on its array of worker nodes, on behalf of the user, and ingest the resulting derived products. We will present our findings in integrating an ancillary, user-defined, system-driven processing system for HySDS that allows users to define faceted rules based on facet constraints and triggers actions when new SAR data products arrive that match the constraints. We will discuss use cases where users have defined rules for the automated generation of InSAR derived products: interferograms for California and Kilauea, time-series analyses, and damage proxy maps. These findings are relevant for science data system development of the proposed NASA-ISRO SAR mission.

  14. Responsible and controlled use: Older cannabis users and harm reduction.

    PubMed

    Lau, Nicholas; Sales, Paloma; Averill, Sheigla; Murphy, Fiona; Sato, Sye-Ok; Murphy, Sheigla

    2015-08-01

    Cannabis use is becoming more accepted in mainstream society. In this paper, we use Zinberg's classic theoretical framework of drug, set, and setting to elucidate how older adult cannabis users managed health, social and legal risks in a context of normalized cannabis use. We present selected findings from our qualitative study of Baby Boomer (born 1946-1964) cannabis users in the San Francisco Bay Area. Data collection consisted of a recorded, in-depth life history interview followed by a questionnaire and health survey. Qualitative interviews were analyzed to discover the factors of cannabis harm reduction from the users' perspectives. Interviewees made harm reduction choices based on preferred cannabis derivatives and routes of administration, as well as why, when, where, and with whom to use. Most interviewees minimized cannabis-related harms so they could maintain social functioning in their everyday lives. Responsible and controlled use was described as moderation of quantity and frequency of cannabis used, using in appropriate settings, and respect for non-users. Users contributed to the normalization of cannabis use through normification. Participants followed rituals or cultural practices, characterized by sanctions that helped define "normal" or "acceptable" cannabis use. Users contributed to cannabis normalization through their harm reduction methods. These cultural practices may prove to be more effective than formal legal prohibitions in reducing cannabis-related harms. Findings also suggest that users with access to a regulated market (medical cannabis dispensaries) were better equipped to practice harm reduction. More research is needed on both cannabis culture and alternative routes of administration as harm reduction methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. User Needs and Advances in Space Wireless Sensing and Communications

    NASA Technical Reports Server (NTRS)

    Kegege, Obadiah

    2017-01-01

    Decades of space exploration and technology trends for future missions show the need for new approaches in space/planetary sensor networks, observatories, internetworking, and communications/data delivery to Earth. The User Needs to be discussed in this talk includes interviews with several scientists and reviews of mission concepts for the next generation of sensors, observatories, and planetary surface missions. These observatories, sensors are envisioned to operate in extreme environments, with advanced autonomy, whereby sometimes communication to Earth is intermittent and delayed. These sensor nodes require software defined networking capabilities in order to learn and adapt to the environment, collect science data, internetwork, and communicate. Also, some user cases require the level of intelligence to manage network functions (either as a host), mobility, security, and interface data to the physical radio/optical layer. For instance, on a planetary surface, autonomous sensor nodes would create their own ad-hoc network, with some nodes handling communication capabilities between the wireless sensor networks and orbiting relay satellites. A section of this talk will cover the advances in space communication and internetworking to support future space missions. NASA's Space Communications and Navigation (SCaN) program continues to evolve with the development of optical communication, a new vision of the integrated network architecture with more capabilities, and the adoption of CCSDS space internetworking protocols. Advances in wireless communications hardware and electronics have enabled software defined networking (DVB-S2, VCM, ACM, DTN, Ad hoc, etc.) protocols for improved wireless communication and network management. Developing technologies to fulfil these user needs for wireless communications and adoption of standardized communication/internetworking protocols will be a huge benefit to future planetary missions, space observatories, and manned missions to other planets.

  16. Transterm: a database to aid the analysis of regulatory sequences in mRNAs

    PubMed Central

    Jacobs, Grant H.; Chen, Augustine; Stevens, Stewart G.; Stockwell, Peter A.; Black, Michael A.; Tate, Warren P.; Brown, Chris M.

    2009-01-01

    Messenger RNAs, in addition to coding for proteins, may contain regulatory elements that affect how the protein is translated. These include protein and microRNA-binding sites. Transterm (http://mRNA.otago.ac.nz/Transterm.html) is a database of regions and elements that affect translation with two major unique components. The first is integrated results of analysis of general features that affect translation (initiation, elongation, termination) for species or strains in Genbank, processed through a standard pipeline. The second is curated descriptions of experimentally determined regulatory elements that function as translational control elements in mRNAs. Transterm focuses on protein binding sites, particularly those in 3′-untranslated regions (3′-UTR). For this release the interface has been extensively updated based on user feedback. The data is now accessible by strain rather than species, for example there are 10 Escherichia coli strains (genomes) analysed separately. In addition to providing a repository of data, the database also provides tools for users to query their own mRNA sequences. Users can search sequences for Transterm or user defined regulatory elements, including protein or miRNA targets. Transterm also provides a central core of links to related resources for complementary analyses. PMID:18984623

  17. Services, systems, and policies affecting mobility device users' community mobility: A scoping review: Services, systèmes et politiques influençant la mobilité dans la communauté des utilisateurs d'aides à la mobilité : examen de la portée.

    PubMed

    Jónasdóttir, Sigrún Kristín; Polgar, Jan Miller

    2018-04-01

    Opportunities to travel from one place to another in the community, or community mobility, are especially important for mobility device users' ability to participate fully in society. However, contextual challenges to such mobility exist. This study summarizes the literature on existing community mobility barriers and facilitators of mobility device users created by services, systems, and policies as defined by the International Classification of Functioning, Disability, and Health (ICF). Arksey and O'Malley's approach for scoping studies was used for the review. The extraction chart was organized following the ICF, and frequency counts were used to report the data. The findings suggest that certain factors, such as transportation, open-space planning, and architecture and construction, influence community mobility opportunities. However, little attention has been paid to services, systems, and policies in the research literature, limiting the knowledge on the subject. Further research is needed to examine the relationship between specific services, systems, and policies and mobility device users' mobility within their communities.

  18. al3c: high-performance software for parameter inference using Approximate Bayesian Computation.

    PubMed

    Stram, Alexander H; Marjoram, Paul; Chen, Gary K

    2015-11-01

    The development of Approximate Bayesian Computation (ABC) algorithms for parameter inference which are both computationally efficient and scalable in parallel computing environments is an important area of research. Monte Carlo rejection sampling, a fundamental component of ABC algorithms, is trivial to distribute over multiple processors but is inherently inefficient. While development of algorithms such as ABC Sequential Monte Carlo (ABC-SMC) help address the inherent inefficiencies of rejection sampling, such approaches are not as easily scaled on multiple processors. As a result, current Bayesian inference software offerings that use ABC-SMC lack the ability to scale in parallel computing environments. We present al3c, a C++ framework for implementing ABC-SMC in parallel. By requiring only that users define essential functions such as the simulation model and prior distribution function, al3c abstracts the user from both the complexities of parallel programming and the details of the ABC-SMC algorithm. By using the al3c framework, the user is able to scale the ABC-SMC algorithm in parallel computing environments for his or her specific application, with minimal programming overhead. al3c is offered as a static binary for Linux and OS-X computing environments. The user completes an XML configuration file and C++ plug-in template for the specific application, which are used by al3c to obtain the desired results. Users can download the static binaries, source code, reference documentation and examples (including those in this article) by visiting https://github.com/ahstram/al3c. astram@usc.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. Design strategies and functionality of the Visual Interface for Virtual Interaction Development (VIVID) tool

    NASA Technical Reports Server (NTRS)

    Nguyen, Lac; Kenney, Patrick J.

    1993-01-01

    Development of interactive virtual environments (VE) has typically consisted of three primary activities: model (object) development, model relationship tree development, and environment behavior definition and coding. The model and relationship tree development activities are accomplished with a variety of well-established graphic library (GL) based programs - most utilizing graphical user interfaces (GUI) with point-and-click interactions. Because of this GUI format, little programming expertise on the part of the developer is necessary to create the 3D graphical models or to establish interrelationships between the models. However, the third VE development activity, environment behavior definition and coding, has generally required the greatest amount of time and programmer expertise. Behaviors, characteristics, and interactions between objects and the user within a VE must be defined via command line C coding prior to rendering the environment scenes. In an effort to simplify this environment behavior definition phase for non-programmers, and to provide easy access to model and tree tools, a graphical interface and development tool has been created. The principal thrust of this research is to effect rapid development and prototyping of virtual environments. This presentation will discuss the 'Visual Interface for Virtual Interaction Development' (VIVID) tool; an X-Windows based system employing drop-down menus for user selection of program access, models, and trees, behavior editing, and code generation. Examples of these selection will be highlighted in this presentation, as will the currently available program interfaces. The functionality of this tool allows non-programming users access to all facets of VE development while providing experienced programmers with a collection of pre-coded behaviors. In conjunction with its existing, interfaces and predefined suite of behaviors, future development plans for VIVID will be described. These include incorporation of dual user virtual environment enhancements, tool expansion, and additional behaviors.

  20. Socially Relevant Knowledge Based Telemedicine

    DTIC Science & Technology

    2010-10-01

    potential to chang e behavior and/or attitude at different situations and different circumstances. Fogg mentions that there are many reasons that...finding appropriate way to pers uade users to perform various activities. Fogg [8] defines persuasive technologies as “intera ctive computing systems...persuasive technology tools (defined b y Fogg ), which we are using in our system is explained below: Tunneling It is a process in which users are

  1. Nanoscale Transport Optimization

    DTIC Science & Technology

    2008-12-04

    could be argued that the advantage of using ABAQUS for this modeling construct has more to do with its ability to impose a user-defined subroutine that...finite element analysis. This is accomplished by employing a user defined subroutine for fluid properties at the interface within the finite element...package ABAQUS . Model Components: As noted above the governing equation for the material system is given as, ( ) ( ) 4484476444 8444 76

  2. CMS Configuration Editor: GUI based application for user analysis job

    NASA Astrophysics Data System (ADS)

    de Cosa, A.

    2011-12-01

    We present the user interface and the software architecture of the Configuration Editor for the CMS experiment. The analysis workflow is organized in a modular way integrated within the CMS framework that organizes in a flexible way user analysis code. The Python scripting language is adopted to define the job configuration that drives the analysis workflow. It could be a challenging task for users, especially for newcomers, to develop analysis jobs managing the configuration of many required modules. For this reason a graphical tool has been conceived in order to edit and inspect configuration files. A set of common analysis tools defined in the CMS Physics Analysis Toolkit (PAT) can be steered and configured using the Config Editor. A user-defined analysis workflow can be produced starting from a standard configuration file, applying and configuring PAT tools according to the specific user requirements. CMS users can adopt this tool, the Config Editor, to create their analysis visualizing in real time which are the effects of their actions. They can visualize the structure of their configuration, look at the modules included in the workflow, inspect the dependences existing among the modules and check the data flow. They can visualize at which values parameters are set and change them according to what is required by their analysis task. The integration of common tools in the GUI needed to adopt an object-oriented structure in the Python definition of the PAT tools and the definition of a layer of abstraction from which all PAT tools inherit.

  3. A set-theoretic model reference adaptive control architecture for disturbance rejection and uncertainty suppression with strict performance guarantees

    NASA Astrophysics Data System (ADS)

    Arabi, Ehsan; Gruenwald, Benjamin C.; Yucelen, Tansel; Nguyen, Nhan T.

    2018-05-01

    Research in adaptive control algorithms for safety-critical applications is primarily motivated by the fact that these algorithms have the capability to suppress the effects of adverse conditions resulting from exogenous disturbances, imperfect dynamical system modelling, degraded modes of operation, and changes in system dynamics. Although government and industry agree on the potential of these algorithms in providing safety and reducing vehicle development costs, a major issue is the inability to achieve a-priori, user-defined performance guarantees with adaptive control algorithms. In this paper, a new model reference adaptive control architecture for uncertain dynamical systems is presented to address disturbance rejection and uncertainty suppression. The proposed framework is predicated on a set-theoretic adaptive controller construction using generalised restricted potential functions.The key feature of this framework allows the system error bound between the state of an uncertain dynamical system and the state of a reference model, which captures a desired closed-loop system performance, to be less than a-priori, user-defined worst-case performance bound, and hence, it has the capability to enforce strict performance guarantees. Examples are provided to demonstrate the efficacy of the proposed set-theoretic model reference adaptive control architecture.

  4. 5G: rethink mobile communications for 2020+.

    PubMed

    Chih-Lin, I; Han, Shuangfeng; Xu, Zhikun; Sun, Qi; Pan, Zhengang

    2016-03-06

    The 5G network is anticipated to meet the challenging requirements of mobile traffic in the 2020s, which are characterized by super high data rate, low latency, high mobility, high energy efficiency and high traffic density. This paper provides an overview of China Mobile's 5G vision and potential solutions. Three key characteristics of 5G are analysed, i.e. super fast, soft and green. The main 5G R&D themes are further elaborated, which include five fundamental rethinkings of the traditional design methodologies. The 5G network design considerations are also discussed, with cloud radio access network, ultra-dense network, software defined network and network function virtualization examined as key potential solutions towards a green and soft 5G network. The paradigm shift to user-centric network operation from the traditional cell-centric operation is also investigated, where the decoupled downlink and uplink, control and data, and adaptive multiple connections provide sufficient means to achieve a user-centric 5G network with 'no more cells'. The software defined air interface is investigated under a uniform framework and can adaptively adapt the parameters to well satisfy various requirements in different 5G scenarios. © 2016 The Author(s).

  5. Structural Tailoring of Advanced Turboprops (STAT). Theoretical manual

    NASA Technical Reports Server (NTRS)

    Brown, K. W.

    1992-01-01

    This manual describes the theories in the Structural Tailoring of Advanced Turboprops (STAT) computer program, which was developed to perform numerical optimizations on highly swept propfan blades. The optimization procedure seeks to minimize an objective function, defined as either direct operating cost or aeroelastic differences between a blade and its scaled model, by tuning internal and external geometry variables that must satisfy realistic blade design constraints. The STAT analyses include an aerodynamic efficiency evaluation, a finite element stress and vibration analysis, an acoustic analysis, a flutter analysis, and a once-per-revolution (1-p) forced response life prediction capability. The STAT constraints include blade stresses, blade resonances, flutter, tip displacements, and a 1-P forced response life fraction. The STAT variables include all blade internal and external geometry parameters needed to define a composite material blade. The STAT objective function is dependent upon a blade baseline definition which the user supplies to describe a current blade design for cost optimization or for the tailoring of an aeroelastic scale model.

  6. Structural Tailoring of Advanced Turboprops (STAT). Theoretical manual

    NASA Astrophysics Data System (ADS)

    Brown, K. W.

    1992-10-01

    This manual describes the theories in the Structural Tailoring of Advanced Turboprops (STAT) computer program, which was developed to perform numerical optimizations on highly swept propfan blades. The optimization procedure seeks to minimize an objective function, defined as either direct operating cost or aeroelastic differences between a blade and its scaled model, by tuning internal and external geometry variables that must satisfy realistic blade design constraints. The STAT analyses include an aerodynamic efficiency evaluation, a finite element stress and vibration analysis, an acoustic analysis, a flutter analysis, and a once-per-revolution (1-p) forced response life prediction capability. The STAT constraints include blade stresses, blade resonances, flutter, tip displacements, and a 1-P forced response life fraction. The STAT variables include all blade internal and external geometry parameters needed to define a composite material blade. The STAT objective function is dependent upon a blade baseline definition which the user supplies to describe a current blade design for cost optimization or for the tailoring of an aeroelastic scale model.

  7. Separating Form from Function; the StarView Experience

    NASA Astrophysics Data System (ADS)

    Pollizzi, J.

    The advent of various display building tools has brought the use of advanced windowing techniques to even casual software developers. This has quickened the development cycle from conception to implementation for many applications. Such approaches are extremely attractive and cost effective when a common application is to be deployed to a number of users. There is however a side effect when using such approaches. The inherent problem is that while the tools allow for a clean separation of user interaction from specific application code, it still closely ties the presentation mechanism to the functional use. StarView, the user interface to the Hubble Data Archives, was developed to expressly distinguish presentation aspects from the functional capabilities needed in the interface. StarView's functional capabilities: creating queries, identifying database fields, data display, archive requests, help,... represent that part of the application that is common to all users. These capabilities are independent of the database to be interrogated or the actual displays or interactions to be used. The presentation aspects: forms, menus, buttons, help text, navigation, even key-bindings, are all defined through symbolic notations. By imposing this distinction as a fundamental design goal, StarView is able to support rapid independent evolution of its forms, - and yet maintain a highly stable core system that implements the common functions needed by all users. The need for this flexibility has been borne out by our experiences. The majority of recent StarView iterations, made by the Space Telescope Science Institute (STScI) scientists, have been on presentation format. There has been little modification of StarView's existing capabilities, but nearly 100 person-hours have gone into tuning the displays for the initial public release. Further, history tells us that regardless of this effort, there will still be a call for changes or requests for personalized formats. This paper investigates this distinction between presentation and function and it explores how the StarView system supports this separation. As background, the typical use of one such tool, ICS's BuilderXcessory, will be briefly described. A sample application using the tool will be compared to a StarView example accomplishing a similar task. The effect of a subsequent presentation change to the task will then be examined in both cases. Finally, there will be a brief discussion of design tradeoffs that best leverage the use of interface tools and yet maintain a flexible approach in connecting the presentation aspects of the application to its functional capabilities.

  8. Exploring NASA OMI Level 2 Data With Visualization

    NASA Technical Reports Server (NTRS)

    Wei, Jennifer; Yang, Wenli; Johnson, James; Zhao, Peisheng; Gerasimov, Irina; Pham, Long; Vicente, Gilberto

    2014-01-01

    Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted, such as model inputs from satellite, or extreme events (such as volcano eruptions, dust storms,... etc.). Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. Such obstacles may be avoided by allowing users to visualize satellite data as "images", with accurate pixel-level (Level-2) information, including pixel coverage area delineation and science team recommended quality screening for individual geophysical parameters. We present a prototype service from the Goddard Earth Sciences Data and Information Services Center (GES DISC) supporting Aura OMI Level-2 Data with GIS-like capabilities. Functionality includes selecting data sources (e.g., multiple parameters under the same scene, like NO2 and SO2, or the same parameter with different aggregation methods, like NO2 in OMNO2G and OMNO2D products), user-defined area-of-interest and temporal extents, zooming, panning, overlaying, sliding, and data subsetting, reformatting, and reprojection. The system will allow any user-defined portal interface (front-end) to connect to our backend server with OGC standard-compliant Web Mapping Service (WMS) and Web Coverage Service (WCS) calls. This back-end service should greatly enhance its expandability to integrate additional outside data/map sources.

  9. Exploring NASA OMI Level 2 Data With Visualization

    NASA Technical Reports Server (NTRS)

    Wei, Jennifer C.; Yang, Wenli; Johnson, James; Zhao, Peisheng; Gerasimov, Irina; Pham, Long; Vincente, Gilbert

    2014-01-01

    Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted, such as model inputs from satellite, or extreme events (such as volcano eruptions, dust storms, etc.).Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. Such obstacles may be avoided by allowing users to visualize satellite data as images, with accurate pixel-level (Level-2) information, including pixel coverage area delineation and science team recommended quality screening for individual geophysical parameters. We present a prototype service from the Goddard Earth Sciences Data and Information Services Center (GES DISC) supporting Aura OMI Level-2 Data with GIS-like capabilities. Functionality includes selecting data sources (e.g., multiple parameters under the same scene, like NO2 and SO2, or the same parameter with different aggregation methods, like NO2 in OMNO2G and OMNO2D products), user-defined area-of-interest and temporal extents, zooming, panning, overlaying, sliding, and data subsetting, reformatting, and reprojection. The system will allow any user-defined portal interface (front-end) to connect to our backend server with OGC standard-compliant Web Mapping Service (WMS) and Web Coverage Service (WCS) calls. This back-end service should greatly enhance its expandability to integrate additional outside data-map sources.

  10. Visual PEF Reader - VIPER

    NASA Technical Reports Server (NTRS)

    Luo, Victor; Khanampornpan, Teerapat; Boehmer, Rudy A.; Kim, Rachel Y.

    2011-01-01

    This software graphically displays all pertinent information from a Predicted Events File (PEF) using the Java Swing framework, which allows for multi-platform support. The PEF is hard to weed through when looking for specific information and it is a desire for the MRO (Mars Reconn aissance Orbiter) Mission Planning & Sequencing Team (MPST) to have a different way to visualize the data. This tool will provide the team with a visual way of reviewing and error-checking the sequence product. The front end of the tool contains much of the aesthetically appealing material for viewing. The time stamp is displayed in the top left corner, and highlighted details are displayed in the bottom left corner. The time bar stretches along the top of the window, and the rest of the space is allotted for blocks and step functions. A preferences window is used to control the layout of the sections along with the ability to choose color and size of the blocks. Double-clicking on a block will show information contained within the block. Zooming into a certain level will graphically display that information as an overlay on the block itself. Other functions include using hotkeys to navigate, an option to jump to a specific time, enabling a vertical line, and double-clicking to zoom in/out. The back end involves a configuration file that allows a more experienced user to pre-define the structure of a block, a single event, or a step function. The individual will have to determine what information is important within each block and what actually defines the beginning and end of a block. This gives the user much more flexibility in terms of what the tool is searching for. In addition to the configurability, all the settings in the preferences window are saved in the configuration file as well

  11. National Ignition Facility Laser System Performance

    DOE PAGES

    Spaeth, Mary L.; Manes, Kenneth R.; Bowers, M.; ...

    2017-03-23

    The National Ignition Facility (NIF) laser is the culmination of more than 40 years of work at Lawrence Livermore National Laboratory dedicated to the delivery of laser systems capable of driving experiments for the study of high-energy-density physics. Although NIF was designed to support a number of missions, it was clear from the beginning that its biggest challenge was to meet the requirements for pursuit of inertial confinement fusion. Meeting the Project Completion Criteria for NIF in 2009 and for the National Ignition Campaign (NIC) in 2012 included meeting the NIF Functional Requirements and Primary Criteria that were established formore » the project in 1994. Finally, during NIC and as NIF transitioned to a user facility, its goals were expanded to include requirements defined by the broader user community as well as by laser system designers and operators.« less

  12. An overview of 5G network slicing architecture

    NASA Astrophysics Data System (ADS)

    Chen, Qiang; Wang, Xiaolei; Lv, Yingying

    2018-05-01

    With the development of mobile communication technology, the traditional single network model has been unable to meet the needs of users, and the demand for differentiated services is increasing. In order to solve this problem, the fifth generation of mobile communication technology came into being, and as one of the key technologies of 5G, network slice is the core technology of network virtualization and software defined network, enabling network slices to flexibly provide one or more network services according to users' needs[1]. Each slice can independently tailor the network functions according to the requirements of the business scene and the traffic model and manage the layout of the corresponding network resources, to improve the flexibility of network services and the utilization of resources, and enhance the robustness and reliability of the whole network [2].

  13. System approach to distributed sensor management

    NASA Astrophysics Data System (ADS)

    Mayott, Gregory; Miller, Gordon; Harrell, John; Hepp, Jared; Self, Mid

    2010-04-01

    Since 2003, the US Army's RDECOM CERDEC Night Vision Electronic Sensor Directorate (NVESD) has been developing a distributed Sensor Management System (SMS) that utilizes a framework which demonstrates application layer, net-centric sensor management. The core principles of the design support distributed and dynamic discovery of sensing devices and processes through a multi-layered implementation. This results in a sensor management layer that acts as a System with defined interfaces for which the characteristics, parameters, and behaviors can be described. Within the framework, the definition of a protocol is required to establish the rules for how distributed sensors should operate. The protocol defines the behaviors, capabilities, and message structures needed to operate within the functional design boundaries. The protocol definition addresses the requirements for a device (sensors or processes) to dynamically join or leave a sensor network, dynamically describe device control and data capabilities, and allow dynamic addressing of publish and subscribe functionality. The message structure is a multi-tiered definition that identifies standard, extended, and payload representations that are specifically designed to accommodate the need for standard representations of common functions, while supporting the need for feature-based functions that are typically vendor specific. The dynamic qualities of the protocol enable a User GUI application the flexibility of mapping widget-level controls to each device based on reported capabilities in real-time. The SMS approach is designed to accommodate scalability and flexibility within a defined architecture. The distributed sensor management framework and its application to a tactical sensor network will be described in this paper.

  14. The process of installing REDCap, a web based database supporting biomedical research: the first year.

    PubMed

    Klipin, M; Mare, I; Hazelhurst, S; Kramer, B

    2014-01-01

    Clinical and research data are essential for patient care, research and healthcare system planning. REDCapTM is a web-based tool for research data curatorship developed at Vanderbilt University in Nashville, USA. The Faculty of Health Sciences at the University of the Witwatersrand, Johannesburg South Africa identified the need for a cost effective data management instrument. REDCap was installed as per the user agreement with Vanderbilt University in August 2012. In order to assist other institutions that may lack the in-house Information Technology capacity, this paper describes the installation and support of REDCap and incorporates an analysis of user uptake over the first year of use. We reviewed the staffing requirements, costs of installation, process of installation and necessary infrastructure and end-user requests following the introduction of REDCap at Wits. The University Legal Office and Human Research Ethics Committee were consulted regarding the REDCap end-user agreement. Bi-monthly user meetings resulted in a training workshop in August 2013. We compared our REDCap software user numbers and records before and after the first training workshop. Human resources were recruited from existing staff. Installation costs were limited to servers and security certificates. The total costs to provide a functional REDCap platform was less than $9000. Eighty-one (81) users were registered in the first year. After the first training workshop the user numbers increased by 59 in one month and the total number of active users to 140 by the end of August 2013. Custom software applications for REDCap were created by collaboration between clinicians and software developers. REDCap was installed and maintained at limited cost. A small number of people with defined skills can support multiple REDCap users in two to four hours a week. End user training increased in the number of users, number of projects created and the number of projects moved to production.

  15. The Process of Installing REDCap, a Web Based Database Supporting Biomedical Research

    PubMed Central

    Mare, I.; Hazelhurst, S.; Kramer, B.

    2014-01-01

    Summary Background Clinical and research data are essential for patient care, research and healthcare system planning. REDCapTM is a web-based tool for research data curatorship developed at Vanderbilt University in Nashville, USA. The Faculty of Health Sciences at the University of the Witwatersrand, Johannesburg South Africa identified the need for a cost effective data management instrument. REDCap was installed as per the user agreement with Vanderbilt University in August 2012. Objectives In order to assist other institutions that may lack the in-house Information Technology capacity, this paper describes the installation and support of REDCap and incorporates an analysis of user uptake over the first year of use. Methods We reviewed the staffing requirements, costs of installation, process of installation and necessary infrastructure and end-user requests following the introduction of REDCap at Wits. The University Legal Office and Human Research Ethics Committee were consulted regarding the REDCap end-user agreement. Bi-monthly user meetings resulted in a training workshop in August 2013. We compared our REDCap software user numbers and records before and after the first training workshop. Results Human resources were recruited from existing staff. Installation costs were limited to servers and security certificates. The total costs to provide a functional REDCap platform was less than $9000. Eighty-one (81) users were registered in the first year. After the first training workshop the user numbers increased by 59 in one month and the total number of active users to 140 by the end of August 2013. Custom software applications for REDCap were created by collaboration between clinicians and software developers. Conclusion REDCap was installed and maintained at limited cost. A small number of people with defined skills can support multiple REDCap users in two to four hours a week. End user training increased in the number of users, number of projects created and the number of projects moved to production. PMID:25589907

  16. Conceptual Modeling via Logic Programming

    DTIC Science & Technology

    1990-01-01

    Define User Interface and Query Language L i1W= Ltl k.l 4. Define Procedures for Specifying Output S . Select Logic Programming Language 6. Develop ...baseline s change model. sessions and baselines. It was changed 6. Develop Methodology for C 31 Users. considerably with the advent of the window This...Model Development : Implica- for Conceptual Modeling Via Logic tions for Communications of a Cognitive Programming. Marina del Rey, Calif.: Analysis of

  17. Application of MCT Failure Criterion using EFM

    DTIC Science & Technology

    2010-03-26

    because HELIUS:MCT™ does not facilitate this. Attempts have been made to use ABAQUS native thermal expansion model combined in addition to Helius-MCT... ABAQUS using a user defined element subroutine EFM. Comparisons have been made between the analysis results using EFM-MCT code and HELIUS:MCT™ code...using the Element-Failure Method (EFM) in ABAQUS . The EFM-MCT has been implemented in ABAQUS using a user defined element subroutine EFM. Comparisons

  18. Homopolymer tail-mediated ligation PCR: a streamlined and highly efficient method for DNA cloning and library construction

    PubMed Central

    Lazinski, David W.; Camilli, Andrew

    2013-01-01

    The amplification of DNA fragments, cloned between user-defined 5′ and 3′ end sequences, is a prerequisite step in the use of many current applications including massively parallel sequencing (MPS). Here we describe an improved method, called homopolymer tail-mediated ligation PCR (HTML-PCR), that requires very little starting template, minimal hands-on effort, is cost-effective, and is suited for use in high-throughput and robotic methodologies. HTML-PCR starts with the addition of homopolymer tails of controlled lengths to the 3′ termini of a double-stranded genomic template. The homopolymer tails enable the annealing-assisted ligation of a hybrid oligonucleotide to the template's recessed 5′ ends. The hybrid oligonucleotide has a user-defined sequence at its 5′ end. This primer, together with a second primer composed of a longer region complementary to the homopolymer tail and fused to a second 5′ user-defined sequence, are used in a PCR reaction to generate the final product. The user-defined sequences can be varied to enable compatibility with a wide variety of downstream applications. We demonstrate our new method by constructing MPS libraries starting from nanogram and sub-nanogram quantities of Vibrio cholerae and Streptococcus pneumoniae genomic DNA. PMID:23311318

  19. Human-centered sensor-based Bayesian control: Increased energy efficiency and user satisfaction in commercial lighting

    NASA Astrophysics Data System (ADS)

    Granderson, Jessica Ann

    2007-12-01

    The need for sustainable, efficient energy systems is the motivation that drove this research, which targeted the design of an intelligent commercial lighting system. Lighting in commercial buildings consumes approximately 13% of all the electricity generated in the US. Advanced lighting controls1 intended for use in commercial office spaces have proven to save up to 45% in electricity consumption. However, they currently comprise only a fraction of the market share, resulting in a missed opportunity to conserve energy. The research goals driving this dissertation relate directly to barriers hindering widespread adoption---increase user satisfaction, and provide increased energy savings through more sophisticated control. To satisfy these goals an influence diagram was developed to perform daylighting actuation. This algorithm was designed to balance the potentially conflicting lighting preferences of building occupants, with the efficiency desires of building facilities management. A supervisory control policy was designed to implement load shedding under a demand response tariff. Such tariffs offer incentives for customers to reduce their consumption during periods of peak demand, trough price reductions. In developing the value function occupant user testing was conducted to determine that computer and paper tasks require different illuminance levels, and that user preferences are sufficiently consistent to attain statistical significance. Approximately ten facilities managers were also interviewed and surveyed to isolate their lighting preferences with respect to measures of lighting quality and energy savings. Results from both simulation and physical implementation and user testing indicate that the intelligent controller can increase occupant satisfaction, efficiency, cost savings, and management satisfaction, with respect to existing commercial daylighting systems. Several important contributions were realized by satisfying the research goals. A general model of a daylighted environment was designed, and a practical means of user preference identification was defined. Further, a set of general procedures were identified for the design of human-centered sensor-based decision-analytic systems, and for the identification of the allowable uncertainty in nodes of interest. To confirm generality, a vehicle health monitoring problem was defined and solved using these two procedures. 1'Daylighting' systems use sensors to determine room occupancy and available sunlight, and automatically dim the lights in response.

  20. Traffic Generator (TrafficGen) Version 1.4.2: Users Guide

    DTIC Science & Technology

    2016-06-01

    events, the user has to enter them manually . We will research and implement a way to better define and organize the multicast addresses so they can be...the network with Transmission Control Protocol and User Datagram Protocol Internet Protocol traffic. Each node generating network traffic in an...TrafficGen Graphical User Interface (GUI) 3 3.1 Anatomy of the User Interface 3 3.2 Scenario Configuration and MGEN Files 4 4. Working with

  1. Mining approximate temporal functional dependencies with pure temporal grouping in clinical databases.

    PubMed

    Combi, Carlo; Mantovani, Matteo; Sabaini, Alberto; Sala, Pietro; Amaddeo, Francesco; Moretti, Ugo; Pozzi, Giuseppe

    2015-07-01

    Functional dependencies (FDs) typically represent associations over facts stored by a database, such as "patients with the same symptom get the same therapy." In more recent years, some extensions have been introduced to represent both temporal constraints (temporal functional dependencies - TFDs), as "for any given month, patients with the same symptom must have the same therapy, but their therapy may change from one month to the next one," and approximate properties (approximate functional dependencies - AFDs), as "patients with the same symptomgenerallyhave the same therapy." An AFD holds most of the facts stored by the database, enabling some data to deviate from the defined property: the percentage of data which violate the given property is user-defined. According to this scenario, in this paper we introduce approximate temporal functional dependencies (ATFDs) and use them to mine clinical data. Specifically, we considered the need for deriving new knowledge from psychiatric and pharmacovigilance data. ATFDs may be defined and measured either on temporal granules (e.g.grouping data by day, week, month, year) or on sliding windows (e.g.a fixed-length time interval which moves over the time axis): in this regard, we propose and discuss some specific and efficient data mining techniques for ATFDs. We also developed two running prototypes and showed the feasibility of our proposal by mining two real-world clinical data sets. The clinical interest of the dependencies derived considering the psychiatry and pharmacovigilance domains confirms the soundness and the usefulness of the proposed techniques. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Issues central to a useful image understanding environment

    NASA Astrophysics Data System (ADS)

    Beveridge, J. Ross; Draper, Bruce A.; Hanson, Allen R.; Riseman, Edward M.

    1992-04-01

    A recent DARPA initiative has sparked interested in software environments for computer vision. The goal is a single environment to support both basic research and technology transfer. This paper lays out six fundamental attributes such a system must possess: (1) support for both C and Lisp, (2) extensibility, (3) data sharing, (4) data query facilities tailored to vision, (5) graphics, and (6) code sharing. The first three attributes fundamentally constrain the system design. Support for both C and Lisp demands some form of database or data-store for passing data between languages. Extensibility demands that system support facilities, such as spatial retrieval of data, be readily extended to new user-defined datatypes. Finally, data sharing demands that data saved by one user, including data of a user-defined type, must be readable by another user.

  3. Argo: an integrative, interactive, text mining-based workbench supporting curation

    PubMed Central

    Rak, Rafal; Rowley, Andrew; Black, William; Ananiadou, Sophia

    2012-01-01

    Curation of biomedical literature is often supported by the automatic analysis of textual content that generally involves a sequence of individual processing components. Text mining (TM) has been used to enhance the process of manual biocuration, but has been focused on specific databases and tasks rather than an environment integrating TM tools into the curation pipeline, catering for a variety of tasks, types of information and applications. Processing components usually come from different sources and often lack interoperability. The well established Unstructured Information Management Architecture is a framework that addresses interoperability by defining common data structures and interfaces. However, most of the efforts are targeted towards software developers and are not suitable for curators, or are otherwise inconvenient to use on a higher level of abstraction. To overcome these issues we introduce Argo, an interoperable, integrative, interactive and collaborative system for text analysis with a convenient graphic user interface to ease the development of processing workflows and boost productivity in labour-intensive manual curation. Robust, scalable text analytics follow a modular approach, adopting component modules for distinct levels of text analysis. The user interface is available entirely through a web browser that saves the user from going through often complicated and platform-dependent installation procedures. Argo comes with a predefined set of processing components commonly used in text analysis, while giving the users the ability to deposit their own components. The system accommodates various areas and levels of user expertise, from TM and computational linguistics to ontology-based curation. One of the key functionalities of Argo is its ability to seamlessly incorporate user-interactive components, such as manual annotation editors, into otherwise completely automatic pipelines. As a use case, we demonstrate the functionality of an in-built manual annotation editor that is well suited for in-text corpus annotation tasks. Database URL: http://www.nactem.ac.uk/Argo PMID:22434844

  4. Fast simulation tool for ultraviolet radiation at the earth's surface

    NASA Astrophysics Data System (ADS)

    Engelsen, Ola; Kylling, Arve

    2005-04-01

    FastRT is a fast, yet accurate, UV simulation tool that computes downward surface UV doses, UV indices, and irradiances in the spectral range 290 to 400 nm with a resolution as small as 0.05 nm. It computes a full UV spectrum within a few milliseconds on a standard PC, and enables the user to convolve the spectrum with user-defined and built-in spectral response functions including the International Commission on Illumination (CIE) erythemal response function used for UV index calculations. The program accounts for the main radiative input parameters, i.e., instrumental characteristics, solar zenith angle, ozone column, aerosol loading, clouds, surface albedo, and surface altitude. FastRT is based on look-up tables of carefully selected entries of atmospheric transmittances and spherical albedos, and exploits the smoothness of these quantities with respect to atmospheric, surface, geometrical, and spectral parameters. An interactive site, http://nadir.nilu.no/~olaeng/fastrt/fastrt.html, enables the public to run the FastRT program with most input options. This page also contains updated information about FastRT and links to freely downloadable source codes and binaries.

  5. Experiences with the Twitter Health Surveillance (THS) System

    PubMed Central

    Rodríguez-Martínez, Manuel

    2018-01-01

    Social media has become an important platform to gauge public opinion on topics related to our daily lives. In practice, processing these posts requires big data analytics tools since the volume of data and the speed of production overwhelm single-server solutions. Building an application to capture and analyze posts from social media can be a challenge simply because it requires combining a set of complex software tools that often times are tricky to configure, tune, and maintain. In many instances, the application ends up being an assorted collection of Java/Scala programs or Python scripts that developers cobble together to generate the data products they need. In this paper, we present the Twitter Health Surveillance (THS) application framework. THS is designed as a platform to allow end-users to monitor a stream of tweets, and process the stream with a combination of built-in functionality and their own user-defined functions. We discuss the architecture of THS, and describe its implementation atop the Apache Hadoop Ecosystem. We also present several lessons learned while developing our current prototype. PMID:29607412

  6. Experiences with the Twitter Health Surveillance (THS) System.

    PubMed

    Rodríguez-Martínez, Manuel

    2017-06-01

    Social media has become an important platform to gauge public opinion on topics related to our daily lives. In practice, processing these posts requires big data analytics tools since the volume of data and the speed of production overwhelm single-server solutions. Building an application to capture and analyze posts from social media can be a challenge simply because it requires combining a set of complex software tools that often times are tricky to configure, tune, and maintain. In many instances, the application ends up being an assorted collection of Java/Scala programs or Python scripts that developers cobble together to generate the data products they need. In this paper, we present the Twitter Health Surveillance (THS) application framework. THS is designed as a platform to allow end-users to monitor a stream of tweets, and process the stream with a combination of built-in functionality and their own user-defined functions. We discuss the architecture of THS, and describe its implementation atop the Apache Hadoop Ecosystem. We also present several lessons learned while developing our current prototype.

  7. Impulsivity in men with prescription of benzodiazepines and methadone in prison.

    PubMed

    Moreno-Ramos, Luis; Fernández-Serrano, María José; Pérez-García, Miguel; Verdejo-García, Antonio

    2016-06-14

    Benzodiazepines and methadone use has been associated with various neuropsychological impairments. However, to the best of our knowledge, no studies have been carried out on the effect of these substances (either separately or combined) on impulsive personality, including studies in prisoners. The aim of this study is to examine the impulsive personality of a sample of 134 male prisoners using the Sensitivity to Punishment and Sensitivity to Reward Questionnaire (Torrubia, Avila, Molto, & Caseras, 2001) and the UPPS-P Scale (Cyders et al., 2007). Some of these were methadone users, methadone and benzodiazepines users, polydrug users in abstinence and non-dependent drug users. The results showed that drug users have greater sensitivity to reward, positive urgency, negative urgency and sensation seeking than non-dependent users. Methadone users showed more sensitivity to punishment and lack of perseverance with respect to other users. No differences were found between methadone+benzodiazepines users and other groups. The secondary aim is to examine which impulsive personality dimensions are related to the two motivational systems proposed by Gray (BIS-BAS) using exploratory factor analysis. Results showed two different components. One component was defined by the subscales sensitivity to reinforcement, positive urgency, negative urgency and sensation seeking. The second component was defined by the subscales sensitivity to punishment, lack of perseverance and lack of premeditation.

  8. Managing End User Computing in the Federal Government.

    ERIC Educational Resources Information Center

    General Services Administration, Washington, DC.

    This report presents an initial approach developed by the General Services Administration for the management of end user computing in federal government agencies. Defined as technology used directly by individuals in need of information products, end user computing represents a new field encompassing such technologies as word processing, personal…

  9. User's Manual for the Object User Interface (OUI): An Environmental Resource Modeling Framework

    USGS Publications Warehouse

    Markstrom, Steven L.; Koczot, Kathryn M.

    2008-01-01

    The Object User Interface is a computer application that provides a framework for coupling environmental-resource models and for managing associated temporal and spatial data. The Object User Interface is designed to be easily extensible to incorporate models and data interfaces defined by the user. Additionally, the Object User Interface is highly configurable through the use of a user-modifiable, text-based control file that is written in the eXtensible Markup Language. The Object User Interface user's manual provides (1) installation instructions, (2) an overview of the graphical user interface, (3) a description of the software tools, (4) a project example, and (5) specifications for user configuration and extension.

  10. Spacelab user interaction

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The results of the third and final phase of a study undertaken to define means of optimizing the Spacelab experiment data system by interactively manipulating the flow of data were presented. A number of payload applicable interactive techniques and an integrated interaction system for each of two possible payloads are described. These interaction systems have been functionally defined and are accompanied with block diagrams, hardware specifications, software sizing and speed requirements, operational procedures and cost/benefits analysis data for both onboard and ground based system elements. It is shown that accrued benefits are attributable to a reduction in data processing costs obtained by, generally, a considerable reduction in the quantity of data that might otherwise be generated without interaction. One other additional anticipated benefit includes the increased scientific value obtained by the quicker return of all useful data.

  11. [Development of opened instrument for generating and measuring physiological signal].

    PubMed

    Chen, Longcong; Hu, Guohu; Gao, Bin

    2004-12-01

    An opened instrument with liquid crystal display (LCD) for generating and measuring physiological signal is introduced in this paper. Based on a single-chip microcomputer. the instrument uses the technique of LCD screen to display signal wave and information, and it realizes man-machine interaction by keyboard. This instrument can produce not only defined signal in common use by utilizing important saved data and relevant arithmetic, but also user-defined signal. Therefore, it is open to produce signal. In addition, this instrument has strong extension because of its modularized design as computer, which has much function such as displaying, measuring and saving physiological signal, and many features such as low power consumption, small volume, low cost and portability. Hence this instrument is convenient for experiment teaching, clinic examining, maintaining of medical instrument.

  12. MAPA: an interactive accelerator design code with GUI

    NASA Astrophysics Data System (ADS)

    Bruhwiler, David L.; Cary, John R.; Shasharina, Svetlana G.

    1999-06-01

    The MAPA code is an interactive accelerator modeling and design tool with an X/Motif GUI. MAPA has been developed in C++ and makes full use of object-oriented features. We present an overview of its features and describe how users can independently extend the capabilities of the entire application, including the GUI. For example, a user can define a new model for a focusing or accelerating element. If the appropriate form is followed, and the new element is "registered" with a single line in the specified file, then the GUI will fully support this user-defined element type after it has been compiled and then linked to the existing application. In particular, the GUI will bring up windows for modifying any relevant parameters of the new element type. At present, one can use the GUI for phase space tracking, finding fixed points and generating line plots for the Twiss parameters, the dispersion and the accelerator geometry. The user can define new types of simulations which the GUI will automatically support by providing a menu option to execute the simulation and subsequently rendering line plots of the resulting data.

  13. 9th Annual CMMI Technology Conference and User Group-Tuesday

    DTIC Science & Technology

    2009-11-19

    evaluating and quantifying risk likelihood and severity risks. Step 4 Project defines thresholds for each risk category. Step 5 Project defines bounds on the...defines consistent criteria for evaluating and quantifying risk likelihood and severity risks in the Risk Management Plan. Step 4 Project defines

  14. Assured Human-Autonomy Interaction through Machine Self-Confidence

    NASA Astrophysics Data System (ADS)

    Aitken, Matthew

    Autonomous systems employ many layers of approximations in order to operate in increasingly uncertain and unstructured environments. The complexity of these systems makes it hard for a user to understand the systems capabilities, especially if the user is not an expert. However, if autonomous systems are to be used efficiently, their users must trust them appropriately. This purpose of this work is to implement and assess an 'assurance' that an autonomous system can provide to the user to elicit appropriate trust. Specifically, the autonomous system's perception of its own capabilities is reported to the user as the self-confidence assurance. The self-confidence assurance should allow the user to more quickly and accurately assess the autonomous system's capabilities, generating appropriate trust in the autonomous system. First, this research defines self-confidence and discusses what the self-confidence assurance is attempting to communicate to the user. Then it provides a framework for computing the autonomous system's self-confidence as a function of self-confidence factors which correspond to individual elements in the autonomous system's process. In order to explore this idea, self-confidence is implemented on an autonomous system that uses a mixed observability Markov decision process model to solve a pursuit-evasion problem on a road network. The implementation of a factor assessing the goodness of the autonomy's expected performance is focused on in particular. This work highlights some of the issues and considerations in the design of appropriate metrics for the self-confidence factors, and provides the basis for future research for computing self-confidence in autonomous systems.

  15. Responsible and controlled use: Older cannabis users and harm reduction

    PubMed Central

    Lau, Nicholas; Sales, Paloma; Averill, Sheigla; Murphy, Fiona; Sato, Sye-Ok; Murphy, Sheigla

    2015-01-01

    Background Cannabis use is becoming more accepted in mainstream society. In this paper, we use Zinberg’s classic theoretical framework of drug, set, and setting to elucidate how older adult cannabis users managed health, social and legal risks in a context of normalized cannabis use. Methods We present selected findings from our qualitative study of Baby Boomer (born 1946–1964) cannabis users in the San Francisco Bay Area. Data collection consisted of a recorded, in-depth life history interview followed by a questionnaire and health survey. Qualitative interviews were analyzed to discover the factors of cannabis harm reduction from the users’ perspectives. Results Interviewees made harm reduction choices based on preferred cannabis derivatives and routes of administration, as well as why, when, where, and with whom to use. Most interviewees minimized cannabis-related harms so they could maintain social functioning in their everyday lives. Responsible and controlled use was described as moderation of quantity and frequency of cannabis used, using in appropriate settings, and respect for non-users. Users contributed to the normalization of cannabis use through normification. Conclusion Participants followed rituals or cultural practices, characterized by sanctions that helped define “normal” or “acceptable” cannabis use. Users contributed to cannabis normalization through their harm reduction methods. These cultural practices may prove to be more effective than formal legal prohibitions in reducing cannabis-related harms. Findings also suggest that users with access to a regulated market (medical cannabis dispensaries) were better equipped to practice harm reduction. More research is needed on both cannabis culture and alternative routes of administration as harm reduction methods. PMID:25911027

  16. A Q-GERT Model for Determining the Maintenance Crew Size for the SAC command Post Upgrade

    DTIC Science & Technology

    1983-12-01

    time that an equiprment fails. DAY3 A real variable corresponding to the day that an LRU is removed from the equipment. DAY4 A real variable...variable corresponding to the time that an LRU is repaired. TIM5 A real variable corresponaing to Lhe time that an equipment returns to service. TNOW...The current time . UF(IFN) User function IFN. UN(I) A sample from the uniform distri- bution defined by parameter set I. YIlN1 A real variable

  17. Proceedings of the Brain Mapping Machine Design Workshop Held in College Station, TX on 10-16 August, 1985. Volume 3. Background Papers Submitted by Participants.

    DTIC Science & Technology

    1985-08-01

    interactively. First, with the "tissue highlight" function, the user must define the range of intensity values (in Hounsfield units ) corresponding to the...Cosponsored by the United States Army Medical Research and Development Command, Scripps Clinic and Research Foundation, Texas A&M University, University of...Research & Development Command DAMDI7-85-G-5042 Sc. ADDRESS (City, State, and ZIP Code) 10. SOURCE OF FUNDING NUMBERS e PROGRAM PROJECT TASK IWORK UNIT

  18. Telescience testbedding for life science missions on the Space Station

    NASA Technical Reports Server (NTRS)

    Rasmussen, D.; Mian, A.; Bosley, J.

    1988-01-01

    'Telescience', defined as the ability of distributed system users to perform remote operations associated with NASA Space Station life science operations, has been explored by a developmental testbed project allowing rapid prototyping to evaluate the functional requirements of telescience implementation in three areas: (1) research planning and design, (2) remote operation of facilities, and (3) remote access to data bases for analysis. Attention is given to the role of expert systems in telescience, its use in realistic simulation of Space Shuttle payload remote monitoring, and remote interaction with life science data bases.

  19. DSpace and customized controlled vocabularies

    NASA Astrophysics Data System (ADS)

    Skourlas, C.; Tsolakidis, A.; Kakoulidis, P.; Giannakopoulos, G.

    2015-02-01

    The open source platform of DSpace could be defined as a repository application used to provide access to digital resources. DSpace is installed and used by more than 1000 organizations worldwide. A predefined taxonomy of keyword, called the Controlled Vocabulary, can be used for describing and accessing the information items stored in the repository. In this paper, we describe how the users can create, and customize their own vocabularies. Various heterogeneous items, such as research papers, videos, articles and educational material of the repository, can be indexed in order to provide advanced search functionality using new controlled vocabularies.

  20. QUEST Hanford Site Computer Users - What do they do?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    WITHERSPOON, T.T.

    2000-03-02

    The Fluor Hanford Chief Information Office requested that a computer-user survey be conducted to determine the user's dependence on the computer and its importance to their ability to accomplish their work. Daily use trends and future needs of Hanford Site personal computer (PC) users was also to be defined. A primary objective was to use the data to determine how budgets should be focused toward providing those services that are truly needed by the users.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Richen; Guo, Hanqi; Yuan, Xiaoru

    Most of the existing approaches to visualize vector field ensembles are to reveal the uncertainty of individual variables, for example, statistics, variability, etc. However, a user-defined derived feature like vortex or air mass is also quite significant, since they make more sense to domain scientists. In this paper, we present a new framework to extract user-defined derived features from different simulation runs. Specially, we use a detail-to-overview searching scheme to help extract vortex with a user-defined shape. We further compute the geometry information including the size, the geo-spatial location of the extracted vortexes. We also design some linked views tomore » compare them between different runs. At last, the temporal information such as the occurrence time of the feature is further estimated and compared. Results show that our method is capable of extracting the features across different runs and comparing them spatially and temporally.« less

  2. Contextual Compression of Large-Scale Wind Turbine Array Simulations: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gruchalla, Kenny M; Brunhart-Lupo, Nicholas J; Potter, Kristin C

    Data sizes are becoming a critical issue particularly for HPC applications. We have developed a user-driven lossy wavelet-based storage model to facilitate the analysis and visualization of large-scale wind turbine array simulations. The model stores data as heterogeneous blocks of wavelet coefficients, providing high-fidelity access to user-defined data regions believed the most salient, while providing lower-fidelity access to less salient regions on a block-by-block basis. In practice, by retaining the wavelet coefficients as a function of feature saliency, we have seen data reductions in excess of 94 percent, while retaining lossless information in the turbine-wake regions most critical to analysismore » and providing enough (low-fidelity) contextual information in the upper atmosphere to track incoming coherent turbulent structures. Our contextual wavelet compression approach has allowed us to deliver interactive visual analysis while providing the user control over where data loss, and thus reduction in accuracy, in the analysis occurs. We argue this reduced but contexualized representation is a valid approach and encourages contextual data management.« less

  3. Contextual Compression of Large-Scale Wind Turbine Array Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gruchalla, Kenny M; Brunhart-Lupo, Nicholas J; Potter, Kristin C

    Data sizes are becoming a critical issue particularly for HPC applications. We have developed a user-driven lossy wavelet-based storage model to facilitate the analysis and visualization of large-scale wind turbine array simulations. The model stores data as heterogeneous blocks of wavelet coefficients, providing high-fidelity access to user-defined data regions believed the most salient, while providing lower-fidelity access to less salient regions on a block-by-block basis. In practice, by retaining the wavelet coefficients as a function of feature saliency, we have seen data reductions in excess of 94 percent, while retaining lossless information in the turbine-wake regions most critical to analysismore » and providing enough (low-fidelity) contextual information in the upper atmosphere to track incoming coherent turbulent structures. Our contextual wavelet compression approach has allowed us to deliver interative visual analysis while providing the user control over where data loss, and thus reduction in accuracy, in the analysis occurs. We argue this reduced but contextualized representation is a valid approach and encourages contextual data management.« less

  4. Metamesh, Version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Staley, Martin

    Metamesh is a general-purpose C++ library for creating "mesh" data structures from smaller parts. That is, rather than providing a traditional "mesh format," as many libraries do, or a GUI for building meshes, Metamesh provides tools by which the mesh structures themselves can be built. Consider that a mesh in up to three dimensions can contain nodes (0d entities), edges (1d), faces (2d), and cells (3d). Edges are typically defined from two nodes. Faces can be defined from nodes or edges; and cells from nodes, edges, or faces. Someone might also wish to allow for general faces or cells, ormore » for only a specific variant - say, triangular faces and tetrahedral cells. Moreover, a mesh can have the same or a lesser dimension than that of its enclosing space. In 3d, say, one could have a full 3d mesh, a 2d "sheet" mesh without cells, a 1d "string" mesh with neither faces nor cells, or even a 1d "point cloud." And, aside from the mesh structure itself, additional data might be wanted: velocities at nodes, say, or fluxes across faces, or an average density in each cell. Metamesh supports all of this, through C++ generics and template metaprogramming techniques. Users fit Metamesh constructs together to define a mesh layout, and Metamesh then automatically provides the newly constructed mesh with functionality. Metamesh also provides facilities for spinning, extruding, visualizing, and performing I/O of whatever meshes a user builds.« less

  5. Mileage-based user fees : defining a path toward implementation phase 2, an assessment of technology issues : final report

    DOT National Transportation Integrated Search

    2009-10-01

    This report reviews technology options for a mileage-based user fee system in the state of Texas. The report was : compiled based on input from a diverse range of sources, including a literature review of existing mileage-based : user fee technical w...

  6. A medical digital library to support scenario and user-tailored information retrieval.

    PubMed

    Chu, W W; Johnson, D B; Kangarloo, H

    2000-06-01

    Current large-scale information sources are designed to support general queries and lack the ability to support scenario-specific information navigation, gathering, and presentation. As a result, users are often unable to obtain desired specific information within a well-defined subject area. Today's information systems do not provide efficient content navigation, incremental appropriate matching, or content correlation. We are developing the following innovative technologies to remedy these problems: 1) scenario-based proxies, enabling the gathering and filtering of information customized for users within a pre-defined domain; 2) context-sensitive navigation and matching, providing approximate matching and similarity links when an exact match to a user's request is unavailable; 3) content correlation of documents, creating semantic links between documents and information sources; and 4) user models for customizing retrieved information and result presentation. A digital medical library is currently being constructed using these technologies to provide customized information for the user. The technologies are general in nature and can provide custom and scenario-specific information in many other domains (e.g., crisis management).

  7. UAH/NASA Workshop on Space Science Platform

    NASA Technical Reports Server (NTRS)

    Wu, S. T. (Editor); Morgan, S. (Editor)

    1978-01-01

    The scientific user requirements for a space science platform were defined. The potential user benefits, technological implications and cost of space platforms were examined. Cost effectiveness of the platforms' capabilities were also examined.

  8. PCOGR: phylogenetic COG ranking as an online tool to judge the specificity of COGs with respect to freely definable groups of organisms.

    PubMed

    Meereis, Florian; Kaufmann, Michael

    2004-10-15

    The rapidly increasing number of completely sequenced genomes led to the establishment of the COG-database which, based on sequence homologies, assigns similar proteins from different organisms to clusters of orthologous groups (COGs). There are several bioinformatic studies that made use of this database to determine (hyper)thermophile-specific proteins by searching for COGs containing (almost) exclusively proteins from (hyper)thermophilic genomes. However, public software to perform individually definable group-specific searches is not available. The tool described here exactly fills this gap. The software is accessible at http://www.uni-wh.de/pcogr and is linked to the COG-database. The user can freely define two groups of organisms by selecting for each of the (current) 66 organisms to belong either to groupA, to the reference groupB or to be ignored by the algorithm. Then, for all COGs a specificity index is calculated with respect to the specificity to groupA, i. e. high scoring COGs contain proteins from the most of groupA organisms while proteins from the most organisms assigned to groupB are absent. In addition to ranking all COGs according to the user defined specificity criteria, a graphical visualization shows the distribution of all COGs by displaying their abundance as a function of their specificity indexes. This software allows detecting COGs specific to a predefined group of organisms. All COGs are ranked in the order of their specificity and a graphical visualization allows recognizing (i) the presence and abundance of such COGs and (ii) the phylogenetic relationship between groupA- and groupB-organisms. The software also allows detecting putative protein-protein interactions, novel enzymes involved in only partially known biochemical pathways, and alternate enzymes originated by convergent evolution.

  9. Neuroprosthetic technology for individuals with spinal cord injury

    PubMed Central

    Collinger, Jennifer L.; Foldes, Stephen; Bruns, Tim M.; Wodlinger, Brian; Gaunt, Robert; Weber, Douglas J.

    2013-01-01

    Context Spinal cord injury (SCI) results in a loss of function and sensation below the level of the lesion. Neuroprosthetic technology has been developed to help restore motor and autonomic functions as well as to provide sensory feedback. Findings This paper provides an overview of neuroprosthetic technology that aims to address the priorities for functional restoration as defined by individuals with SCI. We describe neuroprostheses that are in various stages of preclinical development, clinical testing, and commercialization including functional electrical stimulators, epidural and intraspinal microstimulation, bladder neuroprosthesis, and cortical stimulation for restoring sensation. We also discuss neural recording technologies that may provide command or feedback signals for neuroprosthetic devices. Conclusion/clinical relevance Neuroprostheses have begun to address the priorities of individuals with SCI, although there remains room for improvement. In addition to continued technological improvements, closing the loop between the technology and the user may help provide intuitive device control with high levels of performance. PMID:23820142

  10. IoT-Based User-Driven Service Modeling Environment for a Smart Space Management System

    PubMed Central

    Choi, Hoan-Suk; Rhee, Woo-Seop

    2014-01-01

    The existing Internet environment has been extended to the Internet of Things (IoT) as an emerging new paradigm. The IoT connects various physical entities. These entities have communication capability and deploy the observed information to various service areas such as building management, energy-saving systems, surveillance services, and smart homes. These services are designed and developed by professional service providers. Moreover, users' needs have become more complicated and personalized with the spread of user-participation services such as social media and blogging. Therefore, some active users want to create their own services to satisfy their needs, but the existing IoT service-creation environment is difficult for the non-technical user because it requires a programming capability to create a service. To solve this problem, we propose the IoT-based user-driven service modeling environment to provide an easy way to create IoT services. Also, the proposed environment deploys the defined service to another user. Through the personalization and customization of the defined service, the value and dissemination of the service is increased. This environment also provides the ontology-based context-information processing that produces and describes the context information for the IoT-based user-driven service. PMID:25420153

  11. IoT-based user-driven service modeling environment for a smart space management system.

    PubMed

    Choi, Hoan-Suk; Rhee, Woo-Seop

    2014-11-20

    The existing Internet environment has been extended to the Internet of Things (IoT) as an emerging new paradigm. The IoT connects various physical entities. These entities have communication capability and deploy the observed information to various service areas such as building management, energy-saving systems, surveillance services, and smart homes. These services are designed and developed by professional service providers. Moreover, users' needs have become more complicated and personalized with the spread of user-participation services such as social media and blogging. Therefore, some active users want to create their own services to satisfy their needs, but the existing IoT service-creation environment is difficult for the non-technical user because it requires a programming capability to create a service. To solve this problem, we propose the IoT-based user-driven service modeling environment to provide an easy way to create IoT services. Also, the proposed environment deploys the defined service to another user. Through the personalization and customization of the defined service, the value and dissemination of the service is increased. This environment also provides the ontology-based context-information processing that produces and describes the context information for the IoT-based user-driven service.

  12. Marijuana Use Associations with Pulmonary Symptoms and Function in Tobacco Smokers Enrolled in the Subpopulations and Intermediate Outcome Measures in COPD Study (SPIROMICS).

    PubMed

    Morris, Madeline A; Jacobson, Sean R; Kinney, Gregory L; Tashkin, Donald P; Woodruff, Prescott G; Hoffman, Eric A; Kanner, Richard E; Cooper, Christopher B; Drummond, M Brad; Barr, R Graham; Oelsner, Elizabeth C; Make, Barry J; Han, MeiLan K; Hansel, Nadia N; O'Neal, Wanda K; Bowler, Russell P

    2018-01-24

    Background: Marijuana is often smoked via a filterless cigarette and contains similar chemical makeup as smoked tobacco. There are few publications describing usage patterns and respiratory risks in older adults or in those with chronic obstructive pulmonary disease (COPD). Methods: A cross-sectional analysis of current and former tobacco smokers from the Subpopulations and Intermediate Outcome Measures in COPD Study (SPIROMICS) study assessed associations between marijuana use and pulmonary outcomes. Marijuana use was defined as never, former (use over 30 days ago), or current (use within 30 days). Respiratory health was assessed using quantitative high-resolution computed tomography (HRCT) scans, pulmonary function tests and questionnaire responses about respiratory symptoms. Results: Of the total 2304 participants, 1130 (49%) never, 982 (43%) former, and 192 (8%) current marijuana users were included. Neither current nor former marijuana use was associated with increased odds of wheeze (odds ratio [OR] 0.87, OR 0.97), cough (OR 1.22; OR 0.93) or chronic bronchitis (OR 0.87; OR 1.00) when compared to never users. Current and former marijuana users had lower quantitative emphysema ( P =0.004, P =0.03), higher percent predicted forced expiratory volume in 1 second (FEV 1 %) ( P <0.001, P <0.001), and percent predicted forced vital capacity (FVC%) ( p <0.001, P <0.001). Current marijuana users exhibited higher total tissue volume ( P =0.003) while former users had higher air trapping ( P <0.001) when compared to never marijuana users. Conclusions: Marijuana use was found to have little to no association with poor pulmonary health in older current and former tobacco smokers after adjusting for covariates. Higher forced expiratory volume in 1 second (FEV 1 ) and forced vital capacity (FVC) was observed among current marijuana users. However, higher joint years was associated with more chronic bronchitis symptoms (e.g., wheeze), and this study cannot determine if long-term heavy marijuana smoking in the absence of tobacco smoking is associated with lung symptoms, airflow obstruction, or emphysema, particularly in those who have never smoked tobacco cigarettes.

  13. How do drug users define their progress in harm reduction programs? Qualitative research to develop user-generated outcomes

    PubMed Central

    Ruefli, Terry; Rogers, Susan J

    2004-01-01

    Background Harm reduction is a relatively new and controversial model for treating drug users, with little formal research on its operation and effectiveness. In order to advance the study of harm reduction programs and our understanding of how drug users define their progress, qualitative research was conducted to develop outcomes of harm reduction programming that are culturally relevant, incremental, (i.e., capable of measuring change), and hierarchical (i.e., capable of showing how clients improve over time). Methods The study used nominal group technique (NGT) to develop the outcomes (phase 1) and focus group interviews to help validate the findings (phase 2). Study participants were recruited from a large harm-reduction program in New York City and involved approximately 120 clients in 10 groups in phase 1 and 120 clients in 10 focus groups in phase 2. Results Outcomes of 10 life areas important to drug users were developed that included between 10 to 15 incremental measures per outcome. The outcomes included ways of 1) making money; 2) getting something good to eat; 3) being housed/homeless; 4) relating to families; 5) getting needed programs/benefits/services; 6) handling health problems; 7) handling negative emotions; 8) handling legal problems; 9) improving oneself; and 10) handling drug-use problems. Findings also provided insights into drug users' lives and values, as well as a window into understanding how this population envisions a better quality of life. Results challenged traditional ways of measuring drug users based solely on quantity used and frequency of use. They suggest that more appropriate measures are based on the extent to which drug users organize their lives around drug use and how much drug use is integrated into their lives and negatively impacts other aspects of their lives. Conclusions Harm reduction and other programs serving active drug users and other marginalized people should not rely on institutionalized, provider-defined solutions to problems in living faced by their clients. PMID:15333130

  14. [Automation in surgery: a systematical approach].

    PubMed

    Strauss, G; Meixensberger, J; Dietz, A; Manzey, D

    2007-04-01

    Surgical assistance systems permit a misalignment from the purely manual to an assisted activity of the surgeon (automation). Automation defines a system, that partly or totally fulfils function, those was carried out before totally or partly by the user. The organization of surgical assistance systems following application (planning, simulation, intraoperative navigation and visualization) or technical configuration of the system (manipulator, robot) is not suitable for a description of the interaction between user (surgeon) and the system. The available work has the goal of providing a classification for the degree of the automation of surgical interventions and describing by examples. The presented classification orients itself at pre-working from the Human-Factors-Sciences. As a condition for an automation of a surgical intervention applies that an assumption of a task, which was alone assigned so far to the surgeon takes place via the system. For both reference objects (humans and machine) the condition passively or actively comes into consideration. Besides can be classified according to which functions are taken over during a selected function division by humans and/or the surgical assistance system. Three functional areas were differentiated: "information acquisition and -analysis", "decision making and action planning" as well as "execution of the surgical action". From this results a classification of pre- and intraoperative surgical assist systems in six categories, which represent different automation degrees. The classification pattern is described and illustrated on the basis of surgical of examples.

  15. Modeling and Simulation With Operational Databases to Enable Dynamic Situation Assessment & Prediction

    DTIC Science & Technology

    2010-11-01

    subsections discuss the design of the simulations. 3.12.1 Lanchester5D Simulation A Lanchester simulation was developed to conduct performance...benchmarks using the WarpIV Kernel and HyperWarpSpeed. The Lanchester simulation contains a user-definable number of grid cells in which blue and red...forces engage in battle using Lanchester equations. Having a user-definable number of grid cells enables the simulation to be stressed with high entity

  16. Single crystal to polycrystal neutron transmission simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dessieux, Luc Lucius; Stoica, Alexandru Dan; Bingham, Philip R.

    A collection of routines for calculation of the total cross section that determines the attenuation of neutrons by crystalline solids is presented. The total cross section is calculated semi-empirically as a function of crystal structure, neutron energy, temperature, and crystal orientation. The semi-empirical formula includes the contribution of parasitic Bragg scattering to the total cross section using both the crystal’s mosaic spread value and its orientation with respect to the neutron beam direction as parameters. These routines allow users to enter a distribution of crystal orientations for calculation of total cross sections of user defined powder or pseudo powder distributions,more » which enables simulation of non-uniformities such as texture and strain. In conclusion, the spectra for neutron transmission simulations in the neutron thermal energy range (2 meV–100 meV) are presented for single crystal and polycrystal samples and compared to measurements.« less

  17. Squish: Near-Optimal Compression for Archival of Relational Datasets

    PubMed Central

    Gao, Yihan; Parameswaran, Aditya

    2017-01-01

    Relational datasets are being generated at an alarmingly rapid rate across organizations and industries. Compressing these datasets could significantly reduce storage and archival costs. Traditional compression algorithms, e.g., gzip, are suboptimal for compressing relational datasets since they ignore the table structure and relationships between attributes. We study compression algorithms that leverage the relational structure to compress datasets to a much greater extent. We develop Squish, a system that uses a combination of Bayesian Networks and Arithmetic Coding to capture multiple kinds of dependencies among attributes and achieve near-entropy compression rate. Squish also supports user-defined attributes: users can instantiate new data types by simply implementing five functions for a new class interface. We prove the asymptotic optimality of our compression algorithm and conduct experiments to show the effectiveness of our system: Squish achieves a reduction of over 50% in storage size relative to systems developed in prior work on a variety of real datasets. PMID:28180028

  18. The NASTRAN User's Manual Level 16.0 and Supplement

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The user's manual is restricted to those items related to the use of NASTRAN that are independent of the computing system being used. The features of NASTRAN described include: (1) procedures for defining and loading a structural model and a functional reference for every card that is used for structural modeling; (2) the NASTRAN data deck, including the details for each of the data cards; (3) the NASTRAN control cards that are associated with the use of the program; (4) rigid format procedures, along with specific instructions for the use of each rigid format: (5) procedures for using instructions for the use of each rigid format; (5) procedures for using the NASTRAN plotting capability; (6) procedures governing the creation of DMAP programs; and (7) the NASTRAN diagnostic messages. The NASTRAN dictionary of mnemonics, acronyms, phrases, and other commonly used NASTRAN terms is included along with a limited number of sample problems.

  19. Equilibrium expert: an add-in to Microsoft Excel for multiple binding equilibrium simulations and parameter estimations.

    PubMed

    Raguin, Olivier; Gruaz-Guyon, Anne; Barbet, Jacques

    2002-11-01

    An add-in to Microsoft Excel was developed to simulate multiple binding equilibriums. A partition function, readily written even when the equilibrium is complex, describes the experimental system. It involves the concentrations of the different free molecular species and of the different complexes present in the experiment. As a result, the software is not restricted to a series of predefined experimental setups but can handle a large variety of problems involving up to nine independent molecular species. Binding parameters are estimated by nonlinear least-square fitting of experimental measurements as supplied by the user. The fitting process allows user-defined weighting of the experimental data. The flexibility of the software and the way it may be used to describe common experimental situations and to deal with usual problems such as tracer reactivity or nonspecific binding is demonstrated by a few examples. The software is available free of charge upon request.

  20. Differentially regulated gene expression associated with hepatitis C virus clearance.

    PubMed

    Grimes, Carolyn Z; Hwang, Lu-Yu; Wei, Peng; Shah, Dimpy P; Volcik, Kelly A; Brown, Eric L

    2013-03-01

    Human chronic hepatitis C virus (HCV) infections pose a significant public health threat, necessitating the development of novel treatments and vaccines. HCV infections range from spontaneous resolution to end-stage liver disease. Approximately 10-30% of HCV infections undergo spontaneous resolution independent of treatment by yet-to-be-defined mechanisms. These individuals test positive for anti-HCV antibodies in the absence of detectable viral serum RNA. To identify genes associated with HCV clearance, this study compared gene expression profiles between current drug users chronically infected with HCV and drug users who cleared their HCV infection. This analysis identified 91 differentially regulated (up- or downregulated by twofold or more) genes potentially associated with HCV clearance. The majority of genes identified were associated with immune function, with the remaining genes categorized either as cancer related or 'other'. Identification of factors and pathways that may influence virus clearance will be essential to the development of novel treatment strategies.

  1. Release of the gPhoton Database of GALEX Photon Events

    NASA Astrophysics Data System (ADS)

    Fleming, Scott W.; Million, Chase; Shiao, Bernie; Tucker, Michael; Loyd, R. O. Parke

    2016-01-01

    The GALEX spacecraft surveyed much of the sky in two ultraviolet bands between 2003 and 2013 with non-integrating microchannel plate detectors. The Mikulski Archive for Space Telescopes (MAST) has made more than one trillion photon events observed by the spacecraft available, stored as a 130 TB database, along with an open-source, python-based software package to query this database and create calibrated lightcurves or images from these data at user-defined spatial and temporal scales. In particular, MAST users can now conduct photometry at the intra-visit level (timescales of seconds and minutes). The software, along with the fully populated database, was officially released in Aug. 2015, and improvements to both software functionality and data calibration are ongoing. We summarize the current calibration status of the gPhoton software, along with examples of early science enabled by gPhoton that include stellar flares, AGN, white dwarfs, exoplanet hosts, novae, and nearby galaxies.

  2. METAGUI 3: A graphical user interface for choosing the collective variables in molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Giorgino, Toni; Laio, Alessandro; Rodriguez, Alex

    2017-08-01

    Molecular dynamics (MD) simulations allow the exploration of the phase space of biopolymers through the integration of equations of motion of their constituent atoms. The analysis of MD trajectories often relies on the choice of collective variables (CVs) along which the dynamics of the system is projected. We developed a graphical user interface (GUI) for facilitating the interactive choice of the appropriate CVs. The GUI allows: defining interactively new CVs; partitioning the configurations into microstates characterized by similar values of the CVs; calculating the free energies of the microstates for both unbiased and biased (metadynamics) simulations; clustering the microstates in kinetic basins; visualizing the free energy landscape as a function of a subset of the CVs used for the analysis. A simple mouse click allows one to quickly inspect structures corresponding to specific points in the landscape.

  3. NASA/IPAC Infrared Archive's General Image Cutouts Service

    NASA Astrophysics Data System (ADS)

    Alexov, A.; Good, J. C.

    2006-07-01

    The NASA/IPAC Infrared Archive (IRSA) ``Cutouts" Service (http://irsa.ipac.caltech.edu/applications/Cutouts) is a general tool for creating small ``cutout" FITS images and JPEGs from collections of data archived at IRSA. This service is a companion to IRSA's Atlas tool (http://irsa.ipac.caltech.edu/applications/Atlas/), which currently serves over 25 different data collections of various sizes and complexity and returns entire images for a user-defined region of the sky. The Cutouts Services sits on top of Atlas and extends the Atlas functionality by generating subimages at locations and sizes requested by the user from images already identified by Atlas. These results can be downloaded individually, in batch mode (using the program wget), or as a tar file. Cutouts re-uses IRSA's software architecture along with the publicly available Montage mosaicking tools. The advantages and disadvantages of this approach to generic cutout serving will be discussed.

  4. ChromA: signal-based retention time alignment for chromatography-mass spectrometry data.

    PubMed

    Hoffmann, Nils; Stoye, Jens

    2009-08-15

    We describe ChromA, a web-based alignment tool for chromatography-mass spectrometry data from the metabolomics and proteomics domains. Users can supply their data in open and standardized file formats for retention time alignment using dynamic time warping with different configurable local distance and similarity functions. Additionally, user-defined anchors can be used to constrain and speedup the alignment. A neighborhood around each anchor can be added to increase the flexibility of the constrained alignment. ChromA offers different visualizations of the alignment for easier qualitative interpretation and comparison of the data. For the multiple alignment of more than two data files, the center-star approximation is applied to select a reference among input files to align to. ChromA is available at http://bibiserv.techfak.uni-bielefeld.de/chroma. Executables and source code under the L-GPL v3 license are provided for download at the same location.

  5. Single crystal to polycrystal neutron transmission simulation

    DOE PAGES

    Dessieux, Luc Lucius; Stoica, Alexandru Dan; Bingham, Philip R.

    2018-02-02

    A collection of routines for calculation of the total cross section that determines the attenuation of neutrons by crystalline solids is presented. The total cross section is calculated semi-empirically as a function of crystal structure, neutron energy, temperature, and crystal orientation. The semi-empirical formula includes the contribution of parasitic Bragg scattering to the total cross section using both the crystal’s mosaic spread value and its orientation with respect to the neutron beam direction as parameters. These routines allow users to enter a distribution of crystal orientations for calculation of total cross sections of user defined powder or pseudo powder distributions,more » which enables simulation of non-uniformities such as texture and strain. In conclusion, the spectra for neutron transmission simulations in the neutron thermal energy range (2 meV–100 meV) are presented for single crystal and polycrystal samples and compared to measurements.« less

  6. Substance Users’ Perspectives on Helpful and Unhelpful Confrontation: Implications for Recovery

    PubMed Central

    Polcin, Douglas; Mulia, Nina; Jones, Laura

    2011-01-01

    Substance users commonly face confrontations about their use from family, friends, peers, and professionals. Yet confrontation is controversial and not well understood. To better understand the effects of confrontation we conducted qualitative interviews with 38 substance users (82% male and 79% white) about their experiences of being confronted. Confrontation was defined as warnings about potential harm related to substance use. Results from coded transcripts indicated that helpful confrontations were those that were perceived as legitimate, offered hope and practical support, and were delivered by persons who were trusted and respected. Unhelpful confrontations were those that were perceived as hypocritical, overtly hostile, or occurring within embattled relationships. Experiences of directive, persistent confrontation varied. Limitations of the study include a small and relatively high functioning sample. We conclude that contextual factors are important in determining how confrontation is experienced. Larger studies with more diverse samples are warranted. PMID:22880542

  7. Cognitive function and mood in MDMA/THC users, THC users and non-drug using controls.

    PubMed

    Lamers, C T J; Bechara, A; Rizzo, M; Ramaekers, J G

    2006-03-01

    Repeated ecstasy (MDMA) use is reported to impair cognition and cause increased feelings of depression and anxiety. Yet, many relevant studies have failed to control for use of drugs other than MDMA, especially marijuana (THC). To address these confounding effects we compared behavioural performance of 11 MDMA/THC users, 15 THC users and 15 non-drug users matched for age and intellect. We tested the hypothesis that reported feelings of depression and anxiety and cognitive impairment (memory, executive function and decision making) are more severe in MDMA/THC users than in THC users. MDMA/THC users reported more intense feelings of depression and anxiety than THC users and non-drug users. Memory function was impaired in both groups of drug users. MDMA/THC users showed slower psychomotor speed and less mental flexibility than non-drug users. THC users exhibited less mental flexibility and performed worse on the decision making task compared to non-drug users but these functions were similar to those in MDMA/THC users. It was concluded that MDMA use is associated with increased feelings of depression and anxiety compared to THC users and non-drug users. THC users were impaired in some cognitive abilities to the same degree as MDMA/THC users, suggesting that some cognitive impairment attributed to MDMA is more likely due to concurrent THC use.

  8. Decision support system development at the Upper Midwest Environmental Sciences Center

    USGS Publications Warehouse

    Fox, Timothy J.; Nelson, J. C.; Rohweder, Jason J.

    2014-01-01

    A Decision Support System (DSS) can be defined in many ways. The working definition used by the U.S. Geological Survey Upper Midwest Environmental Sciences Center (UMESC) is, “A spatially based computer application or data that assists a researcher or manager in making decisions.” This is quite a broad definition—and it needs to be, because the possibilities for types of DSSs are limited only by the user group and the developer’s imagination. There is no one DSS; the types of DSSs are as diverse as the problems they help solve. This diversity requires that DSSs be built in a variety of ways, using the most appropriate methods and tools for the individual application. The skills of potential DSS users vary widely as well, further necessitating multiple approaches to DSS development. Some small, highly trained user groups may want a powerful modeling tool with extensive functionality at the expense of ease of use. Other user groups less familiar with geographic information system (GIS) and spatial data may want an easy-to-use application for a nontechnical audience. UMESC has been developing DSSs for almost 20 years. Our DSS developers offer our partners a wide variety of technical skills and development options, ranging from the most simple Web page or small application to complex modeling application development.

  9. PeakWorks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-11-30

    The PeakWorks software is designed to assist in the quantitative analysis of atom probe tomography (APT) generated mass spectra. Specifically, through an interactive user interface, mass peaks can be identified automatically (defined by a threshold) and/or identified manually. The software then provides a means to assign specific elemental isotopes (including more than one) to each peak. The software also provides a means for the user to choose background subtraction of each peak based on background fitting functions, the choice of which is left to the users discretion. Peak ranging (the mass range over which peaks are integrated) is also automatedmore » allowing the user to chose a quantitative range (e.g. full-widthhalf- maximum). The software then integrates all identified peaks, providing a background-subtracted composition, which also includes the deconvolution of peaks (i.e. those peaks that happen to have overlapping isotopic masses). The software is also able to output a 'range file' that can be used in other software packages, such as within IVAS. A range file lists the peak identities, the mass range of each identified peak, and a color code for the peak. The software is also able to generate 'dummy' peak ranges within an outputted range file that can be used within IVAS to provide a means for background subtracted proximity histogram analysis.« less

  10. Visual Sample Plan Version 7.0 User's Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matzke, Brett D.; Newburn, Lisa LN; Hathaway, John E.

    2014-03-01

    User's guide for VSP 7.0 This user's guide describes Visual Sample Plan (VSP) Version 7.0 and provides instructions for using the software. VSP selects the appropriate number and location of environmental samples to ensure that the results of statistical tests performed to provide input to risk decisions have the required confidence and performance. VSP Version 7.0 provides sample-size equations or algorithms needed by specific statistical tests appropriate for specific environmental sampling objectives. It also provides data quality assessment and statistical analysis functions to support evaluation of the data and determine whether the data support decisions regarding sites suspected of contamination.more » The easy-to-use program is highly visual and graphic. VSP runs on personal computers with Microsoft Windows operating systems (XP, Vista, Windows 7, and Windows 8). Designed primarily for project managers and users without expertise in statistics, VSP is applicable to two- and three-dimensional populations to be sampled (e.g., rooms and buildings, surface soil, a defined layer of subsurface soil, water bodies, and other similar applications) for studies of environmental quality. VSP is also applicable for designing sampling plans for assessing chem/rad/bio threat and hazard identification within rooms and buildings, and for designing geophysical surveys for unexploded ordnance (UXO) identification.« less

  11. Defining Requirements and Related Methods for Designing Sensorized Garments

    PubMed Central

    Andreoni, Giuseppe; Standoli, Carlo Emilio; Perego, Paolo

    2016-01-01

    Designing smart garments has strong interdisciplinary implications, specifically related to user and technical requirements, but also because of the very different applications they have: medicine, sport and fitness, lifestyle monitoring, workplace and job conditions analysis, etc. This paper aims to discuss some user, textile, and technical issues to be faced in sensorized clothes development. In relation to the user, the main requirements are anthropometric, gender-related, and aesthetical. In terms of these requirements, the user’s age, the target application, and fashion trends cannot be ignored, because they determine the compliance with the wearable system. Regarding textile requirements, functional factors—also influencing user comfort—are elasticity and washability, while more technical properties are the stability of the chemical agents’ effects for preserving the sensors’ efficacy and reliability, and assuring the proper duration of the product for the complete life cycle. From the technical side, the physiological issues are the most important: skin conductance, tolerance, irritation, and the effect of sweat and perspiration are key factors for reliable sensing. Other technical features such as battery size and duration, and the form factor of the sensor collector, should be considered, as they affect aesthetical requirements, which have proven to be crucial, as well as comfort and wearability. PMID:27240361

  12. AIBench: a rapid application development framework for translational research in biomedicine.

    PubMed

    Glez-Peña, D; Reboiro-Jato, M; Maia, P; Rocha, M; Díaz, F; Fdez-Riverola, F

    2010-05-01

    Applied research in both biomedical discovery and translational medicine today often requires the rapid development of fully featured applications containing both advanced and specific functionalities, for real use in practice. In this context, new tools are demanded that allow for efficient generation, deployment and reutilization of such biomedical applications as well as their associated functionalities. In this context this paper presents AIBench, an open-source Java desktop application framework for scientific software development with the goal of providing support to both fundamental and applied research in the domain of translational biomedicine. AIBench incorporates a powerful plug-in engine, a flexible scripting platform and takes advantage of Java annotations, reflection and various design principles in order to make it easy to use, lightweight and non-intrusive. By following a basic input-processing-output life cycle, it is possible to fully develop multiplatform applications using only three types of concepts: operations, data-types and views. The framework automatically provides functionalities that are present in a typical scientific application including user parameter definition, logging facilities, multi-threading execution, experiment repeatability and user interface workflow management, among others. The proposed framework architecture defines a reusable component model which also allows assembling new applications by the reuse of libraries from past projects or third-party software. Copyright (c) 2009 Elsevier Ireland Ltd. All rights reserved.

  13. GenderMedDB: an interactive database of sex and gender-specific medical literature.

    PubMed

    Oertelt-Prigione, Sabine; Gohlke, Björn-Oliver; Dunkel, Mathias; Preissner, Robert; Regitz-Zagrosek, Vera

    2014-01-01

    Searches for sex and gender-specific publications are complicated by the absence of a specific algorithm within search engines and by the lack of adequate archives to collect the retrieved results. We previously addressed this issue by initiating the first systematic archive of medical literature containing sex and/or gender-specific analyses. This initial collection has now been greatly enlarged and re-organized as a free user-friendly database with multiple functions: GenderMedDB (http://gendermeddb.charite.de). GenderMedDB retrieves the included publications from the PubMed database. Manuscripts containing sex and/or gender-specific analysis are continuously screened and the relevant findings organized systematically into disciplines and diseases. Publications are furthermore classified by research type, subject and participant numbers. More than 11,000 abstracts are currently included in the database, after screening more than 40,000 publications. The main functions of the database include searches by publication data or content analysis based on pre-defined classifications. In addition, registrants are enabled to upload relevant publications, access descriptive publication statistics and interact in an open user forum. Overall, GenderMedDB offers the advantages of a discipline-specific search engine as well as the functions of a participative tool for the gender medicine community.

  14. Integration of a satellite ground support system based on analysis of the satellite ground support domain

    NASA Technical Reports Server (NTRS)

    Pendley, R. D.; Scheidker, E. J.; Levitt, D. S.; Myers, C. R.; Werking, R. D.

    1994-01-01

    This analysis defines a complete set of ground support functions based on those practiced in real space flight operations during the on-orbit phase of a mission. These functions are mapped against ground support functions currently in use by NASA and DOD. Software components to provide these functions can be hosted on RISC-based work stations and integrated to provide a modular, integrated ground support system. Such modular systems can be configured to provide as much ground support functionality as desired. This approach to ground systems has been widely proposed and prototyped both by government institutions and commercial vendors. The combined set of ground support functions we describe can be used as a standard to evaluate candidate ground systems. This approach has also been used to develop a prototype of a modular, loosely-integrated ground support system, which is discussed briefly. A crucial benefit to a potential user is that all the components are flight-qualified, thus giving high confidence in their accuracy and reliability.

  15. Integration of a satellite ground support system based on analysis of the satellite ground support domain

    NASA Astrophysics Data System (ADS)

    Pendley, R. D.; Scheidker, E. J.; Levitt, D. S.; Myers, C. R.; Werking, R. D.

    1994-11-01

    This analysis defines a complete set of ground support functions based on those practiced in real space flight operations during the on-orbit phase of a mission. These functions are mapped against ground support functions currently in use by NASA and DOD. Software components to provide these functions can be hosted on RISC-based work stations and integrated to provide a modular, integrated ground support system. Such modular systems can be configured to provide as much ground support functionality as desired. This approach to ground systems has been widely proposed and prototyped both by government institutions and commercial vendors. The combined set of ground support functions we describe can be used as a standard to evaluate candidate ground systems. This approach has also been used to develop a prototype of a modular, loosely-integrated ground support system, which is discussed briefly. A crucial benefit to a potential user is that all the components are flight-qualified, thus giving high confidence in their accuracy and reliability.

  16. 14 CFR § 1215.108 - Defining user service requirements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... planning, and other significant mission parameters. It is recommended that potential users contact the NIMO... Flight Center, M/S 450.1, 8800 Greenbelt Road Greenbelt, MD 20771. [77 FR 6953, Feb. 10, 2012] ...

  17. Design and validation of a neuroprosthesis for the treatment of upper limb tremor.

    PubMed

    Gallego, J A; Rocon, E; Belda-Lois, J M; Koutsou, A D; Mena, S; Castillo, A; Pons, J L

    2013-01-01

    Pathological tremor is the most prevalent movement disorder. In spite of the existence of various treatments for it, tremor poses a functional problem to a large proportion of patients. This paper presents the design and implementation of a novel neuroprosthesis for tremor management. The paper starts by reviewing a series of design criteria that were established after analyzing users needs and the expected functionality of the system. Then, it summarizes the design of the neuroprosthesis, which was built to meet the criteria defined previously. Experimental results with a representative group of 12 patients show that the neuroprosthesis provided significant (p < 0.001) and systematic tremor attenuation (in average 52.33 ± 25.48 %), and encourage its functional evaluation as a potential new treatment for tremor in a large cohort of patients.

  18. Ontology based log content extraction engine for a posteriori security control.

    PubMed

    Azkia, Hanieh; Cuppens-Boulahia, Nora; Cuppens, Frédéric; Coatrieux, Gouenou

    2012-01-01

    In a posteriori access control, users are accountable for actions they performed and must provide evidence, when required by some legal authorities for instance, to prove that these actions were legitimate. Generally, log files contain the needed data to achieve this goal. This logged data can be recorded in several formats; we consider here IHE-ATNA (Integrating the healthcare enterprise-Audit Trail and Node Authentication) as log format. The difficulty lies in extracting useful information regardless of the log format. A posteriori access control frameworks often include a log filtering engine that provides this extraction function. In this paper we define and enforce this function by building an IHE-ATNA based ontology model, which we query using SPARQL, and show how the a posteriori security controls are made effective and easier based on this function.

  19. Development of a user-friendly system for image processing of electron microscopy by integrating a web browser and PIONE with Eos.

    PubMed

    Tsukamoto, Takafumi; Yasunaga, Takuo

    2014-11-01

    Eos (Extensible object-oriented system) is one of the powerful applications for image processing of electron micrographs. In usual cases, Eos works with only character user interfaces (CUI) under the operating systems (OS) such as OS-X or Linux, not user-friendly. Thus, users of Eos need to be expert at image processing of electron micrographs, and have a little knowledge of computer science, as well. However, all the persons who require Eos does not an expert for CUI. Thus we extended Eos to a web system independent of OS with graphical user interfaces (GUI) by integrating web browser.Advantage to use web browser is not only to extend Eos with GUI, but also extend Eos to work under distributed computational environment. Using Ajax (Asynchronous JavaScript and XML) technology, we implemented more comfortable user-interface on web browser. Eos has more than 400 commands related to image processing for electron microscopy, and the usage of each command is different from each other. Since the beginning of development, Eos has managed their user-interface by using the interface definition file of "OptionControlFile" written in CSV (Comma-Separated Value) format, i.e., Each command has "OptionControlFile", which notes information for interface and its usage generation. Developed GUI system called "Zephyr" (Zone for Easy Processing of HYpermedia Resources) also accessed "OptionControlFIle" and produced a web user-interface automatically, because its mechanism is mature and convenient,The basic actions of client side system was implemented properly and can supply auto-generation of web-form, which has functions of execution, image preview, file-uploading to a web server. Thus the system can execute Eos commands with unique options for each commands, and process image analysis. There remain problems of image file format for visualization and workspace for analysis: The image file format information is useful to check whether the input/output file is correct and we also need to provide common workspace for analysis because the client is physically separated from a server. We solved the file format problem by extension of rules of OptionControlFile of Eos. Furthermore, to solve workspace problems, we have developed two type of system. The first system is to use only local environments. The user runs a web server provided by Eos, access to a web client through a web browser, and manipulate the local files with GUI on the web browser. The second system is employing PIONE (Process-rule for Input/Output Negotiation Environment), which is our developing platform that works under heterogenic distributed environment. The users can put their resources, such as microscopic images, text files and so on, into the server-side environment supported by PIONE, and so experts can write PIONE rule definition, which defines a workflow of image processing. PIONE run each image processing on suitable computers, following the defined rule. PIONE has the ability of interactive manipulation, and user is able to try a command with various setting values. In this situation, we contribute to auto-generation of GUI for a PIONE workflow.As advanced functions, we have developed a module to log user actions. The logs include information such as setting values in image processing, procedure of commands and so on. If we use the logs effectively, we can get a lot of advantages. For example, when an expert may discover some know-how of image processing, other users can also share logs including his know-hows and so we may obtain recommendation workflow of image analysis, if we analyze logs. To implement social platform of image processing for electron microscopists, we have developed system infrastructure, as well. © The Author 2014. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  20. 77 FR 71850 - Self-Regulatory Organizations; International Securities Exchange, LLC; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-04

    ... end-user defined filters (the ``Service''). Spread Crawler, which was developed by MEB, listens to a... based on a registered end-user input (i.e. custom-set parameters for particular symbols or industry...-user(s), if any, would be interested in seeing this order. These filtering rules are contained in a...

  1. Perceived constraints by non-traditional users on the Mt. Baker-Snoqualmie National Forest

    Treesearch

    Elizabeth A. Covelli; Robert C. Burns; Alan Graefe

    2007-01-01

    The purpose of this study was to investigate the constraints that non-traditional users face, along with the negotiation strategies that are employed in order to start, continue, or increase participation in recreation on a national forest. Non-traditional users were defined as respondents who were not Caucasian. Additionally, both constraints and negotiation...

  2. Perceived Security Determinants in E-Commerce among Turkish University Students

    ERIC Educational Resources Information Center

    Yenisey, M.M.; Ozok, A.A.; Salvendy, G.

    2005-01-01

    Perceived security is defined as the level of security that users feel while they are shopping on e-commerce sites. The aims of this study were to determine items that positively influence this feeling of security by users during shopping, and to develop guidelines for perceived security in e-commerce. An experiment allowed users with different…

  3. Configuration control of seven-degree-of-freedom arms

    NASA Technical Reports Server (NTRS)

    Seraji, Homayoun (Inventor); Long, Mark K. (Inventor); Lee, Thomas S. (Inventor)

    1992-01-01

    A seven degree of freedom robot arm with a six degree of freedom end effector is controlled by a processor employing a 6 by 7 Jacobian matrix for defining location and orientation of the end effector in terms of the rotation angles of the joints, a 1 (or more) by 7 Jacobian matrix for defining 1 (or more) user specified kinematic functions constraining location or movement of selected portions of the arm in terms of the joint angles, the processor combining the two Jacobian matrices to produce an augmented 7 (or more) by 7 Jacobian matrix, the processor effecting control by computing in accordance with forward kinematics from the augmented 7 by 7 Jacobian matrix and from the seven joint angles of the arm a set of seven desired joint angles for transmittal to the joint servo loops of the arm. One of the kinematic functions constraints the orientation of the elbow plane of the arm. Another one of the kinematic functions minimizes a sum of gravitational torques on the joints. Still another kinematic function constrains the location of the arm to perform collision avoidance. Generically, one kinematic function minimizes a sum of selected mechanical parameters of at least some of the joints associated with weighting coefficients which may be changed during arm movement. The mechanical parameters may be velocity errors or gravity torques associated with individual joints.

  4. Configuration control of seven degree of freedom arms

    NASA Technical Reports Server (NTRS)

    Seraji, Homayoun (Inventor)

    1995-01-01

    A seven-degree-of-freedom robot arm with a six-degree-of-freedom end effector is controlled by a processor employing a 6-by-7 Jacobian matrix for defining location and orientation of the end effector in terms of the rotation angles of the joints, a 1 (or more)-by-7 Jacobian matrix for defining 1 (or more) user-specified kinematic functions constraining location or movement of selected portions of the arm in terms of the joint angles, the processor combining the two Jacobian matrices to produce an augmented 7 (or more)-by-7 Jacobian matrix, the processor effecting control by computing in accordance with forward kinematics from the augmented 7-by-7 Jacobian matrix and from the seven joint angles of the arm a set of seven desired joint angles for transmittal to the joint servo loops of the arms. One of the kinematic functions constrains the orientation of the elbow plane of the arm. Another one of the kinematic functions minimizing a sum of gravitational torques on the joints. Still another one of the kinematic functions constrains the location of the arm to perform collision avoidance. Generically, one of the kinematic functions minimizes a sum of selected mechanical parameters of at least some of the joints associated with weighting coefficients which may be changed during arm movement. The mechanical parameters may be velocity errors or position errors or gravity torques associated with individual joints.

  5. MO-E-18C-01: Open Access Web-Based Peer-To-Peer Training and Education in Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pawlicki, T; Brown, D; Dunscombe, P

    Purpose: Current training and education delivery models have limitations which result in gaps in clinical proficiency with equipment, procedures, and techniques. Educational and training opportunities offered by vendors and professional societies are by their nature not available at point of need or for the life of clinical systems. The objective of this work is to leverage modern communications technology to provide peer-to-peer training and education for radiotherapy professionals, in the clinic and on demand, as they undertake their clinical duties. Methods: We have developed a free of charge web site ( https://i.treatsafely.org ) using the Google App Engine and datastoremore » (NDB, GQL), Python with AJAX-RPC, and Javascript. The site is a radiotherapy-specific hosting service to which user-created videos illustrating clinical or physics processes and other relevant educational material can be uploaded. Efficient navigation to the material of interest is provided through several RT specific search tools and videos can be scored by users, thus providing comprehensive peer review of the site content. The site also supports multilingual narration\\translation of videos, a quiz function for competence assessment and a library function allowing groups or institutions to define their standard operating procedures based on the video content. Results: The website went live in August 2013 and currently has over 680 registered users from 55 countries; 27.2% from the United States, 9.8% from India, 8.3% from the United Kingdom, 7.3% from Brazil, and 47.5% from other countries. The users include physicists (57.4%), Oncologists (12.5%), therapists (8.2%) and dosimetrists (4.8%). There are 75 videos to date including English, Portuguese, Mandarin, and Thai. Conclusion: Based on the initial acceptance of the site, we conclude that this open access web-based peer-to-peer tool is fulfilling an important need in radiotherapy training and education. Site functionality should expand in the future to include document sharing and continuing education credits.« less

  6. Some implications of an event-based definition of exposure to the risk of road accident.

    PubMed

    Elvik, Rune

    2015-03-01

    This paper proposes a new definition of exposure to the risk of road accident as any event, limited in space and time, representing a potential for an accident to occur by bringing road users close to each other in time or space of by requiring a road user to take action to avoid leaving the roadway. A typology of events representing a potential for an accident is proposed. Each event can be interpreted as a trial as defined in probability theory. Risk is the proportion of events that result in an accident. Defining exposure as events demanding the attention of road users implies that road users will learn from repeated exposure to these events, which in turn implies that there will normally be a negative relationship between exposure and risk. Four hypotheses regarding the relationship between exposure and risk are proposed. Preliminary tests support these hypotheses. Advantages and disadvantages of defining exposure as specific events are discussed. It is argued that developments in vehicle technology are likely to make events both observable and countable, thus ensuring that exposure is an operational concept. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. FACTOR FINDER CD-ROM

    EPA Science Inventory

    The Factor Finder CD-ROM is a user-friendly, searchable tool used to locate exposure factors and sociodemographic data for user-defined populations. Factor Finder improves the exposure assessors and risk assessors (etc.) ability to efficiently locate exposure-related informatio...

  8. Computer systems and methods for the query and visualization of multidimensional databases

    DOEpatents

    Stolte, Chris; Tang, Diane L; Hanrahan, Patrick

    2015-03-03

    A computer displays a graphical user interface on its display. The graphical user interface includes a schema information region and a data visualization region. The schema information region includes multiple operand names, each operand corresponding to one or more fields of a multi-dimensional database that includes at least one data hierarchy. The data visualization region includes a columns shelf and a rows shelf. The computer detects user actions to associate one or more first operands with the columns shelf and to associate one or more second operands with the rows shelf. The computer generates a visual table in the data visualization region in accordance with the user actions. The visual table includes one or more panes. Each pane has an x-axis defined based on data for the one or more first operands, and each pane has a y-axis defined based on data for the one or more second operands.

  9. Computer systems and methods for the query and visualization of multidimensional databases

    DOEpatents

    Stolte, Chris; Tang, Diane L.; Hanrahan, Patrick

    2015-11-10

    A computer displays a graphical user interface on its display. The graphical user interface includes a schema information region and a data visualization region. The schema information region includes a plurality of fields of a multi-dimensional database that includes at least one data hierarchy. The data visualization region includes a columns shelf and a rows shelf. The computer detects user actions to associate one or more first fields with the columns shelf and to associate one or more second fields with the rows shelf. The computer generates a visual table in the data visualization region in accordance with the user actions. The visual table includes one or more panes. Each pane has an x-axis defined based on data for the one or more first fields, and each pane has a y-axis defined based on data for the one or more second fields.

  10. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components, part 2

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.

  11. Implementation and evaluation of LMS mobile application: scele mobile based on user-centered design

    NASA Astrophysics Data System (ADS)

    Banimahendra, R. D.; Santoso, H. B.

    2018-03-01

    The development of mobile technology is now increasing rapidly, demanding all activities including learning should be done on mobile devices. It shows that the implementation of mobile application as a learning medium needs to be done. This study describes the process of developing and evaluating the Moodle-based mobile Learning Management System (LMS) application called Student Centered e-Learning Environment (SCeLE). This study discusses the process of defining features, implementing features into the application, and evaluating the application. We define the features using user research and literature study, then we implement the application with user-centered design basis, at the last phase we evaluated the application using usability testing and system usability score (SUS). The purpose of this study is to determine the extent to which this application can help the users doing their tasks and provide recommendation for the next research and development.

  12. Profile of Executive and Memory Function Associated with Amphetamine and Opiate Dependence

    PubMed Central

    Ersche, Karen D; Clark, Luke; London, Mervyn; Robbins, Trevor W; Sahakian, Barbara J

    2007-01-01

    Cognitive function was assessed in chronic drug users on neurocognitive measures of executive and memory function. Current amphetamine users were contrasted with current opiate users, and these two groups were compared with former users of these substances (abstinent for at least one year). Four groups of participants were recruited: amphetamine-dependent individuals, opiate-dependent individuals, former users of amphetamines, and/or opiates and healthy non-drug taking controls. Participants were administered the Tower of London (TOL) planning task and the 3D-IDED attentional set-shifting task to assess executive function, and Paired Associates Learning and Delayed Pattern Recognition Memory tasks to assess visual memory function. The three groups of substance users showed significant impairments on TOL planning, Pattern Recognition Memory and Paired Associates Learning. Current amphetamine users displayed a greater degree of impairment than current opiate users. Consistent with previous research showing that healthy men are performing better on visuo-spatial tests than women, our male controls remembered significantly more paired associates than their female counterparts. This relationship was reversed in drug users. While performance of female drug users was normal, male drug users showed significant impairment compared to both their female counterparts and male controls. There was no difference in performance between current and former drug users. Neither years of drug abuse nor years of drug abstinence were associated with performance. Chronic drug users display pronounced neuropsychological impairment in the domains of executive and memory function. Impairment persists after several years of drug abstinence and may reflect neuropathology in frontal and temporal cortices. PMID:16160707

  13. Risk perception in consumer product use.

    PubMed

    Weegels, M F; Kanis, H

    2000-05-01

    In the literature, at least two distinct connotations of risk can be found: so called objective risk, defined as the ratio of a particular number of accidents and a measure of exposure, and subjective risk, defined as the perception and awareness of risks by the person(s) involved. This article explores the significance of risk perception and awareness in understanding and clarifying how and why accidents involving consumer products occur. Based on empirical evidence from video-recorded reconstructions of accidents with consumer products, the risk perception and awareness of users in relation to featural and functional product characteristics, and their influence on actual product use culminating in an accident, is addressed. In contrast with what is usually assumed in the literature, the findings show that the majority of the subjects had no idea that they were running any risk of injuring themselves while they operated the product. In several accidents, the product either offered functionalities not anticipated in the design or did not adequately reflect its condition. The implications of the findings for design practice as well as for risk research are discussed.

  14. Accommodation requirements for microgravity science and applications research on space station

    NASA Technical Reports Server (NTRS)

    Uhran, M. L.; Holland, L. R.; Wear, W. O.

    1985-01-01

    Scientific research conducted in the microgravity environment of space represents a unique opportunity to explore and exploit the benefits of materials processing in the virtual abscence of gravity induced forces. NASA has initiated the preliminary design of a permanently manned space station that will support technological advances in process science and stimulate the development of new and improved materials having applications across the commercial spectrum. A study is performed to define from the researchers' perspective, the requirements for laboratory equipment to accommodate microgravity experiments on the space station. The accommodation requirements focus on the microgravity science disciplines including combustion science, electronic materials, metals and alloys, fluids and transport phenomena, glasses and ceramics, and polymer science. User requirements have been identified in eleven research classes, each of which contain an envelope of functional requirements for related experiments having similar characteristics, objectives, and equipment needs. Based on these functional requirements seventeen items of experiment apparatus and twenty items of core supporting equipment have been defined which represent currently identified equipment requirements for a pressurized laboratory module at the initial operating capability of the NASA space station.

  15. SMASH - semi-automatic muscle analysis using segmentation of histology: a MATLAB application.

    PubMed

    Smith, Lucas R; Barton, Elisabeth R

    2014-01-01

    Histological assessment of skeletal muscle tissue is commonly applied to many areas of skeletal muscle physiological research. Histological parameters including fiber distribution, fiber type, centrally nucleated fibers, and capillary density are all frequently quantified measures of skeletal muscle. These parameters reflect functional properties of muscle and undergo adaptation in many muscle diseases and injuries. While standard operating procedures have been developed to guide analysis of many of these parameters, the software to freely, efficiently, and consistently analyze them is not readily available. In order to provide this service to the muscle research community we developed an open source MATLAB script to analyze immunofluorescent muscle sections incorporating user controls for muscle histological analysis. The software consists of multiple functions designed to provide tools for the analysis selected. Initial segmentation and fiber filter functions segment the image and remove non-fiber elements based on user-defined parameters to create a fiber mask. Establishing parameters set by the user, the software outputs data on fiber size and type, centrally nucleated fibers, and other structures. These functions were evaluated on stained soleus muscle sections from 1-year-old wild-type and mdx mice, a model of Duchenne muscular dystrophy. In accordance with previously published data, fiber size was not different between groups, but mdx muscles had much higher fiber size variability. The mdx muscle had a significantly greater proportion of type I fibers, but type I fibers did not change in size relative to type II fibers. Centrally nucleated fibers were highly prevalent in mdx muscle and were significantly larger than peripherally nucleated fibers. The MATLAB code described and provided along with this manuscript is designed for image processing of skeletal muscle immunofluorescent histological sections. The program allows for semi-automated fiber detection along with user correction. The output of the code provides data in accordance with established standards of practice. The results of the program have been validated using a small set of wild-type and mdx muscle sections. This program is the first freely available and open source image processing program designed to automate analysis of skeletal muscle histological sections.

  16. Transport Modeling of Hydrogen in Metals for Application to Hydrogen Assisted Cracking of Metals.

    DTIC Science & Technology

    1995-04-04

    34 consists of a Fortran "user element" subroutine for use with the ABAQUS 2 finite element program. Documentation of the 1-D user element subroutine is...trapping theory. The use of the ABAQUS finite element "User Element" subroutines for solving 1-D problems is then outlined in full detail. This is followed...reflect the new ordering given by Eq. (57). ABAOUS User Element Subroutines ABAQUS executes a Fortran subroutine named UEL for each "user defined" finite

  17. A new implementation of the programming system for structural synthesis (PROSSS-2)

    NASA Technical Reports Server (NTRS)

    Rogers, James L., Jr.

    1984-01-01

    This new implementation of the PROgramming System for Structural Synthesis (PROSSS-2) combines a general-purpose finite element computer program for structural analysis, a state-of-the-art optimization program, and several user-supplied, problem-dependent computer programs. The results are flexibility of the optimization procedure, organization, and versatility of the formulation of constraints and design variables. The analysis-optimization process results in a minimized objective function, typically the mass. The analysis and optimization programs are executed repeatedly by looping through the system until the process is stopped by a user-defined termination criterion. However, some of the analysis, such as model definition, need only be one time and the results are saved for future use. The user must write some small, simple FORTRAN programs to interface between the analysis and optimization programs. One of these programs, the front processor, converts the design variables output from the optimizer into the suitable format for input into the analyzer. Another, the end processor, retrieves the behavior variables and, optionally, their gradients from the analysis program and evaluates the objective function and constraints and optionally their gradients. These quantities are output in a format suitable for input into the optimizer. These user-supplied programs are problem-dependent because they depend primarily upon which finite elements are being used in the model. PROSSS-2 differs from the original PROSSS in that the optimizer and front and end processors have been integrated into the finite element computer program. This was done to reduce the complexity and increase portability of the system, and to take advantage of the data handling features found in the finite element program.

  18. User-Centered Evaluation of Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean C.

    Visual analytics systems are becoming very popular. More domains now use interactive visualizations to analyze the ever-increasing amount and heterogeneity of data. More novel visualizations are being developed for more tasks and users. We need to ensure that these systems can be evaluated to determine that they are both useful and usable. A user-centered evaluation for visual analytics needs to be developed for these systems. While many of the typical human-computer interaction (HCI) evaluation methodologies can be applied as is, others will need modification. Additionally, new functionality in visual analytics systems needs new evaluation methodologies. There is a difference betweenmore » usability evaluations and user-centered evaluations. Usability looks at the efficiency, effectiveness, and user satisfaction of users carrying out tasks with software applications. User-centered evaluation looks more specifically at the utility provided to the users by the software. This is reflected in the evaluations done and in the metrics used. In the visual analytics domain this is very challenging as users are most likely experts in a particular domain, the tasks they do are often not well defined, the software they use needs to support large amounts of different kinds of data, and often the tasks last for months. These difficulties are discussed more in the section on User-centered Evaluation. Our goal is to provide a discussion of user-centered evaluation practices for visual analytics, including existing practices that can be carried out and new methodologies and metrics that need to be developed and agreed upon by the visual analytics community. The material provided here should be of use for both researchers and practitioners in the field of visual analytics. Researchers and practitioners in HCI and interested in visual analytics will find this information useful as well as a discussion on changes that need to be made to current HCI practices to make them more suitable to visual analytics. A history of analysis and analysis techniques and problems is provided as well as an introduction to user-centered evaluation and various evaluation techniques for readers from different disciplines. The understanding of these techniques is imperative if we wish to support analysis in the visual analytics software we develop. Currently the evaluations that are conducted and published for visual analytics software are very informal and consist mainly of comments from users or potential users. Our goal is to help researchers in visual analytics to conduct more formal user-centered evaluations. While these are time-consuming and expensive to carryout, the outcomes of these studies will have a defining impact on the field of visual analytics and help point the direction for future features and visualizations to incorporate. While many researchers view work in user-centered evaluation as a less-than-exciting area to work, the opposite is true. First of all, the goal is user-centered evaluation is to help visual analytics software developers, researchers, and designers improve their solutions and discover creative ways to better accommodate their users. Working with the users is extremely rewarding as well. While we use the term “users” in almost all situations there are a wide variety of users that all need to be accommodated. Moreover, the domains that use visual analytics are varied and expanding. Just understanding the complexities of a number of these domains is exciting. Researchers are trying out different visualizations and interactions as well. And of course, the size and variety of data are expanding rapidly. User-centered evaluation in this context is rapidly changing. There are no standard processes and metrics and thus those of us working on user-centered evaluation must be creative in our work with both the users and with the researchers and developers.« less

  19. Adoption of Library 2.0 Functionalities by Academic Libraries and Users: A Knowledge Management Perspective

    ERIC Educational Resources Information Center

    Kim, Yong-Mi; Abbas, June

    2010-01-01

    This study investigates the adoption of Library 2.0 functionalities by academic libraries and users through a knowledge management perspective. Based on randomly selected 230 academic library Web sites and 184 users, the authors found RSS and blogs are widely adopted by academic libraries while users widely utilized the bookmark function.…

  20. An unsupervised method for summarizing egocentric sport videos

    NASA Astrophysics Data System (ADS)

    Habibi Aghdam, Hamed; Jahani Heravi, Elnaz; Puig, Domenec

    2015-12-01

    People are getting more interested to record their sport activities using head-worn or hand-held cameras. This type of videos which is called egocentric sport videos has different motion and appearance patterns compared with life-logging videos. While a life-logging video can be defined in terms of well-defined human-object interactions, notwithstanding, it is not trivial to describe egocentric sport videos using well-defined activities. For this reason, summarizing egocentric sport videos based on human-object interaction might fail to produce meaningful results. In this paper, we propose an unsupervised method for summarizing egocentric videos by identifying the key-frames of the video. Our method utilizes both appearance and motion information and it automatically finds the number of the key-frames. Our blind user study on the new dataset collected from YouTube shows that in 93:5% cases, the users choose the proposed method as their first video summary choice. In addition, our method is within the top 2 choices of the users in 99% of studies.

  1. Systems Modeling to Implement Integrated System Health Management Capability

    NASA Technical Reports Server (NTRS)

    Figueroa, Jorge F.; Walker, Mark; Morris, Jonathan; Smith, Harvey; Schmalzel, John

    2007-01-01

    ISHM capability includes: detection of anomalies, diagnosis of causes of anomalies, prediction of future anomalies, and user interfaces that enable integrated awareness (past, present, and future) by users. This is achieved by focused management of data, information and knowledge (DIaK) that will likely be distributed across networks. Management of DIaK implies storage, sharing (timely availability), maintaining, evolving, and processing. Processing of DIaK encapsulates strategies, methodologies, algorithms, etc. focused on achieving high ISHM Functional Capability Level (FCL). High FCL means a high degree of success in detecting anomalies, diagnosing causes, predicting future anomalies, and enabling health integrated awareness by the user. A model that enables ISHM capability, and hence, DIaK management, is denominated the ISHM Model of the System (IMS). We describe aspects of the IMS that focus on processing of DIaK. Strategies, methodologies, and algorithms require proper context. We describe an approach to define and use contexts, implementation in an object-oriented software environment (G2), and validation using actual test data from a methane thruster test program at NASA SSC. Context is linked to existence of relationships among elements of a system. For example, the context to use a strategy to detect leak is to identify closed subsystems (e.g. bounded by closed valves and by tanks) that include pressure sensors, and check if the pressure is changing. We call these subsystems Pressurizable Subsystems. If pressure changes are detected, then all members of the closed subsystem become suspect of leakage. In this case, the context is defined by identifying a subsystem that is suitable for applying a strategy. Contexts are defined in many ways. Often, a context is defined by relationships of function (e.g. liquid flow, maintaining pressure, etc.), form (e.g. part of the same component, connected to other components, etc.), or space (e.g. physically close, touching the same common element, etc.). The context might be defined dynamically (if conditions for the context appear and disappear dynamically) or statically. Although this approach is akin to case-based reasoning, we are implementing it using a software environment that embodies tools to define and manage relationships (of any nature) among objects in a very intuitive manner. Context for higher level inferences (that use detected anomalies or events), primarily for diagnosis and prognosis, are related to causal relationships. This is useful to develop root-cause analysis trees showing an event linked to its possible causes and effects. The innovation pertaining to RCA trees encompasses use of previously defined subsystems as well as individual elements in the tree. This approach allows more powerful implementations of RCA capability in object-oriented environments. For example, if a pressurizable subsystem is leaking, its root-cause representation within an RCA tree will show that the cause is that all elements of that subsystem are suspect of leak. Such a tree would apply to all instances of leak-events detected and all elements in all pressurizable subsystems in the system. Example subsystems in our environment to build IMS include: Pressurizable Subsystem, Fluid-Fill Subsystem, Flow-Thru-Valve Subsystem, and Fluid Supply Subsystem. The software environment for IMS is designed to potentially allow definition of any relationship suitable to create a context to achieve ISHM capability.

  2. Action Information Management System (AIMS): a User's View

    NASA Technical Reports Server (NTRS)

    Wiskerchen, M.

    1984-01-01

    The initial approach used in establishing a user-defined information system to fulfill the needs of users at NASA Headquarters was unsuccessful in bringing this pilot endeaveor to full project status. The persistence of several users and the full involvement of the Ames Research Center were the ingredients needed to make the AIMS project a success. The lesson learned from this effort is that NASA should always work from its organizational strengths as a Headquarters-Center partnership.

  3. Proceedings of the Workshop on Government Oil Spill Modeling

    NASA Technical Reports Server (NTRS)

    Bishop, J. M. (Compiler)

    1980-01-01

    Oil spill model users and modelers were brought together for the purpose of fostering joint communication and increasing understanding of mutual problems. The workshop concentrated on defining user needs, presentations on ongoing modeling programs, and discussions of supporting research for these modeling efforts. Specific user recommendations include the development of an oil spill model user library which identifies and describes available models. The development of models for the long-term fate and effect of spilled oil was examined.

  4. An interactive program to display user-generated or file-based maps on a personal computer monitor

    USGS Publications Warehouse

    Langer, W.H.; Stephens, R.W.

    1987-01-01

    PC MAP-MAKER is an ADVANCED BASIC program written to provide users of IBM XT, IBM AT, and compatible computers with a straight-forward, flexible method to display geographical data on a color or monochrome PC (personal computer) monitor. Data can be political boundaries such as State and county boundaries; natural curvilinear features such as rivers, drainage areas, and geological contacts; and points such as well locations and mineral localities. Essentially any point defined by a latitude and longitude and any line defined by a series of latitude and longitude values can be displayed using the program. PC MAP MAKER allows users to view tabular data from U.S. Geological Survey files such as WATSTORE (National Water Data Storage and Retrieval System) in a map format in a time much shorter than required by sending the data to a line plotter. The screen image can be saved to disk for recall at a later date, and hard copies can be printed with a dot matrix printer. The program is user-friendly, using menus or prompts to guide user input. It is fully documented and structured to allow the user to tailor the program to the user 's specific needs. The documentation includes a tutorial designed to introduce users to the capabilities of the program using the State of Colorado as a demonstration map area. (Author 's abstract)

  5. Drugs As Instruments: Describing and Testing a Behavioral Approach to the Study of Neuroenhancement

    PubMed Central

    Brand, Ralf; Wolff, Wanja; Ziegler, Matthias

    2016-01-01

    Neuroenhancement (NE) is the non-medical use of psychoactive substances to produce a subjective enhancement in psychological functioning and experience. So far empirical investigations of individuals' motivation for NE however have been hampered by the lack of theoretical foundation. This study aimed to apply drug instrumentalization theory to user motivation for NE. We argue that NE should be defined and analyzed from a behavioral perspective rather than in terms of the characteristics of substances used for NE. In the empirical study we explored user behavior by analyzing relationships between drug options (use over-the-counter products, prescription drugs, illicit drugs) and postulated drug instrumentalization goals (e.g., improved cognitive performance, counteracting fatigue, improved social interaction). Questionnaire data from 1438 university students were subjected to exploratory and confirmatory factor analysis to address the question of whether analysis of drug instrumentalization should be based on the assumption that users are aiming to achieve a certain goal and choose their drug accordingly or whether NE behavior is more strongly rooted in a decision to try or use a certain drug option. We used factor mixture modeling to explore whether users could be separated into qualitatively different groups defined by a shared “goal × drug option” configuration. Our results indicate, first, that individuals' decisions about NE are eventually based on personal attitude to drug options (e.g., willingness to use an over-the-counter product but not to abuse prescription drugs) rather than motivated by desire to achieve a specific goal (e.g., fighting tiredness) for which different drug options might be tried. Second, data analyses suggested two qualitatively different classes of users. Both predominantly used over-the-counter products, but “neuroenhancers” might be characterized by a higher propensity to instrumentalize over-the-counter products for virtually all investigated goals whereas “fatigue-fighters” might be inclined to use over-the-counter products exclusively to fight fatigue. We believe that psychological investigations like these are essential, especially for designing programs to prevent risky behavior. PMID:27582720

  6. Suicide ideation of individuals in online social networks.

    PubMed

    Masuda, Naoki; Kurahashi, Issei; Onari, Hiroko

    2013-01-01

    Suicide explains the largest number of death tolls among Japanese adolescents in their twenties and thirties. Suicide is also a major cause of death for adolescents in many other countries. Although social isolation has been implicated to influence the tendency to suicidal behavior, the impact of social isolation on suicide in the context of explicit social networks of individuals is scarcely explored. To address this question, we examined a large data set obtained from a social networking service dominant in Japan. The social network is composed of a set of friendship ties between pairs of users created by mutual endorsement. We carried out the logistic regression to identify users' characteristics, both related and unrelated to social networks, which contribute to suicide ideation. We defined suicide ideation of a user as the membership to at least one active user-defined community related to suicide. We found that the number of communities to which a user belongs to, the intransitivity (i.e., paucity of triangles including the user), and the fraction of suicidal neighbors in the social network, contributed the most to suicide ideation in this order. Other characteristics including the age and gender contributed little to suicide ideation. We also found qualitatively the same results for depressive symptoms.

  7. Elevation Difference and Bouguer Anomaly Analysis Tool (EDBAAT) User's Guide

    USGS Publications Warehouse

    Smittle, Aaron M.; Shoberg, Thomas G.

    2017-06-16

    This report describes a software tool that imports gravity anomaly point data from the Gravity Database of the United States (GDUS) of the National Geospatial-Intelligence Agency and University of Texas at El Paso along with elevation data from The National Map (TNM) of the U.S. Geological Survey that lie within a user-specified geographic area of interest. Further, the tool integrates these two sets of data spatially and analyzes the consistency of the elevation of each gravity station from the GDUS with TNM elevation data; it also evaluates the consistency of gravity anomaly data within the GDUS data repository. The tool bins the GDUS data based on user-defined criteria of elevation misfit between the GDUS and TNM elevation data. It also provides users with a list of points from the GDUS data, which have Bouguer anomaly values that are considered outliers (two standard deviations or greater) with respect to other nearby GDUS anomaly data. “Nearby” can be defined by the user at time of execution. These outputs should allow users to quickly and efficiently choose which points from the GDUS would be most useful in reconnaissance studies or in augmenting and extending the range of individual gravity studies.

  8. Using Multicriteria Analysis in Issues Concerning Adaptation of Historic Facilities for the Needs of Public Utility Buildings with a Function of a Theatre

    NASA Astrophysics Data System (ADS)

    Obracaj, Piotr; Fabianowski, Dariusz

    2017-10-01

    Implementations concerning adaptation of historic facilities for public utility objects are associated with the necessity of solving many complex, often conflicting expectations of future users. This mainly concerns the function that includes construction, technology and aesthetic issues. The list of issues is completed with proper protection of historic values, different in each case. The procedure leading to obtaining the expected solution is a multicriteria procedure, usually difficult to accurately define and requiring designer’s large experience. An innovative approach has been used for the analysis, namely - the modified EA FAHP (Extent Analysis Fuzzy Analytic Hierarchy Process) Chang’s method of a multicriteria analysis for the assessment of complex functional and spatial issues. Selection of optimal spatial form of an adapted historic building intended for the multi-functional public utility facility was analysed. The assumed functional flexibility was determined in the scope of: education, conference, and chamber spectacles, such as drama, concerts, in different stage-audience layouts.

  9. Space Transportation System (STS) propellant scavenging system study. Volume 1: Technical report

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The objectives are to define the most efficient and cost effective methods for scavenging cryogenic and storable propellants and then define the requirements for these scavenging systems. For cryogenic propellants, scavenging is the transfer of propellants from the Shuttle orbiter external tank (ET) and/or main propulsion subsystems (MPS) propellant lines into storage tanks located in the orbiter payload bay for delivery to the user station by a space based transfer stage or the Space Transportation System (STS) by direct insertion. For storable propellants, scavenging is the direct transfer from the orbital maneuvering subsystem (OMS) and/or tankage in the payload bay to users in LEO as well as users in the vicinity of the Space Station.

  10. On-line data display

    NASA Astrophysics Data System (ADS)

    Lang, Sherman Y. T.; Brooks, Martin; Gauthier, Marc; Wein, Marceli

    1993-05-01

    A data display system for embedded realtime systems has been developed for use as an operator's user interface and debugging tool. The motivation for development of the On-Line Data Display (ODD) have come from several sources. In particular the design reflects the needs of researchers developing an experimental mobile robot within our laboratory. A proliferation of specialized user interfaces revealed a need for a flexible communications and graphical data display system. At the same time the system had to be readily extensible for arbitrary graphical display formats which would be required for data visualization needs of the researchers. The system defines a communication protocol transmitting 'datagrams' between tasks executing on the realtime system and virtual devices displaying the data in a meaningful way on a graphical workstation. The communication protocol multiplexes logical channels on a single data stream. The current implementation consists of a server for the Harmony realtime operating system and an application written for the Macintosh computer. Flexibility requirements resulted in a highly modular server design, and a layered modular object- oriented design for the Macintosh part of the system. Users assign data types to specific channels at run time. Then devices are instantiated by the user and connected to channels to receive datagrams. The current suite of device types do not provide enough functionality for most users' specialized needs. Instead the system design allows the creation of new device types with modest programming effort. The protocol, design and use of the system are discussed.

  11. An Analysis of the Use of Medical Applications Required for Complex Humanitarian Disasters and Emergencies via Hastily Formed Networks (HFN) in the Field

    DTIC Science & Technology

    2005-09-01

    define criteria for deployment in cooperation with HFN and explore concepts of operations. The end product of this research will serve as a baseline...When choosing an EMR for a humanitarian mission, the user must be careful not only to choose a product for its ease of use or its functionality as it...January 2000, SSL was not allowed to be shipped outside of the U.S. with 128-bit encryption due to the fact that any software product that contained strong

  12. Develop 3G Application with The J2ME SATSA API

    NASA Astrophysics Data System (ADS)

    JunWu, Xu; JunLing, Liang

    This paper describes research in the use of the Security and Trust Services API for J2ME (SATSA) to develop mobile applications. for 3G networks. SATSA defines a set of APIs that allows J2ME applications to communicate with and access functionality, secure storage and cryptographic operations provided by security elements such as smart cards and Wireless Identification Modules (WIM). A Java Card application could also work as an authentication module in a J2ME-based e-bank application. The e-bank application would allow its users to access their bank accounts using their cell phones.

  13. High throughput reconfigurable data analysis system

    NASA Technical Reports Server (NTRS)

    Bearman, Greg (Inventor); Pelletier, Michael J. (Inventor); Seshadri, Suresh (Inventor); Pain, Bedabrata (Inventor)

    2008-01-01

    The present invention relates to a system and method for performing rapid and programmable analysis of data. The present invention relates to a reconfigurable detector comprising at least one array of a plurality of pixels, where each of the plurality of pixels can be selected to receive and read-out an input. The pixel array is divided into at least one pixel group for conducting a common predefined analysis. Each of the pixels has a programmable circuitry programmed with a dynamically configurable user-defined function to modify the input. The present detector also comprises a summing circuit designed to sum the modified input.

  14. Large Field Visualization with Demand-Driven Calculation

    NASA Technical Reports Server (NTRS)

    Moran, Patrick J.; Henze, Chris

    1999-01-01

    We present a system designed for the interactive definition and visualization of fields derived from large data sets: the Demand-Driven Visualizer (DDV). The system allows the user to write arbitrary expressions to define new fields, and then apply a variety of visualization techniques to the result. Expressions can include differential operators and numerous other built-in functions, ail of which are evaluated at specific field locations completely on demand. The payoff of following a demand-driven design philosophy throughout becomes particularly evident when working with large time-series data, where the costs of eager evaluation alternatives can be prohibitive.

  15. The NASA Integrated Information Technology Architecture

    NASA Technical Reports Server (NTRS)

    Baldridge, Tim

    1997-01-01

    This document defines an Information Technology Architecture for the National Aeronautics and Space Administration (NASA), where Information Technology (IT) refers to the hardware, software, standards, protocols and processes that enable the creation, manipulation, storage, organization and sharing of information. An architecture provides an itemization and definition of these IT structures, a view of the relationship of the structures to each other and, most importantly, an accessible view of the whole. It is a fundamental assumption of this document that a useful, interoperable and affordable IT environment is key to the execution of the core NASA scientific and project competencies and business practices. This Architecture represents the highest level system design and guideline for NASA IT related activities and has been created on the authority of the NASA Chief Information Officer (CIO) and will be maintained under the auspices of that office. It addresses all aspects of general purpose, research, administrative and scientific computing and networking throughout the NASA Agency and is applicable to all NASA administrative offices, projects, field centers and remote sites. Through the establishment of five Objectives and six Principles this Architecture provides a blueprint for all NASA IT service providers: civil service, contractor and outsourcer. The most significant of the Objectives and Principles are the commitment to customer-driven IT implementations and the commitment to a simpler, cost-efficient, standards-based, modular IT infrastructure. In order to ensure that the Architecture is presented and defined in the context of the mission, project and business goals of NASA, this Architecture consists of four layers in which each subsequent layer builds on the previous layer. They are: 1) the Business Architecture: the operational functions of the business, or Enterprise, 2) the Systems Architecture: the specific Enterprise activities within the context of IT systems, 3) the Technical Architecture: a common, vendor-independent framework for design, integration and implementation of IT systems and 4) the Product Architecture: vendor=specific IT solutions. The Systems Architecture is effectively a description of the end-user "requirements". Generalized end-user requirements are discussed and subsequently organized into specific mission and project functions. The Technical Architecture depicts the framework, and relationship, of the specific IT components that enable the end-user functionality as described in the Systems Architecture. The primary components as described in the Technical Architecture are: 1) Applications: Basic Client Component, Object Creation Applications, Collaborative Applications, Object Analysis Applications, 2) Services: Messaging, Information Broker, Collaboration, Distributed Processing, and 3) Infrastructure: Network, Security, Directory, Certificate Management, Enterprise Management and File System. This Architecture also provides specific Implementation Recommendations, the most significant of which is the recognition of IT as core to NASA activities and defines a plan, which is aligned with the NASA strategic planning processes, for keeping the Architecture alive and useful.

  16. Vibration Pattern Imager (VPI): A control and data acquisition system for scanning laser vibrometers

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Brown, Donald E.; Shaffer, Thomas A.

    1993-01-01

    The Vibration Pattern Imager (VPI) system was designed to control and acquire data from scanning laser vibrometer sensors. The PC computer based system uses a digital signal processing (DSP) board and an analog I/O board to control the sensor and to process the data. The VPI system was originally developed for use with the Ometron VPI Sensor, but can be readily adapted to any commercially available sensor which provides an analog output signal and requires analog inputs for control of mirror positioning. The sensor itself is not part of the VPI system. A graphical interface program, which runs on a PC under the MS-DOS operating system, functions in an interactive mode and communicates with the DSP and I/O boards in a user-friendly fashion through the aid of pop-up menus. Two types of data may be acquired with the VPI system: single point or 'full field.' In the single point mode, time series data is sampled by the A/D converter on the I/O board (at a user-defined sampling rate for a selectable number of samples) and is stored by the PC. The position of the measuring point (adjusted by mirrors in the sensor) is controlled via a mouse input. The mouse input is translated to output voltages by the D/A converter on the I/O board to control the mirror servos. In the 'full field' mode, the measurement point is moved over a user-selectable rectangular area. The time series data is sampled by the A/D converter on the I/O board (at a user-defined sampling rate for a selectable number of samples) and converted to a root-mean-square (rms) value by the DSP board. The rms 'full field' velocity distribution is then uploaded for display and storage on the PC.

  17. Factors influencing the postoperative use of analgesics in dogs and cats by Canadian veterinarians.

    PubMed

    Dohoo, S E; Dohoo, I R

    1996-09-01

    Four hundred and seventeen Canadian veterinarians were surveyed to determine their postoperative use of analgesics in dogs and cats following 6 categories of surgeries, and their opinion toward pain perception and perceived complications associated with the postoperative use of potent opioid analgesics. Three hundred and seventeen (76%) returned the questionnaire. An analgesic user was defined as a veterinarian who administers analgesics to at least 50% of dogs or 50% of cats following abdominal surgery, excluding ovariohysterectomy. The veterinarians responding exhibited a bimodal distribution of analgesic use, with 49.5% being defined as analgesic users. These veterinarians tended to use analgesics in 100% of animals following abdominal surgery. Veterinarians defined as analgesic nonusers rarely used postoperative analgesics following any abdominal surgery. Pain perception was defined as the average of pain rankings (on a scale of 1 to 10) following abdominal surgery, or the value for dogs or cats if the veterinarian worked with only 1 of the 2 species. Maximum concern about the risks associated with the postoperative use of potent opioid agonists was defined as the highest ranking assigned to any of the 7 risks evaluated in either dogs or cats. Logistic regression analysis identified the pain perception score and the maximum concern regarding the use of potent opioid agonists in the postoperative period as the 2 factors that distinguished analgesic users from analgesic nonusers. This model correctly classified 68% of veterinarians as analgesic users or nonusers. Linear regression analysis identified gender and the presence of an animal health technologist in the practice as the 2 factors that influenced pain perception by veterinarians. Linear regression analysis identified working with an animal health technologist, graduation within the past 10 years, and attendance at continuing education as factors that influenced maximum concern about the postoperative use of opioid agonists.

  18. Emergency Department Frequent Users for Acute Alcohol Intoxication.

    PubMed

    Klein, Lauren R; Martel, Marc L; Driver, Brian E; Reing, Mackenzie; Cole, Jon B

    2018-03-01

    A subset of frequent users of emergency services are those who use the emergency department (ED) for acute alcohol intoxication. This population and their ED encounters have not been previously described. This was a retrospective, observational, cohort study of patients presenting to the ED for acute alcohol intoxication between 2012 and 2016. We collected all data from the electronic medical record. Frequent users for alcohol intoxication were defined as those with greater than 20 visits for acute intoxication without additional medical chief complaints in the previous 12 months. We used descriptive statistics to evaluate characteristics of frequent users for alcohol intoxication, as well as their ED encounters. We identified 32,121 patient encounters. Of those, 325 patients were defined as frequent users for alcohol intoxication, comprising 11,370 of the encounters during the study period. The median maximum number of encounters per person for alcohol intoxication in a one-year period was 47 encounters (range 20 to 169). Frequent users were older (47 years vs. 39 years), and more commonly male (86% vs. 71%). Frequent users for alcohol intoxication had higher rates of medical and psychiatric comorbidities including liver disease, chronic kidney disease, ischemic vascular disease, dementia, chronic obstructive pulmonary disease, history of traumatic brain injury, schizophrenia, and bipolar disorder. In this study, we identified a group of ED frequent users who use the ED for acute alcohol intoxication. This population had higher rates of medical and psychiatric comorbidities compared to non-frequent users.

  19. KODAMA and VPC based Framework for Ubiquitous Systems and its Experiment

    NASA Astrophysics Data System (ADS)

    Takahashi, Kenichi; Amamiya, Satoshi; Iwao, Tadashige; Zhong, Guoqiang; Kainuma, Tatsuya; Amamiya, Makoto

    Recently, agent technologies have attracted a lot of interest as an emerging programming paradigm. With such agent technologies, services are provided through collaboration among agents. At the same time, the spread of mobile technologies and communication infrastructures has made it possible to access the network anytime and from anywhere. Using agents and mobile technologies to realize ubiquitous computing systems, we propose a new framework based on KODAMA and VPC. KODAMA provides distributed management mechanisms by using the concept of community and communication infrastructure to deliver messages among agents without agents being aware of the physical network. VPC provides a method of defining peer-to-peer services based on agent communication with policy packages. By merging the characteristics of both KODAMA and VPC functions, we propose a new framework for ubiquitous computing environments. It provides distributed management functions according to the concept of agent communities, agent communications which are abstracted from the physical environment, and agent collaboration with policy packages. Using our new framework, we conducted a large-scale experiment in shopping malls in Nagoya, which sent advertisement e-mails to users' cellular phones according to user location and attributes. The empirical results showed that our new framework worked effectively for sales in shopping malls.

  20. Precise Network Modeling of Systems Genetics Data Using the Bayesian Network Webserver.

    PubMed

    Ziebarth, Jesse D; Cui, Yan

    2017-01-01

    The Bayesian Network Webserver (BNW, http://compbio.uthsc.edu/BNW ) is an integrated platform for Bayesian network modeling of biological datasets. It provides a web-based network modeling environment that seamlessly integrates advanced algorithms for probabilistic causal modeling and reasoning with Bayesian networks. BNW is designed for precise modeling of relatively small networks that contain less than 20 nodes. The structure learning algorithms used by BNW guarantee the discovery of the best (most probable) network structure given the data. To facilitate network modeling across multiple biological levels, BNW provides a very flexible interface that allows users to assign network nodes into different tiers and define the relationships between and within the tiers. This function is particularly useful for modeling systems genetics datasets that often consist of multiscalar heterogeneous genotype-to-phenotype data. BNW enables users to, within seconds or minutes, go from having a simply formatted input file containing a dataset to using a network model to make predictions about the interactions between variables and the potential effects of experimental interventions. In this chapter, we will introduce the functions of BNW and show how to model systems genetics datasets with BNW.

  1. OrthoVenn: a web server for genome wide comparison and annotation of orthologous clusters across multiple species.

    PubMed

    Wang, Yi; Coleman-Derr, Devin; Chen, Guoping; Gu, Yong Q

    2015-07-01

    Genome wide analysis of orthologous clusters is an important component of comparative genomics studies. Identifying the overlap among orthologous clusters can enable us to elucidate the function and evolution of proteins across multiple species. Here, we report a web platform named OrthoVenn that is useful for genome wide comparisons and visualization of orthologous clusters. OrthoVenn provides coverage of vertebrates, metazoa, protists, fungi, plants and bacteria for the comparison of orthologous clusters and also supports uploading of customized protein sequences from user-defined species. An interactive Venn diagram, summary counts, and functional summaries of the disjunction and intersection of clusters shared between species are displayed as part of the OrthoVenn result. OrthoVenn also includes in-depth views of the clusters using various sequence analysis tools. Furthermore, OrthoVenn identifies orthologous clusters of single copy genes and allows for a customized search of clusters of specific genes through key words or BLAST. OrthoVenn is an efficient and user-friendly web server freely accessible at http://probes.pw.usda.gov/OrthoVenn or http://aegilops.wheat.ucdavis.edu/OrthoVenn. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. Experiments to evolve toward a tangible user interface for computer-aided design parts assembly

    NASA Astrophysics Data System (ADS)

    Legardeur, Jeremy; Garreau, Ludovic; Couture, Nadine

    2004-05-01

    In this paper, we present the concepts of the ESKUA (Experimentation of a Kinesics System Usable for Assembly) platform that allows designers to carry out the assembly of mechanical CAD (Computer Aided Design) parts. This platform, based on tangible user interface lead taking into account assembly constraints from the beginning of the design phase and especially during the phase of CAD models manipulation. Our goal is to propose a working environment where the designer is confronted with real assembly constraints which are currently masked by existing CAD software functionalities. Thus, the platform is based on the handling of physical objects, called tangible interactors, which enable having a physical perception of the assembly constraints. In this goal, we have defined a typology of interactors based on concepts proposed in Design For Assembly methods. We present here the results of studies that led to the evolution of this first interactors set. One is concerning an experiment to evaluate the cognitive aspects of the use of interactors. The other is about an analysis of existing mechanical product and fasteners. We will show how these studies lead to the evolution of the interactors based on the functional surfaces use.

  3. Automated Concurrent Blackboard System Generation in C++

    NASA Technical Reports Server (NTRS)

    Kaplan, J. A.; McManus, J. W.; Bynum, W. L.

    1999-01-01

    In his 1992 Ph.D. thesis, "Design and Analysis Techniques for Concurrent Blackboard Systems", John McManus defined several performance metrics for concurrent blackboard systems and developed a suite of tools for creating and analyzing such systems. These tools allow a user to analyze a concurrent blackboard system design and predict the performance of the system before any code is written. The design can be modified until simulated performance is satisfactory. Then, the code generator can be invoked to generate automatically all of the code required for the concurrent blackboard system except for the code implementing the functionality of each knowledge source. We have completed the port of the source code generator and a simulator for a concurrent blackboard system. The source code generator generates the necessary C++ source code to implement the concurrent blackboard system using Parallel Virtual Machine (PVM) running on a heterogeneous network of UNIX(trademark) workstations. The concurrent blackboard simulator uses the blackboard specification file to predict the performance of the concurrent blackboard design. The only part of the source code for the concurrent blackboard system that the user must supply is the code implementing the functionality of the knowledge sources.

  4. Effect of statins on intracerebral hemorrhage outcome and recurrence.

    PubMed

    FitzMaurice, Emilie; Wendell, Lauren; Snider, Ryan; Schwab, Kristin; Chanderraj, Rishi; Kinnecom, Cathrine; Nandigam, Kaveer; Rost, Natalia S; Viswanathan, Anand; Rosand, Jonathan; Greenberg, Steven M; Smith, Eric E

    2008-07-01

    3-Hydroxy-3-methylglutaryl coenzyme A (HMG-CoA) reductase inhibitors, or statins, have been associated with improved outcome after ischemic stroke and subarachnoid hemorrhage but an increased risk of incident intracerebral hemorrhage (ICH). We investigated (1) whether statin use before ICH was associated with functional independence at 90 days, and (2) whether survivors exposed to statins after ICH had an increased risk of recurrence. We analyzed 629 consecutive ICH patients with 90-day outcome data enrolled in a prospective cohort study between 1998 to 2005. Statin use was determined by patient interview at the time of ICH and supplemented by medical record review. Independent status was defined as Glasgow Outcome Scale 4 or 5. ICH survivors were followed by telephone interview every 6 months. Statins were used by 149/629 (24%) before ICH. There was no effect of pre-ICH statin use on the rates of functional independence (28% versus 29%, P=0.84) or mortality (46% versus 45%, P=0.93). Medical comorbidities and warfarin use were more common in statin users. Hematoma volumes were similar (median 28 cm(3) in pre-ICH statin users compared to 22 cm(3) in nonusers, P=0.18). The multivariable-adjusted odds ratio for independent status in pre-ICH statin users was 1.16 (95% CI 0.65 to 2.10, P=0.62). ICH survivors treated with statins after discharge did not have a higher risk of recurrence (adjusted HR 0.82, 95% CI 0.34 to 1.99, P=0.66). Pre-ICH statin use is not associated with improved ICH functional outcome or mortality. Post-ICH statin use is not associated with an increased risk of ICH recurrence.

  5. CosmoCalc: An Excel add-in for cosmogenic nuclide calculations

    NASA Astrophysics Data System (ADS)

    Vermeesch, Pieter

    2007-08-01

    As dating methods using Terrestrial Cosmogenic Nuclides (TCN) become more popular, the need arises for a general-purpose and easy-to-use data reduction software. The CosmoCalc Excel add-in calculates TCN production rate scaling factors (using Lal, Stone, Dunai, and Desilets methods); topographic, snow, and self-shielding factors; and exposure ages, erosion rates, and burial ages and visualizes the results on banana-style plots. It uses an internally consistent TCN production equation that is based on the quadruple exponential approach of Granger and Smith (2000). CosmoCalc was designed to be as user-friendly as possible. Although the user interface is extremely simple, the program is also very flexible, and nearly all default parameter values can be changed. To facilitate the comparison of different scaling factors, a set of converter tools is provided, allowing the user to easily convert cut-off rigidities to magnetic inclinations, elevations to atmospheric depths, and so forth. Because it is important to use a consistent set of scaling factors for the sample measurements and the production rate calibration sites, CosmoCalc defines the production rates implicitly, as a function of the original TCN concentrations of the calibration site. The program is best suited for 10Be, 26Al, 3He, and 21Ne calculations, although basic functionality for 36Cl and 14C is also provided. CosmoCalc can be downloaded along with a set of test data from http://cosmocalc.googlepages.com.

  6. A CZT-based blood counter for quantitative molecular imaging.

    PubMed

    Espagnet, Romain; Frezza, Andrea; Martin, Jean-Pierre; Hamel, Louis-André; Lechippey, Laëtitia; Beauregard, Jean-Mathieu; Després, Philippe

    2017-12-01

    Robust quantitative analysis in positron emission tomography (PET) and in single-photon emission computed tomography (SPECT) typically requires the time-activity curve as an input function for the pharmacokinetic modeling of tracer uptake. For this purpose, a new automated tool for the determination of blood activity as a function of time is presented. The device, compact enough to be used on the patient bed, relies on a peristaltic pump for continuous blood withdrawal at user-defined rates. Gamma detection is based on a 20 × 20 × 15 mm 3 cadmium zinc telluride (CZT) detector, read by custom-made electronics and a field-programmable gate array-based signal processing unit. A graphical user interface (GUI) allows users to select parameters and easily perform acquisitions. This paper presents the overall design of the device as well as the results related to the detector performance in terms of stability, sensitivity and energy resolution. Results from a patient study are also reported. The device achieved a sensitivity of 7.1 cps/(kBq/mL) and a minimum detectable activity of 2.5 kBq/ml for 18 F. The gamma counter also demonstrated an excellent stability with a deviation in count rates inferior to 0.05% over 6 h. An energy resolution of 8% was achieved at 662 keV. The patient study was conclusive and demonstrated that the compact gamma blood counter developed has the sensitivity and the stability required to conduct quantitative molecular imaging studies in PET and SPECT.

  7. Space Segment (SS) and the Navigation User Segment (US) Interface Control Document (ICD)

    DOT National Transportation Integrated Search

    1993-10-10

    This Interface Control Document (ICD) defines the requirements related to the interface between the Space Segment (SS) of the Global Positioning System (GPS) and the Navigation Users Segment of the GPS. 2880k, 154p.

  8. Impact of Truck Loading on Design and Analysis of Asphaltic Pavement Structures : Phase II

    DOT National Transportation Integrated Search

    2011-02-01

    In this study, Schaperys nonlinear viscoelastic constitutive model is implemented into the commercial finite element (FE) software ABAQUS via user defined subroutine (user material, or UMAT) to analyze asphalt pavement subjected to heavy truck loa...

  9. An Inter-Personal Information Sharing Model Based on Personalized Recommendations

    NASA Astrophysics Data System (ADS)

    Kamei, Koji; Funakoshi, Kaname; Akahani, Jun-Ichi; Satoh, Tetsuji

    In this paper, we propose an inter-personal information sharing model among individuals based on personalized recommendations. In the proposed model, we define an information resource as shared between people when both of them consider it important --- not merely when they both possess it. In other words, the model defines the importance of information resources based on personalized recommendations from identifiable acquaintances. The proposed method is based on a collaborative filtering system that focuses on evaluations from identifiable acquaintances. It utilizes both user evaluations for documents and their contents. In other words, each user profile is represented as a matrix of credibility to the other users' evaluations on each domain of interests. We extended the content-based collaborative filtering method to distinguish other users to whom the documents should be recommended. We also applied a concept-based vector space model to represent the domain of interests instead of the previous method which represented them by a term-based vector space model. We introduce a personalized concept-base compiled from each user's information repository to improve the information retrieval in the user's environment. Furthermore, the concept-spaces change from user to user since they reflect the personalities of the users. Because of different concept-spaces, the similarity between a document and a user's interest varies for each user. As a result, a user receives recommendations from other users who have different view points, achieving inter-personal information sharing based on personalized recommendations. This paper also describes an experimental simulation of our information sharing model. In our laboratory, five participants accumulated a personal repository of e-mails and web pages from which they built their own concept-base. Then we estimated the user profiles according to personalized concept-bases and sets of documents which others evaluated. We simulated inter-personal recommendation based on the user profiles and evaluated the performance of the recommendation method by comparing the recommended documents to the result of the content-based collaborative filtering.

  10. Involving service users in trials: developing a standard operating procedure

    PubMed Central

    2013-01-01

    Background Many funding bodies require researchers to actively involve service users in research to improve relevance, accountability and quality. Current guidance to researchers mainly discusses general principles. Formal guidance about how to involve service users operationally in the conduct of trials is lacking. We aimed to develop a standard operating procedure (SOP) to support researchers to involve service users in trials and rigorous studies. Methods Researchers with experience of involving service users and service users who were contributing to trials collaborated with the West Wales Organisation for Rigorous Trials in Health, a registered clinical trials unit, to develop the SOP. Drafts were prepared in a Task and Finish Group, reviewed by all co-authors and amendments made. Results We articulated core principles, which defined equality of service users with all other research team members and collaborative processes underpinning the SOP, plus guidance on how to achieve these. We developed a framework for involving service users in research that defined minimum levels of collaboration plus additional consultation and decision-making opportunities. We recommended service users be involved throughout the life of a trial, including planning and development, data collection, analysis and dissemination, and listed tasks for collaboration. We listed people responsible for involving service users in studies and promoting an inclusive culture. We advocate actively involving service users as early as possible in the research process, with a minimum of two on all formal trial groups and committees. We propose that researchers protect at least 1% of their total research budget as a minimum resource to involve service users and allow enough time to facilitate active involvement. Conclusions This SOP provides guidance to researchers to involve service users successfully in developing and conducting clinical trials and creating a culture of actively involving service users in research at all stages. The UK Clinical Research Collaboration should encourage clinical trials units actively to involve service users and research funders should provide sufficient funds and time for this in research grants. PMID:23866730

  11. User-guided segmentation for volumetric retinal optical coherence tomography images

    PubMed Central

    Yin, Xin; Chao, Jennifer R.; Wang, Ruikang K.

    2014-01-01

    Abstract. Despite the existence of automatic segmentation techniques, trained graders still rely on manual segmentation to provide retinal layers and features from clinical optical coherence tomography (OCT) images for accurate measurements. To bridge the gap between this time-consuming need of manual segmentation and currently available automatic segmentation techniques, this paper proposes a user-guided segmentation method to perform the segmentation of retinal layers and features in OCT images. With this method, by interactively navigating three-dimensional (3-D) OCT images, the user first manually defines user-defined (or sketched) lines at regions where the retinal layers appear very irregular for which the automatic segmentation method often fails to provide satisfactory results. The algorithm is then guided by these sketched lines to trace the entire 3-D retinal layer and anatomical features by the use of novel layer and edge detectors that are based on robust likelihood estimation. The layer and edge boundaries are finally obtained to achieve segmentation. Segmentation of retinal layers in mouse and human OCT images demonstrates the reliability and efficiency of the proposed user-guided segmentation method. PMID:25147962

  12. User-guided segmentation for volumetric retinal optical coherence tomography images.

    PubMed

    Yin, Xin; Chao, Jennifer R; Wang, Ruikang K

    2014-08-01

    Despite the existence of automatic segmentation techniques, trained graders still rely on manual segmentation to provide retinal layers and features from clinical optical coherence tomography (OCT) images for accurate measurements. To bridge the gap between this time-consuming need of manual segmentation and currently available automatic segmentation techniques, this paper proposes a user-guided segmentation method to perform the segmentation of retinal layers and features in OCT images. With this method, by interactively navigating three-dimensional (3-D) OCT images, the user first manually defines user-defined (or sketched) lines at regions where the retinal layers appear very irregular for which the automatic segmentation method often fails to provide satisfactory results. The algorithm is then guided by these sketched lines to trace the entire 3-D retinal layer and anatomical features by the use of novel layer and edge detectors that are based on robust likelihood estimation. The layer and edge boundaries are finally obtained to achieve segmentation. Segmentation of retinal layers in mouse and human OCT images demonstrates the reliability and efficiency of the proposed user-guided segmentation method.

  13. Identifying the perceptive users for online social systems

    PubMed Central

    Liu, Xiao-Lu; Guo, Qiang; Han, Jing-Ti

    2017-01-01

    In this paper, the perceptive user, who could identify the high-quality objects in their initial lifespan, is presented. By tracking the ratings given to the rewarded objects, we present a method to identify the user perceptibility, which is defined as the capability that a user can identify these objects at their early lifespan. Moreover, we investigate the behavior patterns of the perceptive users from three dimensions: User activity, correlation characteristics of user rating series and user reputation. The experimental results for the empirical networks indicate that high perceptibility users show significantly different behavior patterns with the others: Having larger degree, stronger correlation of rating series and higher reputation. Furthermore, in view of the hysteresis in finding the rewarded objects, we present a general framework for identifying the high perceptibility users based on user behavior patterns. The experimental results show that this work is helpful for deeply understanding the collective behavior patterns for online users. PMID:28704382

  14. Identifying the perceptive users for online social systems.

    PubMed

    Liu, Jian-Guo; Liu, Xiao-Lu; Guo, Qiang; Han, Jing-Ti

    2017-01-01

    In this paper, the perceptive user, who could identify the high-quality objects in their initial lifespan, is presented. By tracking the ratings given to the rewarded objects, we present a method to identify the user perceptibility, which is defined as the capability that a user can identify these objects at their early lifespan. Moreover, we investigate the behavior patterns of the perceptive users from three dimensions: User activity, correlation characteristics of user rating series and user reputation. The experimental results for the empirical networks indicate that high perceptibility users show significantly different behavior patterns with the others: Having larger degree, stronger correlation of rating series and higher reputation. Furthermore, in view of the hysteresis in finding the rewarded objects, we present a general framework for identifying the high perceptibility users based on user behavior patterns. The experimental results show that this work is helpful for deeply understanding the collective behavior patterns for online users.

  15. Associations Between Microbiota, Mitochondrial Function, and Cognition in Chronic Marijuana Users.

    PubMed

    Panee, Jun; Gerschenson, Mariana; Chang, Linda

    2018-03-01

    Marijuana (MJ) use is associated with cognitive deficits. Both mitochondrial (mt) dysfunction and gut dysbiosis also affect cognition. We examined whether cognition is related to peripheral blood mononuclear cells' (PBMCs) mt function and fecal microbiota in chronic MJ users. Nineteen chronic MJ users and 20 non-users were evaluated using the Cognition Battery in NIH Toolbox, their mt function for ATP production, and basal and maximal respirations were measured in PBMCs using the Seahorse XFe96 Analyzer, and the abundances of Prevotella and Bacteroides (associated with plant-based and animal product-based diet, respectively) were calculated from stool microbiota analysis. Average Prevotella:Bacteroides ratio was ~13-fold higher in nonusers than users. Lifetime MJ use correlated inversely with Prevotella:Bacteroides ratio (p = 0.05), mt function (p = 0.0027-0.0057), and Flanker Inhibitory Control and Attention (p = 0.041). Prevotella abundance correlated positively, while Bacteroides abundance correlated inversely, with mt function across all participants (p = 0.0004-0.06). Prevotella abundance also correlated positively with scores of Fluid Cognition, Flanker Inhibitory Control and Attention, List Sorting, and Dimension Change Card Sort in MJ users, but not in non-users (interaction-p = 0.018-0.05). Similarly, mt function correlated positively with scores of Fluid Cognition and Flanker Inhibitory Control and Attention in MJ users, but not in non-users (interaction-p = 0.0018-0.08). These preliminary findings suggest that MJ use is associated with alterations of gut microbiota and mt function, which may further contribute to cognitive deficits. We posited that MJ-associated low vegetable/fruit intake may contribute to these changes. Future studies are needed to delineate the relationships among diet, microbiota, mt function, and cognition in MJ users.

  16. Playing Games with Optimal Competitive Scheduling

    NASA Technical Reports Server (NTRS)

    Frank, Jeremy; Crawford, James; Khatib, Lina; Brafman, Ronen

    2005-01-01

    This paper is concerned with the problem of allocating a unit capacity resource to multiple users within a pre-defined time period. The resource is indivisible, so that at most one user can use it at each time instance. However, different users may use it at different times. The users have independent, selfish preferences for when and for how long they are allocated this resource. Thus, they value different resource access durations differently, and they value different time slots differently. We seek an optimal allocation schedule for this resource.

  17. The Use of Social Media Networks and Mobile Phone Applications for Reporting Suspicious and Criminal Activities to Mass Transit Law Enforcement Agencies

    DTIC Science & Technology

    2013-12-01

    Media (IACP defines Social Media as the following:2  A category of Internet-based resources that integrate user - generated content and user ...Social Media—a category of Internet –based resources that integrate user - generated content and user participations. Includes Facebook, MySpace...reduction in crime is unresolved, warranting future study in this area. vi THIS PAGE INTENTIONALLY LEFT BLANK vii TABLE OF CONTENTS I. INTRODUCTION

  18. TERSSE: Definition of the Total Earth Resources System for the Shuttle Era. Volume 7: User Models: A System Assessment

    NASA Technical Reports Server (NTRS)

    1974-01-01

    User models defined as any explicit process or procedure used to transform information extracted from remotely sensed data into a form useful as a resource management information input are discussed. The role of the user models as information, technological, and operations interfaces between the TERSSE and the resource managers is emphasized. It is recommended that guidelines and management strategies be developed for a systems approach to user model development.

  19. Private Graphs - Access Rights on Graphs for Seamless Navigation

    NASA Astrophysics Data System (ADS)

    Dorner, W.; Hau, F.; Pagany, R.

    2016-06-01

    After the success of GNSS (Global Navigational Satellite Systems) and navigation services for public streets, indoor seems to be the next big development in navigational services, relying on RTLS - Real Time Locating Services (e.g. WIFI) and allowing seamless navigation. In contrast to navigation and routing services on public streets, seamless navigation will cause an additional challenge: how to make routing data accessible to defined users or restrict access rights for defined areas or only to parts of the graph to a defined user group? The paper will present case studies and data from literature, where seamless and especially indoor navigation solutions are presented (hospitals, industrial complexes, building sites), but the problem of restricted access rights was only touched from a real world, but not a technical perspective. The analysis of case studies will show, that the objective of navigation and the different target groups for navigation solutions will demand well defined access rights and require solutions, how to make only parts of a graph to a user or application available to solve a navigational task. The paper will therefore introduce the concept of private graphs, which is defined as a graph for navigational purposes covering the street, road or floor network of an area behind a public street and suggest different approaches how to make graph data for navigational purposes available considering access rights and data protection, privacy and security issues as well.

  20. Providing Effective Access to Shared Resources: A COIN Approach

    NASA Technical Reports Server (NTRS)

    Airiau, Stephane; Wolpert, David H.

    2004-01-01

    Managers of systems of shared resources typically have many separate goals. Examples are efficient utilization of the resources among its users and ensuring no user s satisfaction in the system falls below a preset minimal level. Since such goals will usually conflict with one another, either implicitly or explicitly the manager must determine the relative importance of the goals, encapsulating that into an overall utility function rating the possible behaviors of the entire system. Here we demonstrate a distributed, robust, and adaptive way to optimize that overall function. Our approach is to interpose adaptive agents between each user and the system, where each such agent is working to maximize its own private utility function. In turn, each such agent's function should be both relatively easy for the agent to learn to optimize, and "aligned" with the overall utility function of the system manager - an overall function that is based on but in general different from the satisfaction functions of the individual users. To ensure this we enhance the Collective INtelligence (COIN) framework to incorporate user satisfaction functions in the overall utility function of the system manager and accordingly in the associated private utility functions assigned to the users agents. We present experimental evaluations of different COIN-based private utility functions and demonstrate that those COIN-based functions outperform some natural alternatives.

  1. Providing Effective Access to Shared Resources: A COIN Approach

    NASA Technical Reports Server (NTRS)

    Airiau, Stephane; Wolpert, David H.; Sen, Sandip; Tumer, Kagan

    2003-01-01

    Managers of systems of shared resources typically have many separate goals. Examples are efficient utilization of the resources among its users and ensuring no user's satisfaction in the system falls below a preset minimal level. Since such goals will usually conflict with one another, either implicitly or explicitly the manager must determine the relative importance of the goals, encapsulating that into an overall utility function rating the possible behaviors of the entire system. Here we demonstrate a distributed, robust, and adaptive way to optimize that overall function. Our approach is to interpose adaptive agents between each user and the system, where each such agent is working to maximize its own private utility function. In turn, each such agent's function should be both relatively easy for the agent to learn to optimize, and 'aligned' with the overall utility function of the system manager - an overall function that is based on but in general different from the satisfaction functions of the individual users. To ensure this we enhance the COllective INtelligence (COIN) framework to incorporate user satisfaction functions in the overall utility function of the system manager and accordingly in the associated private utility functions assigned to the users agents. We present experimental evaluations of different COIN-based private utility functions and demonstrate that those COIN-based functions outperform some natural alternatives.

  2. Generic functional requirements for a NASA general-purpose data base management system

    NASA Technical Reports Server (NTRS)

    Lohman, G. M.

    1981-01-01

    Generic functional requirements for a general-purpose, multi-mission data base management system (DBMS) for application to remotely sensed scientific data bases are detailed. The motivation for utilizing DBMS technology in this environment is explained. The major requirements include: (1) a DBMS for scientific observational data; (2) a multi-mission capability; (3) user-friendly; (4) extensive and integrated information about data; (5) robust languages for defining data structures and formats; (6) scientific data types and structures; (7) flexible physical access mechanisms; (8) ways of representing spatial relationships; (9) a high level nonprocedural interactive query and data manipulation language; (10) data base maintenance utilities; (11) high rate input/output and large data volume storage; and adaptability to a distributed data base and/or data base machine configuration. Detailed functions are specified in a top-down hierarchic fashion. Implementation, performance, and support requirements are also given.

  3. Improvements on non-equilibrium and transport Green function techniques: The next-generation TRANSIESTA

    NASA Astrophysics Data System (ADS)

    Papior, Nick; Lorente, Nicolás; Frederiksen, Thomas; García, Alberto; Brandbyge, Mads

    2017-03-01

    We present novel methods implemented within the non-equilibrium Green function code (NEGF) TRANSIESTA based on density functional theory (DFT). Our flexible, next-generation DFT-NEGF code handles devices with one or multiple electrodes (Ne ≥ 1) with individual chemical potentials and electronic temperatures. We describe its novel methods for electrostatic gating, contour optimizations, and assertion of charge conservation, as well as the newly implemented algorithms for optimized and scalable matrix inversion, performance-critical pivoting, and hybrid parallelization. Additionally, a generic NEGF "post-processing" code (TBTRANS/PHTRANS) for electron and phonon transport is presented with several novelties such as Hamiltonian interpolations, Ne ≥ 1 electrode capability, bond-currents, generalized interface for user-defined tight-binding transport, transmission projection using eigenstates of a projected Hamiltonian, and fast inversion algorithms for large-scale simulations easily exceeding 106 atoms on workstation computers. The new features of both codes are demonstrated and bench-marked for relevant test systems.

  4. Semantic Web Service Delivery in Healthcare Based on Functional and Non-Functional Properties.

    PubMed

    Schweitzer, Marco; Gorfer, Thilo; Hörbst, Alexander

    2017-01-01

    In the past decades, a lot of endeavor has been made on the trans-institutional exchange of healthcare data through electronic health records (EHR) in order to obtain a lifelong, shared accessible health record of a patient. Besides basic information exchange, there is a growing need for Information and Communication Technology (ICT) to support the use of the collected health data in an individual, case-specific workflow-based manner. This paper presents the results on how workflows can be used to process data from electronic health records, following a semantic web service approach that enables automatic discovery, composition and invocation of suitable web services. Based on this solution, the user (physician) can define its needs from a domain-specific perspective, whereas the ICT-system fulfills those needs with modular web services. By involving also non-functional properties for the service selection, this approach is even more suitable for the dynamic medical domain.

  5. SNPHunter: a bioinformatic software for single nucleotide polymorphism data acquisition and management.

    PubMed

    Wang, Lin; Liu, Simin; Niu, Tianhua; Xu, Xin

    2005-03-18

    Single nucleotide polymorphisms (SNPs) provide an important tool in pinpointing susceptibility genes for complex diseases and in unveiling human molecular evolution. Selection and retrieval of an optimal SNP set from publicly available databases have emerged as the foremost bottlenecks in designing large-scale linkage disequilibrium studies, particularly in case-control settings. We describe the architectural structure and implementations of a novel software program, SNPHunter, which allows for both ad hoc-mode and batch-mode SNP search, automatic SNP filtering, and retrieval of SNP data, including physical position, function class, flanking sequences at user-defined lengths, and heterozygosity from NCBI dbSNP. The SNP data extracted from dbSNP via SNPHunter can be exported and saved in plain text format for further down-stream analyses. As an illustration, we applied SNPHunter for selecting SNPs for 10 major candidate genes for type 2 diabetes, including CAPN10, FABP4, IL6, NOS3, PPARG, TNF, UCP2, CRP, ESR1, and AR. SNPHunter constitutes an efficient and user-friendly tool for SNP screening, selection, and acquisition. The executable and user's manual are available at http://www.hsph.harvard.edu/ppg/software.htm

  6. A new paradigm on battery powered embedded system design based on User-Experience-Oriented method

    NASA Astrophysics Data System (ADS)

    Wang, Zhuoran; Wu, Yue

    2014-03-01

    The battery sustainable time has been an active research topic recently for the development of battery powered embedded products such as tablets and smart phones, which are determined by the battery capacity and power consumption. Despite numerous efforts on the improvement of battery capacity in the field of material engineering, the power consumption also plays an important role and easier to ameliorate in delivering a desirable user-experience, especially considering the moderate advancement on batteries for decades. In this study, a new Top-Down modelling method, User-Experience-Oriented Battery Powered Embedded System Design Paradigm, is proposed to estimate the target average power consumption, to guide the hardware and software design, and eventually to approach the theoretical lowest power consumption that the application is still able to provide the full functionality. Starting from the 10-hour sustainable time standard, average working current is defined with battery design capacity and set as a target. Then an implementation is illustrated from both hardware perspective, which is summarized as Auto-Gating power management, and from software perspective, which introduces a new algorithm, SleepVote, to guide the system task design and scheduling.

  7. Preliminary results of BRAVO project: brain computer interfaces for Robotic enhanced Action in Visuo-motOr tasks.

    PubMed

    Bergamasco, Massimo; Frisoli, Antonio; Fontana, Marco; Loconsole, Claudio; Leonardis, Daniele; Troncossi, Marco; Foumashi, Mohammad Mozaffari; Parenti-Castelli, Vincenzo

    2011-01-01

    This paper presents the preliminary results of the project BRAVO (Brain computer interfaces for Robotic enhanced Action in Visuo-motOr tasks). The objective of this project is to define a new approach to the development of assistive and rehabilitative robots for motor impaired users to perform complex visuomotor tasks that require a sequence of reaches, grasps and manipulations of objects. BRAVO aims at developing new robotic interfaces and HW/SW architectures for rehabilitation and regain/restoration of motor function in patients with upper limb sensorimotor impairment through extensive rehabilitation therapy and active assistance in the execution of Activities of Daily Living. The final system developed within this project will include a robotic arm exoskeleton and a hand orthosis that will be integrated together for providing force assistance. The main novelty that BRAVO introduces is the control of the robotic assistive device through the active prediction of intention/action. The system will actually integrate the information about the movement carried out by the user with a prediction of the performed action through an interpretation of current gaze of the user (measured through eye-tracking), brain activation (measured through BCI) and force sensor measurements. © 2011 IEEE

  8. EDGE3: A web-based solution for management and analysis of Agilent two color microarray experiments

    PubMed Central

    Vollrath, Aaron L; Smith, Adam A; Craven, Mark; Bradfield, Christopher A

    2009-01-01

    Background The ability to generate transcriptional data on the scale of entire genomes has been a boon both in the improvement of biological understanding and in the amount of data generated. The latter, the amount of data generated, has implications when it comes to effective storage, analysis and sharing of these data. A number of software tools have been developed to store, analyze, and share microarray data. However, a majority of these tools do not offer all of these features nor do they specifically target the commonly used two color Agilent DNA microarray platform. Thus, the motivating factor for the development of EDGE3 was to incorporate the storage, analysis and sharing of microarray data in a manner that would provide a means for research groups to collaborate on Agilent-based microarray experiments without a large investment in software-related expenditures or extensive training of end-users. Results EDGE3 has been developed with two major functions in mind. The first function is to provide a workflow process for the generation of microarray data by a research laboratory or a microarray facility. The second is to store, analyze, and share microarray data in a manner that doesn't require complicated software. To satisfy the first function, EDGE3 has been developed as a means to establish a well defined experimental workflow and information system for microarray generation. To satisfy the second function, the software application utilized as the user interface of EDGE3 is a web browser. Within the web browser, a user is able to access the entire functionality, including, but not limited to, the ability to perform a number of bioinformatics based analyses, collaborate between research groups through a user-based security model, and access to the raw data files and quality control files generated by the software used to extract the signals from an array image. Conclusion Here, we present EDGE3, an open-source, web-based application that allows for the storage, analysis, and controlled sharing of transcription-based microarray data generated on the Agilent DNA platform. In addition, EDGE3 provides a means for managing RNA samples and arrays during the hybridization process. EDGE3 is freely available for download at . PMID:19732451

  9. EDGE(3): a web-based solution for management and analysis of Agilent two color microarray experiments.

    PubMed

    Vollrath, Aaron L; Smith, Adam A; Craven, Mark; Bradfield, Christopher A

    2009-09-04

    The ability to generate transcriptional data on the scale of entire genomes has been a boon both in the improvement of biological understanding and in the amount of data generated. The latter, the amount of data generated, has implications when it comes to effective storage, analysis and sharing of these data. A number of software tools have been developed to store, analyze, and share microarray data. However, a majority of these tools do not offer all of these features nor do they specifically target the commonly used two color Agilent DNA microarray platform. Thus, the motivating factor for the development of EDGE(3) was to incorporate the storage, analysis and sharing of microarray data in a manner that would provide a means for research groups to collaborate on Agilent-based microarray experiments without a large investment in software-related expenditures or extensive training of end-users. EDGE(3) has been developed with two major functions in mind. The first function is to provide a workflow process for the generation of microarray data by a research laboratory or a microarray facility. The second is to store, analyze, and share microarray data in a manner that doesn't require complicated software. To satisfy the first function, EDGE3 has been developed as a means to establish a well defined experimental workflow and information system for microarray generation. To satisfy the second function, the software application utilized as the user interface of EDGE(3) is a web browser. Within the web browser, a user is able to access the entire functionality, including, but not limited to, the ability to perform a number of bioinformatics based analyses, collaborate between research groups through a user-based security model, and access to the raw data files and quality control files generated by the software used to extract the signals from an array image. Here, we present EDGE(3), an open-source, web-based application that allows for the storage, analysis, and controlled sharing of transcription-based microarray data generated on the Agilent DNA platform. In addition, EDGE(3) provides a means for managing RNA samples and arrays during the hybridization process. EDGE(3) is freely available for download at http://edge.oncology.wisc.edu/.

  10. Semantic Annotation of Video Fragments as Learning Objects: A Case Study with "YouTube" Videos and the Gene Ontology

    ERIC Educational Resources Information Center

    Garcia-Barriocanal, Elena; Sicilia, Miguel-Angel; Sanchez-Alonso, Salvador; Lytras, Miltiadis

    2011-01-01

    Web 2.0 technologies can be considered a loosely defined set of Web application styles that foster a kind of media consumer more engaged, and usually active in creating and maintaining Internet contents. Thus, Web 2.0 applications have resulted in increased user participation and massive user-generated (or user-published) open multimedia content,…

  11. Functional Assessment for Human-Computer Interaction: A Method for Quantifying Physical Functional Capabilities for Information Technology Users

    ERIC Educational Resources Information Center

    Price, Kathleen J.

    2011-01-01

    The use of information technology is a vital part of everyday life, but for a person with functional impairments, technology interaction may be difficult at best. Information technology is commonly designed to meet the needs of a theoretical "normal" user. However, there is no such thing as a "normal" user. A user's capabilities will vary over…

  12. Identity management and privacy languages technologies: Improving user control of data privacy

    NASA Astrophysics Data System (ADS)

    García, José Enrique López; García, Carlos Alberto Gil; Pacheco, Álvaro Armenteros; Organero, Pedro Luis Muñoz

    The identity management solutions have the capability to bring confidence to internet services, but this confidence could be improved if user has more control over the privacy policy of its attributes. Privacy languages could help to this task due to its capability to define privacy policies for data in a very flexible way. So, an integration problem arises: making work together both identity management and privacy languages. Despite several proposals for accomplishing this have already been defined, this paper suggests some topics and improvements that could be considered.

  13. Distributed Energy Resources Customer Adoption Model - Graphical User Interface, Version 2.1.8

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ewald, Friedrich; Stadler, Michael; Cardoso, Goncalo F

    The DER-CAM Graphical User Interface has been redesigned to consist of a dynamic tree structure on the left side of the application window to allow users to quickly navigate between different data categories and views. Views can either be tables with model parameters and input data, the optimization results, or a graphical interface to draw circuit topology and visualize investment results. The model parameters and input data consist of tables where values are assigned to specific keys. The aggregation of all model parameters and input data amounts to the data required to build a DER-CAM model, and is passed tomore » the GAMS solver when users initiate the DER-CAM optimization process. Passing data to the GAMS solver relies on the use of a Java server that handles DER-CAM requests, queuing, and results delivery. This component of the DER-CAM GUI can be deployed either locally or remotely, and constitutes an intermediate step between the user data input and manipulation, and the execution of a DER-CAM optimization in the GAMS engine. The results view shows the results of the DER-CAM optimization and distinguishes between a single and a multi-objective process. The single optimization runs the DER-CAM optimization once and presents the results as a combination of summary charts and hourly dispatch profiles. The multi-objective optimization process consists of a sequence of runs initiated by the GUI, including: 1) CO2 minimization, 2) cost minimization, 3) a user defined number of points in-between objectives 1) and 2). The multi-objective results view includes both access to the detailed results of each point generated by the process as well as the generation of a Pareto Frontier graph to illustrate the trade-off between objectives. DER-CAM GUI 2.1.8 also introduces the ability to graphically generate circuit topologies, enabling support to DER-CAM 5.0.0. This feature consists of: 1) The drawing area, where users can manually create nodes and define their properties (e.g. point of common coupling, slack bus, load) and connect them through edges representing either power lines, transformers, or heat pipes, all with user defined characteristics (e.g., length, ampacity, inductance, or heat loss); 2) The tables, which display the user-defined topology in the final numerical form that will be passed to the DER-CAM optimization. Finally, the DER-CAM GUI is also deployed with a database schema that allows users to provide different energy load profiles, solar irradiance profiles, and tariff data, that can be stored locally and later used in any DER-CAM model. However, no real data will be delivered with this version.« less

  14. Information management challenges of the EOS Data and Information System

    NASA Technical Reports Server (NTRS)

    Mcdonald, Kenneth R.; Blake, Deborah J.

    1991-01-01

    An overview of the current information management concepts that are embodied in the plans for the Earth Observing System Data and Information System (EOSDIS) is presented, and some of the technology development and application areas that are envisioned to be particularly challenging are introduced. The Information Management System (IMS) is the EOSDIS element that provides the primary interface between the science users and the data products and services of EOSDIS. The goals of IMS are to define a clear and complete set of functional requirements and to apply innovative methods and technologies to satisfy them. The information management functions are described in detail, and some applicable technolgies are discussed. Some of the general issues affecting the successful development and operation of the information management element are addressed.

  15. The standard operating procedure of the DOE-JGI Metagenome Annotation Pipeline (MAP v.4)

    DOE PAGES

    Huntemann, Marcel; Ivanova, Natalia N.; Mavromatis, Konstantinos; ...

    2016-02-24

    The DOE-JGI Metagenome Annotation Pipeline (MAP v.4) performs structural and functional annotation for metagenomic sequences that are submitted to the Integrated Microbial Genomes with Microbiomes (IMG/M) system for comparative analysis. The pipeline runs on nucleotide sequences provide d via the IMG submission site. Users must first define their analysis projects in GOLD and then submit the associated sequence datasets consisting of scaffolds/contigs with optional coverage information and/or unassembled reads in fasta and fastq file formats. The MAP processing consists of feature prediction including identification of protein-coding genes, non-coding RNAs and regulatory RNAs, as well as CRISPR elements. Structural annotation ismore » followed by functional annotation including assignment of protein product names and connection to various protein family databases.« less

  16. The standard operating procedure of the DOE-JGI Metagenome Annotation Pipeline (MAP v.4)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huntemann, Marcel; Ivanova, Natalia N.; Mavromatis, Konstantinos

    The DOE-JGI Metagenome Annotation Pipeline (MAP v.4) performs structural and functional annotation for metagenomic sequences that are submitted to the Integrated Microbial Genomes with Microbiomes (IMG/M) system for comparative analysis. The pipeline runs on nucleotide sequences provide d via the IMG submission site. Users must first define their analysis projects in GOLD and then submit the associated sequence datasets consisting of scaffolds/contigs with optional coverage information and/or unassembled reads in fasta and fastq file formats. The MAP processing consists of feature prediction including identification of protein-coding genes, non-coding RNAs and regulatory RNAs, as well as CRISPR elements. Structural annotation ismore » followed by functional annotation including assignment of protein product names and connection to various protein family databases.« less

  17. Executive functions in men and postmenopausal women.

    PubMed

    Castonguay, Nathalie; Lussier, Maxime; Bugaiska, Aurélia; Lord, Catherine; Bherer, Louis

    2015-01-01

    This study was designed to assess sex differences in older adults (55-65 years old) in executive functions and to examine the influence of hormone therapy (HT) in postmenopausal women. We have assessed task performance in memory, visuospatial, and executive functions in 29 women using HT, 29 women who never used HT, and 30 men. Men outperformed never users in task switching and updating. HT users outperformed never users in updating. HT users outperformed never users and men in visual divided attention. The present study support previous findings that sex and HT impact cognition and bring new insights on sex and HT-related differences in executive functions.

  18. Subclinical Depressive Symptoms and Continued Cannabis Use: Predictors of Negative Outcomes in First Episode Psychosis

    PubMed Central

    González-Ortega, Itxaso; Alberich, Susana; Echeburúa, Enrique; Aizpuru, Felipe; Millán, Eduardo; Vieta, Eduard; Matute, Carlos; González-Pinto, Ana

    2015-01-01

    Background Although depressive symptoms in first episode psychosis have been associated with cannabis abuse, their influence on the long-term functional course of FEP patients who abuse cannabis is unknown. The aims of the study were to examine the influence of subclinical depressive symptoms on the long-term outcome in first episode-psychosis patients who were cannabis users and to assess the influence of these subclinical depressive symptoms on the ability to quit cannabis use. Methods 64 FEP patients who were cannabis users at baseline were followed-up for 5 years. Two groups were defined: (a) patients with subclinical depressive symptoms at least once during follow-up (DPG), and (b) patients without subclinical depressive symptoms during follow-up (NDPG). Psychotic symptoms were measured using the Positive and Negative Syndrome Scale (PANSS), depressive symptoms using the Hamilton Depression Rating Scale (HDRS)-17, and psychosocial functioning was assessed using the Global Assessment of Functioning (GAF). A linear mixed-effects model was used to analyze the combined influence of cannabis use and subclinical depressive symptomatology on the clinical outcome. Results Subclinical depressive symptoms were associated with continued abuse of cannabis during follow-up (β= 4.45; 95% confidence interval [CI]: 1.78 to 11.17; P = .001) and with worse functioning (β = -5.50; 95% CI: -9.02 to -0.33; P = .009). Conclusions Subclinical depressive symptoms and continued cannabis abuse during follow-up could be predictors of negative outcomes in FEP patients. PMID:25875862

  19. Method and apparatus for configuration control of redundant robots

    NASA Technical Reports Server (NTRS)

    Seraji, Homayoun (Inventor)

    1991-01-01

    A method and apparatus to control a robot or manipulator configuration over the entire motion based on augmentation of the manipulator forward kinematics is disclosed. A set of kinematic functions is defined in Cartesian or joint space to reflect the desirable configuration that will be achieved in addition to the specified end-effector motion. The user-defined kinematic functions and the end-effector Cartesian coordinates are combined to form a set of task-related configuration variables as generalized coordinates for the manipulator. A task-based adaptive scheme is then utilized to directly control the configuration variables so as to achieve tracking of some desired reference trajectories throughout the robot motion. This accomplishes the basic task of desired end-effector motion, while utilizing the redundancy to achieve any additional task through the desired time variation of the kinematic functions. The present invention can also be used for optimization of any kinematic objective function, or for satisfaction of a set of kinematic inequality constraints, as in an obstacle avoidance problem. In contrast to pseudoinverse-based methods, the configuration control scheme ensures cyclic motion of the manipulator, which is an essential requirement for repetitive operations. The control law is simple and computationally very fast, and does not require either the complex manipulator dynamic model or the complicated inverse kinematic transformation. The configuration control scheme can alternatively be implemented in joint space.

  20. The population health record: concepts, definition, design, and implementation.

    PubMed

    Friedman, Daniel J; Parrish, R Gibson

    2010-01-01

    In 1997, the American Medical Informatics Association proposed a US information strategy that included a population health record (PopHR). Despite subsequent progress on the conceptualization, development, and implementation of electronic health records and personal health records, minimal progress has occurred on the PopHR. Adapting International Organization for Standarization electronic health records standards, we define the PopHR as a repository of statistics, measures, and indicators regarding the state of and influences on the health of a defined population, in computer processable form, stored and transmitted securely, and accessible by multiple authorized users. The PopHR is based upon an explicit population health framework and a standardized logical information model. PopHR purpose and uses, content and content sources, functionalities, business objectives, information architecture, and system architecture are described. Barriers to implementation and enabling factors and a three-stage implementation strategy are delineated.

  1. Cutting solid figures by plane - analytical solution and spreadsheet implementation

    NASA Astrophysics Data System (ADS)

    Benacka, Jan

    2012-07-01

    In some secondary mathematics curricula, there is a topic called Stereometry that deals with investigating the position and finding the intersection, angle, and distance of lines and planes defined within a prism or pyramid. Coordinate system is not used. The metric tasks are solved using Pythagoras' theorem, trigonometric functions, and sine and cosine rules. The basic problem is to find the section of the figure by a plane that is defined by three points related to the figure. In this article, a formula is derived that gives the positions of the intersection points of such a plane and the figure edges, that is, the vertices of the section polygon. Spreadsheet implementations of the formula for cuboid and right rectangular pyramids are presented. The user can check his/her graphical solution, or proceed if he/she is not able to complete the section.

  2. Confocal depth-resolved fluorescence micro-X-ray absorption spectroscopy for the study of cultural heritage materials: a new mobile endstation at the Beijing Synchrotron Radiation Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Guang; Chu, Shengqi; Sun, Tianxi

    A confocal fluorescence endstation for depth-resolved micro-X-ray absorption spectroscopy is described. A polycapillary half-lens defines the incident beam path and a second polycapillary half-lens at 90° defines the probe sample volume. An automatic alignment program based on an evolutionary algorithm is employed to make the alignment procedure efficient. This depth-resolved system was examined on a general X-ray absorption spectroscopy (XAS) beamline at the Beijing Synchrotron Radiation Facility. Sacrificial red glaze (AD 1368–1644) china was studied to show the capability of the instrument. As a mobile endstation to be applied on multiple beamlines, the confocal system can improve the function andmore » flexibility of general XAS beamlines, and extend their capabilities to a wider user community.« less

  3. Method of Simulating Flow-Through Area of a Pressure Regulator

    NASA Technical Reports Server (NTRS)

    Hass, Neal E. (Inventor); Schallhorn, Paul A. (Inventor)

    2011-01-01

    The flow-through area of a pressure regulator positioned in a branch of a simulated fluid flow network is generated. A target pressure is defined downstream of the pressure regulator. A projected flow-through area is generated as a non-linear function of (i) target pressure, (ii) flow-through area of the pressure regulator for a current time step and a previous time step, and (iii) pressure at the downstream location for the current time step and previous time step. A simulated flow-through area for the next time step is generated as a sum of (i) flow-through area for the current time step, and (ii) a difference between the projected flow-through area and the flow-through area for the current time step multiplied by a user-defined rate control parameter. These steps are repeated for a sequence of time steps until the pressure at the downstream location is approximately equal to the target pressure.

  4. SpaceWire Protocol ID: What Does It Mean To You?

    NASA Technical Reports Server (NTRS)

    Rakow, Glenn; Schnurr, Richard; Gilley, Daniel; Parks, Steve

    2006-01-01

    Spacewire is becoming a popular solution for satellite high-speed data buses because it is a simple standard that provides great flexibility for a wide range of system requirements. It is simple in packet format and protocol, allowing users to easily tailor their implementation for their specific application. Some of the attractive aspects of Spacewire that make it easy to implement also make it hard for future reuse. Protocol reuse is difficult because Spacewire does not have a defined mechanism to communicate with the higher layers of the protocol stack. This has forced users of Spacewire to define unique packet formats and define how these packets are to be processed. Each mission writes their own Interface Control Document (ICD) and tailors Spacewire for their specific requirements making reuse difficult. Part of the reason for this habit may be because engineers typically optimize designs for their own requirements in the absence of a standard. This is an inefficient use of project resources and costs more to develop missions. A new packet format for Spacewire has been defined as a solution for this problem. This new packet format is a compliment to the Spacewire standard that will support protocol development upon Spacewire. The new packet definition does not replace the current packet structure, i.e., does not make the standard obsolete, but merely extends the standard for those who want to develop protocols over Spacewire. The Spacewire packet is defined with the first part being the Destination Address, which may be one or more bytes. This is followed by the packet cargo, which is user defined. The cargo is truncated with an End-Of-Packet (EOP) marker. This packet structure offers low packet overhead and allows the user to define how the contents are to be formatted. It also provides for many different addressing schemes, which provide flexibility in the system. This packet flexibility is typically an attractive part of the Spacewire. The new extended packet format adds one new field to the packet that greatly enhances the capability of Spacewire. This new field called the Protocol Identifier (ID) is used to identify the packet contents and the associated processing for the packet. This feature along with the restriction in the packet format that uses the Protocol ID, allows a deterministic method of decoding packets that was not before possible. The first part of the packet is still the Destination Address, which still conforms to the original standard but with one restriction. The restriction is that the first byte seen at the destination by the user needs to be a logical address, independent of the addressing scheme used. The second field is defined as the Protocol ID, which is usually one byte in length. The packet cargo (user defined) follows the Protocol ID. After the packet cargo is the EOP, which defines the end of packet. The value of the Protocol ID is assigned by the Spacewire working group and the protocol description published for others to use. The development of Protocols for Spacewire is currently the area of greatest activity by the Spacewire working group. The first protocol definition by the working group has been completed and is now in the process of formal standardization. There are many other protocols in development for missions that have not yet received formal Protocol ID assignment, but even if the protocols are not formally assigned a value, this effort will provide synergism for future developments.

  5. Phase 2 STS new user development program. Volume 4: Guidance/instructions for representatives

    NASA Technical Reports Server (NTRS)

    Mcdowell, J. R.

    1976-01-01

    The overall STS New User Development (NUD) Function is shown. The user development function is directly responsible for implementing the strategy derived to develop a specific, prospective new user. The burden of actually developing the user falls on the NUD representative, and the success of his user contacts will depend upon how well he is prepared to interface with the user. The guidance/instructions as to what a NUD representative needs to know about the prospective user, and the type of data he should provide when calling on a potential user, are presented.

  6. Heavy Cannabis Use Associated With Reduction in Activated and Inflammatory Immune Cell Frequencies in Antiretroviral Therapy-Treated Human Immunodeficiency Virus-Infected Individuals.

    PubMed

    Manuzak, Jennifer A; Gott, Toni M; Kirkwood, Jay S; Coronado, Ernesto; Hensley-McBain, Tiffany; Miller, Charlene; Cheu, Ryan K; Collier, Ann C; Funderburg, Nicholas T; Martin, Jeffery N; Wu, Michael C; Isoherranen, Nina; Hunt, Peter W; Klatt, Nichole R

    2018-06-01

    Cannabis is a widely used drug in the United States, and the frequency of cannabis use in the human immunodeficiency virus (HIV)-infected population is disproportionately high. Previous human and macaque studies suggest that cannabis may have an impact on plasma viral load; however, the relationship between cannabis use and HIV-associated systemic inflammation and immune activation has not been well defined. The impact of cannabis use on peripheral immune cell frequency, activation, and function was assessed in 198 HIV-infected, antiretroviral-treated individuals by flow cytometry. Individuals were categorized into heavy, medium, or occasional cannabis users or noncannabis users based on the amount of the cannabis metabolite 11-nor-carboxy-tetrahydrocannabinol (THC-COOH) detected in plasma by mass spectrometry. Heavy cannabis users had decreased frequencies of human leukocyte antigen (HLA)-DR+CD38+CD4+ and CD8+ T-cell frequencies, compared to frequencies of these cells in non-cannabis-using individuals. Heavy cannabis users had decreased frequencies of intermediate and nonclassical monocyte subsets, as well as decreased frequencies of interleukin 23- and tumor necrosis factor-α-producing antigen-presenting cells. While the clinical implications are unclear, our findings suggest that cannabis use is associated with a potentially beneficial reduction in systemic inflammation and immune activation in the context of antiretroviral-treated HIV infection.

  7. A Functional Role of RB-Dependent Pathway in the Control of Quiescence in Adult Epidermal Stem Cells Revealed by Genomic Profiling

    PubMed Central

    Lorz, Corina; García-Escudero, Ramón; Segrelles, Carmen; Garín, Marina I.; Ariza, José M.; Santos, Mirentxu; Ruiz, Sergio; Lara, María F.; Martínez-Cruz, Ana B.; Costa, Clotilde; Buitrago-Pérez, Águeda; Saiz-Ladera, Cristina; Dueñas, Marta

    2010-01-01

    Continuous cell renewal in mouse epidermis is at the expense of a pool of pluripotent cells that lie in a well defined niche in the hair follicle known as the bulge. To identify mechanisms controlling hair follicle stem cell homeostasis, we developed a strategy to isolate adult bulge stem cells in mice and to define their transcriptional profile. We observed that a large number of transcripts are underexpressed in hair follicle stem cells when compared to non-stem cells. Importantly, the majority of these downregulated genes are involved in cell cycle. Using bioinformatics tools, we identified the E2F transcription factor family as a potential element involved in the regulation of these transcripts. To determine their functional role, we used engineered mice lacking Rb gene in epidermis, which showed increased expression of most E2F family members and increased E2F transcriptional activity. Experiments designed to analyze epidermal stem cell functionality (i.e.: hair regrowth and wound healing) imply a role of the Rb-E2F axis in the control of stem cell quiescence in epidermis. Electronic supplementary material The online version of this article (doi:10.1007/s12015-010-9139-0) contains supplementary material, which is available to authorized users. PMID:20376578

  8. Sequential addition of short DNA oligos in DNA-polymerase-based synthesis reactions

    DOEpatents

    Gardner, Shea N; Mariella, Jr., Raymond P; Christian, Allen T; Young, Jennifer A; Clague, David S

    2013-06-25

    A method of preselecting a multiplicity of DNA sequence segments that will comprise the DNA molecule of user-defined sequence, separating the DNA sequence segments temporally, and combining the multiplicity of DNA sequence segments with at least one polymerase enzyme wherein the multiplicity of DNA sequence segments join to produce the DNA molecule of user-defined sequence. Sequence segments may be of length n, where n is an odd integer. In one embodiment the length of desired hybridizing overlap is specified by the user and the sequences and the protocol for combining them are guided by computational (bioinformatics) predictions. In one embodiment sequence segments are combined from multiple reading frames to span the same region of a sequence, so that multiple desired hybridizations may occur with different overlap lengths.

  9. The EOSDIS Products Usability for Disaster Response.

    NASA Astrophysics Data System (ADS)

    Kafle, D. N.; Wanchoo, L.; Won, Y. I.; Michael, K.

    2016-12-01

    The Earth Observing System (EOS) Data and Information System (EOSDIS) is a key core capability in NASA's Earth Science Data System Program. The EOSDIS science operations are performed within a distributed system of interconnected nodes: the Science Investigator-led Processing Systems (SIPS), and the distributed, discipline-specific, Earth science Distributed Active Archive Centers (DAACs), which have specific responsibilities for the production, archiving, and distribution of Earth science data products. NASA also established the Land, Atmosphere Near real-time Capability for EOS (LANCE) program through which near real-time (NRT) products are produced and distributed within a latency of no more than 3 hours. These data, including NRT, have been widely used by scientists and researchers for studying Earth system science, climate change, natural variability, and enhanced climate predictions including disaster assessments. The Subcommittee on Disaster Reduction (SDR) has defined 15 major types of disasters such as flood, hurricane, earthquake, volcano, tsunami, etc. The focus of the study is to categorize both NRT and standard data products based on applicability to the SDR-defined disaster types. This will identify which datasets from current NASA satellite missions/instruments are best suited for disaster response. The distribution metrics of the products that have been used for studying various selected disasters that have occurred over last 5 years will be analyzed that include volume, number of files, number of users, user domains, user country, etc. This data usage analysis will provide information to the data centers' staff that can help them develop the functionality and allocate the resources needed for enhanced access and timely availability of the data products that are critical for the time-sensitive analyses.

  10. [Profiles of high-frequency users of primary care services and associations with depressive anxiety disorders in Cali, Colombia].

    PubMed

    Rodriguez-Lopez, Mérida; Arrivillaga, Marcela; Holguín, Jorge; León, Hoover; Ávila, Alfonso; Hernández, Carlos; Rincón-Hoyos, Hernán G

    2016-01-01

    To determine the profiles of highly frequent users of primary care services and the associations of these profiles with depressive anxiety disorders in Cali, Colombia. A case-control study, high-frequency cases were defined as those involving patients with a percentile >75 with regard to the frequency of spontaneous use of outpatient facilities in the last 12 months; controls were defined as those with a percentile <25. A multiple correspondence analysis was used to describe patient profiles, and the influences of depression and anxiety on frequent attendance was determined via logistic regression. Among the 780 participating patients, differences in the profiles among frequent users and controls were related to predisposing factors such as sex, age, and education, capacity factors such as the time required to visit the institution and the means of transport used, and need factors such as health perceptions, social support, family function, and the presence of anxiety or depressive disorders. A depression or anxiety disorder was found to associate positively with frequent attendance (adjusted odds ratio [aOR]: 1.99, 95% confidence interval [CI]: 1.19-3.31) and a referral system (aOR: 1.61, 95% CI: 1.01-2.76), but negatively with mild or no family dysfunction (aOR: 0.79; 95% CI: 0.48-0.88) after adjusting for age, sex, ethnicity, and health service-providing institutions. The profiles of high-frequency patients differ from control patients with respect to factors related to capacity, need, and willingness; in particular, the latter were independently associated with frequent attendance. Notably, the presence of an anxious or depressive disorder doubled the risk of highfrequency attendance at a primary care facility.

  11. Orbital Debris Engineering Model (ORDEM) v.3

    NASA Technical Reports Server (NTRS)

    Matney, Mark; Krisko, Paula; Xu, Yu-Lin; Horstman, Matthew

    2013-01-01

    A model of the manmade orbital debris environment is required by spacecraft designers, mission planners, and others in order to understand and mitigate the effects of the environment on their spacecraft or systems. A manmade environment is dynamic, and can be altered significantly by intent (e.g., the Chinese anti-satellite weapon test of January 2007) or accident (e.g., the collision of Iridium 33 and Cosmos 2251 spacecraft in February 2009). Engineering models are used to portray the manmade debris environment in Earth orbit. The availability of new sensor and in situ data, the re-analysis of older data, and the development of new analytical and statistical techniques has enabled the construction of this more comprehensive and sophisticated model. The primary output of this model is the flux [#debris/area/time] as a function of debris size and year. ORDEM may be operated in spacecraft mode or telescope mode. In the former case, an analyst defines an orbit for a spacecraft and "flies" the spacecraft through the orbital debris environment. In the latter case, an analyst defines a ground-based sensor (telescope or radar) in terms of latitude, azimuth, and elevation, and the model provides the number of orbital debris traversing the sensor's field of view. An upgraded graphical user interface (GUI) is integrated with the software. This upgraded GUI uses project-oriented organization and provides the user with graphical representations of numerous output data products. These range from the conventional flux as a function of debris size for chosen analysis orbits (or views), for example, to the more complex color-contoured two-dimensional (2D) directional flux diagrams in local spacecraft elevation and azimuth.

  12. Quality of human-computer interaction - results of a national usability survey of hospital-IT in Germany

    PubMed Central

    2011-01-01

    Background Due to the increasing functionality of medical information systems, it is hard to imagine day to day work in hospitals without IT support. Therefore, the design of dialogues between humans and information systems is one of the most important issues to be addressed in health care. This survey presents an analysis of the current quality level of human-computer interaction of healthcare-IT in German hospitals, focused on the users' point of view. Methods To evaluate the usability of clinical-IT according to the design principles of EN ISO 9241-10 the IsoMetrics Inventory, an assessment tool, was used. The focus of this paper has been put on suitability for task, training effort and conformity with user expectations, differentiated by information systems. Effectiveness has been evaluated with the focus on interoperability and functionality of different IT systems. Results 4521 persons from 371 hospitals visited the start page of the study, while 1003 persons from 158 hospitals completed the questionnaire. The results show relevant variations between different information systems. Conclusions Specialised information systems with defined functionality received better assessments than clinical information systems in general. This could be attributed to the improved customisation of these specialised systems for specific working environments. The results can be used as reference data for evaluation and benchmarking of human computer engineering in clinical health IT context for future studies. PMID:22070880

  13. Design and validation of low-cost assistive glove for hand assessment and therapy during activity of daily living-focused robotic stroke therapy.

    PubMed

    Nathan, Dominic E; Johnson, Michelle J; McGuire, John R

    2009-01-01

    Hand and arm impairment is common after stroke. Robotic stroke therapy will be more effective if hand and upper-arm training is integrated to help users practice reaching and grasping tasks. This article presents the design, development, and validation of a low-cost, functional electrical stimulation grasp-assistive glove for use with task-oriented robotic stroke therapy. Our glove measures grasp aperture while a user completes simple-to-complex real-life activities, and when combined with an integrated functional electrical stimulator, it assists in hand opening and closing. A key function is a new grasp-aperture prediction model, which uses the position of the end-effectors of two planar robots to define the distance between the thumb and index finger. We validated the accuracy and repeatability of the glove and its capability to assist in grasping. Results from five nondisabled subjects indicated that the glove is accurate and repeatable for both static hand-open and -closed tasks when compared with goniometric measures and for dynamic reach-to-grasp tasks when compared with motion analysis measures. Results from five subjects with stroke showed that with the glove, they could open their hands but without it could not. We present a glove that is a low-cost solution for in vivo grasp measurement and assistance.

  14. Integrating Patient-Reported Outcomes into Spine Surgical Care through Visual Dashboards: Lessons Learned from Human-Centered Design.

    PubMed

    Hartzler, Andrea L; Chaudhuri, Shomir; Fey, Brett C; Flum, David R; Lavallee, Danielle

    2015-01-01

    The collection of patient-reported outcomes (PROs) draws attention to issues of importance to patients-physical function and quality of life. The integration of PRO data into clinical decisions and discussions with patients requires thoughtful design of user-friendly interfaces that consider user experience and present data in personalized ways to enhance patient care. Whereas most prior work on PROs focuses on capturing data from patients, little research details how to design effective user interfaces that facilitate use of this data in clinical practice. We share lessons learned from engaging health care professionals to inform design of visual dashboards, an emerging type of health information technology (HIT). We employed human-centered design (HCD) methods to create visual displays of PROs to support patient care and quality improvement. HCD aims to optimize the design of interactive systems through iterative input from representative users who are likely to use the system in the future. Through three major steps, we engaged health care professionals in targeted, iterative design activities to inform the development of a PRO Dashboard that visually displays patient-reported pain and disability outcomes following spine surgery. Design activities to engage health care administrators, providers, and staff guided our work from design concept to specifications for dashboard implementation. Stakeholder feedback from these health care professionals shaped user interface design features, including predefined overviews that illustrate at-a-glance trends and quarterly snapshots, granular data filters that enable users to dive into detailed PRO analytics, and user-defined views to share and reuse. Feedback also revealed important considerations for quality indicators and privacy-preserving sharing and use of PROs. Our work illustrates a range of engagement methods guided by human-centered principles and design recommendations for optimizing PRO Dashboards for patient care and quality improvement. Engaging health care professionals as stakeholders is a critical step toward the design of user-friendly HIT that is accepted, usable, and has the potential to enhance quality of care and patient outcomes.

  15. Integrated System Health Management: Foundational Concepts, Approach, and Implementation.

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando; Schmalzel, John; Walker, Mark; Venkatesh, Meera; Kapadia, Ravi; Morris, Jon; Turowski, Mark; Smith, Harvey

    2009-01-01

    Implementation of integrated system health management (ISHM) capability is fundamentally linked to the management of data, information, and knowledge (DIaK) with the purposeful objective of determining the health of a system. It is akin to having a team of experts who are all individually and collectively observing and analyzing a complex system, and communicating effectively with each other in order to arrive to an accurate and reliable assessment of its health. We present concepts, procedures, and a specific approach as a foundation for implementing a credible ISHM capability. The capability stresses integration of DIaK from all elements of a system. The intent is also to make possible implementation of on-board ISHM capability, in contrast to a remote capability. The information presented is the result of many years of research, development, and maturation of technologies, and of prototype implementations in operational systems (rocket engine test facilities). The paper will address the following topics: 1. ISHM Model of a system 2. Detection of anomaly indicators. 3. Determination and confirmation of anomalies. 4. Diagnostic of causes and determination of effects. 5. Consistency checking cycle. 6. Management of health information 7. User Interfaces 8. Example implementation ISHM has been defined from many perspectives. We define it as a capability that might be achieved by various approaches. We describe a specific approach that has been matured throughout many years of development, and pilot implementations. ISHM is a capability that is achieved by integrating data, information, and knowledge (DIaK) that might be distributed throughout the system elements (which inherently implies capability to manage DIaK associated with distributed sub-systems). DIaK must be available to any element of a system at the right time and in accordance with a meaningful context. ISHM Functional Capability Level (FCL) is measured by how well a system performs the following functions: (1) detect anomalies, (2) diagnose causes, (3) predict future anomalies/failures, and (4) provide the user with an integrated awareness about the condition of every element in the system and guide user decisions.

  16. Advanced Data Format (ADF) Software Library and Users Guide

    NASA Technical Reports Server (NTRS)

    Smith, Matthew; Smith, Charles A. (Technical Monitor)

    1998-01-01

    The "CFD General Notation System" (CGNS) consists of a collection of conventions, and conforming software, for the storage and retrieval of Computational Fluid Dynamics (CFD) data. It facilitates the exchange of data between sites and applications, and helps stabilize the archiving of aerodynamic data. This effort was initiated in order to streamline the procedures in exchanging data and software between NASA and its customers, but the goal is to develop CGNS into a National Standard for the exchange of aerodynamic data. The CGNS development team is comprised of members from Boeing Commercial. Airplane Group, NASA-Ames, NASA-Langley, NASA-Lewis, McDonnell-Douglas Corporation (now Boeing-St. Louis), Air Force-Wright Lab., and ICEM-CFD Engineering. The elements of CGNS address all activities associated with the storage of data on external media and its movement to and from application programs. These elements include: 1) The Advanced Data Format (ADF) Database manager, consisting of both a file format specification and its 1/0 software, which handles the actual reading and writing of data from and to external storage media; 2) The Standard Interface Data Structures (SIDS), which specify the intellectual content of CFD data and the conventions governing naming and terminology; 3) The SIDS-to-ADF File Mapping conventions, which specify the exact location where the CFD data defined by the SIDS is to be stored within the ADF file(s); and 4) The CGNS Mid-level Library, which provides CFD-knowledgeable routines suitable for direct installation into application codes. The ADF is a generic database manager with minimal intrinsic capability. It was written for the purpose of storing large numerical datasets in an efficient, platform independent manner. To be effective, it must be used in conjunction with external agreements on how the data will be organized within the ADF database such defined by the SIDS. There are currently 34 user callable functions that comprise the ADF Core library and are described in the Users Guide. The library is written in C, but each function has a FORTRAN counterpart.

  17. TIM Version 3.0 beta Technical Description and User Guide - Appendix B - Example input file for TIMv3.0

    EPA Pesticide Factsheets

    Terrestrial Investigation Model, TIM, has several appendices to its user guide. This is the appendix that includes an example input file in its preserved format. Both parameters and comments defining them are included.

  18. User manual for SPLASH (Single Panel Lamp and Shroud Helper).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larsen, Marvin Elwood

    2006-02-01

    The radiant heat test facility develops test sets providing well-characterized thermal environments, often representing fires. Many of the components and procedures have become standardized to such an extent that the development of a specialized design tool to determine optimal configurations for radiant heat experiments was appropriate. SPLASH (Single Panel Lamp and Shroud Helper) is that tool. SPLASH is implemented as a user-friendly, Windows-based program that allows a designer to describe a test setup in terms of parameters such as number of lamps, power, position, and separation distance. This document is a user manual for that software. Any incidental descriptions ofmore » theory are only for the purpose of defining the model inputs. The theory for the underlying model is described in SAND2005-2947 (Ref. [1]). SPLASH provides a graphical user interface to define lamp panel and shroud designs parametrically, solves the resulting radiation enclosure problem for up to 2500 surfaces, and provides post-processing to facilitate understanding and documentation of analyzed designs.« less

  19. Software-defined Quantum Networking Ecosystem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humble, Travis S.; Sadlier, Ronald

    The software enables a user to perform modeling and simulation of software-defined quantum networks. The software addresses the problem of how to synchronize transmission of quantum and classical signals through multi-node networks and to demonstrate quantum information protocols such as quantum teleportation. The software approaches this problem by generating a graphical model of the underlying network and attributing properties to each node and link in the graph. The graphical model is then simulated using a combination of discrete-event simulators to calculate the expected state of each node and link in the graph at a future time. A user interacts withmore » the software by providing an initial network model and instantiating methods for the nodes to transmit information with each other. This includes writing application scripts in python that make use of the software library interfaces. A user then initiates the application scripts, which invokes the software simulation. The user then uses the built-in diagnostic tools to query the state of the simulation and to collect statistics on synchronization.« less

  20. System and method for creating expert systems

    NASA Technical Reports Server (NTRS)

    Hughes, Peter M. (Inventor); Luczak, Edward C. (Inventor)

    1998-01-01

    A system and method provides for the creation of a highly graphical expert system without the need for programming in code. An expert system is created by initially building a data interface, defining appropriate Mission, User-Defined, Inferred, and externally-generated GenSAA (EGG) data variables whose data values will be updated and input into the expert system. Next, rules of the expert system are created by building appropriate conditions of the rules which must be satisfied and then by building appropriate actions of rules which are to be executed upon corresponding conditions being satisfied. Finally, an appropriate user interface is built which can be highly graphical in nature and which can include appropriate message display and/or modification of display characteristics of a graphical display object, to visually alert a user of the expert system of varying data values, upon conditions of a created rule being satisfied. The data interface building, rule building, and user interface building are done in an efficient manner and can be created without the need for programming in code.

  1. Systematic Assessment of the Impact of User Roles on Network Flow Patterns

    DTIC Science & Technology

    2017-09-01

    Protocol SNMP Simple Network Management Protocol SQL Structured Query Language SSH Secure Shell SYN TCP Sync Flag SVDD Support Vector Data Description SVM...and evaluating users based on roles provide the best approach for defining normal digital behaviors? People are individuals, with different interests...activities on the network. We evaluate the assumption that users sharing similar roles exhibit similar network behaviors, and contrast the level of similarity

  2. Feasibility and effectiveness of a combined individual and psychoeducational group intervention in psychiatric residential facilities: A controlled, non-randomized study.

    PubMed

    Magliano, Lorenza; Puviani, Marta; Rega, Sonia; Marchesini, Nadia; Rossetti, Marisa; Starace, Fabrizio

    2016-01-30

    This controlled, non-randomized study explored the feasibility of introducing a Combined Individual and Group Intervention (CIGI) for users with mental disorders in residential facilities, and tested whether users who received the CIGI had better functioning than users who received the Treatment-As-Usual (TAU), at two-year follow up. In the CIGI, a structured cognitivebehavioral approach called VADO (in English, Skills Assessment and Definition of Goals) was used to set specific goals with each user, while Falloon's psychoeducational treatment was applied with the users as a group. Thirty-one professionals attended a training course in CIGI, open to users' voluntary participation, and applied it for two years with all users living in 8 residential facilities of the Mental Health Department of Modena, Italy. In the same department, 5 other residential facilities providing TAU were used as controls. ANOVA for repeated measures showed a significant interaction effect between users' functioning at baseline and follow up assessments, and the intervention. In particular, change in global functioning was higher in the 55 CIGI users than in the 44 TAU users. These results suggest that CIGI can be successfully introduced in residential facilities and may be useful to improve functioning in users with severe mental disorders. Copyright © 2016. Published by Elsevier Ireland Ltd.

  3. Characteristics and drug utilization patterns for heavy users of prescription drugs among the elderly: a Danish register-based drug utilization study.

    PubMed

    Øymoen, Anita; Pottegård, Anton; Almarsdóttir, Anna Birna

    2015-06-01

    The objectives of this study were to (1) identify and characterize heavy users of prescription drugs among persons aged 60 years and above; (2) investigate the association of demographic, socioeconomic, and health-related variables with being a heavy drug user; and (3) study the most frequently used drugs among heavy drug users and development in use over time. This is a descriptive study. Heavy drug users were defined as the accumulated top 1 percentile who accounted for the largest share of prescription drug use measured in number of dispensed defined daily doses (DDDs). The nationwide Danish registers were used to obtain data. Multivariable logistic binary regression was used to determine which factors were associated with being a heavy drug user. Heavy drug users among persons aged 60 years and above accounted for 6.8, 6.0, and 5.5% of prescription drug use in 2002, 2007, and 2012, respectively. Male gender, those aged 60-69 years, being divorced, shorter education, low annual income, and recent hospitalization were all significantly associated with being in the top 1 percentile group of drug users (p < 0.05). The ten most frequently used drug classes among heavy drug users accounted for 75.4% of their use in 2012, and five of these were cardiovascular drugs. The development over time for the ten most used drug classes followed the same pattern among heavy drug users and in the general population. There is a skewed utilization of prescription drugs. Contrary to earlier findings, being male was associated with heavy prescription drug use both with respect to number of drugs used and drug expenditure.

  4. Remapping residual coordination for controlling assistive devices and recovering motor functions.

    PubMed

    Pierella, Camilla; Abdollahi, Farnaz; Farshchiansadegh, Ali; Pedersen, Jessica; Thorp, Elias B; Mussa-Ivaldi, Ferdinando A; Casadio, Maura

    2015-12-01

    The concept of human motor redundancy attracted much attention since the early studies of motor control, as it highlights the ability of the motor system to generate a great variety of movements to achieve any well-defined goal. The abundance of degrees of freedom in the human body may be a fundamental resource in the learning and remapping problems that are encountered in human-machine interfaces (HMIs) developments. The HMI can act at different levels decoding brain signals or body signals to control an external device. The transformation from neural signals to device commands is the core of research on brain-machine interfaces (BMIs). However, while BMIs bypass completely the final path of the motor system, body-machine interfaces (BoMIs) take advantage of motor skills that are still available to the user and have the potential to enhance these skills through their consistent use. BoMIs empower people with severe motor disabilities with the possibility to control external devices, and they concurrently offer the opportunity to focus on achieving rehabilitative goals. In this study we describe a theoretical paradigm for the use of a BoMI in rehabilitation. The proposed BoMI remaps the user's residual upper body mobility to the two coordinates of a cursor on a computer screen. This mapping is obtained by principal component analysis (PCA). We hypothesize that the BoMI can be specifically programmed to engage the users in functional exercises aimed at partial recovery of motor skills, while simultaneously controlling the cursor and carrying out functional tasks, e.g. playing games. Specifically, PCA allows us to select not only the subspace that is most comfortable for the user to act upon, but also the degrees of freedom and coordination patterns that the user has more difficulty engaging. In this article, we describe a family of map modifications that can be made to change the motor behavior of the user. Depending on the characteristics of the impairment of each high-level spinal cord injury (SCI) survivor, we can make modifications to restore a higher level of symmetric mobility (left versus right), or to increase the strength and range of motion of the upper body that was spared by the injury. Results showed that this approach restored symmetry between left and right side of the body, with an increase of mobility and strength of all the degrees of freedom in the participants involved in the control of the interface. This is a proof of concept that our BoMI may be used concurrently to control assistive devices and reach specific rehabilitative goals. Engaging the users in functional and entertaining tasks while practicing the interface and changing the map in the proposed ways is a novel approach to rehabilitation treatments facilitated by portable and low-cost technologies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Effects of Chronic Active Cannabis Use on Visuomotor Integration, in Relation to Brain Activation and Cortisol Levels

    PubMed Central

    King, G.R.; Ernst, T.; Deng, W.; Stenger, A.; Gonzales, R.M.K; Nakama, H.; Chang, L.

    2012-01-01

    Cannabis is the most abused illegal substance in the United States. Alterations in brain function and motor behavior have been reported in chronic cannabis users, but the results have been variable. The current study aimed to determine whether chronic active cannabis use in humans may alter psychomotor function, brain activation, and hypothalamic-pituitary-axis (HPA) function in men and women. 30 cannabis users (16 men and 14 women, 18 to 45 years old) and 30 non-drug user controls (16 men and 14 women, 19 to 44 years old) were evaluated with neuropsychological tests designed to assess motor behavior and functional MRI (fMRI), using a 3 Tesla scanner, during a visually paced finger-sequencing task, cued by a flashing checkerboard (at 2 or 4 Hz). Salivary cortisol was measured to assess HPA function. Male, but not female, cannabis users had significantly slower performance on psychomotor speed tests. As a group, cannabis users had greater activation in BA 6 than controls, while controls had greater activation in the visual area BA 17 than cannabis users. Cannabis users also had higher salivary cortisol levels than controls (p = 0.002). Chronic active cannabis use is associated with slower and less efficient psychomotor function, especially in the male users, as indicated by a shift from regions involved with automated visually guided responses to more executive or attentional control areas. These brain activities may be attenuated by the higher cortisol levels in the cannabis users which in turn may lead to less efficient visual-motor function. PMID:22159107

  6. Other drug use does not impact cognitive impairments in chronic ketamine users.

    PubMed

    Zhang, Chenxi; Tang, Wai Kwong; Liang, Hua Jun; Ungvari, Gabor Sandor; Lin, Shih-Ku

    2018-05-01

    Ketamine abuse causes cognitive impairments, which negatively impact on users' abstinence, prognosis, and quality of life. of cognitive impairments in chronic ketamine users have been inconsistent across studies, possibly due to the small sample sizes and the confounding effects of concomitant use of other illicit drugs. This study investigated the cognitive impairment and its related factors in chronic ketamine users with a large sample size and explored the impact of another drug use on cognitive functions. Cognitive functions, including working, verbal and visual memory and executive functions were assessed in ketamine users: 286 non-heavy other drug users and 279 heavy other drug users, and 261 healthy controls. Correlations between cognitive impairment and patterns of ketamine use were analysed. Verbal and visual memory were impaired, but working memory and executive functions were intact for all ketamine users. No significant cognitive differences were found between the two ketamine groups. Greater number of days of ketamine use in the past month was associated with worse visual memory performance in non-heavy other drug users. Higher dose of ketamine use was associated with worse short-term verbal memory in heavy other drug users. Verbal and visual memory are impaired in chronic ketamine users. Other drug use appears to have no impact on ketamine users' cognitive performance. Copyright © 2018. Published by Elsevier B.V.

  7. Making Sense of Complexity with FRE, a Scientific Workflow System for Climate Modeling (Invited)

    NASA Astrophysics Data System (ADS)

    Langenhorst, A. R.; Balaji, V.; Yakovlev, A.

    2010-12-01

    A workflow is a description of a sequence of activities that is both precise and comprehensive. Capturing the workflow of climate experiments provides a record which can be queried or compared, and allows reproducibility of the experiments - sometimes even to the bit level of the model output. This reproducibility helps to verify the integrity of the output data, and enables easy perturbation experiments. GFDL's Flexible Modeling System Runtime Environment (FRE) is a production-level software project which defines and implements building blocks of the workflow as command line tools. The scientific, numerical and technical input needed to complete the workflow of an experiment is recorded in an experiment description file in XML format. Several key features add convenience and automation to the FRE workflow: ● Experiment inheritance makes it possible to define a new experiment with only a reference to the parent experiment and the parameters to override. ● Testing is a basic element of the FRE workflow: experiments define short test runs which are verified before the main experiment is run, and a set of standard experiments are verified with new code releases. ● FRE is flexible enough to support short runs with mere megabytes of data, to high-resolution experiments that run on thousands of processors for months, producing terabytes of output data. Experiments run in segments of model time; after each segment, the state is saved and the model can be checkpointed at that level. Segment length is defined by the user, but the number of segments per system job is calculated to fit optimally in the batch scheduler requirements. FRE provides job control across multiple segments, and tools to monitor and alter the state of long-running experiments. ● Experiments are entered into a Curator Database, which stores query-able metadata about the experiment and the experiment's output. ● FRE includes a set of standardized post-processing functions as well as the ability to incorporate user-level functions. FRE post-processing can take us all the way to the preparing of graphical output for a scientific audience, and publication of data on a public portal. ● Recent FRE development includes incorporating a distributed workflow to support remote computing.

  8. Preliminary Human Factors Guidelines for Automated Highway System Designers, Second Edition - Volume 2: User-System Transactions

    DOT National Transportation Integrated Search

    1998-04-01

    Human factors can be defined as "designing to match the capabilities and limitations of the human user." The objectives of this human-centered design process are to maximize the effectiveness and efficiency of system performance, ensure a high level ...

  9. Orbital service module systems analysis study documentation. Volume 2: Technical report

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Near term, cost effective concepts were defined to augment the power and duration capability offered to shuttle payload users. Feasible concept options that could evolve to provide free-flying power and other services to users in the 1984 time frame were also examined.

  10. From scenarios to domain models: processes and representations

    NASA Astrophysics Data System (ADS)

    Haddock, Gail; Harbison, Karan

    1994-03-01

    The domain specific software architectures (DSSA) community has defined a philosophy for the development of complex systems. This philosophy improves productivity and efficiency by increasing the user's role in the definition of requirements, increasing the systems engineer's role in the reuse of components, and decreasing the software engineer's role to the development of new components and component modifications only. The scenario-based engineering process (SEP), the first instantiation of the DSSA philosophy, has been adopted by the next generation controller project. It is also the chosen methodology of the trauma care information management system project, and the surrogate semi-autonomous vehicle project. SEP uses scenarios from the user to create domain models and define the system's requirements. Domain knowledge is obtained from a variety of sources including experts, documents, and videos. This knowledge is analyzed using three techniques: scenario analysis, task analysis, and object-oriented analysis. Scenario analysis results in formal representations of selected scenarios. Task analysis of the scenario representations results in descriptions of tasks necessary for object-oriented analysis and also subtasks necessary for functional system analysis. Object-oriented analysis of task descriptions produces domain models and system requirements. This paper examines the representations that support the DSSA philosophy, including reference requirements, reference architectures, and domain models. The processes used to create and use the representations are explained through use of the scenario-based engineering process. Selected examples are taken from the next generation controller project.

  11. The economics of time shared computing: Congestion, user costs and capacity

    NASA Technical Reports Server (NTRS)

    Agnew, C. E.

    1982-01-01

    Time shared systems permit the fixed costs of computing resources to be spread over large numbers of users. However, bottleneck results in the theory of closed queueing networks can be used to show that this economy of scale will be offset by the increased congestion that results as more users are added to the system. If one considers the total costs, including the congestion cost, there is an optimal number of users for a system which equals the saturation value usually used to define system capacity.

  12. A software for managing after-hours activities in research user facilities

    DOE PAGES

    Camino, F. E.

    2017-05-01

    Here, we present an afterhours activity management program for shared facilities, which handles the processes required for afterhours access (request, approval, extension, etc.). It implements the concept of permitted afterhours activities, which consists of a list of well-defined activities that each user can perform afterhours. The program provides an easy and unambiguous way for users to know which activities they are allowed to perform afterhours. In addition, the program can enhance its safety efficacy by interacting with lab and instrument access control systems commonly present in user facilities.

  13. A software for managing after-hours activities in research user facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Camino, F. E.

    Here, we present an afterhours activity management program for shared facilities, which handles the processes required for afterhours access (request, approval, extension, etc.). It implements the concept of permitted afterhours activities, which consists of a list of well-defined activities that each user can perform afterhours. The program provides an easy and unambiguous way for users to know which activities they are allowed to perform afterhours. In addition, the program can enhance its safety efficacy by interacting with lab and instrument access control systems commonly present in user facilities.

  14. A pseudo-noise transponder design for low data rate users of the tracking and data relay satellite system

    NASA Technical Reports Server (NTRS)

    Birch, J. N.

    1971-01-01

    A compromise optimum design for the low data rate users of the Tracking and Data Relay Satellite System (TDRSS) is presented. Design goals for the TDRSS are employed in this report to arrive at the transponder design. Multipath, R.F.I., antenna pattern anomolies, other user signals, and other definable degrading factors are included as trade-off parameters in the design. Synchronization, emergency voice, user stabilization, polarization diversity and error control coding are also considered and their impact on the transponder design is evaluated.

  15. Virtual Worlds for Virtual Organizing

    NASA Astrophysics Data System (ADS)

    Rhoten, Diana; Lutters, Wayne

    The members and resources of a virtual organization are dispersed across time and space, yet they function as a coherent entity through the use of technologies, networks, and alliances. As virtual organizations proliferate and become increasingly important in society, many may exploit the technical architecture s of virtual worlds, which are the confluence of computer-mediated communication, telepresence, and virtual reality originally created for gaming. A brief socio-technical history describes their early origins and the waves of progress followed by stasis that brought us to the current period of renewed enthusiasm. Examination of contemporary examples demonstrates how three genres of virtual worlds have enabled new arenas for virtual organizing: developer-defined closed worlds, user-modifiable quasi-open worlds, and user-generated open worlds. Among expected future trends are an increase in collaboration born virtually rather than imported from existing organizations, a tension between high-fidelity recreations of the physical world and hyper-stylized imaginations of fantasy worlds, and the growth of specialized worlds optimized for particular sectors, companies, or cultures.

  16. Advanced Proficiency EHR Training: Effect on Physicians’ EHR Efficiency, EHR Satisfaction and Job Satisfaction

    PubMed Central

    Dastagir, M. Tariq; Chin, Homer L.; McNamara, Michael; Poteraj, Kathy; Battaglini, Sarah; Alstot, Lauren

    2012-01-01

    The best way to train clinicians to optimize their use of the Electronic Health Record (EHR) remains unclear. Approaches range from web-based training, class-room training, EHR functionality training, case-based training, role-based training, process-based training, mock-clinic training and “on the job” training. Similarly, the optimal timing of training remains unclear--whether to engage in extensive pre go-live training vs. minimal pre go-live training followed by more extensive post go-live training. In addition, the effectiveness of non-clinician trainers, clinician trainers, and peer-trainers, remains unclearly defined. This paper describes a program in which relatively experienced clinician users of an EHR underwent an intensive 3-day Peer-Led EHR advanced proficiency training, and the results of that training based on participant surveys. It highlights the effectiveness of Peer-Led Proficiency Training of existing experienced clinician EHR users in improving self-reported efficiency and satisfaction with an EHR and improvements in perceived work-life balance and job satisfaction. PMID:23304282

  17. Advanced proficiency EHR training: effect on physicians' EHR efficiency, EHR satisfaction and job satisfaction.

    PubMed

    Dastagir, M Tariq; Chin, Homer L; McNamara, Michael; Poteraj, Kathy; Battaglini, Sarah; Alstot, Lauren

    2012-01-01

    The best way to train clinicians to optimize their use of the Electronic Health Record (EHR) remains unclear. Approaches range from web-based training, class-room training, EHR functionality training, case-based training, role-based training, process-based training, mock-clinic training and "on the job" training. Similarly, the optimal timing of training remains unclear--whether to engage in extensive pre go-live training vs. minimal pre go-live training followed by more extensive post go-live training. In addition, the effectiveness of non-clinician trainers, clinician trainers, and peer-trainers, remains unclearly defined. This paper describes a program in which relatively experienced clinician users of an EHR underwent an intensive 3-day Peer-Led EHR advanced proficiency training, and the results of that training based on participant surveys. It highlights the effectiveness of Peer-Led Proficiency Training of existing experienced clinician EHR users in improving self-reported efficiency and satisfaction with an EHR and improvements in perceived work-life balance and job satisfaction.

  18. Healthwatch-2 System Overview

    NASA Technical Reports Server (NTRS)

    Barszcz, Eric; Mosher, Marianne; Huff, Edward M.

    2004-01-01

    Healthwatch-2 (HW-2) is a research tool designed to facilitate the development and testing of in-flight health monitoring algorithms. HW-2 software is written in C/C++ and executes on an x86-based computer running the Linux operating system. The executive module has interfaces for collecting various signal data, such as vibration, torque, tachometer, and GPS. It is designed to perform in-flight time or frequency averaging based on specifications defined in a user-supplied configuration file. Averaged data are then passed to a user-supplied algorithm written as a Matlab function. This allows researchers a convenient method for testing in-flight algorithms. In addition to its in-flight capabilities, HW-2 software is also capable of reading archived flight data and processing it as if collected in-flight. This allows algorithms to be developed and tested in the laboratory before being flown. Currently HW-2 has passed its checkout phase and is collecting data on a Bell OH-58C helicopter operated by the U.S. Army at NASA Ames Research Center.

  19. ChromA: signal-based retention time alignment for chromatography–mass spectrometry data

    PubMed Central

    Hoffmann, Nils; Stoye, Jens

    2009-01-01

    Summary: We describe ChromA, a web-based alignment tool for chromatography–mass spectrometry data from the metabolomics and proteomics domains. Users can supply their data in open and standardized file formats for retention time alignment using dynamic time warping with different configurable local distance and similarity functions. Additionally, user-defined anchors can be used to constrain and speedup the alignment. A neighborhood around each anchor can be added to increase the flexibility of the constrained alignment. ChromA offers different visualizations of the alignment for easier qualitative interpretation and comparison of the data. For the multiple alignment of more than two data files, the center-star approximation is applied to select a reference among input files to align to. Availability: ChromA is available at http://bibiserv.techfak.uni-bielefeld.de/chroma. Executables and source code under the L-GPL v3 license are provided for download at the same location. Contact: stoye@techfak.uni-bielefeld.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19505941

  20. 77 FR 37720 - Options Price Reporting Authority; Notice of Filing and Immediate Effectiveness of Proposed...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-22

    ... ``devices'' and/or ``UserIDs.'' \\4\\ OPRA defines a ``Subscriber,'' in general, as an entity or person that... Exchanges on an Unlisted Trading Privilege Basis'' (``Nasdaq/UTP Plan''). CTA and the Nasdaq/UTP Plan define... capacity.\\9\\ Second, because the definition of the term ``associated person'' is defined differently in the...

  1. COMT val158met and 5-HTTLPR Genetic Polymorphisms Moderate Executive Control in Cannabis Users

    PubMed Central

    Verdejo-García, Antonio; Beatriz Fagundo, Ana; Cuenca, Aida; Rodriguez, Joan; Cuyás, Elisabet; Langohr, Klaus; de Sola Llopis, Susana; Civit, Ester; Farré, Magí; Peña-Casanova, Jordi; de la Torre, Rafael

    2013-01-01

    The adverse effects of cannabis use on executive functions are still controversial, fostering the need for novel biomarkers able to unveil individual differences in the cognitive impact of cannabis consumption. Two common genetic polymorphisms have been linked to the neuroadaptive impact of Δ9-tetrahydrocannabinol (THC) exposure and to executive functions in animals: the catechol-O-methyltransferase (COMT) gene val158met polymorphism and the SLC6A4 gene 5-HTTLPR polymorphism. We aimed to test if these polymorphisms moderate the harmful effects of cannabis use on executive function in young cannabis users. We recruited 144 participants: 86 cannabis users and 58 non-drug user controls. Both groups were genotyped and matched for genetic makeup, sex, age, education, and IQ. We used a computerized neuropsychological battery to assess different aspects of executive functions: sustained attention (CANTAB Rapid Visual Information Processing Test, RVIP), working memory (N-back), monitoring/shifting (CANTAB ID/ED set shifting), planning (CANTAB Stockings of Cambridge, SOC), and decision-making (Iowa Gambling Task, IGT). We used general linear model-based analyses to test performance differences between cannabis users and controls as a function of genotypes. We found that: (i) daily cannabis use is not associated with executive function deficits; and (ii) COMT val158met and 5-HTTLPR polymorphisms moderate the link between cannabis use and executive performance. Cannabis users carrying the COMT val/val genotype exhibited lower accuracy of sustained attention, associated with a more strict response bias, than val/val non-users. Cannabis users carrying the COMT val allele also committed more monitoring/shifting errors than cannabis users carrying the met/met genotype. Finally, cannabis users carrying the 5-HTTLPR s/s genotype had worse IGT performance than s/s non-users. COMT and SLC6A4 genes moderate the impact of cannabis use on executive functions. PMID:23449176

  2. Exploring the impact of wheelchair design on user function in a rural South African setting.

    PubMed

    Visagie, Surona; Duffield, Svenje; Unger, Mariaan

    2015-01-01

    Wheelchairs provide mobility that can enhance function and community integration. Function in a wheelchair is influenced by wheelchair design. To explore the impact of wheelchair design on user function and the variables that guided wheelchair prescription in the study setting. A mixed-method, descriptive design using convenience sampling was implemented. Quantitative data were collected from 30 wheelchair users using the functioning every day with a Wheelchair Scale and a Wheelchair Specification Checklist. Qualitative data were collected from ten therapists who prescribed wheelchairs to these users, through interviews. The Kruskal-Wallis test was used to identify relationships, and content analysis was undertaken to identify emerging themes in qualitative data. Wheelchairs with urban designs were issued to 25 (83%) participants. Wheelchair size, fit, support and functional features created challenges concerning transport, operating the wheelchair, performing personal tasks, and indoor and outdoor mobility. Users using wheelchairs designed for use in semi-rural environments achieved significantly better scores regarding the appropriateness of the prescribed wheelchair than those using wheelchairs designed for urban use ( p = <0.01). Therapists prescribed the basic, four-wheel folding frame design most often because of a lack of funding, lack of assessment, lack of skills and user choice. Issuing urban type wheelchairs to users living in rural settings might have a negative effect on users' functional outcomes. Comprehensive assessments, further training and research, on long term cost and quality of life implications, regarding provision of a suitable wheelchair versus a cheaper less suitable option is recommended.

  3. PRISE2: software for designing sequence-selective PCR primers and probes.

    PubMed

    Huang, Yu-Ting; Yang, Jiue-in; Chrobak, Marek; Borneman, James

    2014-09-25

    PRISE2 is a new software tool for designing sequence-selective PCR primers and probes. To achieve high level of selectivity, PRISE2 allows the user to specify a collection of target sequences that the primers are supposed to amplify, as well as non-target sequences that should not be amplified. The program emphasizes primer selectivity on the 3' end, which is crucial for selective amplification of conserved sequences such as rRNA genes. In PRISE2, users can specify desired properties of primers, including length, GC content, and others. They can interactively manipulate the list of candidate primers, to choose primer pairs that are best suited for their needs. A similar process is used to add probes to selected primer pairs. More advanced features include, for example, the capability to define a custom mismatch penalty function. PRISE2 is equipped with a graphical, user-friendly interface, and it runs on Windows, Macintosh or Linux machines. PRISE2 has been tested on two very similar strains of the fungus Dactylella oviparasitica, and it was able to create highly selective primers and probes for each of them, demonstrating the ability to create useful sequence-selective assays. PRISE2 is a user-friendly, interactive software package that can be used to design high-quality selective primers for PCR experiments. In addition to choosing primers, users have an option to add a probe to any selected primer pair, enabling design of Taqman and other primer-probe based assays. PRISE2 can also be used to design probes for FISH and other hybridization-based assays.

  4. The behavioral intervention technology model: an integrated conceptual and technological framework for eHealth and mHealth interventions.

    PubMed

    Mohr, David C; Schueller, Stephen M; Montague, Enid; Burns, Michelle Nicole; Rashidi, Parisa

    2014-06-05

    A growing number of investigators have commented on the lack of models to inform the design of behavioral intervention technologies (BITs). BITs, which include a subset of mHealth and eHealth interventions, employ a broad range of technologies, such as mobile phones, the Web, and sensors, to support users in changing behaviors and cognitions related to health, mental health, and wellness. We propose a model that conceptually defines BITs, from the clinical aim to the technological delivery framework. The BIT model defines both the conceptual and technological architecture of a BIT. Conceptually, a BIT model should answer the questions why, what, how (conceptual and technical), and when. While BITs generally have a larger treatment goal, such goals generally consist of smaller intervention aims (the "why") such as promotion or reduction of specific behaviors, and behavior change strategies (the conceptual "how"), such as education, goal setting, and monitoring. Behavior change strategies are instantiated with specific intervention components or "elements" (the "what"). The characteristics of intervention elements may be further defined or modified (the technical "how") to meet the needs, capabilities, and preferences of a user. Finally, many BITs require specification of a workflow that defines when an intervention component will be delivered. The BIT model includes a technological framework (BIT-Tech) that can integrate and implement the intervention elements, characteristics, and workflow to deliver the entire BIT to users over time. This implementation may be either predefined or include adaptive systems that can tailor the intervention based on data from the user and the user's environment. The BIT model provides a step towards formalizing the translation of developer aims into intervention components, larger treatments, and methods of delivery in a manner that supports research and communication between investigators on how to design, develop, and deploy BITs.

  5. Prefrontal N-acetylaspartate is strongly associated with memory performance in (abstinent) ecstasy users: preliminary report.

    PubMed

    Reneman, L; Majoie, C B; Schmand, B; van den Brink, W; den Heeten, G J

    2001-10-01

    3,4-methylenedioxymethamphetamine (MDMA or "Ecstasy") is known to damage brain serotonin neurons in animals and possibly humans. Because serotonergic damage may adversely affect memory, we compared verbal memory function between MDMA users and MDMA-naïve control subjects and evaluated the relationship between verbal memory function and neuronal dysfunction in the MDMA users. An auditory verbal memory task (Rey Auditory Verbal Learning Test) was used to study eight abstinent MDMA users and seven control subjects. In addition 1H-MRS was used in different brain regions of all MDMA users to measure N-acetylaspartate/creatine (NAA/Cr) ratios, a marker for neuronal viability. The MDMA users recalled significantly fewer words than control subjects on delayed (p =.03) but not immediate recall (p =.08). In MDMA users, delayed memory function was strongly associated with NAA/Cr only in the prefrontal cortex (R(2) =.76, p =.01). Greater decrements in memory function predicted lower NAA/Cr levels-and by inference greater neuronal dysfunction-in the prefrontal cortex of MDMA users.

  6. Ecposure Related Dose Estimating Model

    EPA Science Inventory

    ERDEM is a physiologically based pharmacokinetic (PBPK) modeling system consisting of a general model and an associated front end. An actual model is defined when the user prepares an input command file. Such a command file defines the chemicals, compartments and processes that...

  7. Organizing for low cost space transportation

    NASA Technical Reports Server (NTRS)

    Lee, C. M.

    1977-01-01

    The paper describes the management concepts and organizational structure NASA is establishing to operate the Space Transportation System. Policies which would encourage public and commercial organizations and private individuals to use the new STS are discussed, and design criteria for experiments, spacecraft, and other systems elements are considered. The design criteria are intented to facilitate cost reductions for space operations. NASA plans for the transition from currently used expendable launch vehicles to Shuttle use and Shuttle pricing policies are explained in detail. Hardware development is basically complete, management functions have been defined, pricing policies have been published, and procedures for user contact and services have been places into operation.

  8. Planning Orbiter Flights

    NASA Technical Reports Server (NTRS)

    Harris, H. M.; Bergam, M. J.; Kim, S. L.; Smith, E. A.

    1987-01-01

    Shuttle Mission Design and Operations Software (SMDOS) assists in design and operation of missions involving spacecraft in low orbits around Earth by providing orbital and graphics information. SMDOS performs following five functions: display two world and two polar maps or any user-defined window 5 degrees high in latitude by 5 degrees wide in longitude in one of eight standard projections; designate Earth sites by points or polygon shapes; plot spacecraft ground track with 1-min demarcation lines; display, by means of different colors, availability of Tracking and Data Relay Satellite to Shuttle; and calculate available times and orbits to view particular site, and corresponding look angles. SMDOS written in Laboratory Micro-systems FORTH (1979 standard)

  9. Gross anatomy of network security

    NASA Technical Reports Server (NTRS)

    Siu, Thomas J.

    2002-01-01

    Information security involves many branches of effort, including information assurance, host level security, physical security, and network security. Computer network security methods and implementations are given a top-down description to permit a medically focused audience to anchor this information to their daily practice. The depth of detail of network functionality and security measures, like that of the study of human anatomy, can be highly involved. Presented at the level of major gross anatomical systems, this paper will focus on network backbone implementation and perimeter defenses, then diagnostic tools, and finally the user practices (the human element). Physical security measures, though significant, have been defined as beyond the scope of this presentation.

  10. Fabrication and Testing of Binary-Phase Fourier Gratings for Nonuniform Array Generation

    NASA Technical Reports Server (NTRS)

    Keys, Andrew S.; Crow, Robert W.; Ashley, Paul R.; Nelson, Tom R., Jr.; Parker, Jack H.; Beecher, Elizabeth A.

    2004-01-01

    This effort describes the fabrication and testing of binary-phase Fourier gratings designed to generate an incoherent array of output source points with nonuniform user-defined intensities, symmetric about the zeroth order. Like Dammann fanout gratings, these binary-phase Fourier gratings employ only two phase levels to generate a defined output array. Unlike Dammann fanout gratings, these gratings generate an array of nonuniform, user-defined intensities when projected into the far-field regime. The paper describes the process of design, fabrication, and testing for two different version of the binary-phase grating; one designed for a 12 micron wavelength, referred to as the Long-Wavelength Infrared (LWIR) grating, and one designed for a 5 micron wavelength, referred to as the Mid-Wavelength Infrared Grating (MWIR).

  11. IVOA Credential Delegation Protocol Version 1.0

    NASA Astrophysics Data System (ADS)

    Plante, Raymond; Graham, Matthew; Rixon, Guy; Taffoni, Giuliano; Plante, Raymond; Graham, Matthew

    2010-02-01

    The credential delegation protocol allows a client program to delegate a user's credentials to a service such that that service may make requests of other services in the name of that user. The protocol defines a REST service that works alongside other IVO services to enable such a delegation in a secure manner. In addition to defining the specifics of the service protocol, this document describes how a delegation service is registered in an IVOA registry along with the services it supports. The specification also explains how one can determine from a service registration that it requires the use of a supporting delegation service.

  12. DNASynth: a software application to optimization of artificial gene synthesis

    NASA Astrophysics Data System (ADS)

    Muczyński, Jan; Nowak, Robert M.

    2017-08-01

    DNASynth is a client-server software application in which the client runs in a web browser. The aim of this program is to support and optimize process of artificial gene synthesizing using Ligase Chain Reaction. Thanks to LCR it is possible to obtain DNA strand coding defined by user peptide. The DNA sequence is calculated by optimization algorithm that consider optimal codon usage, minimal energy of secondary structures and minimal number of required LCR. Additionally absence of sequences characteristic for defined by user set of restriction enzymes is guaranteed. The presented software was tested on synthetic and real data.

  13. Tracking and data relay satellite system configuration and tradeoff study, part 1. Volume 1: Summary volume

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Study efforts directed at defining all TDRS system elements are summarized. Emphasis was placed on synthesis of a space segment design optimized to support low and medium data rate user spacecraft and launched with Delta 2914. A preliminary design of the satellite was developed and conceptual designs of the user spacecraft terminal and TDRS ground station were defined. As a result of the analyses and design effort it was determined that (1) a 3-axis-stabilized tracking and data relay satellite launched on a Delta 2914 provides telecommunications services considerably in excess of that required by the study statement; and (2) the design concept supports the needs of the space shuttle and has sufficient growth potential and flexibility to provide telecommunications services to high data rate users. Recommendations for further study are included.

  14. A Universal Portable Appliance for Stellarator W7-X Power Supply Controlling

    NASA Astrophysics Data System (ADS)

    Xu, Wei-hua; Wolfgang, Foerster; Guenter, Mueller

    2001-06-01

    In the project Wendelstein 7-X (W7-X), the popular fieldbus Profibus has been determined as a uniform connection between the central control system and all the subordinate systems. A universal embedded control system has been developed for W7-X power supply controlling. Siemens 80C167CR microcontroller is used as the central control unit of the system. With a user-defined printed circuit board (PCB) several control buses, i.e., Profibus, CAN, IEEE 488, RS485 and RS 232 have been connected to the microcontroller. The corresponding hardware interfaces for the control buses have been designed. A graphic liquid crystal display(LCD) and a user-defined keyboard are used as user interface. The control software will be developed with a C-like language, i.e., C166 for the controller.

  15. TERSSE: Definition of the Total Earth Resources System for the Shuttle Era. Volume 8: User's Mission and System Requirements Data (appendix A of Volume 3)

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A computer printout is presented of the mission requirement for the TERSSE missions and their associated user tasks. The data included in the data base represents a broad-based attempt to define the amount, extent, and type of information needed for an earth resources management program in the era of the space shuttle. An effort was made to consider all aspects of remote sensing and resource management; because of its broad scope, it is not intended that the data be used without verification for in-depth studies of particular missions and/or users. The data base represents the quantitative structure necessary to define the TERSSE architecture and requirements, and to an overall integrated view of the earth resources technology requirements of the 1980's.

  16. Executive function deficits in short-term abstinent cannabis users.

    PubMed

    McHale, Sue; Hunt, Nigel

    2008-07-01

    Few cognitive tasks are adequately sensitive to show the small decrements in performance in abstinent chronic cannabis users. In this series of three experiments we set out to demonstrate a variety of tasks that are sufficiently sensitive to show differences in visual memory, verbal memory, everyday memory and executive function between controls and cannabis users. A series of three studies explored cognitive function deficits in cannabis users (phonemic verbal fluency, visual recognition and immediate and delayed recall, and prospective memory) in short-term abstinent cannabis users. Participants were selected using snowball sampling, with cannabis users being compared to a standard control group and a tobacco-use control group. The cannabis users, compared to both control groups, had deficits on verbal fluency, visual recognition, delayed visual recall, and short- and long-interval prospective memory. There were no differences for immediate visual recall. These findings suggest that cannabis use leads to impaired executive function. Further research needs to explore the longer term impact of cannabis use. Copyright 2008 John Wiley & Sons, Ltd.

  17. The effects of synthetic cannabinoids on executive function.

    PubMed

    Cohen, K; Kapitány-Fövény, M; Mama, Y; Arieli, M; Rosca, P; Demetrovics, Z; Weinstein, A

    2017-04-01

    There is a growing use of novel psychoactive substances (NPSs) including synthetic cannabinoids. Synthetic cannabinoid products have effects similar to those of natural cannabis but the new synthetic cannabinoids are more potent and dangerous and their use has resulted in various adverse effects. The purpose of the study was to assess whether persistent use of synthetic cannabinoids is associating with impairments of executive function in chronic users. A total of 38 synthetic cannabinoids users, 43 recreational cannabis users, and 41 non-user subjects were studied in two centers in Hungary and Israel. Computerized cognitive function tests, the classical Stroop word-color task, n-back task, and a free-recall memory task were used. Synthetic cannabinoid users performed significantly worse than both recreational and non-cannabis users on the n-back task (less accuracy), the Stroop task (overall slow responses and less accuracy), and the long-term memory task (less word recall). Additionally, they have also shown higher ratings of depression and anxiety compared with both recreational and non-users groups. This study showed impairment of executive function in synthetic cannabinoid users compared with recreational users of cannabis and non-users. This may have major implications for our understanding of the long-term consequences of synthetic cannabinoid based drugs.

  18. Robotic Telepresence: Perception, Performance, and User Experience

    DTIC Science & Technology

    2012-02-01

    defined as “a human-computer-machine condition in which a user receives sufficient information about a remote, real-world site through a machine so...that the user feels physically present at the remote, real-world site ” (Aliberti and Bruen, 2006). Telepresence often includes capabilities for a more...outdoor route reconnaissance course (figures 4 and 5) was located at the Molnar MOUT (Military Operations in Urban Terrain) site in Fort Benning, GA. It

  19. Pilot Inventory Complex Adaptive System (PICAS): An Artificial Life Approach to Managing Pilot Retention.

    DTIC Science & Technology

    1999-03-01

    mates) and base their behaviors on this interactive information. This alone defines the nature of a complex adaptive system and it is based on this...world policy initiatives. 2.3.4. User Interaction Building the model with extensive user interaction gives the entire system a more appealing feel...complex behavior that hopefully mimics trends observed in reality . User interaction also allows for easier justification of assumptions used within

  20. Illinois Occupational Skill Standards: Information Technology End User Applications Cluster.

    ERIC Educational Resources Information Center

    Illinois Occupational Skill Standards and Credentialing Council, Carbondale.

    These skill standards for the information technology end user applications cluster are intended to be a guide to workforce preparation program providers in defining content for their programs and to employers to establish the skills and standards necessary for job acquisition. An introduction provides the Illinois perspective; Illinois…

Top