Science.gov

Sample records for online model server

  1. Serverification of Molecular Modeling Applications: The Rosetta Online Server That Includes Everyone (ROSIE)

    PubMed Central

    Conchúir, Shane Ó.; Der, Bryan S.; Drew, Kevin; Kuroda, Daisuke; Xu, Jianqing; Weitzner, Brian D.; Renfrew, P. Douglas; Sripakdeevong, Parin; Borgo, Benjamin; Havranek, James J.; Kuhlman, Brian; Kortemme, Tanja; Bonneau, Richard; Gray, Jeffrey J.; Das, Rhiju

    2013-01-01

    The Rosetta molecular modeling software package provides experimentally tested and rapidly evolving tools for the 3D structure prediction and high-resolution design of proteins, nucleic acids, and a growing number of non-natural polymers. Despite its free availability to academic users and improving documentation, use of Rosetta has largely remained confined to developers and their immediate collaborators due to the code’s difficulty of use, the requirement for large computational resources, and the unavailability of servers for most of the Rosetta applications. Here, we present a unified web framework for Rosetta applications called ROSIE (Rosetta Online Server that Includes Everyone). ROSIE provides (a) a common user interface for Rosetta protocols, (b) a stable application programming interface for developers to add additional protocols, (c) a flexible back-end to allow leveraging of computer cluster resources shared by RosettaCommons member institutions, and (d) centralized administration by the RosettaCommons to ensure continuous maintenance. This paper describes the ROSIE server infrastructure, a step-by-step ‘serverification’ protocol for use by Rosetta developers, and the deployment of the first nine ROSIE applications by six separate developer teams: Docking, RNA de novo, ERRASER, Antibody, Sequence Tolerance, Supercharge, Beta peptide design, NCBB design, and VIP redesign. As illustrated by the number and diversity of these applications, ROSIE offers a general and speedy paradigm for serverification of Rosetta applications that incurs negligible cost to developers and lowers barriers to Rosetta use for the broader biological community. ROSIE is available at http://rosie.rosettacommons.org. PMID:23717507

  2. Design of Accelerator Online Simulator Server Using Structured Data

    SciTech Connect

    Shen, Guobao; Chu, Chungming; Wu, Juhao; Kraimer, Martin; /Argonne

    2012-07-06

    Model based control plays an important role for a modern accelerator during beam commissioning, beam study, and even daily operation. With a realistic model, beam behaviour can be predicted and therefore effectively controlled. The approach used by most current high level application environments is to use a built-in simulation engine and feed a realistic model into that simulation engine. Instead of this traditional monolithic structure, a new approach using a client-server architecture is under development. An on-line simulator server is accessed via network accessible structured data. With this approach, a user can easily access multiple simulation codes. This paper describes the design, implementation, and current status of PVData, which defines the structured data, and PVAccess, which provides network access to the structured data.

  3. RCD+: Fast loop modeling server

    PubMed Central

    López-Blanco, José Ramón; Canosa-Valls, Alejandro Jesús; Li, Yaohang; Chacón, Pablo

    2016-01-01

    Modeling loops is a critical and challenging step in protein modeling and prediction. We have developed a quick online service (http://rcd.chaconlab.org) for ab initio loop modeling combining a coarse-grained conformational search with a full-atom refinement. Our original Random Coordinate Descent (RCD) loop closure algorithm has been greatly improved to enrich the sampling distribution towards near-native conformations. These improvements include a new workflow optimization, MPI-parallelization and fast backbone angle sampling based on neighbor-dependent Ramachandran probability distributions. The server starts by efficiently searching the vast conformational space from only the loop sequence information and the environment atomic coordinates. The generated closed loop models are subsequently ranked using a fast distance-orientation dependent energy filter. Top ranked loops are refined with the Rosetta energy function to obtain accurate all-atom predictions that can be interactively inspected in an user-friendly web interface. Using standard benchmarks, the average root mean squared deviation (RMSD) is 0.8 and 1.4 Å for 8 and 12 residues loops, respectively, in the challenging modeling scenario in where the side chains of the loop environment are fully remodeled. These results are not only very competitive compared to those obtained with public state of the art methods, but also they are obtained ∼10-fold faster. PMID:27151199

  4. RCD+: Fast loop modeling server.

    PubMed

    López-Blanco, José Ramón; Canosa-Valls, Alejandro Jesús; Li, Yaohang; Chacón, Pablo

    2016-07-01

    Modeling loops is a critical and challenging step in protein modeling and prediction. We have developed a quick online service (http://rcd.chaconlab.org) for ab initio loop modeling combining a coarse-grained conformational search with a full-atom refinement. Our original Random Coordinate Descent (RCD) loop closure algorithm has been greatly improved to enrich the sampling distribution towards near-native conformations. These improvements include a new workflow optimization, MPI-parallelization and fast backbone angle sampling based on neighbor-dependent Ramachandran probability distributions. The server starts by efficiently searching the vast conformational space from only the loop sequence information and the environment atomic coordinates. The generated closed loop models are subsequently ranked using a fast distance-orientation dependent energy filter. Top ranked loops are refined with the Rosetta energy function to obtain accurate all-atom predictions that can be interactively inspected in an user-friendly web interface. Using standard benchmarks, the average root mean squared deviation (RMSD) is 0.8 and 1.4 Å for 8 and 12 residues loops, respectively, in the challenging modeling scenario in where the side chains of the loop environment are fully remodeled. These results are not only very competitive compared to those obtained with public state of the art methods, but also they are obtained ∼10-fold faster. PMID:27151199

  5. A Scalability Model for ECS's Data Server

    NASA Technical Reports Server (NTRS)

    Menasce, Daniel A.; Singhal, Mukesh

    1998-01-01

    This report presents in four chapters a model for the scalability analysis of the Data Server subsystem of the Earth Observing System Data and Information System (EOSDIS) Core System (ECS). The model analyzes if the planned architecture of the Data Server will support an increase in the workload with the possible upgrade and/or addition of processors, storage subsystems, and networks. The approaches in the report include a summary of the architecture of ECS's Data server as well as a high level description of the Ingest and Retrieval operations as they relate to ECS's Data Server. This description forms the basis for the development of the scalability model of the data server and the methodology used to solve it.

  6. PlanetServer: Innovative approaches for the online analysis of hyperspectral satellite data from Mars

    NASA Astrophysics Data System (ADS)

    Oosthoek, J. H. P.; Flahaut, J.; Rossi, A. P.; Baumann, P.; Misev, D.; Campalani, P.; Unnithan, V.

    2014-06-01

    PlanetServer is a WebGIS system, currently under development, enabling the online analysis of Compact Reconnaissance Imaging Spectrometer (CRISM) hyperspectral data from Mars. It is part of the EarthServer project which builds infrastructure for online access and analysis of huge Earth Science datasets. Core functionality consists of the rasdaman Array Database Management System (DBMS) for storage, and the Open Geospatial Consortium (OGC) Web Coverage Processing Service (WCPS) for data querying. Various WCPS queries have been designed to access spatial and spectral subsets of the CRISM data. The client WebGIS, consisting mainly of the OpenLayers javascript library, uses these queries to enable online spatial and spectral analysis. Currently the PlanetServer demonstration consists of two CRISM Full Resolution Target (FRT) observations, surrounding the NASA Curiosity rover landing site. A detailed analysis of one of these observations is performed in the Case Study section. The current PlanetServer functionality is described step by step, and is tested by focusing on detecting mineralogical evidence described in earlier Gale crater studies. Both the PlanetServer methodology and its possible use for mineralogical studies will be further discussed. Future work includes batch ingestion of CRISM data and further development of the WebGIS and analysis tools.

  7. "Just Another Tool for Online Studies" (JATOS): An Easy Solution for Setup and Management of Web Servers Supporting Online Studies.

    PubMed

    Lange, Kristian; Kühn, Simone; Filevich, Elisa

    2015-01-01

    We present here "Just Another Tool for Online Studies" (JATOS): an open source, cross-platform web application with a graphical user interface (GUI) that greatly simplifies setting up and communicating with a web server to host online studies that are written in JavaScript. JATOS is easy to install in all three major platforms (Microsoft Windows, Mac OS X, and Linux), and seamlessly pairs with a database for secure data storage. It can be installed on a server or locally, allowing researchers to try the application and feasibility of their studies within a browser environment, before engaging in setting up a server. All communication with the JATOS server takes place via a GUI (with no need to use a command line interface), making JATOS an especially accessible tool for researchers without a strong IT background. We describe JATOS' main features and implementation and provide a detailed tutorial along with example studies to help interested researchers to set up their online studies. JATOS can be found under the Internet address: www.jatos.org. PMID:26114751

  8. The FoldX web server: an online force field

    PubMed Central

    Schymkowitz, Joost; Borg, Jesper; Stricher, Francois; Nys, Robby; Rousseau, Frederic; Serrano, Luis

    2005-01-01

    FoldX is an empirical force field that was developed for the rapid evaluation of the effect of mutations on the stability, folding and dynamics of proteins and nucleic acids. The core functionality of FoldX, namely the calculation of the free energy of a macromolecule based on its high-resolution 3D structure, is now publicly available through a web server at . The current release allows the calculation of the stability of a protein, calculation of the positions of the protons and the prediction of water bridges, prediction of metal binding sites and the analysis of the free energy of complex formation. Alanine scanning, the systematic truncation of side chains to alanine, is also included. In addition, some reporting functions have been added, and it is now possible to print both the atomic interaction networks that constitute the protein, print the structural and energetic details of the interactions per atom or per residue, as well as generate a general quality report of the pdb structure. This core functionality will be further extended as more FoldX applications are developed. PMID:15980494

  9. MO/DSD online information server and global information repository access

    NASA Technical Reports Server (NTRS)

    Nguyen, Diem; Ghaffarian, Kam; Hogie, Keith; Mackey, William

    1994-01-01

    Often in the past, standards and new technology information have been available only in hardcopy form, with reproduction and mailing costs proving rather significant. In light of NASA's current budget constraints and in the interest of efficient communications, the Mission Operations and Data Systems Directorate (MO&DSD) New Technology and Data Standards Office recognizes the need for an online information server (OLIS). This server would allow: (1) dissemination of standards and new technology information throughout the Directorate more quickly and economically; (2) online browsing and retrieval of documents that have been published for and by MO&DSD; and (3) searching for current and past study activities on related topics within NASA before issuing a task. This paper explores a variety of available information servers and searching tools, their current capabilities and limitations, and the application of these tools to MO&DSD. Most importantly, the discussion focuses on the way this concept could be easily applied toward improving dissemination of standards and new technologies and improving documentation processes.

  10. Telematics-based online client-server/client collaborative environment for radiotherapy planning simulations.

    PubMed

    Kum, Oyeon

    2007-11-01

    Customized cancer radiation treatment planning for each patient is very useful for both a patient and a doctor because it provides the ability to deliver higher doses to a more accurately defined tumor and at the same time lower doses to organs at risk and normal tissues. This can be realized by building an accurate planning simulation system to provide better treatment strategies based on each patient's tomographic data such as CT, MRI, PET, or SPECT. In this study, we develop a real-time online client-server/client collaborative environment between the client (health care professionals or hospitals) and the server/client under a secure network using telematics (the integrated use of telecommunications and medical informatics). The implementation is based on a point-to-point communication scheme between client and server/client following the WYSIWIS (what you see is what I see) paradigm. After uploading the patient tomographic data, the client is able to collaborate with the server/client for treatment planning. Consequently, the level of health care services can be improved, specifically for small radiotherapy clinics in rural/remote-country areas that do not possess much experience or equipment such as a treatment planning simulator. The telematics service of the system can also be used to provide continued medical education in radiotherapy. Moreover, the system is easy to use. A client can use the system if s/he is familiar with the Windows(TM) operating system because it is designed and built based on a user-friendly concept. This system does not require the client to continue hardware and software maintenance and updates. These are performed automatically by the server. PMID:17943336

  11. The PDB_REDO server for macromolecular structure model optimization

    PubMed Central

    Joosten, Robbie P.; Long, Fei; Murshudov, Garib N.; Perrakis, Anastassis

    2014-01-01

    The refinement and validation of a crystallographic structure model is the last step before the coordinates and the associated data are submitted to the Protein Data Bank (PDB). The success of the refinement procedure is typically assessed by validating the models against geometrical criteria and the diffraction data, and is an important step in ensuring the quality of the PDB public archive [Read et al. (2011 ▶), Structure, 19, 1395–1412]. The PDB_REDO procedure aims for ‘constructive validation’, aspiring to consistent and optimal refinement parameterization and pro-active model rebuilding, not only correcting errors but striving for optimal interpretation of the electron density. A web server for PDB_REDO has been implemented, allowing thorough, consistent and fully automated optimization of the refinement procedure in REFMAC and partial model rebuilding. The goal of the web server is to help practicing crystallo­graphers to improve their model prior to submission to the PDB. For this, additional steps were implemented in the PDB_REDO pipeline, both in the refinement procedure, e.g. testing of resolution limits and k-fold cross-validation for small test sets, and as new validation criteria, e.g. the density-fit metrics implemented in EDSTATS and ligand validation as implemented in YASARA. Innovative ways to present the refinement and validation results to the user are also described, which together with auto-generated Coot scripts can guide users to subsequent model inspection and improvement. It is demonstrated that using the server can lead to substantial improvement of structure models before they are submitted to the PDB. PMID:25075342

  12. Group-oriented coordination models for distributed client-server computing

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.; Hughes, Craig S.

    1994-01-01

    This paper describes group-oriented control models for distributed client-server interactions. These models transparently coordinate requests for services that involve multiple servers, such as queries across distributed databases. Specific capabilities include: decomposing and replicating client requests; dispatching request subtasks or copies to independent, networked servers; and combining server results into a single response for the client. The control models were implemented by combining request broker and process group technologies with an object-oriented communication middleware tool. The models are illustrated in the context of a distributed operations support application for space-based systems.

  13. A Predictive Performance Model to Evaluate the Contention Cost in Application Servers

    SciTech Connect

    Chen, Shiping; Gorton, Ian )

    2002-12-04

    In multi-tier enterprise systems, application servers are key components that implement business logic and provide application services. To support a large number of simultaneous accesses from clients over the Internet and intranet, most application servers use replication and multi-threading to handle concurrent requests. While multiple processes and multiple threads enhance the processing bandwidth of servers, they also increase the contention for resources in application servers. This paper investigates this issue empirically based on a middleware benchmark. A cost model is proposed to estimate the overall performance of application servers, including the contention overhead. This model is then used to determine the optimal degree of the concurrency of application servers for a specific client load. A case study based on CORBA is presented to validate our model and demonstrate its application.

  14. Real-Time Robust Adaptive Modeling and Scheduling for an Electronic Commerce Server

    NASA Astrophysics Data System (ADS)

    Du, Bing; Ruan, Chun

    With the increasing importance and pervasiveness of Internet services, it is becoming a challenge for the proliferation of electronic commerce services to provide performance guarantees under extreme overload. This paper describes a real-time optimization modeling and scheduling approach for performance guarantee of electronic commerce servers. We show that an electronic commerce server may be simulated as a multi-tank system. A robust adaptive server model is subject to unknown additive load disturbances and uncertain model matching. Overload control techniques are based on adaptive admission control to achieve timing guarantees. We evaluate the performance of the model using a complex simulation that is subjected to varying model parameters and massive overload.

  15. Performance model of the Argonne Voyager multimedia server

    SciTech Connect

    Disz, T.; Olson, R.; Stevens, R.

    1997-07-01

    The Argonne Voyager Multimedia Server is being developed in the Futures Lab of the Mathematics and Computer Science Division at Argonne National Laboratory. As a network-based service for recording and playing multimedia streams, it is important that the Voyager system be capable of sustaining certain minimal levels of performance in order for it to be a viable system. In this article, the authors examine the performance characteristics of the server. As they examine the architecture of the system, they try to determine where bottlenecks lie, show actual vs potential performance, and recommend areas for improvement through custom architectures and system tuning.

  16. BeEP Server: using evolutionary information for quality assessment of protein structure models

    PubMed Central

    Palopoli, Nicolas; Lanzarotti, Esteban; Parisi, Gustavo

    2013-01-01

    The BeEP Server (http://www.embnet.qb.fcen.uba.ar/embnet/beep.php) is an online resource aimed to help in the endgame of protein structure prediction. It is able to rank submitted structural models of a protein through an explicit use of evolutionary information, a criterion differing from structural or energetic considerations commonly used in other assessment programs. The idea behind BeEP (Best Evolutionary Pattern) is to benefit from the substitution pattern derived from structural constraints present in a set of homologous proteins adopting a given protein conformation. The BeEP method uses a model of protein evolution that takes into account the structure of a protein to build site-specific substitution matrices. The suitability of these substitution matrices is assessed through maximum likelihood calculations from which position-specific and global scores can be derived. These scores estimate how well the structural constraints derived from each structural model are represented in a sequence alignment of homologous proteins. Our assessment on a subset of proteins from the Critical Assessment of techniques for protein Structure Prediction (CASP) experiment has shown that BeEP is capable of discriminating the models and selecting one or more native-like structures. Moreover, BeEP is not explicitly parameterized to find structural similarities between models and given targets, potentially helping to explore the conformational ensemble of the native state. PMID:23729471

  17. A distributed clients/distributed servers model for STARCAT

    NASA Technical Reports Server (NTRS)

    Pirenne, B.; Albrecht, M. A.; Durand, D.; Gaudet, S.

    1992-01-01

    STARCAT, the Space Telescope ARchive and CATalogue user interface has been along for a number of years already. During this time it has been enhanced and augmented in a number of different fields. This time, we would like to dwell on a new capability allowing geographically distributed user interfaces to connect to geographically distributed data servers. This new concept permits users anywhere on the internet running STARCAT on their local hardware to access e.g., whichever of the 3 existing HST archive sites is available, or get information on the CFHT archive through a transparent connection to the CADC in BC or to get the La Silla weather by connecting to the ESO database in Munich during the same session. Similarly PreView (or quick look) images and spectra will also flow directly to the user from wherever it is available. Moving towards an 'X'-based STARCAT is another goal being pursued: a graphic/image server and a help/doc server are currently being added to it. They should further enhance the user independence and access transparency.

  18. Cooperative Server Clustering for a Scalable GAS Model on Petascale Cray XT5 Systems

    SciTech Connect

    Yu, Weikuan; Que, Xinyu; Tipparaju, Vinod; Graham, Richard L; Vetter, Jeffrey S

    2010-05-01

    Global Address Space (GAS) programming models are attractive because they retain the easy-to-use addressing model that is the characteristic of shared-memory style load and store operations. The scalability of GAS models depends directly on the design and implementation of runtime libraries on the targeted platforms. In this paper, we examine the memory requirement of a popular GAS run-time library, Aggregate Remote Memory Copy Interface (ARMCI) on petascale Cray XT 5 systems. Then we describe a new technique, cooperative server clustering, that enhances the memory scalability of ARMCI communication servers. In cooperative server clustering, ARMCI servers are organized into clusters, and cooperatively process incoming communication requests among them. A request intervention scheme is also designed to expedite the return of responses to the initiating processes. Our experimental results demonstrate that, with very little impact on ARMCI communication latency and bandwidth, cooperative server clustering is able to significantly reduce the memory requirement of ARMCI communication servers, thereby enabling highly scalable scientific applications. In particular, it dramatically reduces the total execution time of a scientific application, NWChem, by 45% on 2400 processes.

  19. Cooperative Server Clustering for a Scalable GAS Model on petascale cray XT5 Systems

    SciTech Connect

    Yu, Weikuan; Que, Xinyu; Graham, Richard L; Vetter, Jeffrey S

    2010-01-01

    Global Address Space (GAS) programming models are attractive because they retain the easy-to-use addressing model that is the characteristic of shared-memory style load and store operations. The scalability of GAS models depends directly on the design and implementation of runtime libraries on the targeted platforms. In this paper, we examine the memory requirement of a popular GAS runtime library, Aggregate Remote Memory Copy Interface (ARMCI) on petascale Cray XT 5 systems. Then we describe a new technique cooperative server clustering that enhances the memory scalability of ARMCI communication servers. In cooperative server clustering, ARMCI servers are organized into clusters, and cooperatively process incoming communication requests among them. A request intervention scheme is also designed to expedite the return of responses to the initiating processes. Our experimental results demonstrate that, with very little impact on ARMCI communication latency and bandwidth, cooperative server clustering is able to significantly reduce the memory requirement of ARMCI communication servers, thereby enabling highly scalable scientific applications. In particular, it dramatically reduces the total execution time of a scientific application, NWChem, by 45% on 2400 processes.

  20. Impact of malicious servers over trust and reputation models in wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Verma, Vinod Kumar; Singh, Surinder; Pathak, N. P.

    2016-03-01

    This article deals with the impact of malicious servers over different trust and reputation models in wireless sensor networks. First, we analysed the five trust and reputation models, namely BTRM-WSN, Eigen trust, peer trust, power trust, linguistic fuzzy trust model. Further, we proposed wireless sensor network design for optimisation of these models. Finally, influence of malicious servers on the behaviour of above mentioned trust and reputation models is discussed. Statistical analysis has been carried out to prove the validity of our proposal.

  1. A New Web-based Application Optimization Model in Multicore Web Server

    NASA Astrophysics Data System (ADS)

    You, Guohua; Zhao, Ying

    More and more web servers adopt multi-core CPUs to improve performance because of the development of multi-core technology. However, web applications couldn't exploit the potential of multi-core web server efficiently because of traditional processing algorithm of requests and scheduling strategies of threads in O/S. In this paper, a new web-based application optimization model was proposed, which could classify and schedule the dynamic requests and static requests on scheduling core, and process the dynamic requests on the other cores. By this way, a simulation program, which is called SIM, was developed. Experiments have been done to validate the new model, and the results show that the new model can effectively improve the performance of multi-core web servers, and avoid the problems of ping-pong effect.

  2. RHIC injector complex online model status and plans

    SciTech Connect

    Schoefer,V.; Ahrens, L.; Brown, K.; Morris, J.; Nemesure, S.

    2009-05-04

    An online modeling system is being developed for the RHIC injector complex, which consists of the Booster, the AGS and the transfer lines connecting the Booster to the AGS and the AGS to RHIC. Historically the injectors have been operated using static values from design specifications or offline model runs, but tighter beam optics constraints required by polarized proton operations (e.g, accelerating with near-integer tunes) have necessitated a more dynamic system. An online model server for the AGS has been implemented using MAD-X [1] as the model engine, with plans to extend the system to the Booster and the injector transfer lines and to add the option of calculating optics using the Polymorphic Tracking Code (PTC [2]) as the model engine.

  3. Cybersecurity, massive data processing, community interaction, and other developments at WWW-based computational X-ray Server

    NASA Astrophysics Data System (ADS)

    Stepanov, Sergey

    2013-03-01

    X-Ray Server (x-server.gmca.aps.anl.gov) is a WWW-based computational server for modeling of X-ray diffraction, reflection and scattering data. The modeling software operates directly on the server and can be accessed remotely either from web browsers or from user software. In the later case the server can be deployed as a software library or a data fitting engine. As the server recently surpassed the milestones of 15 years online and 1.5 million calculations, it accumulated a number of technical solutions that are discussed in this paper. The developed approaches to detecting physical model limits and user calculations failures, solutions to spam and firewall problems, ways to involve the community in replenishing databases and methods to teach users automated access to the server programs may be helpful for X-ray researchers interested in using the server or sharing their own software online.

  4. Evaluation of a server-client architecture for accelerator modeling and simulation

    SciTech Connect

    Bowling, B.A.; Akers, W.; Shoaee, H.; Watson, W.; van Zeijts, J.; Witherspoon, S.

    1997-02-01

    Traditional approaches to computational modeling and simulation often utilize a batch method for code execution using file-formatted input/output. This method of code implementation was generally chosen for several factors, including CPU throughput and availability, complexity of the required modeling problem, and presentation of computation results. With the advent of faster computer hardware and the advances in networking and software techniques, other program architectures for accelerator modeling have recently been employed. Jefferson Laboratory has implemented a client/server solution for accelerator beam transport modeling utilizing a query-based I/O. The goal of this code is to provide modeling information for control system applications and to serve as a computation engine for general modeling tasks, such as machine studies. This paper performs a comparison between the batch execution and server/client architectures, focusing on design and implementation issues, performance, and general utility towards accelerator modeling demands. {copyright} {ital 1997 American Institute of Physics.}

  5. Evaluation of a server-client architecture for accelerator modeling and simulation

    SciTech Connect

    Bowling, B. A.; Akers, W.; Shoaee, H.; Watson, W.; Zeijts, J. van; Witherspoon, S.

    1997-02-01

    Traditional approaches to computational modeling and simulation often utilize a batch method for code execution using file-formatted input/output. This method of code implementation was generally chosen for several factors, including CPU throughput and availability, complexity of the required modeling problem, and presentation of computation results. With the advent of faster computer hardware and the advances in networking and software techniques, other program architectures for accelerator modeling have recently been employed. Jefferson Laboratory has implemented a client/server solution for accelerator beam transport modeling utilizing a query-based I/O. The goal of this code is to provide modeling information for control system applications and to serve as a computation engine for general modeling tasks, such as machine studies. This paper performs a comparison between the batch execution and server/client architectures, focusing on design and implementation issues, performance, and general utility towards accelerator modeling demands.

  6. Evaluation of a server-client architecture for accelerator modeling and simulation

    SciTech Connect

    Bowling, B.A.; Akers, W.; Shoaee, H.; Watson, W.; Zeijts, J. van; Witherspoon, S.

    1997-11-01

    Traditional approaches to computational modeling and simulation often utilize a batch method for code execution using file-formatted input/output. This method of code implementation was generally chosen for several factors, including CPU throughput and availability, complexity of the required modeling problem, and presentation of computation results. With the advent of faster computer hardware and the advances in networking and software techniques, other program architectures for accelerator modeling have recently been employed. Jefferson Laboratory has implemented a client/server solution for accelerator beam transport modeling utilizing a query-based I/O. The goal of this code is to provide modeling information for control system applications and to serve as a computation engine for general modeling tasks, such as machine studies. This paper performs a comparison between the batch execution and server/client architectures, focusing on design and implementation issues, performance, and general utility towards accelerator modeling demands.

  7. THE RHIC/AGS ONLINE MODEL ENVIRONMENT: DESIGN AND OVERVIEW.

    SciTech Connect

    SATOGATA,T.; BROWN,K.; PILAT,F.; TAFTI,A.A.; TEPIKIAN,S.; VAN ZEIJTS,J.

    1999-03-29

    An integrated online modeling environment is currently under development for use by AGS and RHIC physicists and commissioners. This environment combines the modeling efforts of both groups in a CDEV [1] client-server design, providing access to expected machine optics and physics parameters based on live and design machine settings. An abstract modeling interface has been designed as a set of adapters [2] around core computational modeling engines such as MAD and UAL/Teapot++ [3]. This approach allows us to leverage existing survey, lattice, and magnet infrastructure, as well as easily incorporate new model engine developments. This paper describes the architecture of the RHIC/AGS modeling environment, including the application interface through CDEV and general tools for graphical interaction with the model using Tcl/Tk. Separate papers at this conference address the specifics of implementation and modeling experience for AGS and RHIC.

  8. Model of the reliability analysis of the distributed computer systems with architecture "client-server"

    NASA Astrophysics Data System (ADS)

    Kovalev, I. V.; Zelenkov, P. V.; Karaseva, M. V.; Tsarev, M. Yu; Tsarev, R. Yu

    2015-01-01

    The paper considers the problem of the analysis of distributed computer systems reliability with client-server architecture. A distributed computer system is a set of hardware and software for implementing the following main functions: processing, storage, transmission and data protection. This paper discusses the distributed computer systems architecture "client-server". The paper presents the scheme of the distributed computer system functioning represented as a graph where vertices are the functional state of the system and arcs are transitions from one state to another depending on the prevailing conditions. In reliability analysis we consider such reliability indicators as the probability of the system transition in the stopping state and accidents, as well as the intensity of these transitions. The proposed model allows us to obtain correlations for the reliability parameters of the distributed computer system without any assumptions about the distribution laws of random variables and the elements number in the system.

  9. THttpServer class in ROOT

    NASA Astrophysics Data System (ADS)

    Adamczewski-Musch, Joern; Linev, Sergey

    2015-12-01

    The new THttpServer class in ROOT implements HTTP server for arbitrary ROOT applications. It is based on Civetweb embeddable HTTP server and provides direct access to all objects registered for the server. Objects data could be provided in different formats: binary, XML, GIF/PNG, and JSON. A generic user interface for THttpServer has been implemented with HTML/JavaScript based on JavaScript ROOT development. With any modern web browser one could list, display, and monitor objects available on the server. THttpServer is used in Go4 framework to provide HTTP interface to the online analysis.

  10. DelPhi Web Server: A comprehensive online suite for electrostatic calculations of biological macromolecules and their complexes

    PubMed Central

    Sarkar, Subhra; Witham, Shawn; Zhang, Jie; Zhenirovskyy, Maxim; Rocchia, Walter; Alexov, Emil

    2011-01-01

    Here we report a web server, the DelPhi web server, which utilizes DelPhi program to calculate electrostatic energies and the corresponding electrostatic potential and ionic distributions, and dielectric map. The server provides extra services to fix structural defects, as missing atoms in the structural file and allows for generation of missing hydrogen atoms. The hydrogen placement and the corresponding DelPhi calculations can be done with user selected force field parameters being either Charmm22, Amber98 or OPLS. Upon completion of the calculations, the user is given option to download fixed and protonated structural file, together with the parameter and Delphi output files for further analysis. Utilizing Jmol viewer, the user can see the corresponding structural file, to manipulate it and to change the presentation. In addition, if the potential map is requested to be calculated, the potential can be mapped onto the molecule surface. The DelPhi web server is available from http://compbio.clemson.edu/delphi_webserver. PMID:24683424

  11. Business Models in Emerging Online Services

    NASA Astrophysics Data System (ADS)

    Lyons, Kelly; Playford, Corrie; Messinger, Paul R.; Niu, Run H.; Stroulia, Eleni

    Due to advances in technology and the rapid growth of online services, a significant number of new and inventive web-based service models and delivery methods have been introduced. Although online resources and services are having an impact on more traditional service delivery mechanisms, it is not yet clear how these emerging mechanisms for online service delivery will result in profitable business models. In this paper, we consider emerging business models for online services and their implications for how services are delivered, used, and paid for.We demonstrate the changing roles of user / consumer and provider / seller. We also discuss the applicability of different business models for various domains.

  12. Shanghai urban green landscape model system based on MapServer

    NASA Astrophysics Data System (ADS)

    Rui, Jianxun; Shi, Beiqi; Shen, Di; Yao, Weiqin

    2008-10-01

    Based on RS and GIS, the 2003a's aerial image data of Shanghai is taken as data source. According to the urban green landscape theory, the green landscapes are well classified to park, street green landscape, affiliation green landscape, inhabited green landscape, production green landscape and defending green landscape, et al. Several spatio-temporal models including the space expansion models and ecological analyzing models for urban green landscape have been constructed and calculated. Then, based on the ORDBMS platform PostgreSQL and OGIS MapServer, the urban green landscape database including the above six types green landscapes spatial data and model system of Shanghai have been developed. At last, using the powerful statistics analysis function of the model system, this paper discusses and reveals the impacts of urban space development on green landscape pattern, structure and function. At the same time, the general distribution characteristics of green landscape pattern have been researched at three levels such as green patch level, type level and mosaics structure of different green landscapes. The urban green landscapes model system of Shanghai based on MapServer provides a powerful interactive and perfect platform for governments to make urban planning decisions and landscape study.

  13. A Comprehensive Availability Modeling and Analysis of a Virtualized Servers System Using Stochastic Reward Nets

    PubMed Central

    Kim, Dong Seong; Park, Jong Sou

    2014-01-01

    It is important to assess availability of virtualized systems in IT business infrastructures. Previous work on availability modeling and analysis of the virtualized systems used a simplified configuration and assumption in which only one virtual machine (VM) runs on a virtual machine monitor (VMM) hosted on a physical server. In this paper, we show a comprehensive availability model using stochastic reward nets (SRN). The model takes into account (i) the detailed failures and recovery behaviors of multiple VMs, (ii) various other failure modes and corresponding recovery behaviors (e.g., hardware faults, failure and recovery due to Mandelbugs and aging-related bugs), and (iii) dependency between different subcomponents (e.g., between physical host failure and VMM, etc.) in a virtualized servers system. We also show numerical analysis on steady state availability, downtime in hours per year, transaction loss, and sensitivity analysis. This model provides a new finding on how to increase system availability by combining both software rejuvenations at VM and VMM in a wise manner. PMID:25165732

  14. Online homology modelling as a means of bridging the sequence-structure gap.

    PubMed

    Sheehan, David; O'Sullivan, Siobhán

    2011-01-01

    For even the best-studied species, there is a large gap in their representation in the protein databank (PDB) compared to within sequence databases. Typically, less than 2% of sequences are represented in the PDB. This is partly due to the considerable experimental challenge and manual inputs required to solve three dimensional structures by methods such as X-ray diffraction and multi-dimensional nuclear magnetic resonance (NMR) spectroscopy in comparison to high-throughput sequencing. This gap is made even wider by the high level of redundancy within the PDB and under-representation of some protein categories such as membrane-associated proteins which comprise approximately 25% of proteins encoded in genomes. A traditional route to closing the sequence-structure gap is offered by homology modelling whereby the sequence of a target protein is modelled on a template represented in the PDB using in silico energy minimisation approaches. More recently, online homology servers have become available which automatically generate models from proffered sequences. However, many online servers give little indication of the structural plausibility of the generated model. In this paper, the online homology server Geno3D will be described. This server uses similar software to that used in modelling structures during structure determination and thus generates data allowing determination of the structural plausibility of models. For illustration, modelling of a chemotaxis protein (CheY) from Pseudomononas entomophila L48 (accession YP_609298) on a template (PDB id. 1mvo), the phosphorylation domain of an outer membrane protein PhoP from Bacillus subtilis, will be described. PMID:22064508

  15. PRISM: a web server and repository for prediction of protein-protein interactions and modeling their 3D complexes.

    PubMed

    Baspinar, Alper; Cukuroglu, Engin; Nussinov, Ruth; Keskin, Ozlem; Gursoy, Attila

    2014-07-01

    The PRISM web server enables fast and accurate prediction of protein-protein interactions (PPIs). The prediction algorithm is knowledge-based. It combines structural similarity and accounts for evolutionary conservation in the template interfaces. The predicted models are stored in its repository. Given two protein structures, PRISM will provide a structural model of their complex if a matching template interface is available. Users can download the complex structure, retrieve the interface residues and visualize the complex model. The PRISM web server is user friendly, free and open to all users at http://cosbi.ku.edu.tr/prism. PMID:24829450

  16. Scientific Inquiry: A Model for Online Searching.

    ERIC Educational Resources Information Center

    Harter, Stephen P.

    1984-01-01

    Explores scientific inquiry as philosophical and behavioral model for online search specialist and information retrieval process. Nature of scientific research is described and online analogs to research concepts of variable, hypothesis formulation and testing, operational definition, validity, reliability, assumption, and cyclical nature of…

  17. A Model for Enhancing Online Course Development

    ERIC Educational Resources Information Center

    Knowles, Evelyn; Kalata, Kathleen

    2008-01-01

    In order to meet the growing demand for quality online education, Park University has adopted a model that provides a common framework for all of its online courses. Evelyn Knowles and Kathleen Kalata discuss the circumstances leading to the current system and describe the university's implementation of a course development process that ensures…

  18. HPC Server Performance and Power Consumption for Atmospheric Modeling on GPUs Configured with Different CPU Platforms

    NASA Astrophysics Data System (ADS)

    Posey, Stan; Messmer, Peter; Appleyard, Jeremy

    2015-04-01

    Current trends in high performance computing (HPC) are moving towards the use of graphics processing units (GPUs) to achieve speedups through the extraction of fine-grain parallelism of application software. GPUs have been developed exclusively for computational tasks as massively-parallel co-processors to the CPU, and during 2014 the latest NVIDIA GPU architecture can operate with as many as three CPU platforms. In addition to the conventional use of the x86 CPU architecture with GPUs starting from the mid-2000's, the POWER and ARM-64 architectures have recently become available as x86 alternatives. Today computational efficiency and increased performance per energy-cost are key drivers behind HPC decisions to implement GPU-based servers for atmospheric modeling. The choice of a server CPU platform will influence performance and overall power consumption of a system, and also the available configurations of CPU-to-GPU ratio. It follows that such system design configurations continue to be a critical factor behind scientific decisions to implement models at higher resolutions and possibly with an increased use of ensembles. This presentation will examine the current state of GPU developments for atmospheric modeling with examples from the COSMO dycore and from various WRF physics, and for different CPU platforms. The examples provided will be relevant to science-scale HPC practice of CPU-GPU system configurations based on model resolution requirements of a particular simulation. Performance results will compare use of the latest available CPUs from the three available CPU architectures, both with and without GPU acceleration. Finally a GPU outlook is provided on GPU hardware, software, tools, and programmability for each of the available CPU platforms.

  19. Secure IRC Server

    Energy Science and Technology Software Center (ESTSC)

    2003-08-25

    The IRCD is an IRC server that was originally distributed by the IRCD Hybrid developer team for use as a server in IRC message over the public Internet. By supporting the IRC protocol defined in the IRC RFC, IRCD allows the users to create and join channels for group or one-to-one text-based instant messaging. It stores information about channels (e.g., whether it is public, secret, or invite-only, the topic set, membership) and users (who ismore » online and what channels they are members of). It receives messages for a specific user or channel and forwards these messages to the targeted destination. Since server-to-server communication is also supported, these targeted destinations may be connected to different IRC servers. Messages are exchanged over TCP connections that remain open between the client and the server. The IRCD is being used within the Pervasive Computing Collaboration Environment (PCCE) as the 'chat server' for message exchange over public and private channels. After an LBNLSecureMessaging(PCCE chat) client has been authenticated, the client connects to IRCD with its assigned nickname or 'nick.' The client can then create or join channels for group discussions or one-to-one conversations. These channels can have an initial mode of public or invite-only and the mode may be changed after creation. If a channel is public, any one online can join the discussion; if a channel is invite-only, users can only join if existing members of the channel explicity invite them. Users can be invited to any type of channel and users may be members of multiple channels simultaneously. For use with the PCCE environment, the IRCD application (which was written in C) was ported to Linux and has been tested and installed under Linux Redhat 7.2. The source code was also modified with SSL so that all messages exchanged over the network are encrypted. This modified IRC server also verifies with an authentication server that the client is who he or she claims to be and

  20. Secure IRC Server

    SciTech Connect

    Perry, Marcia

    2003-08-25

    The IRCD is an IRC server that was originally distributed by the IRCD Hybrid developer team for use as a server in IRC message over the public Internet. By supporting the IRC protocol defined in the IRC RFC, IRCD allows the users to create and join channels for group or one-to-one text-based instant messaging. It stores information about channels (e.g., whether it is public, secret, or invite-only, the topic set, membership) and users (who is online and what channels they are members of). It receives messages for a specific user or channel and forwards these messages to the targeted destination. Since server-to-server communication is also supported, these targeted destinations may be connected to different IRC servers. Messages are exchanged over TCP connections that remain open between the client and the server. The IRCD is being used within the Pervasive Computing Collaboration Environment (PCCE) as the 'chat server' for message exchange over public and private channels. After an LBNLSecureMessaging(PCCE chat) client has been authenticated, the client connects to IRCD with its assigned nickname or 'nick.' The client can then create or join channels for group discussions or one-to-one conversations. These channels can have an initial mode of public or invite-only and the mode may be changed after creation. If a channel is public, any one online can join the discussion; if a channel is invite-only, users can only join if existing members of the channel explicity invite them. Users can be invited to any type of channel and users may be members of multiple channels simultaneously. For use with the PCCE environment, the IRCD application (which was written in C) was ported to Linux and has been tested and installed under Linux Redhat 7.2. The source code was also modified with SSL so that all messages exchanged over the network are encrypted. This modified IRC server also verifies with an authentication server that the client is who he or she claims to be and that

  1. Remote information service access system based on a client-server-service model

    DOEpatents

    Konrad, Allan M.

    1999-01-01

    A local host computing system, a remote host computing system as connected by a network, and service functionalities: a human interface service functionality, a starter service functionality, and a desired utility service functionality, and a Client-Server-Service (CSS) model is imposed on each service functionality. In one embodiment, this results in nine logical components and three physical components (a local host, a remote host, and an intervening network), where two of the logical components are integrated into one Remote Object Client component, and that Remote Object Client component and the other seven logical components are deployed among the local host and remote host in a manner which eases compatibility and upgrade problems, and provides an illusion to a user that a desired utility service supported on a remote host resides locally on the user's local host, thereby providing ease of use and minimal software maintenance for users of that remote service.

  2. Remote information service access system based on a client-server-service model

    DOEpatents

    Konrad, A.M.

    1997-12-09

    A local host computing system, a remote host computing system as connected by a network, and service functionalities: a human interface service functionality, a starter service functionality, and a desired utility service functionality, and a Client-Server-Service (CSS) model is imposed on each service functionality. In one embodiment, this results in nine logical components and three physical components (a local host, a remote host, and an intervening network), where two of the logical components are integrated into one Remote Object Client component, and that Remote Object Client component and the other seven logical components are deployed among the local host and remote host in a manner which eases compatibility and upgrade problems, and provides an illusion to a user that a desired utility service supported on a remote host resides locally on the user`s local host, thereby providing ease of use and minimal software maintenance for users of that remote service. 16 figs.

  3. Remote information service access system based on a client-server-service model

    DOEpatents

    Konrad, Allan M.

    1997-01-01

    A local host computing system, a remote host computing system as connected by a network, and service functionalities: a human interface service functionality, a starter service functionality, and a desired utility service functionality, and a Client-Server-Service (CSS) model is imposed on each service functionality. In one embodiment, this results in nine logical components and three physical components (a local host, a remote host, and an intervening network), where two of the logical components are integrated into one Remote Object Client component, and that Remote Object Client component and the other seven logical components are deployed among the local host and remote host in a manner which eases compatibility and upgrade problems, and provides an illusion to a user that a desired utility service supported on a remote host resides locally on the user's local host, thereby providing ease of use and minimal software maintenance for users of that remote service.

  4. Remote information service access system based on a client-server-service model

    DOEpatents

    Konrad, Allan M.

    1996-01-01

    A local host computing system, a remote host computing system as connected by a network, and service functionalities: a human interface service functionality, a starter service functionality, and a desired utility service functionality, and a Client-Server-Service (CSS) model is imposed on each service functionality. In one embodiment, this results in nine logical components and three physical components (a local host, a remote host, and an intervening network), where two of the logical components are integrated into one Remote Object Client component, and that Remote Object Client component and the other seven logical components are deployed among the local host and remote host in a manner which eases compatibility and upgrade problems, and provides an illusion to a user that a desired utility service supported on a remote host resides locally on the user's local host, thereby providing ease of use and minimal software maintenance for users of that remote service.

  5. Remote information service access system based on a client-server-service model

    DOEpatents

    Konrad, A.M.

    1996-08-06

    A local host computing system, a remote host computing system as connected by a network, and service functionalities: a human interface service functionality, a starter service functionality, and a desired utility service functionality, and a Client-Server-Service (CSS) model is imposed on each service functionality. In one embodiment, this results in nine logical components and three physical components (a local host, a remote host, and an intervening network), where two of the logical components are integrated into one Remote Object Client component, and that Remote Object Client component and the other seven logical components are deployed among the local host and remote host in a manner which eases compatibility and upgrade problems, and provides an illusion to a user that a desired utility service supported on a remote host resides locally on the user`s local host, thereby providing ease of use and minimal software maintenance for users of that remote service. 16 figs.

  6. ACHESYM: an algorithm and server for standardized placement of macromolecular models in the unit cell.

    PubMed

    Kowiel, Marcin; Jaskolski, Mariusz; Dauter, Zbigniew

    2014-12-01

    Despite the existence of numerous useful conventions in structural crystallography, for example for the choice of the asymmetric part of the unit cell or of reciprocal space, surprisingly no standards are in use for the placement of the molecular model in the unit cell, often leading to inconsistencies or confusion. A conceptual solution for this problem has been proposed for macromolecular crystal structures based on the idea of the anti-Cheshire unit cell. Here, a program and server (called ACHESYM; http://achesym.ibch.poznan.pl) are presented for the practical implementation of this concept. In addition, the first task of ACHESYM is to find an optimal (compact) macromolecular assembly if more than one polymer chain exists. ACHESYM processes PDB (atomic parameters and TLS matrices) and mmCIF (diffraction data) input files to produce a new coordinate set and to reindex the reflections and modify their phases, if necessary. PMID:25478846

  7. An online educational atmospheric global circulation model

    NASA Astrophysics Data System (ADS)

    Navarro, T.; Schott, C.; Forget, F.

    2015-10-01

    As part of online courses on exoplanets of Observatoire de Paris, an online tool designed to vizualise outputs of the Laboratoire de Métérologie Dynamique (LMD) Global Circulation Model (GCM) for various atmospheric circulation regimes has been developed. It includes the possibility for students to visualize 1D and 2D plots along with animations of atmospheric quantities such as temperature, winds, surface pressure, mass flux, etc... from a state-of-the-art model.

  8. How Much? Cost Models for Online Education.

    ERIC Educational Resources Information Center

    Lorenzo, George

    2001-01-01

    Reviews some of the research being done in the area of cost models for online education. Describes a cost analysis handbook; an activity-based costing model that was based on an economic model for traditional instruction at the Indiana University Purdue University Indianapolis; and blending other costing models. (LRW)

  9. Structural modeling of G-protein coupled receptors: An overview on automatic web-servers.

    PubMed

    Busato, Mirko; Giorgetti, Alejandro

    2016-08-01

    Despite the significant efforts and discoveries during the last few years in G protein-coupled receptor (GPCR) expression and crystallization, the receptors with known structures to date are limited only to a small fraction of human GPCRs. The lack of experimental three-dimensional structures of the receptors represents a strong limitation that hampers a deep understanding of their function. Computational techniques are thus a valid alternative strategy to model three-dimensional structures. Indeed, recent advances in the field, together with extraordinary developments in crystallography, in particular due to its ability to capture GPCRs in different activation states, have led to encouraging results in the generation of accurate models. This, prompted the community of modelers to render their methods publicly available through dedicated databases and web-servers. Here, we present an extensive overview on these services, focusing on their advantages, drawbacks and their role in successful applications. Future challenges in the field of GPCR modeling, such as the predictions of long loop regions and the modeling of receptor activation states are presented as well. PMID:27102413

  10. Drug-target interaction prediction: databases, web servers and computational models.

    PubMed

    Chen, Xing; Yan, Chenggang Clarence; Zhang, Xiaotian; Zhang, Xu; Dai, Feng; Yin, Jian; Zhang, Yongdong

    2016-07-01

    Identification of drug-target interactions is an important process in drug discovery. Although high-throughput screening and other biological assays are becoming available, experimental methods for drug-target interaction identification remain to be extremely costly, time-consuming and challenging even nowadays. Therefore, various computational models have been developed to predict potential drug-target associations on a large scale. In this review, databases and web servers involved in drug-target identification and drug discovery are summarized. In addition, we mainly introduced some state-of-the-art computational models for drug-target interactions prediction, including network-based method, machine learning-based method and so on. Specially, for the machine learning-based method, much attention was paid to supervised and semi-supervised models, which have essential difference in the adoption of negative samples. Although significant improvements for drug-target interaction prediction have been obtained by many effective computational models, both network-based and machine learning-based methods have their disadvantages, respectively. Furthermore, we discuss the future directions of the network-based drug discovery and network approach for personalized drug discovery based on personalized medicine, genome sequencing, tumor clone-based network and cancer hallmark-based network. Finally, we discussed the new evaluation validation framework and the formulation of drug-target interactions prediction problem by more realistic regression formulation based on quantitative bioactivity data. PMID:26283676

  11. The HADDOCK2.2 Web Server: User-Friendly Integrative Modeling of Biomolecular Complexes.

    PubMed

    van Zundert, G C P; Rodrigues, J P G L M; Trellet, M; Schmitz, C; Kastritis, P L; Karaca, E; Melquiond, A S J; van Dijk, M; de Vries, S J; Bonvin, A M J J

    2016-02-22

    The prediction of the quaternary structure of biomolecular macromolecules is of paramount importance for fundamental understanding of cellular processes and drug design. In the era of integrative structural biology, one way of increasing the accuracy of modeling methods used to predict the structure of biomolecular complexes is to include as much experimental or predictive information as possible in the process. This has been at the core of our information-driven docking approach HADDOCK. We present here the updated version 2.2 of the HADDOCK portal, which offers new features such as support for mixed molecule types, additional experimental restraints and improved protocols, all of this in a user-friendly interface. With well over 6000 registered users and 108,000 jobs served, an increasing fraction of which on grid resources, we hope that this timely upgrade will help the community to solve important biological questions and further advance the field. The HADDOCK2.2 Web server is freely accessible to non-profit users at http://haddock.science.uu.nl/services/HADDOCK2.2. PMID:26410586

  12. Aviation System Analysis Capability Quick Response System Report Server User's Guide

    NASA Technical Reports Server (NTRS)

    Roberts, Eileen R.; Villani, James A.; Wingrove, Earl R., III

    1996-01-01

    This report is a user's guide for the Aviation System Analysis Capability Quick Response System (ASAC QRS) Report Server. The ASAC QRS is an automated online capability to access selected ASAC models and data repositories. It supports analysis by the aviation community. This system was designed by the Logistics Management Institute for the NASA Ames Research Center. The ASAC QRS Report Server allows users to obtain information stored in the ASAC Data Repositories.

  13. Modelling Typical Online Language Learning Activity

    ERIC Educational Resources Information Center

    Montoro, Carlos; Hampel, Regine; Stickler, Ursula

    2014-01-01

    This article presents the methods and results of a four-year-long research project focusing on the language learning activity of individual learners using online tasks conducted at the University of Guanajuato (Mexico) in 2009-2013. An activity-theoretical model (Blin, 2010; Engeström, 1987) of the typical language learning activity was used to…

  14. Technology and Online Education: Models for Change

    ERIC Educational Resources Information Center

    Cook, Catherine W.; Sonnenberg, Christian

    2014-01-01

    This paper contends that technology changes advance online education. A number of mobile computing and transformative technologies will be examined and incorporated into a descriptive study. The object of the study will be to design innovative mobile awareness models seeking to understand technology changes for mobile devices and how they can be…

  15. Development of Education Support System for Numerical Electromagnetic Analysis Based on Server-Client Model using Java

    NASA Astrophysics Data System (ADS)

    Ohchi, Masashi; Furukawa, Tatsuya; Tanaka, Shin-Ichiro

    Among several numerical methods, a Finite Element Method (FEM) has been adopted in various engineering problems. In such a background, it is necessary to instruct university students in the numerical analysis. The authors have designed and implemented the numerical analysis education support system for learning electromagnetic fields with Graphical User Interface (GUI) based on the server-client model using Java. In the paper, a feasibility study on the student laboratory class in the third year is described.

  16. Exploring a New Model for Preprint Server: A Case Study of CSPO

    ERIC Educational Resources Information Center

    Hu, Changping; Zhang, Yaokun; Chen, Guo

    2010-01-01

    This paper describes the introduction of an open-access preprint server in China covering 43 disciplines. The system includes mandatory deposit for state-funded research and reports on the repository and its effectiveness and outlines a novel process of peer-review of preprints in the repository, which can be incorporated into the established…

  17. Online Instructors as Thinking Advisors: A Model for Online Learner Adaptation

    ERIC Educational Resources Information Center

    Benedetti, Christopher

    2015-01-01

    This article examines the characteristics and challenges of online instruction and presents a model for improving learner adaptation in an online classroom. Instruction in an online classroom presents many challenges, including learner individualization. Individual differences in learning styles and preferences are often not considered in the…

  18. CovalentDock Cloud: a web server for automated covalent docking.

    PubMed

    Ouyang, Xuchang; Zhou, Shuo; Ge, Zemei; Li, Runtao; Kwoh, Chee Keong

    2013-07-01

    Covalent binding is an important mechanism for many drugs to gain its function. We developed a computational algorithm to model this chemical event and extended it to a web server, the CovalentDock Cloud, to make it accessible directly online without any local installation and configuration. It provides a simple yet user-friendly web interface to perform covalent docking experiments and analysis online. The web server accepts the structures of both the ligand and the receptor uploaded by the user or retrieved from online databases with valid access id. It identifies the potential covalent binding patterns, carries out the covalent docking experiments and provides visualization of the result for user analysis. This web server is free and open to all users at http://docking.sce.ntu.edu.sg/. PMID:23677616

  19. NetExplore: a web server for modeling small network motifs

    PubMed Central

    Papatsenko, Dmitri; Lemischka, Ihor R.

    2015-01-01

    Motivation: Quantitative and qualitative assessment of biological data often produces small essential recurrent networks, containing 3–5 components called network motifs. In this context, model solutions for small network motifs represent very high interest. Results: Software package NetExplore has been created in order to generate, classify and analyze solutions for network motifs including up to six network components. NetExplore allows plotting and visualization of the solution's phase spaces and bifurcation diagrams. Availability and implementation: The current version of NetExplore has been implemented in Perl-CGI and is accessible at the following locations: http://line.bioinfolab.net/nex/NetExplore.htm and http://nex.autosome.ru/nex/NetExplore.htm. Contact: dmitri.papatsenko@gmail.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25637559

  20. Oceanotron server for marine in-situ observations : a thematic data model implementation as a basis for the extensibility

    NASA Astrophysics Data System (ADS)

    Loubrieu, T.; Donnart, J. C.; Bregent, S.; Blower, J.; Griffith, G.

    2012-04-01

    Oceanotron (https://forge.ifremer.fr/plugins/mediawiki/wiki/oceanotron/index.php/Accueil) is an open-source data server dedicated to marine in-situ observation dissemination. For its extensibility it relies of an ocean business data model. IFREMER hosts the CORIOLIS marine in-situ data centre (http://www.coriolis.eu.org) and, as French NODC (National Oceanographic Data Centre, http://www.ifremer.fr/sismer/index_UK.htm), some other in-situ observation databases. As such IFREMER participates to numerous ocean data management projects. IFREMER wished to capitalize its thematic data management expertise in a dedicated data dissemination server called Oceanotron. The development of the server coordinated by IFREMER has started in 2010. Knowing the diversity of data repository formats (RDBMS, netCDF, ODV, MEDATLAS, ...) and the temperamental nature of the standard interoperability interface profiles (OGC/WMS, OGC/WFS, OGC/SOS, OpenDAP, …), the architecture of the software relies on an ocean business data model dedicated to marine in-situ observation features. The ocean business data model relies on the CSML conceptual modelling (http://csml.badc.rl.ac.uk/) and UNIDATA Common Data Model (http://www.unidata.ucar.edu/software/netcdf-java/CDM/) works and focuses on the most common marine observation features which are : vertical profiles, point series, trajectories and point. The ocean business data model has been implemented in java and can be used as an API. The oceanotron server orchestrates different types of modules handling the ocean business data model objects : - StorageUnits : which read specific data repository formats (netCDF/OceanSites, netCDF/ARGO, ...). - TransformationUnits : which apply useful ocean business related transformation to the features (for example conversion of vertical coordinates from pressure in dB to meters under sea surface). - FrontDesks : which get external requests and send results for interoperable protocols (OpenDAP, WMS, ...). These

  1. SimRNAweb: a web server for RNA 3D structure modeling with optional restraints.

    PubMed

    Magnus, Marcin; Boniecki, Michał J; Dawson, Wayne; Bujnicki, Janusz M

    2016-07-01

    RNA function in many biological processes depends on the formation of three-dimensional (3D) structures. However, RNA structure is difficult to determine experimentally, which has prompted the development of predictive computational methods. Here, we introduce a user-friendly online interface for modeling RNA 3D structures using SimRNA, a method that uses a coarse-grained representation of RNA molecules, utilizes the Monte Carlo method to sample the conformational space, and relies on a statistical potential to describe the interactions in the folding process. SimRNAweb makes SimRNA accessible to users who do not normally use high performance computational facilities or are unfamiliar with using the command line tools. The simplest input consists of an RNA sequence to fold RNA de novo. Alternatively, a user can provide a 3D structure in the PDB format, for instance a preliminary model built with some other technique, to jump-start the modeling close to the expected final outcome. The user can optionally provide secondary structure and distance restraints, and can freeze a part of the starting 3D structure. SimRNAweb can be used to model single RNA sequences and RNA-RNA complexes (up to 52 chains). The webserver is available at http://genesilico.pl/SimRNAweb. PMID:27095203

  2. Development of an integrated modelling framework: comparing client-server and demand-driven control flow for model execution

    NASA Astrophysics Data System (ADS)

    Schmitz, Oliver; Karssenberg, Derek; de Jong, Kor; de Kok, Jean-Luc; de Jong, Steven M.

    2014-05-01

    The construction of hydrological models at the catchment or global scale depends on the integration of component models representing various environmental processes, often operating at different spatial and temporal discretisations. A flexible construction of spatio-temporal model components, a means to specify aggregation or disaggregation to bridge discretisation discrepancies, ease of coupling these into complex integrated models, and support for stochastic modelling and the assessment of model outputs are the desired functionalities for the development of integrated models. These functionalities are preferably combined into one modelling framework such that domain specialists can perform exploratory model development without the need to change their working environment. We implemented an integrated modelling framework in the Python programming language, providing support for 1) model construction and 2) model execution. The framework enables modellers to represent spatio-temporal processes or to specify spatio-temporal (dis)aggregation with map algebra operations provided by the PCRaster library. Model algebra operations can be used by the modeller to specify the exchange of data and therefore the coupling of components. The framework determines the control flow for the ordered execution based on the time steps and couplings of the model components given by the modeller. We implemented two different control flow mechanisms. First, a client-server approach is used with a central entity controlling the execution of the component models and steering the data exchange. Second, a demand-driven approach is used that triggers the execution of a component model when data is requested by a coupled component model. We show that both control flow mechanisms allow for the execution of stochastic, multi-scale integrated models. We examine the implications of each control flow mechanism on the terminology used by the modeller to specify integrated models, and illustrate the

  3. Parallel Computing Using Web Servers and "Servlets".

    ERIC Educational Resources Information Center

    Lo, Alfred; Bloor, Chris; Choi, Y. K.

    2000-01-01

    Describes parallel computing and presents inexpensive ways to implement a virtual parallel computer with multiple Web servers. Highlights include performance measurement of parallel systems; models for using Java and intranet technology including single server, multiple clients and multiple servers, single client; and a comparison of CGI (common…

  4. Online maintaining appearance model using particle filter

    NASA Astrophysics Data System (ADS)

    Chen, Siying; Lan, Tian; Wang, Jianyu; Ni, Guoqiang

    2008-03-01

    Tracking by foreground matching heavily depends on the appearance model to establish object correspondences among frames and essentially, the appearance model should encode both the difference part between the object and background to guarantee the robustness and the stable part to ensure tracking consistency. This paper provides a solution for online maintaining appearance models by adjusting features in the model. Object appearance is co-modeled by a subset of Haar features selected from the over-complete feature dictionary which encodes the discriminative part of object appearance and the color histogram which describes the stable appearance. During the particle filtering process, feature values both from background patches and object observations are sampled efficiently by the aid of "foreground" and "background" particles respectively. Based on these sampled values, top-ranked discriminative features are added and invalid features are removed out to ensure the object being distinguishable from current background according to the evolving appearance model. The tracker based on this online appearance model maintaining technique has been tested on people and car tracking tasks and promising experimental results are obtained.

  5. EarthServer - an FP7 project to enable the web delivery and analysis of 3D/4D models

    NASA Astrophysics Data System (ADS)

    Laxton, John; Sen, Marcus; Passmore, James

    2013-04-01

    EarthServer aims at open access and ad-hoc analytics on big Earth Science data, based on the OGC geoservice standards Web Coverage Service (WCS) and Web Coverage Processing Service (WCPS). The WCS model defines "coverages" as a unifying paradigm for multi-dimensional raster data, point clouds, meshes, etc., thereby addressing a wide range of Earth Science data including 3D/4D models. WCPS allows declarative SQL-style queries on coverages. The project is developing a pilot implementing these standards, and will also investigate the use of GeoSciML to describe coverages. Integration of WCPS with XQuery will in turn allow coverages to be queried in combination with their metadata and GeoSciML description. The unified service will support navigation, extraction, aggregation, and ad-hoc analysis on coverage data from SQL. Clients will range from mobile devices to high-end immersive virtual reality, and will enable 3D model visualisation using web browser technology coupled with developing web standards. EarthServer is establishing open-source client and server technology intended to be scalable to Petabyte/Exabyte volumes, based on distributed processing, supercomputing, and cloud virtualization. Implementation will be based on the existing rasdaman server technology developed. Services using rasdaman technology are being installed serving the atmospheric, oceanographic, geological, cryospheric, planetary and general earth observation communities. The geology service (http://earthserver.bgs.ac.uk/) is being provided by BGS and at present includes satellite imagery, superficial thickness data, onshore DTMs and 3D models for the Glasgow area. It is intended to extend the data sets available to include 3D voxel models. Use of the WCPS standard allows queries to be constructed against single or multiple coverages. For example on a single coverage data for a particular area can be selected or data with a particular range of pixel values. Queries on multiple surfaces can be

  6. CORAL Server and CORAL Server Proxy: Scalable Access to Relational Databases from CORAL Applications

    NASA Astrophysics Data System (ADS)

    Valassi, A.; Bartoldus, R.; Kalkhof, A.; Salnikov, A.; Wache, M.

    2011-12-01

    The CORAL software is widely used at CERN by the LHC experiments to access the data they store on relational databases, such as Oracle. Two new components have recently been added to implement a model involving a middle tier "CORAL server" deployed close to the database and a tree of "CORAL server proxies", providing data caching and multiplexing, deployed close to the client. A first implementation of the two new components, released in the summer 2009, is now deployed in the ATLAS online system to read the data needed by the High Level Trigger, allowing the configuration of a farm of several thousand processes. This paper reviews the architecture of the software, its development status and its usage in ATLAS.

  7. CORAL Server and CORAL Server Proxy: Scalable Access to Relational Databases from CORAL Applications

    SciTech Connect

    Valassi, A.; Bartoldus, R.; Kalkhof, A.; Salnikov, A.; Wache, M.; /Mainz U., Inst. Phys.

    2012-04-19

    The CORAL software is widely used at CERN by the LHC experiments to access the data they store on relational databases, such as Oracle. Two new components have recently been added to implement a model involving a middle tier 'CORAL server' deployed close to the database and a tree of 'CORAL server proxies', providing data caching and multiplexing, deployed close to the client. A first implementation of the two new components, released in the summer 2009, is now deployed in the ATLAS online system to read the data needed by the High Level Trigger, allowing the configuration of a farm of several thousand processes. This paper reviews the architecture of the software, its development status and its usage in ATLAS.

  8. Incorporating 3-dimensional models in online articles

    PubMed Central

    Cevidanes, Lucia H. S.; Ruellasa, Antonio C. O.; Jomier, Julien; Nguyen, Tung; Pieper, Steve; Budin, Francois; Styner, Martin; Paniagua, Beatriz

    2015-01-01

    Introduction The aims of this article were to introduce the capability to view and interact with 3-dimensional (3D) surface models in online publications, and to describe how to prepare surface models for such online 3D visualizations. Methods Three-dimensional image analysis methods include image acquisition, construction of surface models, registration in a common coordinate system, visualization of overlays, and quantification of changes. Cone-beam computed tomography scans were acquired as volumetric images that can be visualized as 3D projected images or used to construct polygonal meshes or surfaces of specific anatomic structures of interest. The anatomic structures of interest in the scans can be labeled with color (3D volumetric label maps), and then the scans are registered in a common coordinate system using a target region as the reference. The registered 3D volumetric label maps can be saved in .obj, .ply, .stl, or .vtk file formats and used for overlays, quantification of differences in each of the 3 planes of space, or color-coded graphic displays of 3D surface distances. Results All registered 3D surface models in this study were saved in .vtk file format and loaded in the Elsevier 3D viewer. In this study, we describe possible ways to visualize the surface models constructed from cone-beam computed tomography images using 2D and 3D figures. The 3D surface models are available in the article’s online version for viewing and downloading using the reader’s software of choice. These 3D graphic displays are represented in the print version as 2D snapshots. Overlays and color-coded distance maps can be displayed using the reader’s software of choice, allowing graphic assessment of the location and direction of changes or morphologic differences relative to the structure of reference. The interpretation of 3D overlays and quantitative color-coded maps requires basic knowledge of 3D image analysis. Conclusions When submitting manuscripts, authors can

  9. An evolving model of online bipartite networks

    NASA Astrophysics Data System (ADS)

    Zhang, Chu-Xu; Zhang, Zi-Ke; Liu, Chuang

    2013-12-01

    Understanding the structure and evolution of online bipartite networks is a significant task since they play a crucial role in various e-commerce services nowadays. Recently, various attempts have been tried to propose different models, resulting in either power-law or exponential degree distributions. However, many empirical results show that the user degree distribution actually follows a shifted power-law distribution, the so-called Mandelbrot’s law, which cannot be fully described by previous models. In this paper, we propose an evolving model, considering two different user behaviors: random and preferential attachment. Extensive empirical results on two real bipartite networks, Delicious and CiteULike, show that the theoretical model can well characterize the structure of real networks for both user and object degree distributions. In addition, we introduce a structural parameter p, to demonstrate that the hybrid user behavior leads to the shifted power-law degree distribution, and the region of power-law tail will increase with the increment of p. The proposed model might shed some lights in understanding the underlying laws governing the structure of real online bipartite networks.

  10. Dynamic online sewer modelling in Helsingborg.

    PubMed

    Hernebring, C; Jönsson, L E; Thorén, U B; Møller, A

    2002-01-01

    Within the last decade, the sewer system in Helsingborg, Sweden has been rehabilitated in many ways along with the reconstruction of the WWTP Oresundsverket in order to obtain a high degree of nitrogen and phosphorus removal. In that context a holistic view has been applied in order to optimise the corrective measures as seen from the effects in the receiving waters. A sewer catchment model has been used to evaluate several operation strategies and the effect of introducing RTC. Recently, a MOUSE ONLINE system was installed. In this phase the objective is to establish a stable communication with the SCADA system and to generate short-term flow forecasts. PMID:11936663

  11. EXpectation Propagation LOgistic REgRession (EXPLORER): Distributed Privacy-Preserving Online Model Learning

    PubMed Central

    Wang, Shuang; Jiang, Xiaoqian; Wu, Yuan; Cui, Lijuan; Cheng, Samuel; Ohno-Machado, Lucila

    2013-01-01

    We developed an EXpectation Propagation LOgistic REgRession (EXPLORER) model for distributed privacy-preserving online learning. The proposed framework provides a high level guarantee for protecting sensitive information, since the information exchanged between the server and the client is the encrypted posterior distribution of coefficients. Through experimental results, EXPLORER shows the same performance (e.g., discrimination, calibration, feature selection etc.) as the traditional frequentist Logistic Regression model, but provides more flexibility in model updating. That is, EXPLORER can be updated one point at a time rather than having to retrain the entire data set when new observations are recorded. The proposed EXPLORER supports asynchronized communication, which relieves the participants from coordinating with one another, and prevents service breakdown from the absence of participants or interrupted communications. PMID:23562651

  12. Consumer's Online Shopping Influence Factors and Decision-Making Model

    NASA Astrophysics Data System (ADS)

    Yan, Xiangbin; Dai, Shiliang

    Previous research on online consumer behavior has mostly been confined to the perceived risk which is used to explain those barriers for purchasing online. However, perceived benefit is another important factor which influences consumers’ decision when shopping online. As a result, an integrated consumer online shopping decision-making model is developed which contains three elements—Consumer, Product, and Web Site. This model proposed relative factors which influence the consumers’ intention during the online shopping progress, and divided them into two different dimensions—mentally level and material level. We tested those factors with surveys, from both online volunteers and offline paper surveys with more than 200 samples. With the help of SEM, the experimental results show that the proposed model and method can be used to analyze consumer’s online shopping decision-making process effectively.

  13. University Business Models and Online Practices: A Third Way

    ERIC Educational Resources Information Center

    Rubin, Beth

    2013-01-01

    Higher Education is in a state of change, and the existing business models do not meet the needs of stakeholders. This article contrasts the current dominant business models of universities, comparing the traditional non-profit against the for-profit online model, examining the structural features and online teaching practices that underlie each.…

  14. Optimal Self-Tuning PID Controller Based on Low Power Consumption for a Server Fan Cooling System.

    PubMed

    Lee, Chengming; Chen, Rongshun

    2015-01-01

    Recently, saving the cooling power in servers by controlling the fan speed has attracted considerable attention because of the increasing demand for high-density servers. This paper presents an optimal self-tuning proportional-integral-derivative (PID) controller, combining a PID neural network (PIDNN) with fan-power-based optimization in the transient-state temperature response in the time domain, for a server fan cooling system. Because the thermal model of the cooling system is nonlinear and complex, a server mockup system simulating a 1U rack server was constructed and a fan power model was created using a third-order nonlinear curve fit to determine the cooling power consumption by the fan speed control. PIDNN with a time domain criterion is used to tune all online and optimized PID gains. The proposed controller was validated through experiments of step response when the server operated from the low to high power state. The results show that up to 14% of a server's fan cooling power can be saved if the fan control permits a slight temperature response overshoot in the electronic components, which may provide a time-saving strategy for tuning the PID controller to control the server fan speed during low fan power consumption. PMID:26007725

  15. A Conceptual Model for Engagement of the Online Learner

    ERIC Educational Resources Information Center

    Angelino, Lorraine M.; Natvig, Deborah

    2009-01-01

    Engagement of the online learner is one approach to reduce attrition rates. Attrition rates for classes taught through distance education are 10-20% higher than classes taught in a face-to-face setting. This paper introduces a Model for Engagement and provides strategies to engage the online learner. The Model depicts various opportunities where…

  16. Planning for Online Education: A Systems Model

    ERIC Educational Resources Information Center

    Picciano, Anthony G.

    2015-01-01

    The purpose of this article is to revisit the basic principles of technology planning as applied to online education initiatives. While not meant to be an exhaustive treatment of the topic, the article is timely because many colleges and universities are considering the development and expansion of online education as part of their planning…

  17. BION web server: predicting non-specifically bound surface ions

    PubMed Central

    Alexov, Emil

    2013-01-01

    Motivation: Ions are essential component of the cell and frequently are found bound to various macromolecules, in particular to proteins. A binding of an ion to a protein greatly affects protein’s biophysical characteristics and needs to be taken into account in any modeling approach. However, ion’s bounded positions cannot be easily revealed experimentally, especially if they are loosely bound to macromolecular surface. Results: Here, we report a web server, the BION web server, which addresses the demand for tools of predicting surface bound ions, for which specific interactions are not crucial; thus, they are difficult to predict. The BION is easy to use web server that requires only coordinate file to be inputted, and the user is provided with various, but easy to navigate, options. The coordinate file with predicted bound ions is displayed on the output and is available for download. Availability: http://compbio.clemson.edu/bion_server/ Supplementary information: Supplementary data are available at Bioinformatics online. Contact: ealexov@clemson.edu PMID:23380591

  18. Generic OPC UA Server Framework

    NASA Astrophysics Data System (ADS)

    Nikiel, Piotr P.; Farnham, Benjamin; Filimonov, Viatcheslav; Schlenker, Stefan

    2015-12-01

    This paper describes a new approach for generic design and efficient development of OPC UA servers. Development starts with creation of a design file, in XML format, describing an object-oriented information model of the target system or device. Using this model, the framework generates an executable OPC UA server application, which exposes the per-design OPC UA address space, without the developer writing a single line of code. Furthermore, the framework generates skeleton code into which the developer adds the necessary logic for integration to the target system or device. This approach allows both developers unfamiliar with the OPC UA standard, and advanced OPC UA developers, to create servers for the systems they are experts in while greatly reducing design and development effort as compared to developments based purely on COTS OPC UA toolkits. Higher level software may further benefit from the explicit OPC UA server model by using the XML design description as the basis for generating client connectivity configuration and server data representation. Moreover, having the XML design description at hand facilitates automatic generation of validation tools. In this contribution, the concept and implementation of this framework is detailed along with examples of actual production-level usage in the detector control system of the ATLAS experiment at CERN and beyond.

  19. On-Line Distance Learning: A Model for Developing Countries.

    ERIC Educational Resources Information Center

    Khan, Abdul W.

    2000-01-01

    Discusses issues related to open and distance-learning (ODL) in developing countries, using the virtual campus initiative of the Indira Gandhi National Open University (India) as an example and model of on-line program delivery and on-line, for-profit telelearning centers. Suggests strategies to enable open and distance-learning institutions to…

  20. Optimal Self-Tuning PID Controller Based on Low Power Consumption for a Server Fan Cooling System

    PubMed Central

    Lee, Chengming; Chen, Rongshun

    2015-01-01

    Recently, saving the cooling power in servers by controlling the fan speed has attracted considerable attention because of the increasing demand for high-density servers. This paper presents an optimal self-tuning proportional-integral-derivative (PID) controller, combining a PID neural network (PIDNN) with fan-power-based optimization in the transient-state temperature response in the time domain, for a server fan cooling system. Because the thermal model of the cooling system is nonlinear and complex, a server mockup system simulating a 1U rack server was constructed and a fan power model was created using a third-order nonlinear curve fit to determine the cooling power consumption by the fan speed control. PIDNN with a time domain criterion is used to tune all online and optimized PID gains. The proposed controller was validated through experiments of step response when the server operated from the low to high power state. The results show that up to 14% of a server’s fan cooling power can be saved if the fan control permits a slight temperature response overshoot in the electronic components, which may provide a time-saving strategy for tuning the PID controller to control the server fan speed during low fan power consumption. PMID:26007725

  1. Using servers to enhance control system capability

    SciTech Connect

    M. Bickley; B.A. Bowling; D.A. Bryan; J. van Zeijts; K.S. White; S. Witherspoon

    1999-03-01

    Many traditional control systems include a distributed collection of front end machines to control hardware. Back end tools are used to view, modify and record the signals generated by these front end machines. Software servers, which are a middleware layer between the front and back ends, can improve a control system in several ways. Servers can enable on-line processing of raw data, and consolidation of functionality. In many cases, data retrieved from the front end must be processed in order to convert the raw data into useful information. These calculations are often redundantly performed by different programs, frequently offline. Servers can monitor the raw data and rapidly perform calculations, producing new signals which can be treated like any other control system signal, and can be used by any back end application. Algorithms can be incorporated to actively modify signal values in the control system based upon changes of other signals, essentially producing feedback in a control system. Servers thus increase the flexibility of a control system. Lastly, servers running on inexpensive UNIX workstations can relay or cache frequently needed information, reducing the load on front end hardware by functioning as concentrators. Rather than many back end tools connecting directly to the front end machines, increasing the work load of these machines, they instead connect to the server. Servers like those discussed above have been used successfully at the Thomas Jefferson National Accelerator Facility to provide functionality such as beam steering, fault monitoring, storage of machine parameters, and on-line data processing. The authors discuss the potential uses of such servers, and share the results of work performed to date.

  2. USING SERVERS TO ENHANCE CONTROL SYSTEM CAPABILITY.

    SciTech Connect

    BICKLEY,M.; BOWLING,B.A.; BRYAN,D.A.; ZEIJTS,J.; WHITE,K.S.; WITHERSPOON,S.

    1999-03-29

    Many traditional control systems include a distributed collection of front end machines to control hardware. Back end tools are used to view, modify, and record the signals generated by these front end machines. Software servers, which are a middleware layer between the front and back ends, can improve a control system in several ways. Servers can enable on-line processing of raw data, and consolidation of functionality. In many cases data retrieved from the front end must be processed in order to convert the raw data into useful information. These calculations are often redundantly performed by different programs, frequently offline. Servers can monitor the raw data and rapidly perform calculations, producing new signals which can be treated like any other control system signal, and can be used by any back end application. Algorithms can be incorporated to actively modify signal values in the control system based upon changes of other signals, essentially producing feedback in a control system. Servers thus increase the flexibility of a control system. Lastly, servers running on inexpensive UNIX workstations can relay or cache frequently needed information, reducing the load on front end hardware by functioning as concentrators. Rather than many back end tools connecting directly to the front end machines, increasing the work load of these machines, they instead connect to the server. Servers like those discussed above have been used successfully at the Thomas Jefferson National Accelerator Facility to provide functionality such as beam steering, fault monitoring, storage of machine parameters, and on-line data processing. The authors discuss the potential uses of such, servers, and share the results of work performed to date.

  3. RMS ENVELOPE BACK-PROPAGATION IN THE XAL ONLINE MODEL

    SciTech Connect

    Allen, Christopher K; Sako, Hiroyuki; Ikegami, Masanori

    2009-01-01

    The ability to back-propagate RMS envelopes was added to the J-PARC XAL online model. Specifically, given an arbitrary downstream location, the online model can propagate the RMS envelopes backward to an arbitrary upstream location. This feature provides support for algorithms estimating upstream conditions from downstream data. The upgrade required significant refactoring, which we outline. We also show simulations using the new feature.

  4. The Targeted Open Online Course (TOOC) Model

    ERIC Educational Resources Information Center

    Baker, Credence; Gentry, James

    2014-01-01

    In an era of increasingly hyped Massive Open Online Courses (MOOCs) that seem to evoke feelings of both promise and peril for higher education, many institutions are struggling to find their niche among top-tier Ivy League schools offering courses to thousands of participants for free. While the effectiveness of MOOCs in terms of learning outcomes…

  5. Online Educational Delivery Models: A Descriptive View

    ERIC Educational Resources Information Center

    Hill, Phil

    2012-01-01

    Although there has been a long history of distance education, the creation of online education occurred just over a decade and a half ago--a relatively short time in academic terms. Early course delivery via the web had started by 1994, soon followed by a more structured approach using the new category of course management systems. Since that…

  6. ProTSAV: A protein tertiary structure analysis and validation server.

    PubMed

    Singh, Ankita; Kaushik, Rahul; Mishra, Avinash; Shanker, Asheesh; Jayaram, B

    2016-01-01

    Quality assessment of predicted model structures of proteins is as important as the protein tertiary structure prediction. A highly efficient quality assessment of predicted model structures directs further research on function. Here we present a new server ProTSAV, capable of evaluating predicted model structures based on some popular online servers and standalone tools. ProTSAV furnishes the user with a single quality score in case of individual protein structure along with a graphical representation and ranking in case of multiple protein structure assessment. The server is validated on ~64,446 protein structures including experimental structures from RCSB and predicted model structures for CASP targets and from public decoy sets. ProTSAV succeeds in predicting quality of protein structures with a specificity of 100% and a sensitivity of 98% on experimentally solved structures and achieves a specificity of 88%and a sensitivity of 91% on predicted protein structures of CASP11 targets under 2Å.The server overcomes the limitations of any single server/method and is seen to be robust in helping in quality assessment. ProTSAV is freely available at http://www.scfbio-iitd.res.in/software/proteomics/protsav.jsp. PMID:26478257

  7. The Client Server Design of the Gemini Data Handling System

    NASA Astrophysics Data System (ADS)

    Hill, Norman; Gaudet, Séverin; Dunn, Jennifer; Jaeger, Shannon; Cockayne, Steve

    The Gemini Telescopes Data Handling System (DHS) developed by the Canadian Astronomy Data Centre (CADC) has diverse requirements to support the operation of the Gemini telescopes. The DHS is implemented as a group of servers, where each performs separate functions. The servers use a client server model to communicate between themselves and with other Gemini software systems. This paper describes the client server model of the Gemini Data Handling System.

  8. oGNM: online computation of structural dynamics using the Gaussian Network Model.

    PubMed

    Yang, Lee-Wei; Rader, A J; Liu, Xiong; Jursa, Cristopher Jon; Chen, Shann Ching; Karimi, Hassan A; Bahar, Ivet

    2006-07-01

    An assessment of the equilibrium dynamics of biomolecular systems, and in particular their most cooperative fluctuations accessible under native state conditions, is a first step towards understanding molecular mechanisms relevant to biological function. We present a web-based system, oGNM that enables users to calculate online the shape and dispersion of normal modes of motion for proteins, oligonucleotides and their complexes, or associated biological units, using the Gaussian Network Model (GNM). Computations with the new engine are 5-6 orders of magnitude faster than those using conventional normal mode analyses. Two cases studies illustrate the utility of oGNM. The first shows that the thermal fluctuations predicted for 1250 non-homologous proteins correlate well with X-ray crystallographic data over a broad range [7.3-15 A] of inter-residue interaction cutoff distances and the correlations improve with increasing observation temperatures. The second study, focused on 64 oligonucleotides and oligonucleotide-protein complexes, shows that good agreement with experiments is achieved by representing each nucleotide by three GNM nodes (as opposed to one-node-per-residue in proteins) along with uniform interaction ranges for all components of the complexes. These results open the way to a rapid assessment of the dynamics of DNA/RNA-containing complexes. The server can be accessed at http://ignm.ccbb.pitt.edu/GNM_Online_Calculation.htm. PMID:16845002

  9. The NEOS server.

    SciTech Connect

    Czyzyk, J.; Mesnier, M. P.; More, J. J.; Mathematics and Computer Science

    1998-07-01

    The Network-Enabled Optimization System (NEOS) is an Internet based optimization service. The NEOS Server introduces a novel approach for solving optimization problems. Users of the NEOS Server submit a problem and their choice of optimization solver over the Internet. The NEOS Server computes all information (for example, derivatives and sparsity patterns) required by the solver, links the optimization problem with the solver, and returns a solution.

  10. CentiServer: A Comprehensive Resource, Web-Based Application and R Package for Centrality Analysis

    PubMed Central

    Jalili, Mahdi; Salehzadeh-Yazdi, Ali; Asgari, Yazdan; Arab, Seyed Shahriar; Yaghmaie, Marjan; Ghavamzadeh, Ardeshir; Alimoghaddam, Kamran

    2015-01-01

    Various disciplines are trying to solve one of the most noteworthy queries and broadly used concepts in biology, essentiality. Centrality is a primary index and a promising method for identifying essential nodes, particularly in biological networks. The newly created CentiServer is a comprehensive online resource that provides over 110 definitions of different centrality indices, their computational methods, and algorithms in the form of an encyclopedia. In addition, CentiServer allows users to calculate 55 centralities with the help of an interactive web-based application tool and provides a numerical result as a comma separated value (csv) file format or a mapped graphical format as a graph modeling language (GML) file. The standalone version of this application has been developed in the form of an R package. The web-based application (CentiServer) and R package (centiserve) are freely available at http://www.centiserver.org/ PMID:26571275

  11. Data Transfer Software-SAS MetaData Server & Phoenix Integration Model Center

    Energy Science and Technology Software Center (ESTSC)

    2010-04-15

    This software is a plug-in that interfaces between the Phoenix Integration's Model Center and the Base SAS 9.2 applications. The end use of the plug-in is to link input and output data that resides in SAS tables or MS SQL to and from "legacy" software programs without recoding. The potential end users are users who need to run legacy code and want data stored in a SQL database.

  12. Online Ph.D. Program Delivery Models and Student Success

    ERIC Educational Resources Information Center

    Jorissen, Shari L.; Keen, James P.; Riedel, Eric S.

    2015-01-01

    The purpose of this study was to provide information to an online university that offers Ph.D. programs in three formats: knowledge area modules (or KAM, a type of faculty-led, self-directed doctoral study), course-based model, and mixed model (a combination of the KAM and course-based models). The investigators sought to determine why students…

  13. Building an Online Wisdom Community: A Transformational Design Model

    ERIC Educational Resources Information Center

    Gunawardena, Charlotte N.; Jennings, Barbara; Ortegano-Layne, Ludmila C.; Frechette, Casey; Carabajal, Kayleigh; Lindemann, Ken; Mummert, Julia

    2004-01-01

    This paper discusses the development of a new instructional design model based on socioconstructivist learning theories and distance education principles for the design of online wisdom communities and the efficacy of the model drawing on evaluation results from its implementation in Fall 2002. The model, Final Outcome Centered Around Learner…

  14. A Model for Measuring Effectiveness of an Online Course

    ERIC Educational Resources Information Center

    Mashaw, Bijan

    2012-01-01

    As a result of this research, a quantitative model and a procedure have been developed to create an online mentoring effectiveness index (EI). To develop the model, mentoring and teaching effectiveness are defined, and then the constructs and factors of effectiveness are identified. The model's construction is based on the theory that…

  15. Beyond clients and servers.

    PubMed Central

    van Mulligen, E.; Timmers, T.

    1994-01-01

    Computer scientists working in medical informatics have to face the problem that software offered by industry is more and more adopted for clinical use by medical professionals. A new challenge arises of how to combine commercial solutions with typical medical software that already exists for some years and proved to be reliable with these off-the-shelf solutions [1]. With the HERMES project, this new challenge was accepted and possible solutions to integrate existing legacy systems with state-of-the-art commercial solutions have been investigated. After a period of prototyping to assess possible alternative solutions, a system based on an indirect client-server model was implemented with help of the industry. In this paper, its architecture is described together with the most important features currently covered. Based on the HERMES architecture, both systems for clinical data analysis and patient care (cardiology) are currently developed. PMID:7949988

  16. Optimal allocation of file servers in a local network environment

    NASA Technical Reports Server (NTRS)

    Woodside, C. M.; Tripathi, S. K.

    1986-01-01

    Files associated with workstations in a local area network are to be allocated among two or more file servers. Assuming statistically identical workstations and file servers and a performance model which is a closed multiclass separable queueing network, an optimal allocation is found. It is shown that all the files of each workstation should be placed on one file server, with the workstations divided as equally as possible among the file servers.

  17. Performance of a distributed superscalar storage server

    NASA Technical Reports Server (NTRS)

    Finestead, Arlan; Yeager, Nancy

    1993-01-01

    The RS/6000 performed well in our test environment. The potential exists for the RS/6000 to act as a departmental server for a small number of users, rather than as a high speed archival server. Multiple UniTree Disk Server's utilizing one UniTree Disk Server's utilizing one UniTree Name Server could be developed that would allow for a cost effective archival system. Our performance tests were clearly limited by the network bandwidth. The performance gathered by the LibUnix testing shows that UniTree is capable of exceeding ethernet speeds on an RS/6000 Model 550. The performance of FTP might be significantly faster if asked to perform across a higher bandwidth network. The UniTree Name Server also showed signs of being a potential bottleneck. UniTree sites that would require a high ratio of file creations and deletions to reads and writes would run into this bottleneck. It is possible to improve the UniTree Name Server performance by bypassing the UniTree LibUnix Library altogether and communicating directly with the UniTree Name Server and optimizing creations. Although testing was performed in a less than ideal environment, hopefully the performance statistics stated in this paper will give end-users a realistic idea as to what performance they can expect in this type of setup.

  18. KoBaMIN: a knowledge-based minimization web server for protein structure refinement

    PubMed Central

    Rodrigues, João P. G. L. M.; Levitt, Michael; Chopra, Gaurav

    2012-01-01

    The KoBaMIN web server provides an online interface to a simple, consistent and computationally efficient protein structure refinement protocol based on minimization of a knowledge-based potential of mean force. The server can be used to refine either a single protein structure or an ensemble of proteins starting from their unrefined coordinates in PDB format. The refinement method is particularly fast and accurate due to the underlying knowledge-based potential derived from structures deposited in the PDB; as such, the energy function implicitly includes the effects of solvent and the crystal environment. Our server allows for an optional but recommended step that optimizes stereochemistry using the MESHI software. The KoBaMIN server also allows comparison of the refined structures with a provided reference structure to assess the changes brought about by the refinement protocol. The performance of KoBaMIN has been benchmarked widely on a large set of decoys, all models generated at the seventh worldwide experiments on critical assessment of techniques for protein structure prediction (CASP7) and it was also shown to produce top-ranking predictions in the refinement category at both CASP8 and CASP9, yielding consistently good results across a broad range of model quality values. The web server is fully functional and freely available at http://csb.stanford.edu/kobamin. PMID:22564897

  19. KoBaMIN: a knowledge-based minimization web server for protein structure refinement.

    PubMed

    Rodrigues, João P G L M; Levitt, Michael; Chopra, Gaurav

    2012-07-01

    The KoBaMIN web server provides an online interface to a simple, consistent and computationally efficient protein structure refinement protocol based on minimization of a knowledge-based potential of mean force. The server can be used to refine either a single protein structure or an ensemble of proteins starting from their unrefined coordinates in PDB format. The refinement method is particularly fast and accurate due to the underlying knowledge-based potential derived from structures deposited in the PDB; as such, the energy function implicitly includes the effects of solvent and the crystal environment. Our server allows for an optional but recommended step that optimizes stereochemistry using the MESHI software. The KoBaMIN server also allows comparison of the refined structures with a provided reference structure to assess the changes brought about by the refinement protocol. The performance of KoBaMIN has been benchmarked widely on a large set of decoys, all models generated at the seventh worldwide experiments on critical assessment of techniques for protein structure prediction (CASP7) and it was also shown to produce top-ranking predictions in the refinement category at both CASP8 and CASP9, yielding consistently good results across a broad range of model quality values. The web server is fully functional and freely available at http://csb.stanford.edu/kobamin. PMID:22564897

  20. A Comparison Between Publish-and-Subscribe and Client-Server Models in Distributed Control System Networks

    NASA Technical Reports Server (NTRS)

    Boulanger, Richard P., Jr.; Kwauk, Xian-Min; Stagnaro, Mike; Kliss, Mark (Technical Monitor)

    1998-01-01

    The BIO-Plex control system requires real-time, flexible, and reliable data delivery. There is no simple "off-the-shelf 'solution. However, several commercial packages will be evaluated using a testbed at ARC for publish- and-subscribe and client-server communication architectures. Point-to-point communication architecture is not suitable for real-time BIO-Plex control system. Client-server architecture provides more flexible data delivery. However, it does not provide direct communication among nodes on the network. Publish-and-subscribe implementation allows direct information exchange among nodes on the net, providing the best time-critical communication. In this work Network Data Delivery Service (NDDS) from Real-Time Innovations, Inc. ARTIE will be used to implement publish-and subscribe architecture. It offers update guarantees and deadlines for real-time data delivery. Bridgestone, a data acquisition and control software package from National Instruments, will be tested for client-server arrangement. A microwave incinerator located at ARC will be instrumented with a fieldbus network of control devices. BridgeVIEW will be used to implement an enterprise server. An enterprise network consisting of several nodes at ARC and a WAN connecting ARC and RISC will then be setup to evaluate proposed control system architectures. Several network configurations will be evaluated for fault tolerance, quality of service, reliability and efficiency. Data acquired from these network evaluation tests will then be used to determine preliminary design criteria for the BIO-Plex distributed control system.

  1. XAL-Based Applications and Online Models for LCLS

    SciTech Connect

    Chu, P.; Woodley, M.; Iverson, R.; Krejcik, P.; White, G.; Wu, J.; Gan, Q.; /Beijing, Inst. High Energy Phys.

    2009-12-11

    XAL, a high-level accelerator application framework originally developed at the Spallation Neutron Source (SNS), Oak Ridge National Laboratory, has been adopted by the Linac Coherent Light Source (LCLS) project. The work includes proper relational database schema modification to better suit XAL configuration data requirement, addition of new device types for LCLS online modeling purpose, longitudinal coordinate system change to better represent the LCLS electron beam rather than proton or ion beam in the original SNS XAL design, intensively benchmark with MAD and present SLC modeling system for the online model, and various new features to the XAL framework. Storing online model data in a relational database and providing universal access methods for other applications is also described here.

  2. A last updating evolution model for online social networks

    NASA Astrophysics Data System (ADS)

    Bu, Zhan; Xia, Zhengyou; Wang, Jiandong; Zhang, Chengcui

    2013-05-01

    As information technology has advanced, people are turning to electronic media more frequently for communication, and social relationships are increasingly found on online channels. However, there is very limited knowledge about the actual evolution of the online social networks. In this paper, we propose and study a novel evolution network model with the new concept of “last updating time”, which exists in many real-life online social networks. The last updating evolution network model can maintain the robustness of scale-free networks and can improve the network reliance against intentional attacks. What is more, we also found that it has the “small-world effect”, which is the inherent property of most social networks. Simulation experiment based on this model show that the results and the real-life data are consistent, which means that our model is valid.

  3. Servers Made to Order

    SciTech Connect

    Anderson, Daryl L.

    2007-11-01

    Virtualization is a hot buzzword right now, and it’s no wonder federal agencies are coming around to the idea of consolidating their servers and storage. Traditional servers do nothing for about 80% of their lifecycle, yet use nearly half their peak energy consumption which wastes capacity and power. Server virtualization creates logical "machines" on a single physical server. At the Pacific Northwest National Laboratory in Richland, Washington, using virtualization technology is proving to be a cost-effective way to make better use of current server hardware resources while reducing hardware lifecycle costs and cooling demands, and saving precious data center space. And as an added bonus, virtualization also ties in with the Laboratory’s mission to be responsible stewards of the environment as well as the Department of Energy’s assets. This article explains why even the smallest IT shops can benefit from the Laboratory’s best practices.

  4. Teaching Communications Online Using the Master Teacher Model.

    ERIC Educational Resources Information Center

    Wilhelm, William J.

    2003-01-01

    Indiana State University's business school online education program uses the Master Teacher Model adopted from the Open University. The model employs a hierarchy of instructors: program director, master teacher, and adjunct faculty. Methods include case analyses, threaded discussions, and e-mail for interactive group communication. (Contains 10…

  5. FULLY COUPLED "ONLINE" CHEMISTRY WITHIN THE WRF MODEL

    EPA Science Inventory

    A fully coupled "online" Weather Research and Forecasting/Chemistry (WRF/Chem) model has been developed. The air quality component of the model is fully consistent with the meteorological component; both components use the same transport scheme (mass and scalar preserving), the s...

  6. R3D Align web server for global nucleotide to nucleotide alignments of RNA 3D structures.

    PubMed

    Rahrig, Ryan R; Petrov, Anton I; Leontis, Neocles B; Zirbel, Craig L

    2013-07-01

    The R3D Align web server provides online access to 'RNA 3D Align' (R3D Align), a method for producing accurate nucleotide-level structural alignments of RNA 3D structures. The web server provides a streamlined and intuitive interface, input data validation and output that is more extensive and easier to read and interpret than related servers. The R3D Align web server offers a unique Gallery of Featured Alignments, providing immediate access to pre-computed alignments of large RNA 3D structures, including all ribosomal RNAs, as well as guidance on effective use of the server and interpretation of the output. By accessing the non-redundant lists of RNA 3D structures provided by the Bowling Green State University RNA group, R3D Align connects users to structure files in the same equivalence class and the best-modeled representative structure from each group. The R3D Align web server is freely accessible at http://rna.bgsu.edu/r3dalign/. PMID:23716643

  7. R3D Align web server for global nucleotide to nucleotide alignments of RNA 3D structures

    PubMed Central

    Rahrig, Ryan R.; Petrov, Anton I.; Leontis, Neocles B.; Zirbel, Craig L.

    2013-01-01

    The R3D Align web server provides online access to ‘RNA 3D Align’ (R3D Align), a method for producing accurate nucleotide-level structural alignments of RNA 3D structures. The web server provides a streamlined and intuitive interface, input data validation and output that is more extensive and easier to read and interpret than related servers. The R3D Align web server offers a unique Gallery of Featured Alignments, providing immediate access to pre-computed alignments of large RNA 3D structures, including all ribosomal RNAs, as well as guidance on effective use of the server and interpretation of the output. By accessing the non-redundant lists of RNA 3D structures provided by the Bowling Green State University RNA group, R3D Align connects users to structure files in the same equivalence class and the best-modeled representative structure from each group. The R3D Align web server is freely accessible at http://rna.bgsu.edu/r3dalign/. PMID:23716643

  8. Remote diagnosis server

    NASA Technical Reports Server (NTRS)

    Deb, Somnath (Inventor); Ghoshal, Sudipto (Inventor); Malepati, Venkata N. (Inventor); Kleinman, David L. (Inventor); Cavanaugh, Kevin F. (Inventor)

    2004-01-01

    A network-based diagnosis server for monitoring and diagnosing a system, the server being remote from the system it is observing, comprises a sensor for generating signals indicative of a characteristic of a component of the system, a network-interfaced sensor agent coupled to the sensor for receiving signals therefrom, a broker module coupled to the network for sending signals to and receiving signals from the sensor agent, a handler application connected to the broker module for transmitting signals to and receiving signals therefrom, a reasoner application in communication with the handler application for processing, and responding to signals received from the handler application, wherein the sensor agent, broker module, handler application, and reasoner applications operate simultaneously relative to each other, such that the present invention diagnosis server performs continuous monitoring and diagnosing of said components of the system in real time. The diagnosis server is readily adaptable to various different systems.

  9. An individuality model for online signatures using global Fourier descriptors

    NASA Astrophysics Data System (ADS)

    Kholmatov, Alisher; Yanikoglu, Berrin

    2008-03-01

    The discriminative capability of a biometric is based on its individuality/uniqueness and is an important factor in choosing a biometric for a large-scale deployment. Individuality studies have been carried out rigorously for only certain biometrics, in particular fingerprint and iris, while work on establishing handwriting and signature individuality has been mainly on feature level. In this study, we present a preliminary individuality model for online signatures using the Fourier domain representation of the signature. Using the normalized Fourier coefficients as global features describing the signature, we derive a formula for the probability of coincidentally matching a given signature. Estimating model parameters from a large database and making certain simplifying assumptions, the probability of two arbitrary signatures to match in 13 of the coefficients is calculated as 4.7x10 -4. When compared with the results of a verification algorithm that parallels the theoretical model, the results show that the theoretical model fits the random forgery test results fairly well. While online signatures are sometimes dismissed as not very secure, our results show that the probability of successfully guessing an online signature is very low. Combined with the fact that signature is a behavioral biometric with adjustable complexity, these results support the use of online signatures for biometric authentication.

  10. Keeping Our Network Safe: A Model of Online Protection Behaviour

    ERIC Educational Resources Information Center

    Lee, Doohwang; Larose, Robert; Rifon, Nora

    2008-01-01

    The objective of this study is to develop and test a model of online protection behaviour, particularly regarding the use of virus protection. Hypotheses are proposed concerning the predictors of the intention to engage in virus protection behaviour. Using a survey of 273 college students who use the Internet, a test of the hypotheses is conducted…

  11. Designing a Predictive Model of Student Satisfaction in Online Learning

    ERIC Educational Resources Information Center

    Parahoo, Sanjai K; Santally, Mohammad Issack; Rajabalee, Yousra; Harvey, Heather Lea

    2016-01-01

    Higher education institutions consider student satisfaction to be one of the major elements in determining the quality of their programs. The objective of the study was to develop a model of student satisfaction to identify the influencers that emerged in online higher education settings. The study adopted a mixed method approach to identify…

  12. A Distributed Online Curriculum and Courseware Development Model

    ERIC Educational Resources Information Center

    Durdu, Pinar Onay; Yalabik, Nese; Cagiltay, Kursat

    2009-01-01

    A distributed online curriculum and courseware development model (DONC[superscript 2]) is developed and tested in this study. Courseware development teams which may work in different institutions who need to develop high quality, reduced cost, on time products will be the users of DONC[superscript 2]. The related features from the disciplines of…

  13. A Performance-Based Development Model for Online Faculty

    ERIC Educational Resources Information Center

    Fang, Berlin

    2007-01-01

    Faculty development in distance education does not happen in a vacuum. It is often interwoven with efforts to increase adoption of distance education programs and increase the effectiveness of online teaching. Training might not be the only way to meet these needs. This article presents a new faculty-development model, based on a systematic…

  14. Building a Model Explaining the Social Nature of Online Learning

    ERIC Educational Resources Information Center

    Tsai, I-Chun; Kim, Bosung; Liu, Pei-Ju; Goggins, Sean P.; Kumalasari, Christiana; Laffey, James M.

    2008-01-01

    Based on a framework emphasizing the social nature of learning, this research examines a model of how social constructs affect satisfaction within online learning using path analysis for students in higher education. The social constructs evaluated in this study include sense of community (SOC), social ability (SA), perceived ease of use (PEU) and…

  15. Collaborative Online Teaching: A Model for Gerontological Social Work Education

    ERIC Educational Resources Information Center

    Fulton, Amy E.; Walsh, Christine A.; Azulai, Anna; Gulbrandsen, Cari; Tong, Hongmei

    2015-01-01

    Social work students and faculty are increasingly embracing online education and collaborative teaching. Yet models to support these activities have not been adequately developed. This paper describes how a team of instructors developed, delivered, and evaluated an undergraduate gerontological social work course using a collaborative online…

  16. Home media server content management

    NASA Astrophysics Data System (ADS)

    Tokmakoff, Andrew A.; van Vliet, Harry

    2001-07-01

    With the advent of set-top boxes, the convergence of TV (broadcasting) and PC (Internet) is set to enter the home environment. Currently, a great deal of activity is occurring in developing standards (TV-Anytime Forum) and devices (TiVo) for local storage on Home Media Servers (HMS). These devices lie at the heart of convergence of the triad: communications/networks - content/media - computing/software. Besides massive storage capacity and being a communications 'gateway', the home media server is characterised by the ability to handle metadata and software that provides an easy to use on-screen interface and intelligent search/content handling facilities. In this paper, we describe a research prototype HMS that is being developed within the GigaCE project at the Telematica Instituut . Our prototype demonstrates advanced search and retrieval (video browsing), adaptive user profiling and an innovative 3D component of the Electronic Program Guide (EPG) which represents online presence. We discuss the use of MPEG-7 for representing metadata, the use of MPEG-21 working draft standards for content identification, description and rights expression, and the use of HMS peer-to-peer content distribution approaches. Finally, we outline explorative user behaviour experiments that aim to investigate the effectiveness of the prototype HMS during development.

  17. Blast furnace on-line simulation model

    NASA Astrophysics Data System (ADS)

    Saxén, Henrik

    1990-10-01

    A mathematical model of the ironmaking blast furnace (BF) is presented. The model describes the steady-state operation of the furnace in one spatial dimension using real process data sampled at the steelworks. The measurement data are reconciled by an interface routine which yields boundary conditions obeying the conservation laws of atoms and energy. The simulation model, which provides a picture of the internal conditions of the BF, can be used to evaluate the current state of the process and to predict the effect of operating actions on the performance of the furnace.

  18. A simple generative model of collective online behavior

    PubMed Central

    Gleeson, James P.; Cellai, Davide; Onnela, Jukka-Pekka; Porter, Mason A.; Reed-Tsochas, Felix

    2014-01-01

    Human activities increasingly take place in online environments, providing novel opportunities for relating individual behaviors to population-level outcomes. In this paper, we introduce a simple generative model for the collective behavior of millions of social networking site users who are deciding between different software applications. Our model incorporates two distinct mechanisms: one is associated with recent decisions of users, and the other reflects the cumulative popularity of each application. Importantly, although various combinations of the two mechanisms yield long-time behavior that is consistent with data, the only models that reproduce the observed temporal dynamics are those that strongly emphasize the recent popularity of applications over their cumulative popularity. This demonstrates—even when using purely observational data without experimental design—that temporal data-driven modeling can effectively distinguish between competing microscopic mechanisms, allowing us to uncover previously unidentified aspects of collective online behavior. PMID:25002470

  19. A simple generative model of collective online behavior.

    PubMed

    Gleeson, James P; Cellai, Davide; Onnela, Jukka-Pekka; Porter, Mason A; Reed-Tsochas, Felix

    2014-07-22

    Human activities increasingly take place in online environments, providing novel opportunities for relating individual behaviors to population-level outcomes. In this paper, we introduce a simple generative model for the collective behavior of millions of social networking site users who are deciding between different software applications. Our model incorporates two distinct mechanisms: one is associated with recent decisions of users, and the other reflects the cumulative popularity of each application. Importantly, although various combinations of the two mechanisms yield long-time behavior that is consistent with data, the only models that reproduce the observed temporal dynamics are those that strongly emphasize the recent popularity of applications over their cumulative popularity. This demonstrates--even when using purely observational data without experimental design--that temporal data-driven modeling can effectively distinguish between competing microscopic mechanisms, allowing us to uncover previously unidentified aspects of collective online behavior. PMID:25002470

  20. Submersed Aquatic Vegetation Modeling Output Online

    USGS Publications Warehouse

    Yin, Yao; Rogala, Jim; Sullivan, John; Rohweder, Jason J.

    2005-01-01

    Introduction The ability to predict the distribution of submersed aquatic vegetation in the Upper Mississippi River on the basis of physical or chemical variables is useful to resource managers. Wildlife managers have a keen interest in advanced estimates of food quantity such as American wildcelery (Vallisneria americana) population status to give out more informed advisories to hunters before the fall hunting season. Predictions for distribution of submerged aquatic vegetation beds can potentially increase hunter observance of voluntary avoidance zones where foraging birds are left alone to feed undisturbed. In years when submersed aquatic vegetation is predicted to be scarce in important wildlife habitats, managers can get the message out to hunters well before the hunting season (Jim Nissen, Upper Mississippi River National Wildlife and Fish Refuge, La Crosse District Manager, La Crosse, Wisconsin, personal communication). We developed a statistical model to predict the probability of occurrence of submersed aquatic vegetation in Pool 8 of the Upper Mississippi River on the basis of a few hydrological, physical, and geomorphic variables. Our model takes into consideration flow velocity, wind fetch, bathymetry, growing-season daily water level, and light extinction coefficient in the river (fig. 1) and calculates the probability of submersed aquatic vegetation existence in Pool 8 in individual 5- x 5-m grid cells. The model was calibrated using the data collected in 1998 (516 sites), 1999 (595 sites), and 2000 (649 sites) using a stratified random sampling protocol (Yin and others, 2000b). To validate the model, we chose the data from the Long Term Resource Monitoring Program (LTRMP) transect sampling in backwater areas (Rogers and Owens 1995; Yin and others, 2000a) and ran the model for each 5- x 5-m grid cell in every growing season from 1991 to 2001. We tallied all the cells and came up with an annual average percent frequency of submersed aquatic vegetation

  1. Online Knowledge-Based Model for Big Data Topic Extraction

    PubMed Central

    Khan, Muhammad Taimoor; Durrani, Mehr; Khalid, Shehzad; Aziz, Furqan

    2016-01-01

    Lifelong machine learning (LML) models learn with experience maintaining a knowledge-base, without user intervention. Unlike traditional single-domain models they can easily scale up to explore big data. The existing LML models have high data dependency, consume more resources, and do not support streaming data. This paper proposes online LML model (OAMC) to support streaming data with reduced data dependency. With engineering the knowledge-base and introducing new knowledge features the learning pattern of the model is improved for data arriving in pieces. OAMC improves accuracy as topic coherence by 7% for streaming data while reducing the processing cost to half. PMID:27195004

  2. Visible Geology - Interactive online geologic block modelling

    NASA Astrophysics Data System (ADS)

    Cockett, R.

    2012-12-01

    Geology is a highly visual science, and many disciplines require spatial awareness and manipulation. For example, interpreting cross-sections, geologic maps, or plotting data on a stereonet all require various levels of spatial abilities. These skills are often not focused on in undergraduate geoscience curricula and many students struggle with spatial relations, manipulations, and penetrative abilities (e.g. Titus & Horsman, 2009). A newly developed program, Visible Geology, allows for students to be introduced to many geologic concepts and spatial skills in a virtual environment. Visible Geology is a web-based, three-dimensional environment where students can create and interrogate their own geologic block models. The program begins with a blank model, users then add geologic beds (with custom thickness and color) and can add geologic deformation events like tilting, folding, and faulting. Additionally, simple intrusive dikes can be modelled, as well as unconformities. Students can also explore the interaction of geology with topography by drawing elevation contours to produce their own topographic models. Students can not only spatially manipulate their model, but can create cross-sections and boreholes to practice their visual penetrative abilities. Visible Geology is easy to access and use, with no downloads required, so it can be incorporated into current, paper-based, lab activities. Sample learning activities are being developed that target introductory and structural geology curricula with learning objectives such as relative geologic history, fault characterization, apparent dip and thickness, interference folding, and stereonet interpretation. Visible Geology provides a richly interactive, and immersive environment for students to explore geologic concepts and practice their spatial skills.; Screenshot of Visible Geology showing folding and faulting interactions on a ridge topography.

  3. Characterizing and Modeling the Dynamics of Online Popularity

    NASA Astrophysics Data System (ADS)

    Ratkiewicz, Jacob; Fortunato, Santo; Flammini, Alessandro; Menczer, Filippo; Vespignani, Alessandro

    2010-10-01

    Online popularity has an enormous impact on opinions, culture, policy, and profits. We provide a quantitative, large scale, temporal analysis of the dynamics of online content popularity in two massive model systems: the Wikipedia and an entire country’s Web space. We find that the dynamics of popularity are characterized by bursts, displaying characteristic features of critical systems such as fat-tailed distributions of magnitude and interevent time. We propose a minimal model combining the classic preferential popularity increase mechanism with the occurrence of random popularity shifts due to exogenous factors. The model recovers the critical features observed in the empirical analysis of the systems analyzed here, highlighting the key factors needed in the description of popularity dynamics.

  4. PEM public key certificate cache server

    NASA Astrophysics Data System (ADS)

    Cheung, T.

    1993-12-01

    Privacy Enhanced Mail (PEM) provides privacy enhancement services to users of Internet electronic mail. Confidentiality, authentication, message integrity, and non-repudiation of origin are provided by applying cryptographic measures to messages transferred between end systems by the Message Transfer System. PEM supports both symmetric and asymmetric key distribution. However, the prevalent implementation uses a public key certificate-based strategy, modeled after the X.509 directory authentication framework. This scheme provides an infrastructure compatible with X.509. According to RFC 1422, public key certificates can be stored in directory servers, transmitted via non-secure message exchanges, or distributed via other means. Directory services provide a specialized distributed database for OSI applications. The directory contains information about objects and then provides structured mechanisms for accessing that information. Since directory services are not widely available now, a good approach is to manage certificates in a centralized certificate server. This document describes the detailed design of a centralized certificate cache serve. This server manages a cache of certificates and a cache of Certificate Revocation Lists (CRL's) for PEM applications. PEMapplications contact the server to obtain/store certificates and CRL's. The server software is programmed in C and ELROS. To use this server, ISODE has to be configured and installed properly. The ISODE library 'libisode.a' has to be linked together with this library because ELROS uses the transport layer functions provided by 'libisode.a.' The X.500 DAP library that is included with the ELROS distribution has to be linked in also, since the server uses the DAP library functions to communicate with directory servers.

  5. FFAS server: novel features and applications.

    PubMed

    Jaroszewski, Lukasz; Li, Zhanwen; Cai, Xiao-hui; Weber, Christoph; Godzik, Adam

    2011-07-01

    The Fold and Function Assignment System (FFAS) server [Jaroszewski et al. (2005) FFAS03: a server for profile-profile sequence alignments. Nucleic Acids Research, 33, W284-W288] implements the algorithm for protein profile-profile alignment introduced originally in [Rychlewski et al. (2000) Comparison of sequence profiles. Strategies for structural predictions using sequence information. Protein Science: a Publication of the Protein Society, 9, 232-241]. Here, we present updates, changes and novel functionality added to the server since 2005 and discuss its new applications. The sequence database used to calculate sequence profiles was enriched by adding sets of publicly available metagenomic sequences. The profile of a user's protein can now be compared with ∼20 additional profile databases, including several complete proteomes, human proteins involved in genetic diseases and a database of microbial virulence factors. A newly developed interface uses a system of tabs, allowing the user to navigate multiple results pages, and also includes novel functionality, such as a dotplot graph viewer, modeling tools, an improved 3D alignment viewer and links to the database of structural similarities. The FFAS server was also optimized for speed: running times were reduced by an order of magnitude. The FFAS server, http://ffas.godziklab.org, has no log-in requirement, albeit there is an option to register and store results in individual, password-protected directories. Source code and Linux executables for the FFAS program are available for download from the FFAS server. PMID:21715387

  6. Online robust model estimation during in vivo needle insertions.

    PubMed

    Barbé, Laurent; Bayle, Bernard; de Mathelin, Michel

    2006-01-01

    Soft tissue modeling is of key importance in medical robotics and simulation. In the case of percutaneous operations, a fine model of layers transitions and target tissues is required. However, the nature and the variety of these tissues is such that this problem is extremely complex. In this article, we propose a method to estimate the interaction between in vivo tissues and a surgical needle. The online robust estimation of a varying parameters model is achieved during an insertion in standard operating conditions. PMID:16404010

  7. Dali server update.

    PubMed

    Holm, Liisa; Laakso, Laura M

    2016-07-01

    The Dali server (http://ekhidna2.biocenter.helsinki.fi/dali) is a network service for comparing protein structures in 3D. In favourable cases, comparing 3D structures may reveal biologically interesting similarities that are not detectable by comparing sequences. The Dali server has been running in various places for over 20 years and is used routinely by crystallographers on newly solved structures. The latest update of the server provides enhanced analytics for the study of sequence and structure conservation. The server performs three types of structure comparisons: (i) Protein Data Bank (PDB) search compares one query structure against those in the PDB and returns a list of similar structures; (ii) pairwise comparison compares one query structure against a list of structures specified by the user; and (iii) all against all structure comparison returns a structural similarity matrix, a dendrogram and a multidimensional scaling projection of a set of structures specified by the user. Structural superimpositions are visualized using the Java-free WebGL viewer PV. The structural alignment view is enhanced by sequence similarity searches against Uniprot. The combined structure-sequence alignment information is compressed to a stack of aligned sequence logos. In the stack, each structure is structurally aligned to the query protein and represented by a sequence logo. PMID:27131377

  8. An Instrument Development Model for Online Surveys in Human Resource Development and Adult Education

    ERIC Educational Resources Information Center

    Strachota, Elaine M.; Conceicao, Simone C. O.; Schmidt, Steven W.

    2006-01-01

    This article describes the use of a schematic model for developing and distributing online surveys. Two empirical studies that developed and implemented online surveys to collect data to measure satisfaction in various aspects of human resource development and adult education exemplify the use of the model to conduct online survey research. The…

  9. Feedback control by online learning an inverse model.

    PubMed

    Waegeman, Tim; Wyffels, Francis; Schrauwen, Francis

    2012-10-01

    A model, predictor, or error estimator is often used by a feedback controller to control a plant. Creating such a model is difficult when the plant exhibits nonlinear behavior. In this paper, a novel online learning control framework is proposed that does not require explicit knowledge about the plant. This framework uses two learning modules, one for creating an inverse model, and the other for actually controlling the plant. Except for their inputs, they are identical. The inverse model learns by the exploration performed by the not yet fully trained controller, while the actual controller is based on the currently learned model. The proposed framework allows fast online learning of an accurate controller. The controller can be applied on a broad range of tasks with different dynamic characteristics. We validate this claim by applying our control framework on several control tasks: 1) the heating tank problem (slow nonlinear dynamics); 2) flight pitch control (slow linear dynamics); and 3) the balancing problem of a double inverted pendulum (fast linear and nonlinear dynamics). The results of these experiments show that fast learning and accurate control can be achieved. Furthermore, a comparison is made with some classical control approaches, and observations concerning convergence and stability are made. PMID:24808008

  10. A Hybrid Evaluation Model for Evaluating Online Professional Development

    ERIC Educational Resources Information Center

    Hahs-Vaughn, Debbie; Zygouris-Coe, Vicky; Fiedler, Rebecca

    2007-01-01

    Online professional development is multidimensional. It encompasses: a) an online, web-based format; b) professional development; and most likely c) specific objectives tailored to and created for the respective online professional development course. Evaluating online professional development is therefore also multidimensional and as such both…

  11. On-line lower-order modeling via neural networks.

    PubMed

    Ho, H F; Rad, A B; Wong, Y K; Lo, W L

    2003-10-01

    This paper presents a novel method to determine the parameters of a first-order plus dead-time model using neural networks. The outputs of the neural networks are the gain, dominant time constant, and apparent time delay. By combining this algorithm with a conventional PI or PID controller, we also present an adaptive controller which requires very little a priori knowledge about the plant under control. The simplicity of the scheme for real-time control provides a new approach for implementing neural network applications for a variety of on-line industrial control problems. Simulation and experimental results demonstrate the feasibility and adaptive property of the proposed scheme. PMID:14582882

  12. Time dependent optimal switching controls in online selling models

    SciTech Connect

    Bradonjic, Milan; Cohen, Albert

    2010-01-01

    We present a method to incorporate dishonesty in online selling via a stochastic optimal control problem. In our framework, the seller wishes to maximize her average wealth level W at a fixed time T of her choosing. The corresponding Hamilton-Jacobi-Bellmann (HJB) equation is analyzed for a basic case. For more general models, the admissible control set is restricted to a jump process that switches between extreme values. We propose a new approach, where the optimal control problem is reduced to a multivariable optimization problem.

  13. Responses and Influences: A Model of Online Information Use for Learning

    ERIC Educational Resources Information Center

    Hughes, Hilary

    2006-01-01

    Introduction: Explores the complexity of online information use for learning in the culturally-diverse, information and communication technologies-intensive, higher education context. It presents a Model of responses and influences in online information use for learning, which aims to increase awareness of the complexity of online information use…

  14. OPC Data Acquisition Server for CPDev Engineering Environment

    NASA Astrophysics Data System (ADS)

    Rzońca, Dariusz; Sadolewski, Jan; Trybus, Bartosz

    OPC Server has been created for the CPDev engineering environment, which provides classified process data for OPC client applications. Hierarchical Coloured Petri nets are used at design stage to model communications of the server with CPDev target controllers. Implementation involves an universal interface for acquisition data via different communication protocols like Modbus or .NET Remoting.

  15. Dynamic Web Pages: Performance Impact on Web Servers.

    ERIC Educational Resources Information Center

    Kothari, Bhupesh; Claypool, Mark

    2001-01-01

    Discussion of Web servers and requests for dynamic pages focuses on experimentally measuring and analyzing the performance of the three dynamic Web page generation technologies: CGI, FastCGI, and Servlets. Develops a multivariate linear regression model and predicts Web server performance under some typical dynamic requests. (Author/LRW)

  16. Microsoft SQL Server 6.0{reg_sign} Workbook

    SciTech Connect

    Augustenborg, E.C.

    1996-09-01

    This workbook was prepared for introductory training in the use of Microsoft SQL Server Version 6.0. The examples are all taken from the PUBS database that Microsoft distributes for training purposes or from the Microsoft Online Documentation. The merits of the relational database are presented.

  17. Enhanced networked server management with random remote backups

    NASA Astrophysics Data System (ADS)

    Kim, Song-Kyoo

    2003-08-01

    In this paper, the model is focused on available server management in network environments. The (remote) backup servers are hooked up by VPN (Virtual Private Network) and replace broken main severs immediately. A virtual private network (VPN) is a way to use a public network infrastructure and hooks up long-distance servers within a single network infrastructure. The servers can be represent as "machines" and then the system deals with main unreliable and random auxiliary spare (remote backup) machines. When the system performs a mandatory routine maintenance, auxiliary machines are being used for backups during idle periods. Unlike other existing models, the availability of auxiliary machines is changed for each activation in this enhanced model. Analytically tractable results are obtained by using several mathematical techniques and the results are demonstrated in the framework of optimized networked server allocation problems.

  18. PDS: A Performance Database Server

    DOE PAGESBeta

    Berry, Michael W.; Dongarra, Jack J.; Larose, Brian H.; Letsche, Todd A.

    1994-01-01

    The process of gathering, archiving, and distributing computer benchmark data is a cumbersome task usually performed by computer users and vendors with little coordination. Most important, there is no publicly available central depository of performance data for all ranges of machines from personal computers to supercomputers. We present an Internet-accessible performance database server (PDS) that can be used to extract current benchmark data and literature. As an extension to the X-Windows-based user interface (Xnetlib) to the Netlib archival system, PDS provides an on-line catalog of public domain computer benchmarks such as the LINPACK benchmark, Perfect benchmarks, and the NAS parallelmore » benchmarks. PDS does not reformat or present the benchmark data in any way that conflicts with the original methodology of any particular benchmark; it is thereby devoid of any subjective interpretations of machine performance. We believe that all branches (research laboratories, academia, and industry) of the general computing community can use this facility to archive performance metrics and make them readily available to the public. PDS can provide a more manageable approach to the development and support of a large dynamic database of published performance metrics.« less

  19. An online model composition tool for system biology models

    PubMed Central

    2013-01-01

    Background There are multiple representation formats for Systems Biology computational models, and the Systems Biology Markup Language (SBML) is one of the most widely used. SBML is used to capture, store, and distribute computational models by Systems Biology data sources (e.g., the BioModels Database) and researchers. Therefore, there is a need for all-in-one web-based solutions that support advance SBML functionalities such as uploading, editing, composing, visualizing, simulating, querying, and browsing computational models. Results We present the design and implementation of the Model Composition Tool (Interface) within the PathCase-SB (PathCase Systems Biology) web portal. The tool helps users compose systems biology models to facilitate the complex process of merging systems biology models. We also present three tools that support the model composition tool, namely, (1) Model Simulation Interface that generates a visual plot of the simulation according to user’s input, (2) iModel Tool as a platform for users to upload their own models to compose, and (3) SimCom Tool that provides a side by side comparison of models being composed in the same pathway. Finally, we provide a web site that hosts BioModels Database models and a separate web site that hosts SBML Test Suite models. Conclusions Model composition tool (and the other three tools) can be used with little or no knowledge of the SBML document structure. For this reason, students or anyone who wants to learn about systems biology will benefit from the described functionalities. SBML Test Suite models will be a nice starting point for beginners. And, for more advanced purposes, users will able to access and employ models of the BioModels Database as well. PMID:24006914

  20. Client/server study

    NASA Technical Reports Server (NTRS)

    Dezhgosha, Kamyar; Marcus, Robert; Brewster, Stephen

    1995-01-01

    The goal of this project is to find cost-effective and efficient strategies/solutions to integrate existing databases, manage network, and improve productivity of users in a move towards client/server and Integrated Desktop Environment (IDE) at NASA LeRC. The project consisted of two tasks as follows: (1) Data collection, and (2) Database Development/Integration. Under task 1, survey questionnaires and a database were developed. Also, an investigation on commercially available tools for automated data-collection and net-management was performed. As requirements evolved, the main focus has been task 2 which involved the following subtasks: (1) Data gathering/analysis of database user requirements, (2) Database analysis and design, making recommendations for modification of existing data structures into relational database or proposing a common interface to access heterogeneous databases(INFOMAN system, CCNS equipment list, CCNS software list, USERMAN, and other databases), (3) Establishment of a client/server test bed at Central State University (CSU), (4) Investigation of multi-database integration technologies/ products for IDE at NASA LeRC, and (5) Development of prototypes using CASE tools (Object/View) for representative scenarios accessing multi-databases and tables in a client/server environment. Both CSU and NASA LeRC have benefited from this project. CSU team investigated and prototyped cost-effective/practical solutions to facilitate NASA LeRC move to a more productive environment. CSU students utilized new products and gained skills that could be a great resource for future needs of NASA.

  1. Online NARMAX model for electron fluxes at GEO

    NASA Astrophysics Data System (ADS)

    Boynton, R. J.; Balikhin, M. A.; Billings, S. A.

    2015-03-01

    Multi-input single-output (MISO) nonlinear autoregressive moving average with exogenous inputs (NARMAX) models have been derived to forecast the > 0.8 MeV and > 2 MeV electron fluxes at geostationary Earth orbit (GEO). The NARMAX algorithm is able to identify mathematical model for a wide class of nonlinear systems from input-output data. The models employ solar wind parameters as inputs to provide an estimate of the average electron flux for the following day, i.e. the 1-day forecast. The identified models are shown to provide a reliable forecast for both > 0.8 and > 2 MeV electron fluxes and are capable of providing real-time warnings of when the electron fluxes will be dangerously high for satellite systems. These models, named SNB3GEO > 0.8 and > 2 MeV electron flux models, have been implemented online at http://www.ssg.group.shef.ac.uk/USSW/UOSSW.html.

  2. Modelling Influence and Opinion Evolution in Online Collective Behaviour

    PubMed Central

    Gend, Pascal; Rentfrow, Peter J.; Hendrickx, Julien M.; Blondel, Vincent D.

    2016-01-01

    Opinion evolution and judgment revision are mediated through social influence. Based on a large crowdsourced in vitro experiment (n = 861), it is shown how a consensus model can be used to predict opinion evolution in online collective behaviour. It is the first time the predictive power of a quantitative model of opinion dynamics is tested against a real dataset. Unlike previous research on the topic, the model was validated on data which did not serve to calibrate it. This avoids to favor more complex models over more simple ones and prevents overfitting. The model is parametrized by the influenceability of each individual, a factor representing to what extent individuals incorporate external judgments. The prediction accuracy depends on prior knowledge on the participants’ past behaviour. Several situations reflecting data availability are compared. When the data is scarce, the data from previous participants is used to predict how a new participant will behave. Judgment revision includes unpredictable variations which limit the potential for prediction. A first measure of unpredictability is proposed. The measure is based on a specific control experiment. More than two thirds of the prediction errors are found to occur due to unpredictability of the human judgment revision process rather than to model imperfection. PMID:27336834

  3. Online decision support based on modeling with the aim of increased irrigation efficiency

    NASA Astrophysics Data System (ADS)

    Dövényi-Nagy, Tamás; Bakó, Károly; Molnár, Krisztina; Rácz, Csaba; Vasvári, Gyula; Nagy, János; Dobos, Attila

    2015-04-01

    to allow the integration of several public available models and algorithms adapted to local climate (Rácz et al., 2013). The service, the server side framework, scripts and the front-end, providing access to the measured and modeled data, are based on own developments or free available and/or open source softwares and services like Apache, PHP, MySQL and Google Maps API. MetAgro intends to accomplish functionalities of three different areas of usage: research, education and practice. The members differ in educational background, knowledge of models and possibilities to access relevant input data. The system and interfaces must reflect these differences that is accomplished by the degradation of modeling: choosing the place of the farm and the crop already gives some general results, but with every additional parameter given the results are more reliable. The system 'MetAgro' provides a basis for improved decision-making with regard to irrigation on cropland. Based on experiences and feedback, the online application was proved to be useful in the design and practice of reasonable irrigation. In addition to its use in irrigation practice, MetAgro is also a valuable tool for research and education.

  4. Frame architecture for video servers

    NASA Astrophysics Data System (ADS)

    Venkatramani, Chitra; Kienzle, Martin G.

    1999-11-01

    Video is inherently frame-oriented and most applications such as commercial video processing require to manipulate video in terms of frames. However, typical video servers treat videos as byte streams and perform random access based on approximate byte offsets to be supplied by the client. They do not provide frame or timecode oriented API which is essential for many applications. This paper describes a frame-oriented architecture for video servers. It also describes the implementation in the context of IBM's VideoCharger server. The later part of the paper describes an application that uses the frame architecture and provides fast and slow-motion scanning capabilities to the server.

  5. TMOC: A Model for Lecturers' Training to Management of Online Courses in Higher-Education

    ERIC Educational Resources Information Center

    Ghilay, Yaron; Ghilay, Ruth

    2014-01-01

    The study examined a new model called TMOC: Training to Management of Online Courses. The model is designed to train lecturers in higher-education to successfully create, deliver and develop online courses. The research was based on a sample of lecturers, who studied in a course based on the new model at the Mofet Institute in Tel-Aviv (n = 20).…

  6. Towards the development of an on-line model error identification system for land surface models

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Due to the complexity of potential error sources in land surface models, the accurate specification of model error parameters has emerged as a major challenge in the development of effective land data assimilation systems for hydrologic and hydro-climatic applications. While several on-line procedur...

  7. PACS image security server

    NASA Astrophysics Data System (ADS)

    Cao, Fei; Huang, H. K.

    2004-04-01

    Medical image security in a PACS environment has become a pressing issue as communications of images increasingly extends over open networks, and hospitals are currently hard-pushed by Health Insurance Portability and Accountability Act (HIPAA) to be HIPPA complaint for ensuring health data security. Other security-related guidelines and technical standards continue bringing to the public attention in healthcare. However, there is not an infrastructure or systematic method to implement and deploy these standards in a PACS. In this paper, we first review DICOM Part15 standard for secure communications of medical images and the HIPAA impacts on PACS security, as well as our previous works on image security. Then we outline a security infrastructure in a HIPAA mandated PACS environment using a dedicated PACS image security server. The server manages its own database of all image security information. It acts as an image Authority for checking and certificating the image origin and integrity upon request by a user, as a secure DICOM gateway to the outside connections and meanwhile also as a PACS operation monitor for HIPAA supporting information.

  8. BUILDING ROBUST APPEARANCE MODELS USING ON-LINE FEATURE SELECTION

    SciTech Connect

    PORTER, REID B.; LOVELAND, ROHAN; ROSTEN, ED

    2007-01-29

    In many tracking applications, adapting the target appearance model over time can improve performance. This approach is most popular in high frame rate video applications where latent variables, related to the objects appearance (e.g., orientation and pose), vary slowly from one frame to the next. In these cases the appearance model and the tracking system are tightly integrated, and latent variables are often included as part of the tracking system's dynamic model. In this paper we describe our efforts to track cars in low frame rate data (1 frame/second) acquired from a highly unstable airborne platform. Due to the low frame rate, and poor image quality, the appearance of a particular vehicle varies greatly from one frame to the next. This leads us to a different problem: how can we build the best appearance model from all instances of a vehicle we have seen so far. The best appearance model should maximize the future performance of the tracking system, and maximize the chances of reacquiring the vehicle once it leaves the field of view. We propose an online feature selection approach to this problem and investigate the performance and computational trade-offs with a real-world dataset.

  9. Modelling human mobility patterns using photographic data shared online.

    PubMed

    Barchiesi, Daniele; Preis, Tobias; Bishop, Steven; Moat, Helen Susannah

    2015-08-01

    Humans are inherently mobile creatures. The way we move around our environment has consequences for a wide range of problems, including the design of efficient transportation systems and the planning of urban areas. Here, we gather data about the position in space and time of about 16 000 individuals who uploaded geo-tagged images from locations within the UK to the Flickr photo-sharing website. Inspired by the theory of Lévy flights, which has previously been used to describe the statistical properties of human mobility, we design a machine learning algorithm to infer the probability of finding people in geographical locations and the probability of movement between pairs of locations. Our findings are in general agreement with official figures in the UK and on travel flows between pairs of major cities, suggesting that online data sources may be used to quantify and model large-scale human mobility patterns. PMID:26361545

  10. Modelling human mobility patterns using photographic data shared online

    PubMed Central

    Barchiesi, Daniele; Preis, Tobias; Bishop, Steven; Moat, Helen Susannah

    2015-01-01

    Humans are inherently mobile creatures. The way we move around our environment has consequences for a wide range of problems, including the design of efficient transportation systems and the planning of urban areas. Here, we gather data about the position in space and time of about 16 000 individuals who uploaded geo-tagged images from locations within the UK to the Flickr photo-sharing website. Inspired by the theory of Lévy flights, which has previously been used to describe the statistical properties of human mobility, we design a machine learning algorithm to infer the probability of finding people in geographical locations and the probability of movement between pairs of locations. Our findings are in general agreement with official figures in the UK and on travel flows between pairs of major cities, suggesting that online data sources may be used to quantify and model large-scale human mobility patterns. PMID:26361545

  11. Online Nucleosynthesis

    NASA Astrophysics Data System (ADS)

    Meyer Jordan, Bradley, IV; The, Lih-Sin; Robbins, Stuart

    2004-05-01

    Nuclear-reaction network codes are important to astronomers seeking to explore nucleosynthetic implications of astrophysical models and to nuclear physicists seeking to understand the role of nuclear properties or reaction rates in element formation. However, many users do not have the time or inclination to download and compile the codes, to manage the requisite input files, or to explore the often complex output with their own graphics programs. To help make nucleosynthesis calculations more readily available, we have placed the Clemson Nucleosynthesis code on the world-wide web at http://www.ces.clemson.edu/physics/nucleo/nuclearNetwork At this web site, any Internet user may set his or her own reaction network, nuclear properties and reaction rates, and thermodynamic trajectories. The user then submits the nucleosynthesis calculation, which runs on a dedicated server professionally maintained at Clemson University. Once the calculation is completed, the user may explore the results through dynamically produced and downloadable tables and graphs. Online help guides the user through the necessary steps. We hope this web site will prove a user-friendly and helpful tool for professional scientists as well as for students seeking to explore element formation.

  12. Improvement plans for the RHIC/AGS on-line model environments

    SciTech Connect

    Brown,K.A.; Ahrens, L.; Beebe-Wang, J.; Morris, J.; Nemesure, S.; Robert-Demolaize, G.; Satogata, T.; Schoefer, V.; Tepikian, S.

    2009-08-31

    The on-line models for Relativistic Ion Collider (RHIC) and the RHIC pre-injectors (the AGS and the AGS Booster) can be thought of as containing our best collective knowledge of these accelerators. As we improve these on-line models we are building the framework to have a sophisticated model-based controls system. Currently the RHIC on-line model is an integral part of the controls system, providing the interface for tune control, chromaticity control, and non-linear chromaticity control. What we discuss in this paper is our vision of the future of the on-line model environment for RHIC and the RHIC preinjectors. Although these on-line models are primarily used as Courant-Snyder parameter calculators using live machine settings, we envision expanding these environments to encompass many other problem domains.

  13. New Model, New Strategies: Instructional Design for Building Online Wisdom Communities

    ERIC Educational Resources Information Center

    Gunawardena, Charlotte N.; Ortegano-Layne, Ludmila; Carabajal, Kayleigh; Frechette, Casey; Lindemann, Ken; Jennings, Barbara

    2006-01-01

    We discuss the development of an instructional design model, WisCom (Wisdom Communities), based on socio-constructivist and sociocultural learning philosophies and distance education principles for the development of online wisdom communities, and the application and evaluation of the model in an online graduate course in the USA. The WisCom model…

  14. Logic Models as a Way to Support Online Students and Their Projects

    ERIC Educational Resources Information Center

    Strycker, Jesse

    2016-01-01

    As online enrollment continues to grow, students may need additional pedagogical supports to increase their likelihood of success in online environments that don't offer the same supports as those found in face to face classrooms. Logic models are a way to provide such support to students by helping to model project expectations, allowing students…

  15. Introducing the R2D2 Model: Online Learning for the Diverse Learners of This World

    ERIC Educational Resources Information Center

    Bonk, Curtis J.; Zhang, Ke

    2006-01-01

    The R2D2 method--read, reflect, display, and do--is a new model for designing and delivering distance education, and in particular, online learning. Such a model is especially important to address the diverse preferences of online learners of varied generations and varied Internet familiarity. Four quadrants can be utilized separately or as part…

  16. Best Practices for Designing Online Learning Environments for 3D Modeling Curricula: A Delphi Study

    ERIC Educational Resources Information Center

    Mapson, Kathleen Harrell

    2011-01-01

    The purpose of this study was to develop an inventory of best practices for designing online learning environments for 3D modeling curricula. Due to the instructional complexity of three-dimensional modeling, few have sought to develop this type of course for online teaching and learning. Considering this, the study aimed to collectively aggregate…

  17. Quasi-Facial Communication for Online Learning Using 3D Modeling Techniques

    ERIC Educational Resources Information Center

    Wang, Yushun; Zhuang, Yueting

    2008-01-01

    Online interaction with 3D facial animation is an alternative way of face-to-face communication for distance education. 3D facial modeling is essential for virtual educational environments establishment. This article presents a novel 3D facial modeling solution that facilitates quasi-facial communication for online learning. Our algorithm builds…

  18. Disconfirming User Expectations of the Online Service Experience: Inferred versus Direct Disconfirmation Modeling.

    ERIC Educational Resources Information Center

    O'Neill, Martin; Palmer, Adrian; Wright, Christine

    2003-01-01

    Disconfirmation models of online service measurement seek to define service quality as the difference between user expectations of the service to be received and perceptions of the service actually received. Two such models-inferred and direct disconfirmation-for measuring quality of the online experience are compared (WebQUAL, SERVQUAL). Findings…

  19. Purge Lock Server

    SciTech Connect

    Fox, Kevin

    2012-08-21

    The software provides a simple web api to allow users to request a time window where a file will not be removed from cache. HPSS provides the concept of a "purge lock". When a purge lock is set on a file, the file will not be removed from disk, entering tape only state. A lot of network file protocols assume a file is on disk so it is good to purge lock a file before transferring using one of those protocols. HPSS's purge lock system is very coarse grained though. A file is either purge locked or not. Nothing enforces quotas, timely unlocking of purge locks, or managing the races inherent with multiple users wanting to lock/unlock the same file. The Purge Lock Server lets you, through a simple REST API, specify a list of files to purge lock and an expire time, and the system will ensure things happen properly.

  20. Purge Lock Server

    Energy Science and Technology Software Center (ESTSC)

    2012-08-21

    The software provides a simple web api to allow users to request a time window where a file will not be removed from cache. HPSS provides the concept of a "purge lock". When a purge lock is set on a file, the file will not be removed from disk, entering tape only state. A lot of network file protocols assume a file is on disk so it is good to purge lock a file beforemore » transferring using one of those protocols. HPSS's purge lock system is very coarse grained though. A file is either purge locked or not. Nothing enforces quotas, timely unlocking of purge locks, or managing the races inherent with multiple users wanting to lock/unlock the same file. The Purge Lock Server lets you, through a simple REST API, specify a list of files to purge lock and an expire time, and the system will ensure things happen properly.« less

  1. Pathological Buying Online as a Specific Form of Internet Addiction: A Model-Based Experimental Investigation.

    PubMed

    Trotzke, Patrick; Starcke, Katrin; Müller, Astrid; Brand, Matthias

    2015-01-01

    The study aimed to investigate different factors of vulnerability for pathological buying in the online context and to determine whether online pathological buying has parallels to a specific Internet addiction. According to a model of specific Internet addiction by Brand and colleagues, potential vulnerability factors may consist of a predisposing excitability from shopping and as mediating variable, specific Internet use expectancies. Additionally, in line with models on addiction behavior, cue-induced craving should also constitute an important factor for online pathological buying. The theoretical model was tested in this study by investigating 240 female participants with a cue-reactivity paradigm, which was composed of online shopping pictures, to assess excitability from shopping. Craving (before and after the cue-reactivity paradigm) and online shopping expectancies were measured. The tendency for pathological buying and online pathological buying were screened with the Compulsive Buying Scale (CBS) and the Short Internet Addiction Test modified for shopping (s-IATshopping). The results demonstrated that the relationship between individual's excitability from shopping and online pathological buying tendency was partially mediated by specific Internet use expectancies for online shopping (model's R² = .742, p < .001). Furthermore, craving and online pathological buying tendencies were correlated (r = .556, p < .001), and an increase in craving after the cue presentation was observed solely in individuals scoring high for online pathological buying (t(28) = 2.98, p < .01, d = 0.44). Both screening instruments were correlated (r = .517, p < .001), and diagnostic concordances as well as divergences were indicated by applying the proposed cut-off criteria. In line with the model for specific Internet addiction, the study identified potential vulnerability factors for online pathological buying and suggests potential parallels. The presence of craving in

  2. Implementing Adaptive Performance Management in Server Applications

    SciTech Connect

    Liu, Yan; Gorton, Ian

    2007-06-11

    Performance and scalability are critical quality attributes for server applications in Internet-facing business systems. These applications operate in dynamic environments with rapidly fluctuating user loads and resource levels, and unpredictable system faults. Adaptive (autonomic) systems research aims to augment such server applications with intelligent control logic that can detect and react to sudden environmental changes. However, developing this adaptive logic is complex in itself. In addition, executing the adaptive logic consumes processing resources, and hence may (paradoxically) adversely affect application performance. In this paper we describe an approach for developing high-performance adaptive server applications and the supporting technology. The Adaptive Server Framework (ASF) is built on standard middleware services, and can be used to augment legacy systems with adaptive behavior without needing to change the application business logic. Crucially, ASF provides built-in control loop components to optimize the overall application performance, which comprises both the business and adaptive logic. The control loop is based on performance models and allows systems designers to tune the performance levels simply by modifying high level declarative policies. We demonstrate the use of ASF in a case study.

  3. Effective Live Online Faculty Development Workshops: One Model

    ERIC Educational Resources Information Center

    Blyth, Russell D.; May, Michael K.; Rainbolt, Julianne G.

    2006-01-01

    This article describes live, online faculty development workshops that show faculty how to use software packages (to date, GAP and Maple) in teaching college-level mathematics. The authors' primary goal in this article is to encourage others in any discipline to run similar online workshops by providing a resource for their successful operation,…

  4. Optimizing Success: A Model for Persistence in Online Education

    ERIC Educational Resources Information Center

    Glazer, Hilda R.; Murphy, John A.

    2015-01-01

    The first-year experience for students enrolled in an online degree program, particularly the orientation and the first course experience, is critical to success and completion. The experience of one online university in improving persistence through enhancing orientation and the first academic course is presented. Factors impacting persistence…

  5. A Maturity Model for Online Classes across Academic Disciplines

    ERIC Educational Resources Information Center

    Neequaye, Barbara Burris

    2013-01-01

    The number of academic institutions offering courses online has increased with courses being offered across almost all academic disciplines. Faculty members are often confronted with the responsibility of converting a face-to-face course to an online course while simultaneously dealing with new technologies and the interrelationship between the…

  6. Modeling Best Practice through Online Learning: Building Relationships

    ERIC Educational Resources Information Center

    Cerniglia, Ellen G.

    2011-01-01

    Students may fear that they will feel unsupported and isolated when engaged in online learning. They don't know how they will be able to build relationships with their teacher and classmates solely based on written words, without facial expressions, tone of voice, and other nonverbal communication cues. Traditionally, online learning required…

  7. Examining Workload Models in Online and Blended Teaching

    ERIC Educational Resources Information Center

    Tynan, Belinda; Ryan, Yoni; Lamont-Mills, Andrea

    2015-01-01

    Over the past decade, most Australian universities have moved increasingly towards "blended" and online course delivery for both undergraduate and graduate programs. In almost all cases, elements of online teaching are part of routine teaching loads. Yet detailed and accurate workload data associated with "e-teaching" are not…

  8. A Structural Equation Model of Predictors of Online Learning Retention

    ERIC Educational Resources Information Center

    Lee, Youngju; Choi, Jaeho

    2013-01-01

    This study examined the effects of internal academic locus of control (ALOC), learning strategies, flow experience, and student satisfaction on student retention in online learning courses. A total number of 282 adult students at the Korea National Open University participated in the study by completing an online survey adopted from previous…

  9. Modeling a multivariable reactor and on-line model predictive control.

    PubMed

    Yu, D W; Yu, D L

    2005-10-01

    A nonlinear first principle model is developed for a laboratory-scaled multivariable chemical reactor rig in this paper and the on-line model predictive control (MPC) is implemented to the rig. The reactor has three variables-temperature, pH, and dissolved oxygen with nonlinear dynamics-and is therefore used as a pilot system for the biochemical industry. A nonlinear discrete-time model is derived for each of the three output variables and their model parameters are estimated from the real data using an adaptive optimization method. The developed model is used in a nonlinear MPC scheme. An accurate multistep-ahead prediction is obtained for MPC, where the extended Kalman filter is used to estimate system unknown states. The on-line control is implemented and a satisfactory tracking performance is achieved. The MPC is compared with three decentralized PID controllers and the advantage of the nonlinear MPC over the PID is clearly shown. PMID:16294779

  10. A novel information cascade model in online social networks

    NASA Astrophysics Data System (ADS)

    Tong, Chao; He, Wenbo; Niu, Jianwei; Xie, Zhongyu

    2016-02-01

    The spread and diffusion of information has become one of the hot issues in today's social network analysis. To analyze the spread of online social network information and the attribute of cascade, in this paper, we discuss the spread of two kinds of users' decisions for city-wide activities, namely the "want to take part in the activity" and "be interested in the activity", based on the users' attention in "DouBan" and the data of the city-wide activities. We analyze the characteristics of the activity-decision's spread in these aspects: the scale and scope of the cascade subgraph, the structure characteristic of the cascade subgraph, the topological attribute of spread tree, and the occurrence frequency of cascade subgraph. On this basis, we propose a new information spread model. Based on the classical independent diffusion model, we introduce three mechanisms, equal probability, similarity of nodes, and popularity of nodes, which can generate and affect the spread of information. Besides, by conducting the experiments in six different kinds of network data set, we compare the effects of three mechanisms above mentioned, totally six specific factors, on the spread of information, and put forward that the node's popularity plays an important role in the information spread.

  11. Pathological Buying Online as a Specific Form of Internet Addiction: A Model-Based Experimental Investigation

    PubMed Central

    Trotzke, Patrick; Starcke, Katrin; Müller, Astrid; Brand, Matthias

    2015-01-01

    The study aimed to investigate different factors of vulnerability for pathological buying in the online context and to determine whether online pathological buying has parallels to a specific Internet addiction. According to a model of specific Internet addiction by Brand and colleagues, potential vulnerability factors may consist of a predisposing excitability from shopping and as mediating variable, specific Internet use expectancies. Additionally, in line with models on addiction behavior, cue-induced craving should also constitute an important factor for online pathological buying. The theoretical model was tested in this study by investigating 240 female participants with a cue-reactivity paradigm, which was composed of online shopping pictures, to assess excitability from shopping. Craving (before and after the cue-reactivity paradigm) and online shopping expectancies were measured. The tendency for pathological buying and online pathological buying were screened with the Compulsive Buying Scale (CBS) and the Short Internet Addiction Test modified for shopping (s-IATshopping). The results demonstrated that the relationship between individual’s excitability from shopping and online pathological buying tendency was partially mediated by specific Internet use expectancies for online shopping (model’s R² = .742, p < .001). Furthermore, craving and online pathological buying tendencies were correlated (r = .556, p < .001), and an increase in craving after the cue presentation was observed solely in individuals scoring high for online pathological buying (t(28) = 2.98, p < .01, d = 0.44). Both screening instruments were correlated (r = .517, p < .001), and diagnostic concordances as well as divergences were indicated by applying the proposed cut-off criteria. In line with the model for specific Internet addiction, the study identified potential vulnerability factors for online pathological buying and suggests potential parallels. The presence of craving in

  12. NEOS server 4.0 administrative guide.

    SciTech Connect

    Dolan, E. D.

    2001-07-13

    The NEOS Server 4.0 provides a general Internet-based client/server as a link between users and software applications. The administrative guide covers the fundamental principals behind the operation of the NEOS Server, installation and trouble-shooting of the Server software, and implementation details of potential interest to a NEOS Server administrator. The guide also discusses making new software applications available through the Server, including areas of concern to remote solver administrators such as maintaining security, providing usage instructions, and enforcing reasonable restrictions on jobs. The administrative guide is intended both as an introduction to the NEOS Server and as a reference for use when running the Server.

  13. SPAM Detection Server Model Inspired by the Dionaea Muscipula Closure Mechanism: An Alternative Approach for Natural Computing Challenges

    NASA Astrophysics Data System (ADS)

    de Souza Pereira Lopes, Rodrigo Arthur; Carrari R. Lopes, Lia; Mustaro, Pollyana Notargiacomo

    Natural computing has been an increasingly evolving field in the last few years. Focusing on the interesting behaviours offered by nature and biological processes, this work intends to apply the metaphor of the carnivorous plant "Dionaea muscipula" as a complementary defence system against a recurring problem regarding internet and e-mails: spam. The metaphor model presents relevant aspects for further implementation and debate.

  14. Shape prior modeling using sparse representation and online dictionary learning.

    PubMed

    Zhang, Shaoting; Zhan, Yiqiang; Zhou, Yan; Uzunbas, Mustafa; Metaxas, Dimitris N

    2012-01-01

    The recently proposed sparse shape composition (SSC) opens a new avenue for shape prior modeling. Instead of assuming any parametric model of shape statistics, SSC incorporates shape priors on-the-fly by approximating a shape instance (usually derived from appearance cues) by a sparse combination of shapes in a training repository. Theoretically, one can increase the modeling capability of SSC by including as many training shapes in the repository. However, this strategy confronts two limitations in practice. First, since SSC involves an iterative sparse optimization at run-time, the more shape instances contained in the repository, the less run-time efficiency SSC has. Therefore, a compact and informative shape dictionary is preferred to a large shape repository. Second, in medical imaging applications, training shapes seldom come in one batch. It is very time consuming and sometimes infeasible to reconstruct the shape dictionary every time new training shapes appear. In this paper, we propose an online learning method to address these two limitations. Our method starts from constructing an initial shape dictionary using the K-SVD algorithm. When new training shapes come, instead of re-constructing the dictionary from the ground up, we update the existing one using a block-coordinates descent approach. Using the dynamically updated dictionary, sparse shape composition can be gracefully scaled up to model shape priors from a large number of training shapes without sacrificing run-time efficiency. Our method is validated on lung localization in X-Ray and cardiac segmentation in MRI time series. Compared to the original SSC, it shows comparable performance while being significantly more efficient. PMID:23286160

  15. WMS Server 2.0

    NASA Technical Reports Server (NTRS)

    Plesea, Lucian; Wood, James F.

    2012-01-01

    This software is a simple, yet flexible server of raster map products, compliant with the Open Geospatial Consortium (OGC) Web Map Service (WMS) 1.1.1 protocol. The server is a full implementation of the OGC WMS 1.1.1 as a fastCGI client and using Geospatial Data Abstraction Library (GDAL) for data access. The server can operate in a proxy mode, where all or part of the WMS requests are done on a back server. The server has explicit support for a colocated tiled WMS, including rapid response of black (no-data) requests. It generates JPEG and PNG images, including 16-bit PNG. The GDAL back-end support allows great flexibility on the data access. The server is a port to a Linux/GDAL platform from the original IRIX/IL platform. It is simpler to configure and use, and depending on the storage format used, it has better performance than other available implementations. The WMS server 2.0 is a high-performance WMS implementation due to the fastCGI architecture. The use of GDAL data back end allows for great flexibility. The configuration is relatively simple, based on a single XML file. It provides scaling and cropping, as well as blending of multiple layers based on layer transparency.

  16. PEP-FOLD: an updated de novo structure prediction server for both linear and disulfide bonded cyclic peptides

    PubMed Central

    Thévenet, Pierre; Shen, Yimin; Maupetit, Julien; Guyon, Frédéric; Derreumaux, Philippe; Tufféry, Pierre

    2012-01-01

    In the context of the renewed interest of peptides as therapeutics, it is important to have an on-line resource for 3D structure prediction of peptides with well-defined structures in aqueous solution. We present an updated version of PEP-FOLD allowing the treatment of both linear and disulphide bonded cyclic peptides with 9–36 amino acids. The server makes possible to define disulphide bonds and any residue–residue proximity under the guidance of the biologists. Using a benchmark of 34 cyclic peptides with one, two and three disulphide bonds, the best PEP-FOLD models deviate by an average RMS of 2.75 Å from the full NMR structures. Using a benchmark of 37 linear peptides, PEP-FOLD locates lowest-energy conformations deviating by 3 Å RMS from the NMR rigid cores. The evolution of PEP-FOLD comes as a new on-line service to supersede the previous server. The server is available at: http://bioserv.rpbs.univ-paris-diderot.fr/PEP-FOLD. PMID:22581768

  17. Remote Sensing Data Analytics for Planetary Science with PlanetServer/EarthServer

    NASA Astrophysics Data System (ADS)

    Rossi, Angelo Pio; Figuera, Ramiro Marco; Flahaut, Jessica; Martinot, Melissa; Misev, Dimitar; Baumann, Peter; Pham Huu, Bang; Besse, Sebastien

    2016-04-01

    Planetary Science datasets, beyond the change in the last two decades from physical volumes to internet-accessible archives, still face the problem of large-scale processing and analytics (e.g. Rossi et al., 2014, Gaddis and Hare, 2015). PlanetServer, the Planetary Science Data Service of the EC-funded EarthServer-2 project (#654367) tackles the planetary Big Data analytics problem with an array database approach (Baumann et al., 2014). It is developed to serve a large amount of calibrated, map-projected planetary data online, mainly through Open Geospatial Consortium (OGC) Web Coverage Processing Service (WCPS) (e.g. Rossi et al., 2014; Oosthoek et al., 2013; Cantini et al., 2014). The focus of the H2020 evolution of PlanetServer is still on complex multidimensional data, particularly hyperspectral imaging and topographic cubes and imagery. In addition to hyperspectral and topographic from Mars (Rossi et al., 2014), the use of WCPS is applied to diverse datasets on the Moon, as well as Mercury. Other Solar System Bodies are going to be progressively available. Derived parameters such as summary products and indices can be produced through WCPS queries, as well as derived imagery colour combination products, dynamically generated and accessed also through OGC Web Coverage Service (WCS). Scientific questions translated into queries can be posed to a large number of individual coverages (data products), locally, regionally or globally. The new PlanetServer system uses the the Open Source Nasa WorldWind (e.g. Hogan, 2011) virtual globe as visualisation engine, and the array database Rasdaman Community Edition as core server component. Analytical tools and client components of relevance for multiple communities and disciplines are shared across service such as the Earth Observation and Marine Data Services of EarthServer. The Planetary Science Data Service of EarthServer is accessible on http://planetserver.eu. All its code base is going to be available on GitHub, on

  18. PROMALS web server for accurate multiple protein sequence alignments.

    PubMed

    Pei, Jimin; Kim, Bong-Hyun; Tang, Ming; Grishin, Nick V

    2007-07-01

    Multiple sequence alignments are essential in homology inference, structure modeling, functional prediction and phylogenetic analysis. We developed a web server that constructs multiple protein sequence alignments using PROMALS, a progressive method that improves alignment quality by using additional homologs from PSI-BLAST searches and secondary structure predictions from PSIPRED. PROMALS shows higher alignment accuracy than other advanced methods, such as MUMMALS, ProbCons, MAFFT and SPEM. The PROMALS web server takes FASTA format protein sequences as input. The output includes a colored alignment augmented with information about sequence grouping, predicted secondary structures and positional conservation. The PROMALS web server is available at: http://prodata.swmed.edu/promals/ PMID:17452345

  19. Compute Server Performance Results

    NASA Technical Reports Server (NTRS)

    Stockdale, I. E.; Barton, John; Woodrow, Thomas (Technical Monitor)

    1994-01-01

    Parallel-vector supercomputers have been the workhorses of high performance computing. As expectations of future computing needs have risen faster than projected vector supercomputer performance, much work has been done investigating the feasibility of using Massively Parallel Processor systems as supercomputers. An even more recent development is the availability of high performance workstations which have the potential, when clustered together, to replace parallel-vector systems. We present a systematic comparison of floating point performance and price-performance for various compute server systems. A suite of highly vectorized programs was run on systems including traditional vector systems such as the Cray C90, and RISC workstations such as the IBM RS/6000 590 and the SGI R8000. The C90 system delivers 460 million floating point operations per second (FLOPS), the highest single processor rate of any vendor. However, if the price-performance ration (PPR) is considered to be most important, then the IBM and SGI processors are superior to the C90 processors. Even without code tuning, the IBM and SGI PPR's of 260 and 220 FLOPS per dollar exceed the C90 PPR of 160 FLOPS per dollar when running our highly vectorized suite,

  20. Online Synchronous vs. Asynchronous Software Training through the Behavioral Modeling Approach: A Longitudinal Field Experiment

    ERIC Educational Resources Information Center

    Chen, Charlie C.; Shaw, Ruey-shiang

    2006-01-01

    The continued and increasing use of online training raises the question of whether the most effective training methods applied in live instruction will carry over to different online environments in the long run. Behavior Modeling (BM) approach--teaching through demonstration--has been proven as the most effective approach in a face-to-face (F2F)…

  1. Open Online Language Courses: The Multi-Level Model of the Spanish N(ottingham)OOC

    ERIC Educational Resources Information Center

    Goria, Cecilia; Lagares, Manuel

    2015-01-01

    Research into open education has identified a "high number of participants" and "unpredictable mixed abilities" as factors responsible for the relatively weak presence of language Massive Open Online Courses (MOOCs). This contribution presents a model for open online language courses that aims to bridge this gap. The tangible…

  2. The Radical Model--A Painless Way To Teach On-Line.

    ERIC Educational Resources Information Center

    Romm, C.; Taylor, W.

    The information technology/information systems (IT/IS) education sector needs to come up with creative ways of thinking about on-line education. In this paper, the major themes in the literature on on-line education to date are highlighted with a view to identifying issues that are either missing or under-emphasized. Next, the "radical model of…

  3. Organizational Models for Online Education: District, State, or Charter School? Policy and Planning Series #109

    ERIC Educational Resources Information Center

    Cavalluzzo, Linda

    2004-01-01

    Opportunities for online K-12 education are expanding dramatically throughout the United States. This paper describes some of the organizational models that have been developed to provide online education to public school students, including their key strengths and challenges. The review is intended to help state and local school officials weigh…

  4. Talking about Reading as Thinking: Modeling the Hidden Complexities of Online Reading Comprehension

    ERIC Educational Resources Information Center

    Coiro, Julie

    2011-01-01

    This article highlights four cognitive processes key to online reading comprehension and how one might begin to transform existing think-aloud strategy models to encompass the challenges of reading for information on the Internet. Informed by principles of cognitive apprenticeship and an emerging taxonomy of online reading comprehension…

  5. Structural Equation Modeling towards Online Learning Readiness, Academic Motivations, and Perceived Learning

    ERIC Educational Resources Information Center

    Horzum, Mehmet Baris; Kaymak, Zeliha Demir; Gungoren, Ozlem Canan

    2015-01-01

    The relationship between online learning readiness, academic motivations, and perceived learning was investigated via structural equation modeling in the research. The population of the research consisted of 750 students who studied using the online learning programs of Sakarya University. 420 of the students who volunteered for the research and…

  6. New Learning Models: The Evolution of Online Learning into Innovative K-12 Blended Programs

    ERIC Educational Resources Information Center

    Patrick, Susan

    2011-01-01

    The author traces the growth of K-12 online learning in the United States from its modest genesis in the mid-1990s with 50,000 students to the more than 4 million enrollments today, the fastest scaling ever of any innovation in K-12 education. The evolution from one-size-fits-all online courses to innovative, blended instructional models that are…

  7. The SDSS data archive server

    SciTech Connect

    Neilsen, Eric H., Jr.; /Fermilab

    2007-10-01

    The Sloan Digital Sky Survey (SDSS) Data Archive Server (DAS) provides public access to data files produced by the SDSS data reduction pipeline. This article discusses challenges in public distribution of data of this volume and complexity, and how the project addressed them. The Sloan Digital Sky Survey (SDSS)1 is an astronomical survey of covering roughly one quarter of the night sky. It contains images of this area, a catalog of almost 300 million objects detected in those images, and spectra of more than a million of these objects. The catalog of objects includes a variety of data on each object. These data include not only basic information but also fit parameters for a variety of models, classifications by sophisticated object classification algorithms, statistical parameters, and more. If the survey contains the spectrum of an object, the catalog includes a variety of other parameters derived from its spectrum. Data processing and catalog generation, described more completely in the SDSS Early Data Release2 paper, consists of several stages: collection of imaging data, processing of imaging data, selection of spectroscopic targets from catalogs generated from the imaging data, collection of spectroscopic data, processing of spectroscopic data, and loading of processed data into a database. Each of these stages is itself a complex process. For example, the software that processes the imaging data determines and removes some instrumental signatures in the raw images to create 'corrected frames', models the point spread function, models and removes the sky background, detects objects, measures object positions, measures the radial profile and other morphological parameters for each object, measures the brightness of each object using a variety of methods, classifies the objects, calibrates the brightness measurements against survey standards, and produces a variety of quality assurance plots and diagnostic tables. The complexity of the spectroscopic data

  8. Online Mars Digital Elevation Model Derived from Profiles

    NASA Astrophysics Data System (ADS)

    Delacourt, C.; Gros, N.; Allemand, P.; Baratoux, D.

    The topography of Mars is a key parameter for understanding the geological evolution of the planet. Since 1997, the Mars Orbital Laser Altimeter (MOLA), launched in the frame of Mars Global Surveyor, has acquired more than 600 million topographic measurements. However, despite the high vertical accuracy of those profiles, the main limitation of this technique appears when topographic maps are required. To create a Digital Elevation Model (DEM) or a topographic map, an interpolation on individual MOLA measurements on regular grids is required. Calculating the global full-resolution Martian DEM requires very intensive computation and large disk capacities. Only a few teams throughout the world have computed a full resolution DEM from MOLA data. Even if a scientist is interested in a small area of Mars, numerous profiles have to be processed and extracted from 44 CD-ROMs. To facilitate the exploitation of the high potential of MOLA data, we propose an Internet application that allows any user to extract the individual MOLA measurements from all profiles over a given area and to process local DEMs of the surface of Mars with adjustable parameters of computation. This facility, developed in Interactive Data Language by Research Systems, Inc., allows users to select the zone of interest and the resolution of the output DEM. After online processing, various products in standard formats can be downloaded on the user's computer, including DEMs, individual MOLA points, list and map of the MOLA ground tracks used for the DEM generation, and a quality map. This map is computed by evaluating the distance between each point of the DEM and the closest measurements of the altimeter. Furthermore, IDL tools are supplied to facilitate data integration and use.

  9. KFC Server: interactive forecasting of protein interaction hot spots.

    PubMed

    Darnell, Steven J; LeGault, Laura; Mitchell, Julie C

    2008-07-01

    The KFC Server is a web-based implementation of the KFC (Knowledge-based FADE and Contacts) model-a machine learning approach for the prediction of binding hot spots, or the subset of residues that account for most of a protein interface's; binding free energy. The server facilitates the automated analysis of a user submitted protein-protein or protein-DNA interface and the visualization of its hot spot predictions. For each residue in the interface, the KFC Server characterizes its local structural environment, compares that environment to the environments of experimentally determined hot spots and predicts if the interface residue is a hot spot. After the computational analysis, the user can visualize the results using an interactive job viewer able to quickly highlight predicted hot spots and surrounding structural features within the protein structure. The KFC Server is accessible at http://kfc.mitchell-lab.org. PMID:18539611

  10. Programmatic, Systematic, Automatic: An Online Course Accessibility Support Model

    ERIC Educational Resources Information Center

    Bastedo, Kathleen; Sugar, Amy; Swenson, Nancy; Vargas, Jessica

    2013-01-01

    Over the past few years, there has been a noticeable increase in the number of requests for online course material accommodations at the University of Central Florida (UCF). In response to these requests, UCF's Center for Distributed Learning (CDL) formed new teams, reevaluated its processes, and initiated a partnership with UCF's…

  11. A Model for Social Presence in Online Classrooms

    ERIC Educational Resources Information Center

    Wei, Chun-Wang; Chen, Nian-Shing; Kinshuk,

    2012-01-01

    It is now possible to create flexible learning environments without time and distance barriers on the internet. However, research has shown that learners typically experience isolation and alienation in online learning environments. These negative experiences can be reduced by enhancing social presence. In order to better facilitate the perceived…

  12. Interdisciplinary Gerontology Education Online: A Developmental Process Model

    ERIC Educational Resources Information Center

    St. Hill, Halcyon; Edwards, Nancy

    2004-01-01

    Distance education online in gerontology in academic settings is designed to reflect content relevant to gerontology practices, academic standards, teaching strategies, and technology that embrace content delivery while enhancing learning. A balance with community services and needs for older adult populations, academic integrity, stakeholders,…

  13. Designing Online Workshops: Using an Experiential Learning Model

    ERIC Educational Resources Information Center

    Lynch, Sherry K.; Kogan, Lori R.

    2004-01-01

    This article describes 4 online workshops designed to assist college students with improving their time management, textbook reading, memory and concentration, and overall academic performance. These workshops were created to work equally well with imaginative, analytic, common-sense, and dynamic learners. Positive student feedback indicated that…

  14. Free Textbooks: An Online Company Tries a Controversial Publishing Model

    ERIC Educational Resources Information Center

    Rampell, Catherine

    2008-01-01

    The high prices of textbooks, which are approaching $1,000 per year for an average student, have those students and their professors crying for mercy. Flat World Knowledge, a new digital-textbook publisher, has the answer to this problem. Starting next year, the publisher will offer online, peer-reviewed, interactive, user-editable textbooks, free…

  15. Enhanced Online Access Requires Redesigned Delivery Options and Cost Models

    ERIC Educational Resources Information Center

    Stern, David

    2007-01-01

    Rapidly developing online information technologies provide dramatically new capabilities and opportunities, and place new responsibilities on all involved to recreate networks for scholarly communication. Collaborations between all segments of the information network are made possible and necessary as we attempt to find a balanced and mutually…

  16. A new hybrid model for exploring the adoption of online nursing courses.

    PubMed

    Tung, Feng-Cheng; Chang, Su-Chao

    2008-04-01

    With the advancement in educational technology and internet access in recent years, nursing academia is searching for ways to widen nurses' educational opportunities. The online nursing courses are drawing more attention as well. The online nursing courses are very important e-learning tools for nursing students. The research combines the innovation diffusion theory and technology acceptance model, and adds two research variables, perceived financial cost and computer self-efficacy to propose a new hybrid technology acceptance model to study nursing students' behavioral intentions to use the online nursing courses. Based on 267 questionnaires collected from six universities in Taiwan, the research finds that studies strongly support this new hybrid technology acceptance model in predicting nursing students' behavioral intentions to use the online nursing courses. This research finds that compatibility, perceived usefulness, perceived ease of use, perceived financial cost and computer self-efficacy are critical factors for nursing students' behavioral intentions to use the online nursing courses. By explaining nursing students' behavioral intentions from a user's perspective, the findings of this research help to develop more user friendly online nursing courses and also provide insight into the best way to promote new e-learning tools for nursing students. This research finds that compatibility is the most important research variable that affects the behavioral intention to use the online nursing courses. PMID:17706842

  17. Fault-tolerant PACS server

    NASA Astrophysics Data System (ADS)

    Cao, Fei; Liu, Brent J.; Huang, H. K.; Zhou, Michael Z.; Zhang, Jianguo; Zhang, X. C.; Mogel, Greg T.

    2002-05-01

    Failure of a PACS archive server could cripple an entire PACS operation. Last year we demonstrated that it was possible to design a fault-tolerant (FT) server with 99.999% uptime. The FT design was based on a triple modular redundancy with a simple majority vote to automatically detect and mask a faulty module. The purpose of this presentation is to report on its continuous developments in integrating with external mass storage devices, and to delineate laboratory failover experiments. An FT PACS Simulator with generic PACS software has been used in the experiment. To simulate a PACS clinical operation, image examinations are transmitted continuously from the modality simulator to the DICOM gateway and then to the FT PACS server and workstations. The hardware failures in network, FT server module, disk, RAID, and DLT are manually induced to observe the failover recovery of the FT PACS to resume its normal data flow. We then test and evaluate the FT PACS server in its reliability, functionality, and performance.

  18. Addressing Diverse Learner Preferences and Intelligences with Emerging Technologies: Matching Models to Online Opportunities

    ERIC Educational Resources Information Center

    Zhang, Ke; Bonk, Curtis J.

    2008-01-01

    This paper critically reviews various learning preferences and human intelligence theories and models with a particular focus on the implications for online learning. It highlights a few key models, Gardner's multiple intelligences, Fleming and Mills' VARK model, Honey and Mumford's Learning Styles, and Kolb's Experiential Learning Model, and…

  19. Comparing Server Energy Use and Efficiency Using Small Sample Sizes

    SciTech Connect

    Coles, Henry C.; Qin, Yong; Price, Phillip N.

    2014-11-01

    This report documents a demonstration that compared the energy consumption and efficiency of a limited sample size of server-type IT equipment from different manufacturers by measuring power at the server power supply power cords. The results are specific to the equipment and methods used. However, it is hoped that those responsible for IT equipment selection can used the methods described to choose models that optimize energy use efficiency. The demonstration was conducted in a data center at Lawrence Berkeley National Laboratory in Berkeley, California. It was performed with five servers of similar mechanical and electronic specifications; three from Intel and one each from Dell and Supermicro. Server IT equipment is constructed using commodity components, server manufacturer-designed assemblies, and control systems. Server compute efficiency is constrained by the commodity component specifications and integration requirements. The design freedom, outside of the commodity component constraints, provides room for the manufacturer to offer a product with competitive efficiency that meets market needs at a compelling price. A goal of the demonstration was to compare and quantify the server efficiency for three different brands. The efficiency is defined as the average compute rate (computations per unit of time) divided by the average energy consumption rate. The research team used an industry standard benchmark software package to provide a repeatable software load to obtain the compute rate and provide a variety of power consumption levels. Energy use when the servers were in an idle state (not providing computing work) were also measured. At high server compute loads, all brands, using the same key components (processors and memory), had similar results; therefore, from these results, it could not be concluded that one brand is more efficient than the other brands. The test results show that the power consumption variability caused by the key components as a

  20. KOSMOS: a universal morph server for nucleic acids, proteins and their complexes

    PubMed Central

    Seo, Sangjae; Kim, Moon Ki

    2012-01-01

    KOSMOS is the first online morph server to be able to address the structural dynamics of DNA/RNA, proteins and even their complexes, such as ribosomes. The key functions of KOSMOS are the harmonic and anharmonic analyses of macromolecules. In the harmonic analysis, normal mode analysis (NMA) based on an elastic network model (ENM) is performed, yielding vibrational modes and B-factor calculations, which provide insight into the potential biological functions of macromolecules based on their structural features. Anharmonic analysis involving elastic network interpolation (ENI) is used to generate plausible transition pathways between two given conformations by optimizing a topology-oriented cost function that guarantees a smooth transition without steric clashes. The quality of the computed pathways is evaluated based on their various facets, including topology, energy cost and compatibility with the NMA results. There are also two unique features of KOSMOS that distinguish it from other morph servers: (i) the versatility in the coarse-graining methods and (ii) the various connection rules in the ENM. The models enable us to analyze macromolecular dynamics with the maximum degrees of freedom by combining a variety of ENMs from full-atom to coarse-grained, backbone and hybrid models with one connection rule, such as distance-cutoff, number-cutoff or chemical-cutoff. KOSMOS is available at http://bioengineering.skku.ac.kr/kosmos. PMID:22669912

  1. KOSMOS: a universal morph server for nucleic acids, proteins and their complexes.

    PubMed

    Seo, Sangjae; Kim, Moon Ki

    2012-07-01

    KOSMOS is the first online morph server to be able to address the structural dynamics of DNA/RNA, proteins and even their complexes, such as ribosomes. The key functions of KOSMOS are the harmonic and anharmonic analyses of macromolecules. In the harmonic analysis, normal mode analysis (NMA) based on an elastic network model (ENM) is performed, yielding vibrational modes and B-factor calculations, which provide insight into the potential biological functions of macromolecules based on their structural features. Anharmonic analysis involving elastic network interpolation (ENI) is used to generate plausible transition pathways between two given conformations by optimizing a topology-oriented cost function that guarantees a smooth transition without steric clashes. The quality of the computed pathways is evaluated based on their various facets, including topology, energy cost and compatibility with the NMA results. There are also two unique features of KOSMOS that distinguish it from other morph servers: (i) the versatility in the coarse-graining methods and (ii) the various connection rules in the ENM. The models enable us to analyze macromolecular dynamics with the maximum degrees of freedom by combining a variety of ENMs from full-atom to coarse-grained, backbone and hybrid models with one connection rule, such as distance-cutoff, number-cutoff or chemical-cutoff. KOSMOS is available at http://bioengineering.skku.ac.kr/kosmos. PMID:22669912

  2. Revision and Validation of a Culturally-Adapted Online Instructional Module Using Edmundson's CAP Model: A DBR Study

    ERIC Educational Resources Information Center

    Tapanes, Marie A.

    2011-01-01

    In the present study, the Cultural Adaptation Process Model was applied to an online module to include adaptations responsive to the online students' culturally-influenced learning styles and preferences. The purpose was to provide the online learners with a variety of course material presentations, where the e-learners had the opportunity to…

  3. Toward a Social Conflict Evolution Model: Examining the Adverse Power of Conflictual Social Interaction in Online Learning

    ERIC Educational Resources Information Center

    Xie, Kui; Miller, Nicole C.; Allison, Justin R.

    2013-01-01

    This case study examined an authentic online learning phenomenon where social conflict, including harsh critique and negative tone, weaved throughout peer-moderated online discussions in an online class. Opening coding and content analysis were performed on 1306 message units and course artifacts. The results revealed that a model of social…

  4. Distributed control system for demand response by servers

    NASA Astrophysics Data System (ADS)

    Hall, Joseph Edward

    Within the broad topical designation of smart grid, research in demand response, or demand-side management, focuses on investigating possibilities for electrically powered devices to adapt their power consumption patterns to better match generation and more efficiently integrate intermittent renewable energy sources, especially wind. Devices such as battery chargers, heating and cooling systems, and computers can be controlled to change the time, duration, and magnitude of their power consumption while still meeting workload constraints such as deadlines and rate of throughput. This thesis presents a system by which a computer server, or multiple servers in a data center, can estimate the power imbalance on the electrical grid and use that information to dynamically change the power consumption as a service to the grid. Implementation on a testbed demonstrates the system with a hypothetical but realistic usage case scenario of an online video streaming service in which there are workloads with deadlines (high-priority) and workloads without deadlines (low-priority). The testbed is implemented with real servers, estimates the power imbalance from the grid frequency with real-time measurements of the live outlet, and uses a distributed, real-time algorithm to dynamically adjust the power consumption of the servers based on the frequency estimate and the throughput of video transcoder workloads. Analysis of the system explains and justifies multiple design choices, compares the significance of the system in relation to similar publications in the literature, and explores the potential impact of the system.

  5. Hybrid metrology implementation: server approach

    NASA Astrophysics Data System (ADS)

    Osorio, Carmen; Timoney, Padraig; Vaid, Alok; Elia, Alex; Kang, Charles; Bozdog, Cornel; Yellai, Naren; Grubner, Eyal; Ikegami, Toru; Ikeno, Masahiko

    2015-03-01

    Hybrid metrology (HM) is the practice of combining measurements from multiple toolset types in order to enable or improve metrology for advanced structures. HM is implemented in two phases: Phase-1 includes readiness of the infrastructure to transfer processed data from the first toolset to the second. Phase-2 infrastructure allows simultaneous transfer and optimization of raw data between toolsets such as spectra, images, traces - co-optimization. We discuss the extension of Phase-1 to include direct high-bandwidth communication between toolsets using a hybrid server, enabling seamless fab deployment and further laying the groundwork for Phase-2 high volume manufacturing (HVM) implementation. An example of the communication protocol shows the information that can be used by the hybrid server, differentiating its capabilities from that of a host-based approach. We demonstrate qualification and production implementation of the hybrid server approach using CD-SEM and OCD toolsets for complex 20nm and 14nm applications. Finally we discuss the roadmap for Phase-2 HM implementation through use of the hybrid server.

  6. Participatory storytelling online: a complementary model of patient satisfaction.

    PubMed

    Born, Karen; Rizo, Carlos; Seeman, Neil

    2009-01-01

    Measuring patient satisfaction is an important quality improvement technique. The World Wide Web offers new approaches to understanding patient satisfaction and stories about healthcare encounters. In this paper, we suggest that there is a wealth of patients' stories being told online, in real time, on social networking and on social rating websites. This patient-generated, publicly available information can complement existing patient satisfaction data and can provide insights into patients' values, perspectives and expectations - and can suggest ways to improve the patient's experience along the continuum of care. PMID:20057238

  7. Modelling unsupervised online-learning of artificial grammars: linking implicit and statistical learning.

    PubMed

    Rohrmeier, Martin A; Cross, Ian

    2014-07-01

    Humans rapidly learn complex structures in various domains. Findings of above-chance performance of some untrained control groups in artificial grammar learning studies raise questions about the extent to which learning can occur in an untrained, unsupervised testing situation with both correct and incorrect structures. The plausibility of unsupervised online-learning effects was modelled with n-gram, chunking and simple recurrent network models. A novel evaluation framework was applied, which alternates forced binary grammaticality judgments and subsequent learning of the same stimulus. Our results indicate a strong online learning effect for n-gram and chunking models and a weaker effect for simple recurrent network models. Such findings suggest that online learning is a plausible effect of statistical chunk learning that is possible when ungrammatical sequences contain a large proportion of grammatical chunks. Such common effects of continuous statistical learning may underlie statistical and implicit learning paradigms and raise implications for study design and testing methodologies. PMID:24905545

  8. On-line and Model-based Approaches to the Visual Control of Action

    PubMed Central

    Zhao, Huaiyong; Warren, William H.

    2014-01-01

    Two general approaches to the visual control of action have emerged in last few decades, known as the on-line and model-based approaches. The key difference between them is whether action is controlled by current visual information or on the basis of an internal world model. In this paper, we evaluate three hypotheses: strong on-line control, strong model-based control, and a hybrid solution that combines on-line control with weak off-line strategies. We review experimental research on the control of locomotion and manual actions, which indicates that (a) an internal world model is neither sufficient nor necessary to control action at normal levels of performance; (b) current visual information is necessary and sufficient to control action at normal levels; and (c) under certain conditions (e.g. occlusion) action is controlled by less accurate, simple strategies such as heuristics, visual-motor mappings, or spatial memory. We conclude that the strong model-based hypothesis is not sustainable. Action is normally controlled on-line when current information is available, consistent with the strong on-line control hypothesis. In exceptional circumstances, action is controlled by weak, context-specific, off-line strategies. This hybrid solution is comprehensive, parsimonious, and able to account for a variety of tasks under a range of visual conditions. PMID:25454700

  9. Nuke@ - a nuclear information internet server

    SciTech Connect

    Slone, B.J. III.; Richardson, C.E.

    1994-12-31

    To facilitate Internet communications between nuclear utilities, vendors, agencies, and other interested parties, an Internet server is being established. This server will provide the nuclear industry with its first file-transfer protocol (ftp) connection point, its second mail server, and a potential telnet connection location.

  10. Protein structure prediction and analysis using the Robetta server

    PubMed Central

    Kim, David E.; Chivian, Dylan; Baker, David

    2004-01-01

    The Robetta server (http://robetta.bakerlab.org) provides automated tools for protein structure prediction and analysis. For structure prediction, sequences submitted to the server are parsed into putative domains and structural models are generated using either comparative modeling or de novo structure prediction methods. If a confident match to a protein of known structure is found using BLAST, PSI-BLAST, FFAS03 or 3D-Jury, it is used as a template for comparative modeling. If no match is found, structure predictions are made using the de novo Rosetta fragment insertion method. Experimental nuclear magnetic resonance (NMR) constraints data can also be submitted with a query sequence for RosettaNMR de novo structure determination. Other current capabilities include the prediction of the effects of mutations on protein–protein interactions using computational interface alanine scanning. The Rosetta protein design and protein–protein docking methodologies will soon be available through the server as well. PMID:15215442

  11. The Robust Learning Model (RLM): A Comprehensive Approach to a New Online University

    ERIC Educational Resources Information Center

    Neumann, Yoram; Neumann, Edith F.

    2010-01-01

    This paper outlines the components of the Robust Learning Model (RLM) as a conceptual framework for creating a new online university offering numerous degree programs at all degree levels. The RLM is a multi-factorial model based on the basic belief that successful learning outcomes depend on multiple factors employed together in a holistic…

  12. Development of an Instructional Model for Online Task-Based Interactive Listening for EFL Learners

    ERIC Educational Resources Information Center

    Tian, Xingbin; Suppasetseree, Suksan

    2013-01-01

    College English in China has shifted from cultivating reading ability to comprehensive communicative abilities with an emphasis on listening and speaking. For this reason, new teaching models should be built on modern information technology. However, little research on developing models for the online teaching of listening skills has been…

  13. Evaluating Online Instruction: Adapting a Training Model to E-Learning in Higher Education.

    ERIC Educational Resources Information Center

    Hallett, Karen; Essex, Christopher

    This paper presents a model for the evaluation of postsecondary online distance education courses and programs. To better address the unique nature and audience for these courses and programs, and the related institutional needs for assessing their success or failure, the focus is on a model from corporate training that provides a comprehensive,…

  14. Using the Constructivist Tridimensional Design Model for Online Continuing Education for Health Care Clinical Faculty

    ERIC Educational Resources Information Center

    Seo, Kay Kyeong-Ju; Engelhard, Chalee

    2014-01-01

    This article presents a new paradigm for continuing education of Clinical Instructors (CIs): the Constructivist Tridimensional (CTD) model for the design of an online curriculum. Based on problem-based learning, self-regulated learning, and adult learning theory, the CTD model was designed to facilitate interactive, collaborative, and authentic…

  15. From honeybees to Internet servers: biomimicry for distributed management of Internet hosting centers.

    PubMed

    Nakrani, Sunil; Tovey, Craig

    2007-12-01

    An Internet hosting center hosts services on its server ensemble. The center must allocate servers dynamically amongst services to maximize revenue earned from hosting fees. The finite server ensemble, unpredictable request arrival behavior and server reallocation cost make server allocation optimization difficult. Server allocation closely resembles honeybee forager allocation amongst flower patches to optimize nectar influx. The resemblance inspires a honeybee biomimetic algorithm. This paper describes details of the honeybee self-organizing model in terms of information flow and feedback, analyzes the homology between the two problems and derives the resulting biomimetic algorithm for hosting centers. The algorithm is assessed for effectiveness and adaptiveness by comparative testing against benchmark and conventional algorithms. Computational results indicate that the new algorithm is highly adaptive to widely varying external environments and quite competitive against benchmark assessment algorithms. Other swarm intelligence applications are briefly surveyed, and some general speculations are offered regarding their various degrees of success. PMID:18037727

  16. Online coupled regional meteorology-chemistry models in Europe: current status and prospects

    NASA Astrophysics Data System (ADS)

    Baklanov, A.; Schluenzen, K. H.; Suppan, P.; Baldasano, J.; Brunner, D.; Aksoyoglu, S.; Carmichael, G.; Douros, J.; Flemming, J.; Forkel, R.; Galmarini, S.; Gauss, M.; Grell, G.; Hirtl, M.; Joffre, S.; Jorba, O.; Kaas, E.; Kaasik, M.; Kallos, G.; Kong, X.; Korsholm, U.; Kurganskiy, A.; Kushta, J.; Lohmann, U.; Mahura, A.; Manders-Groot, A.; Maurizi, A.; Moussiopoulos, N.; Rao, S. T.; Savage, N.; Seigneur, C.; Sokhi, R.; Solazzo, E.; Solomos, S.; Sørensen, B.; Tsegas, G.; Vignati, E.; Vogel, B.; Zhang, Y.

    2013-05-01

    The simulation of the coupled evolution of atmospheric dynamics, pollutant transport, chemical reactions and atmospheric composition is one of the most challenging tasks in environmental modelling, climate change studies, and weather forecasting for the next decades as they all involve strongly integrated processes. Weather strongly influences air quality (AQ) and atmospheric transport of hazardous materials, while atmospheric composition can influence both weather and climate by directly modifying the atmospheric radiation budget or indirectly affecting cloud formation. Until recently, however, due to the scientific complexities and lack of computational power, atmospheric chemistry and weather forecasting have developed as separate disciplines, leading to the development of separate modelling systems that are only loosely coupled. The continuous increase in computer power has now reached a stage that enables us to perform online coupling of regional meteorological models with atmospheric chemical transport models. The focus on integrated systems is timely, since recent research has shown that meteorology and chemistry feedbacks are important in the context of many research areas and applications, including numerical weather prediction (NWP), AQ forecasting as well as climate and Earth system modelling. However, the relative importance of online integration and its priorities, requirements and levels of detail necessary for representing different processes and feedbacks can greatly vary for these related communities: (i) NWP, (ii) AQ forecasting and assessments, (iii) climate and earth system modelling. Additional applications are likely to benefit from online modelling, e.g.: simulation of volcanic ash or forest fire plumes, pollen warnings, dust storms, oil/gas fires, geo-engineering tests involving changes in the radiation balance. The COST Action ES1004 - European framework for online integrated air quality and meteorology modelling (EuMetChem) - aims at

  17. Creating a GIS data server on the World Wide Web: The GISST example

    SciTech Connect

    Pace, P.J.; Evers, T.K.

    1996-01-01

    In an effort to facilitate user access to Geographic Information Systems (GIS) data, the GIS and Computer Modeling Group from the Computational Physics and Engineering Division at the Oak Ridge National Laboratory (ORNL) in Oak Ridge, Tennessee (TN), has developed a World Wide Web server named GISST. The server incorporates a highly interactive and dynamic forms-based interface to browse and download a variety of GIS data types. This paper describes the server`s design considerations, development, resulting implementation and future enhancements.

  18. Using Application Servers to Build Distributed Data Systems

    NASA Astrophysics Data System (ADS)

    King, T. A.; Walker, R. J.; Joy, S. P.

    2004-12-01

    Space and Earth scientists increasingly require data products from multiple sensors. Frequently these data are widely distributed and each source may have very different types of data products. For instance a single space science research project can require data from more than one instrument on more than one spacecraft, data from Earth based sensors and results from theoretical models. These data and model results are housed at many locations around the world. The location of the data may change with time as spacecraft are complete their missions. Unless care is taken in providing access to these data, using them will require a great deal of effort on the part of individual scientists. Today's data system designers are challenged to link these distributed sources and make them work together as one. One approach to providing universal support is to base the core functionality of each data provider on common technology. An emerging technology platform is Sun's Java Application Server. With an application server approach all services offered by the data center are provided through Java servlets that can be invoked through the application server while responding to a request for a specific URL. The benefits of using an application server include a well established framework for development, broad corporate support for the technology and increased sharing of implementations between data centers. We will illustrate the use of an application server by describing the system currently being deployed at the Planetary Plasma Interactions Node of NASA's Planetary Data System.

  19. On-line coupling of volcanic ash and aerosols transport with multiscale meteorological models

    NASA Astrophysics Data System (ADS)

    Marti, Alejandro; Folch, Arnau; Jorba, Oriol

    2014-05-01

    Large explosive volcanic eruptions can inject significant amounts of tephra and aerosols (e.g. SO2) into the atmosphere inducing a multi-scale array of physical, chemical and biological feedbacks within the environment. Effective coupled Numerical Weather Prediction (NWP) models capable to forecast on-line the spatial and temporal distribution of volcanic ash and aerosols are necessary to assess the magnitude of these feedback effects. However, due to several limitations (users from different communities, operational constrains, computational power, etc.), tephra transport models and NWP models have evolved independently. Within the framework of NEMOH(an Initial Training Network of the European Commission FP7 Program), we aim to quantify the feedback effects of volcanic ash clouds and aerosols emitted during large-magnitude eruptions on regional meteorology. As a first step, we have focused on the differences between the off-line hypothesis, currently assumed by tephra transport models (e.g. FALL3D), and the on-line approach, where transport and sedimentation of volcanic ash is coupled on-line to the NMMB (Non-hydrostatic Multiscale Meteorological model on a B grid) meteorological model; the evolution of the WRF-NMME meteorological model. We compared the spatiotemporal transport of volcanic ash particles simulated with the on-line coupled FALL3D-NMMB/BSC-CTM model with those from the off-line FALL3D model, by using the 2011 Cordón-Caulle eruption as a test-case and validating results against satellite data. Additionally, this presentation introduces the forthcoming steps to implement a sulfate aerosol module within the chemical transport module of the FALL3D-NMMB/BSC-CTM model, in order to quantify the feedback effects on the atmospheric radiative budget, particularly during large-magnitude explosive volcanic eruptions. Keywords: volcanic ash, SO2, FALL3D, NMMB, meteorology, on-line coupling, NEMOH.

  20. SAbPred: a structure-based antibody prediction server.

    PubMed

    Dunbar, James; Krawczyk, Konrad; Leem, Jinwoo; Marks, Claire; Nowak, Jaroslaw; Regep, Cristian; Georges, Guy; Kelm, Sebastian; Popovic, Bojana; Deane, Charlotte M

    2016-07-01

    SAbPred is a server that makes predictions of the properties of antibodies focusing on their structures. Antibody informatics tools can help improve our understanding of immune responses to disease and aid in the design and engineering of therapeutic molecules. SAbPred is a single platform containing multiple applications which can: number and align sequences; automatically generate antibody variable fragment homology models; annotate such models with estimated accuracy alongside sequence and structural properties including potential developability issues; predict paratope residues; and predict epitope patches on protein antigens. The server is available at http://opig.stats.ox.ac.uk/webapps/sabpred. PMID:27131379

  1. Telerobotic control of a mobile coordinated robotic server. M.S. Thesis Annual Technical Report

    NASA Technical Reports Server (NTRS)

    Lee, Gordon

    1993-01-01

    The annual report on telerobotic control of a mobile coordinated robotic server is presented. The goal of this effort is to develop advanced control methods for flexible space manipulator systems. As such, an adaptive fuzzy logic controller was developed in which model structure as well as parameter constraints are not required for compensation. The work builds upon previous work on fuzzy logic controllers. Fuzzy logic controllers have been growing in importance in the field of automatic feedback control. Hardware controllers using fuzzy logic have become available as an alternative to the traditional PID controllers. Software has also been introduced to aid in the development of fuzzy logic rule-bases. The advantages of using fuzzy logic controllers include the ability to merge the experience and intuition of expert operators into the rule-base and that a model of the system is not required to construct the controller. A drawback of the classical fuzzy logic controller, however, is the many parameters needed to be turned off-line prior to application in the closed-loop. In this report, an adaptive fuzzy logic controller is developed requiring no system model or model structure. The rule-base is defined to approximate a state-feedback controller while a second fuzzy logic algorithm varies, on-line, parameters of the defining controller. Results indicate the approach is viable for on-line adaptive control of systems when the model is too complex or uncertain for application of other more classical control techniques.

  2. A decade of web server updates at the bioinformatics links directory: 2003–2012

    PubMed Central

    Brazas, Michelle D.; Yim, David; Yeung, Winston; Ouellette, B. F. Francis

    2012-01-01

    The 2012 Bioinformatics Links Directory update marks the 10th special Web Server issue from Nucleic Acids Research. Beginning with content from their 2003 publication, the Bioinformatics Links Directory in collaboration with Nucleic Acids Research has compiled and published a comprehensive list of freely accessible, online tools, databases and resource materials for the bioinformatics and life science research communities. The past decade has exhibited significant growth and change in the types of tools, databases and resources being put forth, reflecting both technology changes and the nature of research over that time. With the addition of 90 web server tools and 12 updates from the July 2012 Web Server issue of Nucleic Acids Research, the Bioinformatics Links Directory at http://bioinformatics.ca/links_directory/ now contains an impressive 134 resources, 455 databases and 1205 web server tools, mirroring the continued activity and efforts of our field. PMID:22700703

  3. File servers, networking, and supercomputers

    NASA Technical Reports Server (NTRS)

    Moore, Reagan W.

    1992-01-01

    One of the major tasks of a supercomputer center is managing the massive amount of data generated by application codes. A data flow analysis of the San Diego Supercomputer Center is presented that illustrates the hierarchical data buffering/caching capacity requirements and the associated I/O throughput requirements needed to sustain file service and archival storage. Usage paradigms are examined for both tightly-coupled and loosely-coupled file servers linked to the supercomputer by high-speed networks.

  4. File servers, networking, and supercomputers

    NASA Technical Reports Server (NTRS)

    Moore, Reagan W.

    1991-01-01

    One of the major tasks of a supercomputer center is managing the massive amount of data generated by application codes. A data flow analysis of the San Diego Supercomputer Center is presented that illustrates the hierarchical data buffering/caching capacity requirements and the associated I/O throughput requirements needed to sustain file service and archival storage. Usage paradigms are examined for both tightly-coupled and loosely-coupled file servers linked to the supercomputer by high-speed networks.

  5. The NASA Technical Report Server

    NASA Astrophysics Data System (ADS)

    Nelson, M. L.; Gottlich, G. L.; Bianco, D. J.; Paulson, S. S.; Binkley, R. L.; Kellogg, Y. D.; Beaumont, C. J.; Schmunk, R. B.; Kurtz, M. J.; Accomazzi, A.; Syed, O.

    The National Aeronautics and Space Act of 1958 established the National Aeronautics and Space Administration (NASA) and charged it to "provide for the widest practicable and appropriate dissemination of information concerning...its activities and the results thereof". The search for innovative methods to distribute NASA's information led a grass-roots team to create the NASA Technical Report Server (NTRS), which uses the World Wide Web and other popular Internet-based information systems .

  6. Client-Server Password Recovery

    NASA Astrophysics Data System (ADS)

    Chmielewski, Łukasz; Hoepman, Jaap-Henk; van Rossum, Peter

    Human memory is not perfect - people constantly memorize new facts and forget old ones. One example is forgetting a password, a common problem raised at IT help desks. We present several protocols that allow a user to automatically recover a password from a server using partial knowledge of the password. These protocols can be easily adapted to the personal entropy setting [7], where a user can recover a password only if he can answer a large enough subset of personal questions.

  7. Web server for priority ordered multimedia services

    NASA Astrophysics Data System (ADS)

    Celenk, Mehmet; Godavari, Rakesh K.; Vetnes, Vermund

    2001-10-01

    In this work, our aim is to provide finer priority levels in the design of a general-purpose Web multimedia server with provisions of the CM services. The type of services provided include reading/writing a web page, downloading/uploading an audio/video stream, navigating the Web through browsing, and interactive video teleconferencing. The selected priority encoding levels for such operations follow the order of admin read/write, hot page CM and Web multicasting, CM read, Web read, CM write and Web write. Hot pages are the most requested CM streams (e.g., the newest movies, video clips, and HDTV channels) and Web pages (e.g., portal pages of the commercial Internet search engines). Maintaining a list of these hot Web pages and CM streams in a content addressable buffer enables a server to multicast hot streams with lower latency and higher system throughput. Cold Web pages and CM streams are treated as regular Web and CM requests. Interactive CM operations such as pause (P), resume (R), fast-forward (FF), and rewind (RW) have to be executed without allocation of extra resources. The proposed multimedia server model is a part of the distributed network with load balancing schedulers. The SM is connected to an integrated disk scheduler (IDS), which supervises an allocated disk manager. The IDS follows the same priority handling as the SM, and implements a SCAN disk-scheduling method for an improved disk access and a higher throughput. Different disks are used for the Web and CM services in order to meet the QoS requirements of CM services. The IDS ouput is forwarded to an Integrated Transmission Scheduler (ITS). The ITS creates a priority ordered buffering of the retrieved Web pages and CM data streams that are fed into an auto regressive moving average (ARMA) based traffic shaping circuitry before being transmitted through the network.

  8. ON-LINE CALCULATOR: JOHNSON ETTINGER VAPOR INTRUSION MODEL

    EPA Science Inventory

    On-Site was developed to provide modelers and model reviewers with prepackaged tools ("calculators") for performing site assessment calculations. The philosophy behind OnSite is that the convenience of the prepackaged calculators helps provide consistency for simple calculations,...

  9. ON-LINE CALCULATOR: FORWARD CALCULATION JOHNSON ETTINGER MODEL

    EPA Science Inventory

    On-Site was developed to provide modelers and model reviewers with prepackaged tools ("calculators") for performing site assessment calculations. The philosophy behind OnSite is that the convenience of the prepackaged calculators helps provide consistency for simple calculations,...

  10. Required Collaborative Work in Online Courses: A Predictive Modeling Approach

    ERIC Educational Resources Information Center

    Smith, Marlene A.; Kellogg, Deborah L.

    2015-01-01

    This article describes a predictive model that assesses whether a student will have greater perceived learning in group assignments or in individual work. The model produces correct classifications 87.5% of the time. The research is notable in that it is the first in the education literature to adopt a predictive modeling methodology using data…

  11. Adventures in the evolution of a high-bandwidth network for central servers

    SciTech Connect

    Swartz, K.L.; Cottrell, L.; Dart, M.

    1994-08-01

    In a small network, clients and servers may all be connected to a single Ethernet without significant performance concerns. As the number of clients on a network grows, the necessity of splitting the network into multiple sub-networks, each with a manageable number of clients, becomes clear. Less obvious is what to do with the servers. Group file servers on subnets and multihomed servers offer only partial solutions -- many other types of servers do not lend themselves to a decentralized model, and tend to collect on another, well-connected but overloaded Ethernet. The higher speed of FDDI seems to offer an easy solution, but in practice both expense and interoperability problems render FDDI a poor choice. Ethernet switches appear to permit cheaper and more reliable networking to the servers while providing an aggregate network bandwidth greater than a simple Ethernet. This paper studies the evolution of the server networks at SLAC. Difficulties encountered in the deployment of FDDI are described, as are the tools and techniques used to characterize the traffic patterns on the server network. Performance of Ethernet, FDDI, and switched Ethernet networks is analyzed, as are reliability and maintainability issues for these alternatives. The motivations for re-designing the SLAC general server network to use a switched Ethernet instead of FDDI are described, as are the reasons for choosing FDDI for the farm and firewall networks at SLAC. Guidelines are developed which may help in making this choice for other networks.

  12. A Distributed Model for Managing Academic Staff in an International Online Academic Programme

    ERIC Educational Resources Information Center

    Kalman, Yoram M.; Leng, Paul H.

    2007-01-01

    Online delivery of programmes of Higher Education typically involves a distributed community of students interacting with a single university site, at which the teachers, learning resources and administration of the programme are located. The alternative model, of a fully "Virtual University", which assumes no physical campus, poses problems of…

  13. Mentoring Professors: A Model for Developing Quality Online Instructors and Courses in Higher Education

    ERIC Educational Resources Information Center

    Barczyk, Casimir; Buckenmeyer, Janet; Feldman, Lori

    2010-01-01

    This article presents a four-stage model for mentoring faculty in higher education to deliver high quality online instruction. It provides a timeline that shows the stages of program implementation. Known as the Distance Education Mentoring Program, its major outcomes include certified instructors, student achievement, and the attainment of a…

  14. Phenomenological Study of Business Models Used to Scale Online Enrollment at Institutions of Higher Education

    ERIC Educational Resources Information Center

    Williams, Dana E.

    2012-01-01

    The purpose of this qualitative phenomenological study was to explore factors for selecting a business model for scaling online enrollment by institutions of higher education. The goal was to explore the lived experiences of academic industry experts involved in the selection process. The research question for this study was: What were the lived…

  15. Taking the Epistemic Step: Toward a Model of On-Line Access to Conversational Implicatures

    ERIC Educational Resources Information Center

    Breheny, Richard; Ferguson, Heather J.; Katsos, Napoleon

    2013-01-01

    There is a growing body of evidence showing that conversational implicatures are rapidly accessed in incremental utterance interpretation. To date, studies showing incremental access have focussed on implicatures related to linguistic triggers, such as "some" and "or". We discuss three kinds of on-line model that can account for this data. A model…

  16. Online Discussion and College Student Learning: Toward a Model of Influence

    ERIC Educational Resources Information Center

    Johnson, Genevieve M.; Howell, Andrew J.; Code, Jillianne R.

    2005-01-01

    As technology revolutionizes instruction, conceptual models of influence are necessary to guide implementation and evaluation of specific applications such as online peer discussion. Students in an educational psychology course analyzed five case studies that applied and integrated course content. Some students (n= 42) used "WebCT Discussions" to…

  17. Using the Jigsaw Model to Facilitate Cooperative Learning in an Online Course

    ERIC Educational Resources Information Center

    Weidman, Rob; Bishop, M. J.

    2009-01-01

    This study examined whether the jigsaw model might be used in an online higher education course to produce the key characteristics of successful cooperative learning: interdependence, individual accountability, development of social skills, and promotive interaction. The authors employed a qualitative case study design to examine a 6-week online…

  18. Analysis of Feedback Processes in Online Group Interaction: A Methodological Model

    ERIC Educational Resources Information Center

    Espasa, Anna; Guasch, Teresa; Alvarez, Ibis M.

    2013-01-01

    The aim of this article is to present a methodological model to analyze students' group interaction to improve their essays in online learning environments, based on asynchronous and written communication. In these environments teacher and student scaffolds for discussion are essential to promote interaction. One of these scaffolds can be the…

  19. The Practitioner's Model: Designing a Professional Development Program for Online Teaching

    ERIC Educational Resources Information Center

    Weaver, Debbi; Robbie, Diane; Borland, Rosemary

    2008-01-01

    This article describes the experiences of staff responsible for developing and delivering professional development (PD) in online teaching in three universities in the same Australian state. Each university draws on a similar pool of staff and students, and operates under the same government regulations, but has used different models of policy and…

  20. Using the Community of Inquiry Model to Investigate Students' Knowledge Construction in Asynchronous Online Discussions

    ERIC Educational Resources Information Center

    Liu, Chien-Jen; Yang, Shu Ching

    2014-01-01

    This study used the Community of Inquiry (CoI) model proposed by Garrison to investigate students' level of knowledge construction in asynchronous discussions. The participants included 36 senior students (27 males) majoring in information management. The students attended 18 weeks of an online information ethics course. In this study, four types…

  1. The POD Model: Using Communities of Practice Theory to Conceptualise Student Teachers' Professional Learning Online

    ERIC Educational Resources Information Center

    Clarke, Linda

    2009-01-01

    This paper focuses on the broad outcomes of a research project which aimed to analyse and model student teachers' learning in the online components of an initial teacher education course. It begins with discussion of the methodological approach adopted for the case study, which combined conventional data gathering techniques with those which are…

  2. The Development of a Content Analysis Model for Assessing Students' Cognitive Learning in Asynchronous Online Discussions

    ERIC Educational Resources Information Center

    Yang, Dazhi; Richardson, Jennifer C.; French, Brian F.; Lehman, James D.

    2011-01-01

    The purpose of this study was to develop and validate a content analysis model for assessing students' cognitive learning in asynchronous online discussions. It adopted a fully mixed methods design, in which qualitative and quantitative methods were employed sequentially for data analysis and interpretation. Specifically, the design was a…

  3. Mathematical defense method of networked servers with controlled remote backups

    NASA Astrophysics Data System (ADS)

    Kim, Song-Kyoo

    2006-05-01

    The networked server defense model is focused on reliability and availability in security respects. The (remote) backup servers are hooked up by VPN (Virtual Private Network) with high-speed optical network and replace broken main severs immediately. The networked server can be represent as "machines" and then the system deals with main unreliable, spare, and auxiliary spare machine. During vacation periods, when the system performs a mandatory routine maintenance, auxiliary machines are being used for back-ups; the information on the system is naturally delayed. Analog of the N-policy to restrict the usage of auxiliary machines to some reasonable quantity. The results are demonstrated in the network architecture by using the stochastic optimization techniques.

  4. LassoProt: server to analyze biopolymers with lassos

    PubMed Central

    Dabrowski-Tumanski, Pawel; Niemyska, Wanda; Pasznik, Pawel; Sulkowska, Joanna I.

    2016-01-01

    The LassoProt server, http://lassoprot.cent.uw.edu.pl/, enables analysis of biopolymers with entangled configurations called lassos. The server offers various ways of visualizing lasso configurations, as well as their time trajectories, with all the results and plots downloadable. Broad spectrum of applications makes LassoProt a useful tool for biologists, biophysicists, chemists, polymer physicists and mathematicians. The server and our methods have been validated on the whole PDB, and the results constitute the database of proteins with complex lassos, supported with basic biological data. This database can serve as a source of information about protein geometry and entanglement-function correlations, as a reference set in protein modeling, and for many other purposes. PMID:27131383

  5. Migration of legacy mumps applications to relational database servers.

    PubMed

    O'Kane, K C

    2001-07-01

    An extended implementation of the Mumps language is described that facilitates vendor neutral migration of legacy Mumps applications to SQL-based relational database servers. Implemented as a compiler, this system translates Mumps programs to operating system independent, standard C code for subsequent compilation to fully stand-alone, binary executables. Added built-in functions and support modules extend the native hierarchical Mumps database with access to industry standard, networked, relational database management servers (RDBMS) thus freeing Mumps applications from dependence upon vendor specific, proprietary, unstandardized database models. Unlike Mumps systems that have added captive, proprietary RDMBS access, the programs generated by this development environment can be used with any RDBMS system that supports common network access protocols. Additional features include a built-in web server interface and the ability to interoperate directly with programs and functions written in other languages. PMID:11501636

  6. LassoProt: server to analyze biopolymers with lassos.

    PubMed

    Dabrowski-Tumanski, Pawel; Niemyska, Wanda; Pasznik, Pawel; Sulkowska, Joanna I

    2016-07-01

    The LassoProt server, http://lassoprot.cent.uw.edu.pl/, enables analysis of biopolymers with entangled configurations called lassos. The server offers various ways of visualizing lasso configurations, as well as their time trajectories, with all the results and plots downloadable. Broad spectrum of applications makes LassoProt a useful tool for biologists, biophysicists, chemists, polymer physicists and mathematicians. The server and our methods have been validated on the whole PDB, and the results constitute the database of proteins with complex lassos, supported with basic biological data. This database can serve as a source of information about protein geometry and entanglement-function correlations, as a reference set in protein modeling, and for many other purposes. PMID:27131383

  7. On-line control models for the Stanford Linear Collider

    SciTech Connect

    Sheppard, J.C.; Helm, R.H.; Lee, M.J.; Woodley, M.D.

    1983-03-01

    Models for computer control of the SLAC three-kilometer linear accelerator and damping rings have been developed as part of the control system for the Stanford Linear Collider. Some of these models have been tested experimentally and implemented in the control program for routine linac operations. This paper will describe the development and implementation of these models, as well as some of the operational results.

  8. SciServer Compute brings Analysis to Big Data in the Cloud

    NASA Astrophysics Data System (ADS)

    Raddick, Jordan; Medvedev, Dmitry; Lemson, Gerard; Souter, Barbara

    2016-06-01

    SciServer Compute uses Jupyter Notebooks running within server-side Docker containers attached to big data collections to bring advanced analysis to big data "in the cloud." SciServer Compute is a component in the SciServer Big-Data ecosystem under development at JHU, which will provide a stable, reproducible, sharable virtual research environment.SciServer builds on the popular CasJobs and SkyServer systems that made the Sloan Digital Sky Survey (SDSS) archive one of the most-used astronomical instruments. SciServer extends those systems with server-side computational capabilities and very large scratch storage space, and further extends their functions to a range of other scientific disciplines.Although big datasets like SDSS have revolutionized astronomy research, for further analysis, users are still restricted to downloading the selected data sets locally – but increasing data sizes make this local approach impractical. Instead, researchers need online tools that are co-located with data in a virtual research environment, enabling them to bring their analysis to the data.SciServer supports this using the popular Jupyter notebooks, which allow users to write their own Python and R scripts and execute them on the server with the data (extensions to Matlab and other languages are planned). We have written special-purpose libraries that enable querying the databases and other persistent datasets. Intermediate results can be stored in large scratch space (hundreds of TBs) and analyzed directly from within Python or R with state-of-the-art visualization and machine learning libraries. Users can store science-ready results in their permanent allocation on SciDrive, a Dropbox-like system for sharing and publishing files. Communication between the various components of the SciServer system is managed through SciServer‘s new Single Sign-on Portal.We have created a number of demos to illustrate the capabilities of SciServer Compute, including Python and R scripts

  9. MARSIS data and simulation exploited using array databases: PlanetServer/EarthServer for sounding radars

    NASA Astrophysics Data System (ADS)

    Cantini, Federico; Pio Rossi, Angelo; Orosei, Roberto; Baumann, Peter; Misev, Dimitar; Oosthoek, Jelmer; Beccati, Alan; Campalani, Piero; Unnithan, Vikram

    2014-05-01

    parallel computing has been developed and tested on a Tier 0 class HPC cluster computer located at CINECA, Bologna, Italy, to produce accurate simulations for the entire MARSIS dataset. Although the necessary computational resources have not yet been secured, through the HPC cluster at Jacobs University in Bremen it was possible to simulate a significant subset of orbits covering the area of the Medusae Fossae Formation (MFF), a seeimingly soft, easily eroded deposit that extends for nearly 1,000 km along the equator of Mars (e.g. Watters et al., 2007; Carter et al., 2009). Besides the MARSIS data, simulation of MARSIS surface clutter signal are included in the db to further improve its scientific value. Simulations will be available throught the project portal to end users/scientists and they will eventually be provided in the PSA/PDS archives. References: Baumann, P. On the management of multidimensional discrete data. VLDB J. 4 (3), 401-444, Special Issue on Spatial Database Systems, 1994. Carter, L. M., Campbell, B. A., Watters, T. R., Phillips, R. J., Putzig, N. E., Safaeinili, A., Plaut, J., Okubo, C., Egan, A. F., Biccari, D., Orosei, R. (2009). Shallow radar (SHARAD) sounding observations of the Medusae Fossae Formation, Mars. Icarus, 199(2), 295-302. Nouvel, J.-F., Herique, A., Kofman, W., Safaeinili, A. 2004. Radar signal simulation: Surface modeling with the Facet Method. Radio Science 39, 1013. Oosthoek, J.H.P, Flahaut J., Rossi, A. P., Baumann, P., Misev, D., Campalani, P., Unnithan, V. (2013) PlanetServer: Innovative Approaches for the Online Analysis of Hyperspectral Satellite Data from Mars, Advances in Space Research. DOI: 10.1016/j.asr.2013.07.002 Picardi, G., and 33 colleagues 2005. Radar Soundings of the Subsurface of Mars. Science 310, 1925-1928. Rossi, A. P., Baumann, P., Oosthoek, J., Beccati, A., Cantini, F., Misev, D. Orosei, R., Flahaut, J., Campalani, P., Unnithan, V. (2014),Geophys. Res. Abs., Vol. 16, #EGU2014-5149, this meeting. Watters, T. R

  10. MARSIS data and simulation exploited using array databases: PlanetServer/EarthServer for sounding radars

    NASA Astrophysics Data System (ADS)

    Cantini, Federico; Pio Rossi, Angelo; Orosei, Roberto; Baumann, Peter; Misev, Dimitar; Oosthoek, Jelmer; Beccati, Alan; Campalani, Piero; Unnithan, Vikram

    2014-05-01

    parallel computing has been developed and tested on a Tier 0 class HPC cluster computer located at CINECA, Bologna, Italy, to produce accurate simulations for the entire MARSIS dataset. Although the necessary computational resources have not yet been secured, through the HPC cluster at Jacobs University in Bremen it was possible to simulate a significant subset of orbits covering the area of the Medusae Fossae Formation (MFF), a seeimingly soft, easily eroded deposit that extends for nearly 1,000 km along the equator of Mars (e.g. Watters et al., 2007; Carter et al., 2009). Besides the MARSIS data, simulation of MARSIS surface clutter signal are included in the db to further improve its scientific value. Simulations will be available throught the project portal to end users/scientists and they will eventually be provided in the PSA/PDS archives. References: Baumann, P. On the management of multidimensional discrete data. VLDB J. 4 (3), 401-444, Special Issue on Spatial Database Systems, 1994. Carter, L. M., Campbell, B. A., Watters, T. R., Phillips, R. J., Putzig, N. E., Safaeinili, A., Plaut, J., Okubo, C., Egan, A. F., Biccari, D., Orosei, R. (2009). Shallow radar (SHARAD) sounding observations of the Medusae Fossae Formation, Mars. Icarus, 199(2), 295-302. Nouvel, J.-F., Herique, A., Kofman, W., Safaeinili, A. 2004. Radar signal simulation: Surface modeling with the Facet Method. Radio Science 39, 1013. Oosthoek, J.H.P, Flahaut J., Rossi, A. P., Baumann, P., Misev, D., Campalani, P., Unnithan, V. (2013) PlanetServer: Innovative Approaches for the Online Analysis of Hyperspectral Satellite Data from Mars, Advances in Space Research. DOI: 10.1016/j.asr.2013.07.002 Picardi, G., and 33 colleagues 2005. Radar Soundings of the Subsurface of Mars. Science 310, 1925-1928. Rossi, A. P., Baumann, P., Oosthoek, J., Beccati, A., Cantini, F., Misev, D. Orosei, R., Flahaut, J., Campalani, P., Unnithan, V. (2014),Geophys. Res. Abs., Vol. 16, #EGU2014-5149, this meeting. Watters, T. R

  11. HMMER web server: interactive sequence similarity searching

    PubMed Central

    Finn, Robert D.; Clements, Jody; Eddy, Sean R.

    2011-01-01

    HMMER is a software suite for protein sequence similarity searches using probabilistic methods. Previously, HMMER has mainly been available only as a computationally intensive UNIX command-line tool, restricting its use. Recent advances in the software, HMMER3, have resulted in a 100-fold speed gain relative to previous versions. It is now feasible to make efficient profile hidden Markov model (profile HMM) searches via the web. A HMMER web server (http://hmmer.janelia.org) has been designed and implemented such that most protein database searches return within a few seconds. Methods are available for searching either a single protein sequence, multiple protein sequence alignment or profile HMM against a target sequence database, and for searching a protein sequence against Pfam. The web server is designed to cater to a range of different user expertise and accepts batch uploading of multiple queries at once. All search methods are also available as RESTful web services, thereby allowing them to be readily integrated as remotely executed tasks in locally scripted workflows. We have focused on minimizing search times and the ability to rapidly display tabular results, regardless of the number of matches found, developing graphical summaries of the search results to provide quick, intuitive appraisement of them. PMID:21593126

  12. Cyberdemocracy and Online Politics: A New Model of Interactivity

    ERIC Educational Resources Information Center

    Ferber, Paul; Foltz, Franz; Pugliese, Rudy

    2007-01-01

    Building on McMillan's two-way model of interactivity, this study presents a three-way model of interactive communication, which is used to assess political Web sites' progress toward the ideals of cyberdemocracy and the fostering of public deliberation. Results of a 3-year study of state legislature Web sites, an analysis of the community…

  13. Web Server Security on Open Source Environments

    NASA Astrophysics Data System (ADS)

    Gkoutzelis, Dimitrios X.; Sardis, Manolis S.

    Administering critical resources has never been more difficult that it is today. In a changing world of software innovation where major changes occur on a daily basis, it is crucial for the webmasters and server administrators to shield their data against an unknown arsenal of attacks in the hands of their attackers. Up until now this kind of defense was a privilege of the few, out-budgeted and low cost solutions let the defender vulnerable to the uprising of innovating attacking methods. Luckily, the digital revolution of the past decade left its mark, changing the way we face security forever: open source infrastructure today covers all the prerequisites for a secure web environment in a way we could never imagine fifteen years ago. Online security of large corporations, military and government bodies is more and more handled by open source application thus driving the technological trend of the 21st century in adopting open solutions to E-Commerce and privacy issues. This paper describes substantial security precautions in facing privacy and authentication issues in a totally open source web environment. Our goal is to state and face the most known problems in data handling and consequently propose the most appealing techniques to face these challenges through an open solution.

  14. A decision-making process model of young online shoppers.

    PubMed

    Lin, Chin-Feng; Wang, Hui-Fang

    2008-12-01

    Based on the concepts of brand equity, means-end chain, and Web site trust, this study proposes a novel model called the consumption decision-making process of adolescents (CDMPA) to understand adolescents' Internet consumption habits and behavioral intention toward particular sporting goods. The findings of the CDMPA model can help marketers understand adolescents' consumption preferences and habits for developing effective Internet marketing strategies. PMID:19025465

  15. Online estimation algorithm for a biaxial ankle kinematic model with configuration dependent joint axes.

    PubMed

    Tsoi, Y H; Xie, S Q

    2011-02-01

    The kinematics of the human ankle is commonly modeled as a biaxial hinge joint model. However, significant variations in axis orientations have been found between different individuals and also between different foot configurations. For ankle rehabilitation robots, information regarding the ankle kinematic parameters can be used to estimate the ankle and subtalar joint displacements. This can in turn be used as auxiliary variables in adaptive control schemes to allow modification of the robot stiffness and damping parameters to reduce the forces applied at stiffer foot configurations. Due to the large variations observed in the ankle kinematic parameters, an online identification algorithm is required to provide estimates of the model parameters. An online parameter estimation routine based on the recursive least-squares (RLS) algorithm was therefore developed in this research. An extension of the conventional biaxial ankle kinematic model, which allows variation in axis orientations with different foot configurations had also been developed and utilized in the estimation algorithm. Simulation results showed that use of the extended model in the online algorithm is effective in capturing the foot orientation of a biaxial ankle model with variable joint axis orientations. Experimental results had also shown that a modified RLS algorithm that penalizes a deviation of model parameters from their nominal values can be used to obtain more realistic parameter estimates while maintaining a level of estimation accuracy comparable to that of the conventional RLS routine. PMID:21280877

  16. Oceanotron, Scalable Server for Marine Observations

    NASA Astrophysics Data System (ADS)

    Loubrieu, T.; Bregent, S.; Blower, J. D.; Griffiths, G.

    2013-12-01

    Ifremer, French marine institute, is deeply involved in data management for different ocean in-situ observation programs (ARGO, OceanSites, GOSUD, ...) or other European programs aiming at networking ocean in-situ observation data repositories (myOcean, seaDataNet, Emodnet). To capitalize the effort for implementing advance data dissemination services (visualization, download with subsetting) for these programs and generally speaking water-column observations repositories, Ifremer decided to develop the oceanotron server (2010). Knowing the diversity of data repository formats (RDBMS, netCDF, ODV, ...) and the temperamental nature of the standard interoperability interface profiles (OGC/WMS, OGC/WFS, OGC/SOS, OpeNDAP, ...), the server is designed to manage plugins: - StorageUnits : which enable to read specific data repository formats (netCDF/OceanSites, RDBMS schema, ODV binary format). - FrontDesks : which get external requests and send results for interoperable protocols (OGC/WMS, OGC/SOS, OpenDAP). In between a third type of plugin may be inserted: - TransformationUnits : which enable ocean business related transformation of the features (for example conversion of vertical coordinates from pressure in dB to meters under sea surface). The server is released under open-source license so that partners can develop their own plugins. Within MyOcean project, University of Reading has plugged a WMS implementation as an oceanotron frontdesk. The modules are connected together by sharing the same information model for marine observations (or sampling features: vertical profiles, point series and trajectories), dataset metadata and queries. The shared information model is based on OGC/Observation & Measurement and Unidata/Common Data Model initiatives. The model is implemented in java (http://www.ifremer.fr/isi/oceanotron/javadoc/). This inner-interoperability level enables to capitalize ocean business expertise in software development without being indentured to

  17. Architectural Improvements and New Processing Tools for the Open XAL Online Model

    SciTech Connect

    Allen, Christopher K; Pelaia II, Tom; Freed, Jonathan M

    2015-01-01

    The online model is the component of Open XAL providing accelerator modeling, simulation, and dynamic synchronization to live hardware. Significant architectural changes and feature additions have been recently made in two separate areas: 1) the managing and processing of simulation data, and 2) the modeling of RF cavities. Simulation data and data processing have been completely decoupled. A single class manages all simulation data while standard tools were developed for processing the simulation results. RF accelerating cavities are now modeled as composite structures where parameter and dynamics computations are distributed. The beam and hardware models both maintain their relative phase information, which allows for dynamic phase slip and elapsed time computation.

  18. Exploring the Earth System through online interactive models

    NASA Astrophysics Data System (ADS)

    Coogan, L. A.

    2013-12-01

    Upper level Earth Science students commonly have a strong background of mathematical training from Math courses, however their ability to use mathematical models to solve Earth Science problems is commonly limited. Their difficulty comes, in part, because of the nature of the subject matter. There is a large body of background ';conceptual' and ';observational' understanding and knowledge required in the Earth Sciences before in-depth quantification becomes useful. For example, it is difficult to answer questions about geological processes until you can identify minerals and rocks and understand the general geodynamic implications of their associations. However, science is fundamentally quantitative. To become scientists students have to translate their conceptual understanding into quantifiable models. Thus, it is desirable for students to become comfortable with using mathematical models to test hypotheses. With the aim of helping to bridging the gap between conceptual understanding and quantification I have started to build an interactive teaching website based around quantitative models of Earth System processes. The site is aimed at upper-level undergraduate students and spans a range of topics that will continue to grow as time allows. The mathematical models are all built for the students, allowing them to spend their time thinking about how the ';model world' changes in response to their manipulation of the input variables. The web site is divided into broad topics or chapters (Background, Solid Earth, Ocean and Atmosphere, Earth history) and within each chapter there are different subtopic (e.g. Solid Earth: Core, Mantle, Crust) and in each of these individual webpages. Each webpage, or topic, starts with an introduction to the topic, followed by an interactive model that the students can use sliders to control the input to and watch how the results change. This interaction between student and model is guided by a series of multiple choice questions that

  19. Peer Assessment with Online Tools to Improve Student Modeling

    ERIC Educational Resources Information Center

    Atkins, Leslie J.

    2012-01-01

    Introductory physics courses often require students to develop precise models of phenomena and represent these with diagrams, including free-body diagrams, light-ray diagrams, and maps of field lines. Instructors expect that students will adopt a certain rigor and precision when constructing these diagrams, but we want that rigor and precision to…

  20. Understanding the Effectiveness of Online Peer Assessment: A Path Model

    ERIC Educational Resources Information Center

    Lu, Jingyan; Zhang, Zhidong

    2012-01-01

    Peer assessment has been implemented in schools as both a learning tool and an assessment tool. Earlier studies have explored the effectiveness of peer assessment from different perspectives, such as domain knowledge and skills, peer assessment skills, and attitude changes. However, there is no holistic model describing the effects of cognitive…

  1. Towards a Social Networks Model for Online Learning & Performance

    ERIC Educational Resources Information Center

    Chung, Kon Shing Kenneth; Paredes, Walter Christian

    2015-01-01

    In this study, we develop a theoretical model to investigate the association between social network properties, "content richness" (CR) in academic learning discourse, and performance. CR is the extent to which one contributes content that is meaningful, insightful and constructive to aid learning and by social network properties we…

  2. Promoting Continuous Quality Improvement in Online Teaching: The META Model

    ERIC Educational Resources Information Center

    Dittmar, Eileen; McCracken, Holly

    2012-01-01

    Experienced e-learning faculty members share strategies for implementing a comprehensive postsecondary faculty development program essential to continuous improvement of instructional skills. The high-impact META Model (centered around Mentoring, Engagement, Technology, and Assessment) promotes information sharing and content creation, and fosters…

  3. UniTree Name Server internals

    SciTech Connect

    Mecozzi, D.; Minton, J.

    1996-01-01

    The UniTree Name Server (UNS) is one of several servers which make up the UniTree storage system. The Name Server is responsible for mapping names to capabilities Names are generally human readable ASCII strings of any length. Capabilities are unique 256-bit identifiers that point to files, directories, or symbolic links. The Name Server implements a UNIX style hierarchical directory structure to facilitate name-to-capability mapping. The principal task of the Name Server is to manage the directories which make up the UniTree directory structure. The principle clients of the Name Server are the FTP Daemon, NFS and a few UniTree utility routines. However, the Name Server is a generalized server and will accept messages from any client. The purpose of this paper is to describe the internal workings of the UniTree Name Server. In cases where it seems appropriate, the motivation for a particular choice of algorithm as description of the algorithm itself will be given.

  4. WSKE: Web Server Key Enabled Cookies

    NASA Astrophysics Data System (ADS)

    Masone, Chris; Baek, Kwang-Hyun; Smith, Sean

    In this paper, we present the design and prototype of a new approach to cookie management: if a server deposits a cookie only after authenticating itself via the SSL handshake, the browser will return the cookie only to a server that can authenticate itself, via SSL, to the same keypair. This approach can enable usable but secure client authentication. This approach can improve the usability of server authentication by clients. This approach is superior to the prior work on Active Cookies in that it defends against both DNS spoofing and IP spoofing—and does not require binding a user's interaction with a server to individual IP addresses.

  5. National Medical Terminology Server in Korea

    NASA Astrophysics Data System (ADS)

    Lee, Sungin; Song, Seung-Jae; Koh, Soonjeong; Lee, Soo Kyoung; Kim, Hong-Gee

    Interoperable EHR (Electronic Health Record) necessitates at least the use of standardized medical terminologies. This paper describes a medical terminology server, LexCare Suite, which houses terminology management applications, such as a terminology editor, and a terminology repository populated with international standard terminology systems such as Systematized Nomenclature of Medicine (SNOMED). The server is to satisfy the needs of quality terminology systems to local primary to tertiary hospitals. Our partner general hospitals have used the server to test its applicability. This paper describes the server and the results of the applicability test.

  6. Opportunities for the Mashup of Heterogenous Data Server via Semantic Web Technology

    NASA Astrophysics Data System (ADS)

    Ritschel, Bernd; Seelus, Christoph; Neher, Günther; Iyemori, Toshihiko; Koyama, Yukinobu; Yatagai, Akiyo; Murayama, Yasuhiro; King, Todd; Hughes, John; Fung, Shing; Galkin, Ivan; Hapgood, Michael; Belehaki, Anna

    2015-04-01

    Opportunities for the Mashup of Heterogenous Data Server via Semantic Web Technology European Union ESPAS, Japanese IUGONET and GFZ ISDC data server are developed for the ingestion, archiving and distributing of geo and space science domain data. Main parts of the data -managed by the mentioned data server- are related to near earth-space and geomagnetic field data. A smart mashup of the data server would allow a seamless browse and access to data and related context information. However the achievement of a high level of interoperability is a challenge because the data server are based on different data models and software frameworks. This paper is focused on the latest experiments and results for the mashup of the data server using the semantic Web approach. Besides the mashup of domain and terminological ontologies, especially the options to connect data managed by relational databases using D2R server and SPARQL technology will be addressed. A successful realization of the data server mashup will not only have a positive impact to the data users of the specific scientific domain but also to related projects, such as e.g. the development of a new interoperable version of NASA's Planetary Data System (PDS) or ICUS's World Data System alliance. ESPAS data server: https://www.espas-fp7.eu/portal/ IUGONET data server: http://search.iugonet.org/iugonet/ GFZ ISDC data server (semantic Web based prototype): http://rz-vm30.gfz-potsdam.de/drupal-7.9/ NASA PDS: http://pds.nasa.gov ICSU-WDS: https://www.icsu-wds.org

  7. The NASA Technical Report Server

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Gottlich, Gretchen L.; Bianco, David J.; Paulson, Sharon S.; Binkley, Robert L.; Kellogg, Yvonne D.; Beaumont, Chris J.; Schmunk, Robert B.; Kurtz, Michael J.; Accomazzi, Alberto

    1995-01-01

    The National Aeronautics and Space Act of 1958 established NASA and charged it to "provide for the widest practicable and appropriate dissemination of information concerning its activities and the results thereof." The search for innovative methods to distribute NASA's information lead a grass-roots team to create the NASA Technical Report Server (NTRS), which uses the World Wide Web and other popular Internet-based information systems as search engines. The NTRS is an inter-center effort which provides uniform access to various distributed publication servers residing on the Internet. Users have immediate desktop access to technical publications from NASA centers and institutes. The NTRS is comprised of several units, some constructed especially for inclusion in NTRS, and others that are existing NASA publication services that NTRS reuses. This paper presents the NTRS architecture, usage metrics, and the lessons learned while implementing and maintaining the service. The NTRS is largely constructed with freely available software running on existing hardware. NTRS builds upon existing hardware and software, and the resulting additional exposure for the body of literature contained ensures that NASA's institutional knowledge base will continue to receive the widest practicable and appropriate dissemination.

  8. HDF-EOS Web Server

    NASA Technical Reports Server (NTRS)

    Ullman, Richard; Bane, Bob; Yang, Jingli

    2008-01-01

    A shell script has been written as a means of automatically making HDF-EOS-formatted data sets available via the World Wide Web. ("HDF-EOS" and variants thereof are defined in the first of the two immediately preceding articles.) The shell script chains together some software tools developed by the Data Usability Group at Goddard Space Flight Center to perform the following actions: Extract metadata in Object Definition Language (ODL) from an HDF-EOS file, Convert the metadata from ODL to Extensible Markup Language (XML), Reformat the XML metadata into human-readable Hypertext Markup Language (HTML), Publish the HTML metadata and the original HDF-EOS file to a Web server and an Open-source Project for a Network Data Access Protocol (OPeN-DAP) server computer, and Reformat the XML metadata and submit the resulting file to the EOS Clearinghouse, which is a Web-based metadata clearinghouse that facilitates searching for, and exchange of, Earth-Science data.

  9. On-line simulations of models for backward masking.

    PubMed

    Francis, Gregory

    2003-11-01

    Five simulations of quantitative models of visual backward masking are available on the Internet at http://www.psych.purdue.edu/-gfrancis/Publications/BackwardMasking/. The simulations can be run in a Web browser that supports the Java programming language. This article describes the motivation for making the simulations available and gives a brief introduction as to how the simulations are used. The source code is available on the Web page, and this article describes how the code is organized. PMID:14748495

  10. Peer Assessment with Online Tools to Improve Student Modeling

    NASA Astrophysics Data System (ADS)

    Atkins, Leslie J.

    2012-11-01

    Introductory physics courses often require students to develop precise models of phenomena and represent these with diagrams, including free-body diagrams, light-ray diagrams, and maps of field lines. Instructors expect that students will adopt a certain rigor and precision when constructing these diagrams, but we want that rigor and precision to be an aid to sense-making rather than meeting seemingly arbitrary requirements set by the instructor. By giving students the authority to develop their own models and establish requirements for their diagrams, the sense that these are arbitrary requirements diminishes and students are more likely to see modeling as a sense-making activity. The practice of peer assessment can help students take ownership; however, it can be difficult for instructors to manage. Furthermore, it is not without risk: students can be reluctant to critique their peers, they may view this as the job of the instructor, and there is no guarantee that students will employ greater rigor and precision as a result of peer assessment. In this article, we describe one approach for peer assessment that can establish norms for diagrams in a way that is student driven, where students retain agency and authority in assessing and improving their work. We show that such an approach does indeed improve students' diagrams and abilities to assess their own work, without sacrificing students' authority and agency.

  11. An epidemic model of rumor diffusion in online social networks

    NASA Astrophysics Data System (ADS)

    Cheng, Jun-Jun; Liu, Yun; Shen, Bo; Yuan, Wei-Guo

    2013-01-01

    So far, in some standard rumor spreading models, the transition probability from ignorants to spreaders is always treated as a constant. However, from a practical perspective, the case that individual whether or not be infected by the neighbor spreader greatly depends on the trustiness of ties between them. In order to solve this problem, we introduce a stochastic epidemic model of the rumor diffusion, in which the infectious probability is defined as a function of the strength of ties. Moreover, we investigate numerically the behavior of the model on a real scale-free social site with the exponent γ = 2.2. We verify that the strength of ties plays a critical role in the rumor diffusion process. Specially, selecting weak ties preferentially cannot make rumor spread faster and wider, but the efficiency of diffusion will be greatly affected after removing them. Another significant finding is that the maximum number of spreaders max( S) is very sensitive to the immune probability μ and the decay probability v. We show that a smaller μ or v leads to a larger spreading of the rumor, and their relationships can be described as the function ln(max( S)) = Av + B, in which the intercept B and the slope A can be fitted perfectly as power-law functions of μ. Our findings may offer some useful insights, helping guide the application in practice and reduce the damage brought by the rumor.

  12. Online motor fault detection and diagnosis using a hybrid FMM-CART model.

    PubMed

    Seera, Manjeevan; Lim, Chee Peng

    2014-04-01

    In this brief, a hybrid model combining the fuzzy min-max (FMM) neural network and the classification and regression tree (CART) for online motor detection and diagnosis tasks is described. The hybrid model, known as FMM-CART, exploits the advantages of both FMM and CART for undertaking data classification and rule extraction problems. To evaluate the applicability of the proposed FMM-CART model, an evaluation with a benchmark data set pertaining to electrical motor bearing faults is first conducted. The results obtained are equivalent to those reported in the literature. Then, a laboratory experiment for detecting and diagnosing eccentricity faults in an induction motor is performed. In addition to producing accurate results, useful rules in the form of a decision tree are extracted to provide explanation and justification for the predictions from FMM-CART. The experimental outcome positively shows the potential of FMM-CART in undertaking online motor fault detection and diagnosis tasks. PMID:24807956

  13. Evaluation of surface ozone simulated by the WRF/CMAQ online modelling system

    NASA Astrophysics Data System (ADS)

    Marougianni, Garyfalia; Katragkou, Eleni; Giannaros, Theodoros; Poupkou, Anastasia; Melas, Dimitris; Zanis, Prodromos; Feidas, Haralambos

    2013-04-01

    In this work we evaluate the online model WRF/CMAQ with respect to surface ozone and compare its performance with an off-line modelling system (WRF/CAMx) that has been operationally used by Aristotle University of Thessaloniki (AUTH) for chemical weather forecasting in the Mediterranean. The online model consists of the mesoscale meteorological model WRF3.3 and the air quality model CMAQ5.0.1 which are coupled in every time-step. The modelling domain covers Europe with a resolution of 30 Km (identical projection for meteorological and chemistry simulations to avoid interpolation errors) and CMAQ has 17 vertical layers extending up to 15 Km. Anthropogenic emissions are prepared according to the SNAP nomenclature and the biogenic emissions are provided by the Natural Emission Model (NEMO) developed by AUTH. A 2-month simulation is performed by WRF/CMAQ covering the time period of June-July 2010. Average monthly concentration values obtained from the MACCII service (IFS-Mozart) are used as chemical boundary conditions for the simulations. For the WRF simulations boundary conditions are provided by the ECMWF. The same boundaries, chemical mechanism (CBV), emissions and model set up is used in the off-line WRF/CAMx in order to allow a more direct comparison of model results. To evaluate the performance of the WRF/CMAQ online model, simulated ozone concentrations are compared against near surface ozone measurements from the EMEP network. Τhe model has been validated with the climatic observational database that has been compiled in the framework of the GEOCLIMA project (http://www.geoclima.eu/). In the evaluation analysis only those stations that fulfill the criterion of 75% data availability for near surface ozone are used. Various statistical metrics are used for the model evaluation, including correlation coefficient (R), normalized standard deviation (NSD) and modified normalized mean bias (MNMB). The final aim is to investigate whether the state-of-the-art WRF

  14. PlanetServer/EarthServer: Big Data analytics in Planetary Science

    NASA Astrophysics Data System (ADS)

    Pio Rossi, Angelo; Oosthoek, Jelmer; Baumann, Peter; Beccati, Alan; Cantini, Federico; Misev, Dimitar; Orosei, Roberto; Flahaut, Jessica; Campalani, Piero; Unnithan, Vikram

    2014-05-01

    Planetary data are freely available on PDS/PSA archives and alike (e.g. Heather et al., 2013). Their exploitation by the community is somewhat limited by the variable availability of calibrated/higher level datasets. An additional complexity of these multi-experiment, multi-mission datasets is related to the heterogeneity of data themselves, rather than their volume. Orbital - so far - data are best suited for an inclusion in array databases (Baumann et al., 1994). Most lander- or rover-based remote sensing experiment (and possibly, in-situ as well) are suitable for similar approaches, although the complexity of coordinate reference systems (CRS) is higher in the latter case. PlanetServer, the Planetary Service of the EC FP7 e-infrastructure project EarthServer (http://earthserver.eu) is a state-of-art online data exploration and analysis system based on the Open Geospatial Consortium (OGC) standards for Mars orbital data. It provides access to topographic, panchromatic, multispectral and hyperspectral calibrated data. While its core focus has been on hyperspectral data analysis through the OGC Web Coverage Processing Service (Oosthoek et al., 2013; Rossi et al., 2013), the Service progressively expanded to host also sounding radar data (Cantini et al., this volume). Additionally, both single swath and mosaicked imagery and topographic data are being added to the Service, deriving from the HRSC experiment (e.g. Jaumann et al., 2007; Gwinner et al., 2009) The current Mars-centric focus can be extended to other planetary bodies and most components are general purpose ones, making possible its application to the Moon, Mercury or alike. The Planetary Service of EarthServer is accessible on http://www.planetserver.eu References: Baumann, P. (1994) VLDB J. 4 (3), 401-444, Special Issue on Spatial Database Systems. Cantini, F. et al. (2014) Geophys. Res. Abs., Vol. 16, #EGU2014-3784, this volume Heather, D., et al.(2013) EuroPlanet Sci. Congr. #EPSC2013-626 Gwinner, K

  15. Constructs of Student-Centered Online Learning on Learning Satisfaction of a Diverse Online Student Body: A Structural Equation Modeling Approach

    ERIC Educational Resources Information Center

    Ke, Fengfeng; Kwak, Dean

    2013-01-01

    The present study investigated the relationships between constructs of web-based student-centered learning and the learning satisfaction of a diverse online student body. Hypotheses on the constructs of student-centered learning were tested using structural equation modeling. The results indicated that five key constructs of student-centered…

  16. Using Structural Equation Modeling to Validate Online Game Players' Motivations Relative to Self-Concept and Life Adaptation

    ERIC Educational Resources Information Center

    Yang, Shu Ching; Huang, Chiao Ling

    2013-01-01

    This study aimed to validate a systematic instrument to measure online players' motivations for playing online games (MPOG) and examine how the interplay of differential motivations impacts young gamers' self-concept and life adaptation. Confirmatory factor analysis determined that a hierarchical model with a two-factor structure of…

  17. Students' Performance at Tutorial Online of Social Studies through the Use of Learning Cycle Model

    ERIC Educational Resources Information Center

    Farisi, Mohammad Imam

    2014-01-01

    The purpose of the study is to describe student's performance in tutorial online (tuton) of Social Studies through developing the 5Es--Engage Explore, Explain, Elaborate, and Evaluate--Learning Cycle Model (the 5Es-LCM). The study conducted at UT-Online portal uses the Research and Development (R&D) method. The research subjects consisted…

  18. When Disney Meets the Research Park: Metaphors and Models for Engineering an Online Learning Community of Tomorrow

    ERIC Educational Resources Information Center

    Chenail, Ronald J.

    2004-01-01

    It is suggested that educators look to an environment in which qualitative research can be learned in more flexible and creative ways--an online learning community known as the Research Park Online (RPO). This model, based upon Walt Disney's 1966 plan for his "Experimental Prototype Community of Tomorrow" (EPCOT) and university cooperative…

  19. Online, On Demand Access to Coastal Digital Elevation Models

    NASA Astrophysics Data System (ADS)

    Long, J.; Bristol, S.; Long, D.; Thompson, S.

    2014-12-01

    Process-based numerical models for coastal waves, water levels, and sediment transport are initialized with digital elevation models (DEM) constructed by interpolating and merging bathymetric and topographic elevation data. These gridded surfaces must seamlessly span the land-water interface and may cover large regions where the individual raw data sources are collected at widely different spatial and temporal resolutions. In addition, the datasets are collected from different instrument platforms with varying accuracy and may or may not overlap in coverage. The lack of available tools and difficulties in constructing these DEMs lead scientists to 1) rely on previously merged, outdated, or over-smoothed DEMs; 2) discard more recent data that covers only a portion of the DEM domain; and 3) use inconsistent methodologies to generate DEMs. The objective of this work is to address the immediate need of integrating land and water-based elevation data sources and streamline the generation of a seamless data surface that spans the terrestrial-marine boundary. To achieve this, the U.S. Geological Survey (USGS) is developing a web processing service to format and initialize geoprocessing tasks designed to create coastal DEMs. The web processing service is maintained within the USGS ScienceBase data management system and has an associated user interface. Through the map-based interface, users define a geographic region that identifies the bounds of the desired DEM and a time period of interest. This initiates a query for elevation datasets within federal science agency data repositories. A geoprocessing service is then triggered to interpolate, merge, and smooth the data sources creating a DEM based on user-defined configuration parameters. Uncertainty and error estimates for the DEM are also returned by the geoprocessing service. Upon completion, the information management platform provides access to the final gridded data derivative and saves the configuration parameters

  20. Evaluation of Major Online Diabetes Risk Calculators and Computerized Predictive Models

    PubMed Central

    Stiglic, Gregor; Pajnkihar, Majda

    2015-01-01

    Classical paper-and-pencil based risk assessment questionnaires are often accompanied by the online versions of the questionnaire to reach a wider population. This study focuses on the loss, especially in risk estimation performance, that can be inflicted by direct transformation from the paper to online versions of risk estimation calculators by ignoring the possibilities of more complex and accurate calculations that can be performed using the online calculators. We empirically compare the risk estimation performance between four major diabetes risk calculators and two, more advanced, predictive models. National Health and Nutrition Examination Survey (NHANES) data from 1999–2012 was used to evaluate the performance of detecting diabetes and pre-diabetes. American Diabetes Association risk test achieved the best predictive performance in category of classical paper-and-pencil based tests with an Area Under the ROC Curve (AUC) of 0.699 for undiagnosed diabetes (0.662 for pre-diabetes) and 47% (47% for pre-diabetes) persons selected for screening. Our results demonstrate a significant difference in performance with additional benefits for a lower number of persons selected for screening when statistical methods are used. The best AUC overall was obtained in diabetes risk prediction using logistic regression with AUC of 0.775 (0.734) and an average 34% (48%) persons selected for screening. However, generalized boosted regression models might be a better option from the economical point of view as the number of selected persons for screening of 30% (47%) lies significantly lower for diabetes risk assessment in comparison to logistic regression (p < 0.001), with a significantly higher AUC (p < 0.001) of 0.774 (0.740) for the pre-diabetes group. Our results demonstrate a serious lack of predictive performance in four major online diabetes risk calculators. Therefore, one should take great care and consider optimizing the online versions of questionnaires that were

  1. A Web Server for MACCS Magnetometer Data

    NASA Technical Reports Server (NTRS)

    Engebretson, Mark J.

    1998-01-01

    NASA Grant NAG5-3719 was provided to Augsburg College to support the development of a web server for the Magnetometer Array for Cusp and Cleft Studies (MACCS), a two-dimensional array of fluxgate magnetometers located at cusp latitudes in Arctic Canada. MACCS was developed as part of the National Science Foundation's GEM (Geospace Environment Modeling) Program, which was designed in part to complement NASA's Global Geospace Science programs during the decade of the 1990s. This report describes the successful use of these grant funds to support a working web page that provides both daily plots and file access to any user accessing the worldwide web. The MACCS home page can be accessed at http://space.augsburg.edu/space/MaccsHome.html.

  2. IIR filtering based adaptive active vibration control methodology with online secondary path modeling using PZT actuators

    NASA Astrophysics Data System (ADS)

    Boz, Utku; Basdogan, Ipek

    2015-12-01

    Structural vibrations is a major cause for noise problems, discomfort and mechanical failures in aerospace, automotive and marine systems, which are mainly composed of plate-like structures. In order to reduce structural vibrations on these structures, active vibration control (AVC) is an effective approach. Adaptive filtering methodologies are preferred in AVC due to their ability to adjust themselves for varying dynamics of the structure during the operation. The filtered-X LMS (FXLMS) algorithm is a simple adaptive filtering algorithm widely implemented in active control applications. Proper implementation of FXLMS requires availability of a reference signal to mimic the disturbance and model of the dynamics between the control actuator and the error sensor, namely the secondary path. However, the controller output could interfere with the reference signal and the secondary path dynamics may change during the operation. This interference problem can be resolved by using an infinite impulse response (IIR) filter which considers feedback of the one or more previous control signals to the controller output and the changing secondary path dynamics can be updated using an online modeling technique. In this paper, IIR filtering based filtered-U LMS (FULMS) controller is combined with online secondary path modeling algorithm to suppress the vibrations of a plate-like structure. The results are validated through numerical and experimental studies. The results show that the FULMS with online secondary path modeling approach has more vibration rejection capabilities with higher convergence rate than the FXLMS counterpart.

  3. A low-order coupled chemistry meteorology model for testing online and offline data assimilation schemes

    NASA Astrophysics Data System (ADS)

    Haussaire, J.-M.; Bocquet, M.

    2015-08-01

    Bocquet and Sakov (2013) have introduced a low-order model based on the coupling of the chaotic Lorenz-95 model which simulates winds along a mid-latitude circle, with the transport of a tracer species advected by this zonal wind field. This model, named L95-T, can serve as a playground for testing data assimilation schemes with an online model. Here, the tracer part of the model is extended to a reduced photochemistry module. This coupled chemistry meteorology model (CCMM), the L95-GRS model, mimics continental and transcontinental transport and the photochemistry of ozone, volatile organic compounds and nitrogen oxides. Its numerical implementation is described. The model is shown to reproduce the major physical and chemical processes being considered. L95-T and L95-GRS are specifically designed and useful for testing advanced data assimilation schemes, such as the iterative ensemble Kalman smoother (IEnKS) which combines the best of ensemble and variational methods. These models provide useful insights prior to the implementation of data assimilation methods on larger models. We illustrate their use with data assimilation schemes on preliminary, yet instructive numerical experiments. In particular, online and offline data assimilation strategies can be conveniently tested and discussed with this low-order CCMM. The impact of observed chemical species concentrations on the wind field can be quantitatively estimated. The impacts of the wind chaotic dynamics and of the chemical species non-chaotic but highly nonlinear dynamics on the data assimilation strategies are illustrated.

  4. Get the Word Out with List Servers

    ERIC Educational Resources Information Center

    Goldberg, Laurence

    2006-01-01

    In this article, the author details the use of electronic mail server in their school. In their school district of about 7,300 students in suburban Philadelphia (Abington SD), electronic mail list servers are now being used, along with other methods of communication, to disseminate information quickly and widely. They began by manually maintaining…

  5. You're a What? Process Server

    ERIC Educational Resources Information Center

    Torpey, Elka

    2012-01-01

    In this article, the author talks about the role and functions of a process server. The job of a process server is to hand deliver legal documents to the people involved in court cases. These legal documents range from a summons to appear in court to a subpoena for producing evidence. Process serving can involve risk, as some people take out their…

  6. Interfaces for Distributed Systems of Information Servers.

    ERIC Educational Resources Information Center

    Kahle, Brewster; And Others

    1992-01-01

    Describes two systems--Wide Area Information Servers (WAIS) and Rosebud--that provide protocol-based mechanisms for accessing remote full-text information servers. Design constraints, human interface design, and implementation are examined for five interfaces to these systems developed to run on the Macintosh or Unix terminals. Sample screen…

  7. ModelView for ModelDB: Online Presentation of Model Structure.

    PubMed

    McDougal, Robert A; Morse, Thomas M; Hines, Michael L; Shepherd, Gordon M

    2015-10-01

    ModelDB ( modeldb.yale.edu ), a searchable repository of source code of more than 950 published computational neuroscience models, seeks to promote model reuse and reproducibility. Code sharing is a first step; however, model source code is often large and not easily understood. To aid users, we have developed ModelView, a web application for ModelDB that presents a graphical view of model structure augmented with contextual information for NEURON and NEURON-runnable (e.g. NeuroML, PyNN) models. Web presentation provides a rich, simulator-independent environment for interacting with graphs. The necessary data is generated by combining manual curation, text-mining the source code, querying ModelDB, and simulator introspection. Key features of the user interface along with the data analysis, storage, and visualization algorithms are explained. With this tool, researchers can examine and assess the structure of hundreds of models in ModelDB in a standardized presentation without installing any software, downloading the model, or reading model source code. PMID:25896640

  8. The Development and Implementation of an Online Professional Development Model for Pre-Service Teacher Education

    ERIC Educational Resources Information Center

    Denton, Jon J.; Davis, Trina J.; Smith, Ben L.; Beason, Lynn; Strader, R. Arlen

    2005-01-01

    "Accelerate Online/OPTIONS" is a three component program for certifying secondary mathematics and science teachers in Texas. "Accelerate Online/OPTIONS" provides those possessing or pursuing science degrees with an online program of education that can be completed in 12-18 months. The On-line Curriculum consists of 35 online modules developed to…

  9. Applying the Dualistic Model of Passion to Post-Secondary Online Instruction: A Comparative Study

    ERIC Educational Resources Information Center

    Greenberger, Scott W.

    2013-01-01

    With the growth of online education, online student attrition and failure rates will continue to be a concern for post-secondary institutions. Although many factors may contribute to such phenomena, the role of the online instructor is clearly an important factor. Exploring how online instructors perceive their role as online teachers,…

  10. Building Component Library: An Online Repository to Facilitate Building Energy Model Creation; Preprint

    SciTech Connect

    Fleming, K.; Long, N.; Swindler, A.

    2012-05-01

    This paper describes the Building Component Library (BCL), the U.S. Department of Energy's (DOE) online repository of building components that can be directly used to create energy models. This comprehensive, searchable library consists of components and measures as well as the metadata which describes them. The library is also designed to allow contributors to easily add new components, providing a continuously growing, standardized list of components for users to draw upon.

  11. PockDrug-Server: a new web server for predicting pocket druggability on holo and apo proteins.

    PubMed

    Hussein, Hiba Abi; Borrel, Alexandre; Geneix, Colette; Petitjean, Michel; Regad, Leslie; Camproux, Anne-Claude

    2015-07-01

    Predicting protein pocket's ability to bind drug-like molecules with high affinity, i.e. druggability, is of major interest in the target identification phase of drug discovery. Therefore, pocket druggability investigations represent a key step of compound clinical progression projects. Currently computational druggability prediction models are attached to one unique pocket estimation method despite pocket estimation uncertainties. In this paper, we propose 'PockDrug-Server' to predict pocket druggability, efficient on both (i) estimated pockets guided by the ligand proximity (extracted by proximity to a ligand from a holo protein structure) and (ii) estimated pockets based solely on protein structure information (based on amino atoms that form the surface of potential binding cavities). PockDrug-Server provides consistent druggability results using different pocket estimation methods. It is robust with respect to pocket boundary and estimation uncertainties, thus efficient using apo pockets that are challenging to estimate. It clearly distinguishes druggable from less druggable pockets using different estimation methods and outperformed recent druggability models for apo pockets. It can be carried out from one or a set of apo/holo proteins using different pocket estimation methods proposed by our web server or from any pocket previously estimated by the user. PockDrug-Server is publicly available at: http://pockdrug.rpbs.univ-paris-diderot.fr. PMID:25956651

  12. The PredictProtein server

    PubMed Central

    Rost, Burkhard; Liu, Jinfeng

    2003-01-01

    PredictProtein (PP, http://cubic.bioc.columbia.edu/pp/) is an internet service for sequence analysis and the prediction of aspects of protein structure and function. Users submit protein sequence or alignments; the server returns a multiple sequence alignment, PROSITE sequence motifs, low-complexity regions (SEG), ProDom domain assignments, nuclear localisation signals, regions lacking regular structure and predictions of secondary structure, solvent accessibility, globular regions, transmembrane helices, coiled-coil regions, structural switch regions and disulfide-bonds. Upon request, fold recognition by prediction-based threading is available. For all services, users can submit their query either by electronic mail or interactively from World Wide Web. PMID:12824312

  13. The Argonne Voyager multimedia server

    SciTech Connect

    Disz, T.; Judson, I.; Olson, R.; Stevens, R.

    1997-07-01

    With the growing presence of multimedia-enabled systems, one will see an integration of collaborative computing concepts into the everyday environments of future scientific and technical workplaces. Desktop teleconferencing is in common use today, while more complex desktop teleconferencing technology that relies on the availability of multipoint (greater than two nodes) enabled tools is now starting to become available on PCs. A critical problem when using these collaboration tools is the inability to easily archive multistream, multipoint meetings and make the content available to others. Ideally one would like the ability to capture, record, playback, index, annotate and distribute multimedia stream data as easily as one currently handles text or still image data. While the ultimate goal is still some years away, the Argonne Voyager project is aimed at exploring and developing media server technology needed to provide a flexible virtual multipoint recording/playback capability. In this article the authors describe the motivating requirements, architecture implementation, operation, performance, and related work.

  14. GLIMMPSE: Online Power Computation for Linear Models with and without a Baseline Covariate.

    PubMed

    Kreidler, Sarah M; Muller, Keith E; Grunwald, Gary K; Ringham, Brandy M; Coker-Dukowitz, Zacchary T; Sakhadeo, Uttara R; Barón, Anna E; Glueck, Deborah H

    2013-09-01

    GLIMMPSE is a free, web-based software tool that calculates power and sample size for the general linear multivariate model with Gaussian errors (http://glimmpse.SampleSizeShop.org/). GLIMMPSE provides a user-friendly interface for the computation of power and sample size. We consider models with fixed predictors, and models with fixed predictors and a single Gaussian covariate. Validation experiments demonstrate that GLIMMPSE matches the accuracy of previously published results, and performs well against simulations. We provide several online tutorials based on research in head and neck cancer. The tutorials demonstrate the use of GLIMMPSE to calculate power and sample size. PMID:24403868

  15. Online collaboration and model sharing in volcanology via VHub.org

    NASA Astrophysics Data System (ADS)

    Valentine, G.; Patra, A. K.; Bajo, J. V.; Bursik, M. I.; Calder, E.; Carn, S. A.; Charbonnier, S. J.; Connor, C.; Connor, L.; Courtland, L. M.; Gallo, S.; Jones, M.; Palma Lizana, J. L.; Moore-Russo, D.; Renschler, C. S.; Rose, W. I.

    2013-12-01

    VHub (short for VolcanoHub, and accessible at vhub.org) is an online platform for barrier free access to high end modeling and simulation and collaboration in research and training related to volcanoes, the hazards they pose, and risk mitigation. The underlying concept is to provide a platform, building upon the successful HUBzero software infrastructure (hubzero.org), that enables workers to collaborate online and to easily share information, modeling and analysis tools, and educational materials with colleagues around the globe. Collaboration occurs around several different points: (1) modeling and simulation; (2) data sharing; (3) education and training; (4) volcano observatories; and (5) project-specific groups. VHub promotes modeling and simulation in two ways: (1) some models can be implemented on VHub for online execution. VHub can provide a central warehouse for such models that should result in broader dissemination. VHub also provides a platform that supports the more complex CFD models by enabling the sharing of code development and problem-solving knowledge, benchmarking datasets, and the development of validation exercises. VHub also provides a platform for sharing of data and datasets. The VHub development team is implementing the iRODS data sharing middleware (see irods.org). iRODS allows a researcher to access data that are located at participating data sources around the world (a cloud of data) as if the data were housed in a single virtual database. Projects associated with VHub are also going to introduce the use of data driven workflow tools to support the use of multistage analysis processes where computing and data are integrated for model validation, hazard analysis etc. Audio-video recordings of seminars, PowerPoint slide sets, and educational simulations are all items that can be placed onto VHub for use by the community or by selected collaborators. An important point is that the manager of a given educational resource (or any other

  16. Development and testing of a Hyperlearning Model for design of an online critical care course.

    PubMed

    Jeffries, Pamela R

    2005-08-01

    Many U.S. colleges and universities are discovering innovative and exciting ways of using information technology to promote the process of teaching and learning and to extend education to new populations of students. Nurse educators in academia and service settings are developing interactive e-learning programs or courses to meet this need, and to either enhance practice concepts and basic skills or orient new associates to the clinical organization. In continuing education programs, students need flexibility and convenience to concurrently meet their personal and academic goals, and consumer demand for online instruction is increasing. The challenge is to prepare a comprehensive, high-quality, cost-effective e-learning course to meet educational standards and competencies. To meet this challenge, an instructional design model, the Hyperlearning Model, was developed based on Chickering and Gamson's principles of best practices in undergraduate education, to guide the development of an online course for basic critical care content. In this article, I describe the creation and testing of an instructional design model for developing content in this online course. PMID:16130343

  17. Online and Certifiable Spectroscopy Courses Using Information and Communication Tools. a Model for Classrooms and Beyond

    NASA Astrophysics Data System (ADS)

    Krishnan, Mangala Sunder

    2015-06-01

    Online education tools and flipped (reverse) class models for teaching and learning and pedagogic and andragogic approaches to self-learning have become quite mature in the last few years because of the revolution in video, interactive software and social learning tools. Open Educational resources of dependable quality and variety are also becoming available throughout the world making the current era truly a renaissance period for higher education using Internet. In my presentation, I shall highlight structured course content preparation online in several areas of spectroscopy and also the design and development of virtual lab tools and kits for studying optical spectroscopy. Both elementary and advanced courses on molecular spectroscopy are currently under development jointly with researchers in other institutions in India. I would like to explore participation from teachers throughout the world in the teaching-learning process using flipped class methods for topics such as experimental and theoretical microwave spectroscopy of semi-rigid and non-rigid molecules, molecular complexes and aggregates. In addition, courses in Raman, Infrared spectroscopy experimentation and advanced electronic spectroscopy courses are also envisaged for free, online access. The National Programme on Technology Enhanced Learning (NPTEL) and the National Mission on Education through Information and Communication Technology (NMEICT) are two large Government of India funded initiatives for producing certified and self-learning courses with financial support for moderated discussion forums. The learning tools and interactive presentations so developed can be used in classrooms throughout the world using flipped mode of teaching. They are very much sought after by learners and researchers who are in other areas of learning but want to contribute to research and development through inter-disciplinary learning. NPTEL is currently is experimenting with Massive Open Online Course (MOOC

  18. pKNOT v.2: the protein KNOT web server.

    PubMed

    Lai, Yan-Long; Chen, Chih-Chieh; Hwang, Jenn-Kang

    2012-07-01

    Knotted proteins have recently received lots of attention due to their interesting topological novelty as well as its puzzling folding mechanisms. We previously published a pKNOT server, which provides a structural database of knotted proteins, analysis tools for detecting and analyzing knotted regions from structures as well as a Java-based 3D graphics viewer for visualizing knotted structures. However, there lacks a convenient platform performing similar tasks directly from 'protein sequences'. In the current version of the web server, referred to as pKNOT v.2, we implement a homology modeling tool such that the server can now accept protein sequences in addition to 3D structures or Protein Data Bank (PDB) IDs and return knot analysis. In addition, we have updated the database of knotted proteins from the current PDB with a combination of automatic and manual procedure. We believe that the updated pKNOT server with its extended functionalities will provide better service to biologists interested in the research of knotted proteins. The pKNOT v.2 is available from http://pknot.life.nctu.edu.tw/. PMID:22693223

  19. CCTOP: a Consensus Constrained TOPology prediction web server.

    PubMed

    Dobson, László; Reményi, István; Tusnády, Gábor E

    2015-07-01

    The Consensus Constrained TOPology prediction (CCTOP; http://cctop.enzim.ttk.mta.hu) server is a web-based application providing transmembrane topology prediction. In addition to utilizing 10 different state-of-the-art topology prediction methods, the CCTOP server incorporates topology information from existing experimental and computational sources available in the PDBTM, TOPDB and TOPDOM databases using the probabilistic framework of hidden Markov model. The server provides the option to precede the topology prediction with signal peptide prediction and transmembrane-globular protein discrimination. The initial result can be recalculated by (de)selecting any of the prediction methods or mapped experiments or by adding user specified constraints. CCTOP showed superior performance to existing approaches. The reliability of each prediction is also calculated, which correlates with the accuracy of the per protein topology prediction. The prediction results and the collected experimental information are visualized on the CCTOP home page and can be downloaded in XML format. Programmable access of the CCTOP server is also available, and an example of client-side script is provided. PMID:25943549

  20. CCTOP: a Consensus Constrained TOPology prediction web server

    PubMed Central

    Dobson, László; Reményi, István; Tusnády, Gábor E.

    2015-01-01

    The Consensus Constrained TOPology prediction (CCTOP; http://cctop.enzim.ttk.mta.hu) server is a web-based application providing transmembrane topology prediction. In addition to utilizing 10 different state-of-the-art topology prediction methods, the CCTOP server incorporates topology information from existing experimental and computational sources available in the PDBTM, TOPDB and TOPDOM databases using the probabilistic framework of hidden Markov model. The server provides the option to precede the topology prediction with signal peptide prediction and transmembrane-globular protein discrimination. The initial result can be recalculated by (de)selecting any of the prediction methods or mapped experiments or by adding user specified constraints. CCTOP showed superior performance to existing approaches. The reliability of each prediction is also calculated, which correlates with the accuracy of the per protein topology prediction. The prediction results and the collected experimental information are visualized on the CCTOP home page and can be downloaded in XML format. Programmable access of the CCTOP server is also available, and an example of client-side script is provided. PMID:25943549

  1. RNA-Redesign: a web server for fixed-backbone 3D design of RNA.

    PubMed

    Yesselman, Joseph D; Das, Rhiju

    2015-07-01

    RNA is rising in importance as a design medium for interrogating fundamental biology and for developing therapeutic and bioengineering applications. While there are several online servers for design of RNA secondary structure, there are no tools available for the rational design of 3D RNA structure. Here we present RNA-Redesign (http://rnaredesign.stanford.edu), an online 3D design tool for RNA. This resource utilizes fixed-backbone design to optimize the sequence identity and nucleobase conformations of an RNA to match a desired backbone, analogous to fundamental tools that underlie rational protein engineering. The resulting sequences suggest thermostabilizing mutations that can be experimentally verified. Further, sequence preferences that differ between natural and computationally designed sequences can suggest whether natural sequences possess functional constraints besides folding stability, such as cofactor binding or conformational switching. Finally, for biochemical studies, the designed sequences can suggest experimental tests of 3D models, including concomitant mutation of base triples. In addition to the designs generated, detailed graphical analysis is presented through an integrated and user-friendly environment. PMID:25964298

  2. RNA-Redesign: a web server for fixed-backbone 3D design of RNA

    PubMed Central

    Yesselman, Joseph D.; Das, Rhiju

    2015-01-01

    RNA is rising in importance as a design medium for interrogating fundamental biology and for developing therapeutic and bioengineering applications. While there are several online servers for design of RNA secondary structure, there are no tools available for the rational design of 3D RNA structure. Here we present RNA-Redesign (http://rnaredesign.stanford.edu), an online 3D design tool for RNA. This resource utilizes fixed-backbone design to optimize the sequence identity and nucleobase conformations of an RNA to match a desired backbone, analogous to fundamental tools that underlie rational protein engineering. The resulting sequences suggest thermostabilizing mutations that can be experimentally verified. Further, sequence preferences that differ between natural and computationally designed sequences can suggest whether natural sequences possess functional constraints besides folding stability, such as cofactor binding or conformational switching. Finally, for biochemical studies, the designed sequences can suggest experimental tests of 3D models, including concomitant mutation of base triples. In addition to the designs generated, detailed graphical analysis is presented through an integrated and user-friendly environment. PMID:25964298

  3. The Online Theology Classroom: Strategies for Engaging a Community of Distance Learners in a Hybrid Model of Online Education

    ERIC Educational Resources Information Center

    Hege, Brent A. R.

    2011-01-01

    One factor contributing to success in online education is the creation of a safe and vibrant virtual community and sustained, lively engagement with that community of learners. In order to create and engage such a community instructors must pay special attention to the relationship between technology and pedagogy, specifically in terms of issues…

  4. The PhyloFacts FAT-CAT web server: ortholog identification and function prediction using fast approximate tree classification.

    PubMed

    Afrasiabi, Cyrus; Samad, Bushra; Dineen, David; Meacham, Christopher; Sjölander, Kimmen

    2013-07-01

    The PhyloFacts 'Fast Approximate Tree Classification' (FAT-CAT) web server provides a novel approach to ortholog identification using subtree hidden Markov model-based placement of protein sequences to phylogenomic orthology groups in the PhyloFacts database. Results on a data set of microbial, plant and animal proteins demonstrate FAT-CAT's high precision at separating orthologs and paralogs and robustness to promiscuous domains. We also present results documenting the precision of ortholog identification based on subtree hidden Markov model scoring. The FAT-CAT phylogenetic placement is used to derive a functional annotation for the query, including confidence scores and drill-down capabilities. PhyloFacts' broad taxonomic and functional coverage, with >7.3 M proteins from across the Tree of Life, enables FAT-CAT to predict orthologs and assign function for most sequence inputs. Four pipeline parameter presets are provided to handle different sequence types, including partial sequences and proteins containing promiscuous domains; users can also modify individual parameters. PhyloFacts trees matching the query can be viewed interactively online using the PhyloScope Javascript tree viewer and are hyperlinked to various external databases. The FAT-CAT web server is available at http://phylogenomics.berkeley.edu/phylofacts/fatcat/. PMID:23685612

  5. A study of critical reasoning in online learning: application of the Occupational Performance Process Model.

    PubMed

    Mitchell, Anita Witt; Batorski, Rosemary E

    2009-01-01

    This study examined the effect of an online guided independent study on critical reasoning skills. Twenty-one first-semester Master of Occupational Therapy students completed an online assignment designed to facilitate application of the Occupational Performance Process Model (Fearing & Clark) and kept reflective journals. Data from the journals were analyzed in relation to the three sets of questions, question type and results of the Watson-Glaser Critical Thinking Appraisal (WGCTA). This assignment appeared to be effective for enhancing awareness and use of critical reasoning skills. Differences in patterns of critical reasoning between students with high and low WGCTA scores and results of an inductive analysis of the journal entries are discussed. Future research investigating the types of feedback that effectively facilitate development of critical reasoning and whether students with high and low WGCTA scores might benefit from different types of instruction and/or feedback is recommended. PMID:19343703

  6. Taking the epistemic step: toward a model of on-line access to conversational implicatures.

    PubMed

    Breheny, Richard; Ferguson, Heather J; Katsos, Napoleon

    2013-03-01

    There is a growing body of evidence showing that conversational implicatures are rapidly accessed in incremental utterance interpretation. To date, studies showing incremental access have focussed on implicatures related to linguistic triggers, such as 'some' and 'or'. We discuss three kinds of on-line model that can account for this data. A model built around the notion of linguistic alternatives stored in the lexicon would only account for linguistically triggered implicatures of the kind already studied and not so-called 'particularised' implicatures that are not associated with specific linguistic items. A second model built around the idea of focus alternatives could handle both linguistically triggered implicatures and so-called particularised implicatures but would be insensitive to the role that information about the speaker's mental state plays in deriving implicatures. A third more fully 'Gricean' model takes account of the speaker's mental state in accessing these implications. In this paper we present a visual world study using a new interactive paradigm where two communicators (one confederate) describe visually-presented events to each other as their eye movements are monitored. In this way, we directly compare the suitability of these three kinds of model. We show hearers can access contextually specific particularised implicatures in on-line comprehension. Moreover, we show that in doing so, hearers are sensitive to the relevant mental states of the speaker. We conclude with a discussion of how a more 'Gricean' model may be developed and of how our findings inform a long-standing debate on the immediacy of on-line perspective taking in language comprehension. PMID:23291422

  7. An Assessment of a Physical Chemistry Online Activity

    NASA Astrophysics Data System (ADS)

    Hamby Towns, Marcy; Kreke, Kelley; Sauder, Deborah; Stout, Roland; Long, George; Zielinski, Theresa Julia

    1998-12-01

    A questionnaire and list server archive were used to investigate the perception of students and faculty who took part in a physical chemistry online project. Students at four universities worked cooperatively in their own classrooms and collaborated as a larger team on the Internet via a list server to determine the best mathematical model to describe the PV behavior of a gas at a specified temperature. The strengths of the project were the interaction among students, the use of Mathcad and modern technology, and the experience of authentic problem-solving. The weaknesses were the problems with the technology, the facilitation of interaction, and the student's ability to ask questions to solve an ill-defined problem. The suggestions for improvements focused on facilitating interuniversity interaction between students, clarifying tasks and goals, and implementation of the online activities. We discuss how our evaluation of the project guided and informed the design of a subsequent online project, and our planning for future projects. In addition, we describe the professional learning community that evolved among faculty who participated in this project.

  8. Quadfinder: server for identification and analysis of quadruplex-forming motifs in nucleotide sequences

    PubMed Central

    Scaria, Vinod; Hariharan, Manoj; Arora, Amit; Maiti, Souvik

    2006-01-01

    G-quadruplex secondary structures, which play a structural role in repetitive DNA such as telomeres, may also play a functional role at other genomic locations as targetable regulatory elements which control gene expression. The recent interest in application of quadruplexes in biological systems prompted us to develop a tool for the identification and analysis of quadruplex-forming nucleotide sequences especially in the RNA. Here we present Quadfinder, an online server for prediction and bioinformatics of uni-molecular quadruplex-forming nucleotide sequences. The server is designed to be user-friendly and needs minimal intervention by the user, while providing flexibility of defining the variants of the motif. The server is freely available at URL . PMID:16845097

  9. Performance measurements of single server fuzzy queues with unreliable server using left and right method

    NASA Astrophysics Data System (ADS)

    Mueen, Zeina; Ramli, Razamin; Zaibidi, Nerda Zura

    2015-12-01

    There are a number of real life systems that can be described as a queuing system, and this paper presents a queuing system model applied in a manufacturing system example. The queuing model considered is depicted in a fuzzy environment with retrial queues and unreliable server. The stability condition state of this model is investigated and the performance measurement is obtained by adopting the left and right method. The new approach adopted in this study merges the existing α-cut interval and nonlinear programming techniques and a numerical example was considered to explain the methodology of this technique. From the numerical example, the flexibility of the method was shown graphically showing the exact real mean value of customers in the system and also the expected waiting times.

  10. The Matpar Server on the HP Exemplar

    NASA Technical Reports Server (NTRS)

    Springer, Paul

    2000-01-01

    This presentation reviews the design of Matlab for parallel processing on a parallel system. Matlab was found to be too slow on many large problems, and with the Next Generation Space Telescope requiring greater capability, the work was begun in early 1996 on parallel extensions to Matlab, called Matpar. This presentation reviews the architecture, the functionality, and the design of MatPar. The design utilizes a client server strategy, with the client code written in C, and the object-oriented server code written in C++. The client/server approach for Matpar provides ease of use an good speed.

  11. Web server with ATMEGA 2560 microcontroller

    NASA Astrophysics Data System (ADS)

    Răduca, E.; Ungureanu-Anghel, D.; Nistor, L.; Haţiegan, C.; Drăghici, S.; Chioncel, C.; Spunei, E.; Lolea, R.

    2016-02-01

    This paper presents the design and building of a Web Server to command, control and monitor at a distance lots of industrial or personal equipments and/or sensors. The server works based on a personal software. The software can be written by users and can work with many types of operating system. The authors were realized the Web server based on two platforms, an UC board and a network board. The source code was written in "open source" language Arduino 1.0.5.

  12. OneRTM: an online real-time modelling platform for the next generation of numerical environmental modelling

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Kingdon, Andrew

    2014-05-01

    Numerical modelling has been applied in many fields to better understand and predict the behaviours of different processes. In our increasingly dynamic world there is an imperative to identify potential stresses and threats in the environment and to respond quickly with sound decisions. However, the limitations in traditional modelling methodologies make it difficult to respond quickly to rapidly developing environmental events, such as floods, droughts and pollution incidents. For example, it is both time consuming and costly to keep model data up-to-date and also to disseminate models results and modelled output datasets to end-users. Crucially it is difficult for people who has limited numerical modelling skills to understand and interact with models and modelled results. In response to these challenges, a proof-of-concept online real-time modelling platform (OneRTM) has been developed as a mechanism for maintaining and disseminating numerical models and datasets. This automatically keeps models current for the most recent input data, links models based on data flow; it makes models and modelled datasets (historic, real-time and forecasted) immediately available via the internet as easy-to-understand dynamic GIS layers and graphs; and it provides online modelling functions to allow non-modellers to manipulate model including running pre-defined scenarios with a few mouse clicks. OneRTM has been successfully applied and tested in the Chalk groundwater flow modelling in the Thames Basin, UK. The system hosts and links groundwater recharge and groundwater flow models in the case study area, and automatically publishes the latest groundwater level layers on the internet once the current weather datasets becomes available. It also provides online functions of generating groundwater hydrograph and running groundwater abstraction scenarios. Although OneRTM is currently tested using groundwater flow modelling as an example, it could be further developed into a platform

  13. An Online Database for Informing Ecological Network Models: http://kelpforest.ucsc.edu

    PubMed Central

    Beas-Luna, Rodrigo; Novak, Mark; Carr, Mark H.; Tinker, Martin T.; Black, August; Caselle, Jennifer E.; Hoban, Michael; Malone, Dan; Iles, Alison

    2014-01-01

    Ecological network models and analyses are recognized as valuable tools for understanding the dynamics and resiliency of ecosystems, and for informing ecosystem-based approaches to management. However, few databases exist that can provide the life history, demographic and species interaction information necessary to parameterize ecological network models. Faced with the difficulty of synthesizing the information required to construct models for kelp forest ecosystems along the West Coast of North America, we developed an online database (http://kelpforest.ucsc.edu/) to facilitate the collation and dissemination of such information. Many of the database's attributes are novel yet the structure is applicable and adaptable to other ecosystem modeling efforts. Information for each taxonomic unit includes stage-specific life history, demography, and body-size allometries. Species interactions include trophic, competitive, facilitative, and parasitic forms. Each data entry is temporally and spatially explicit. The online data entry interface allows researchers anywhere to contribute and access information. Quality control is facilitated by attributing each entry to unique contributor identities and source citations. The database has proven useful as an archive of species and ecosystem-specific information in the development of several ecological network models, for informing management actions, and for education purposes (e.g., undergraduate and graduate training). To facilitate adaptation of the database by other researches for other ecosystems, the code and technical details on how to customize this database and apply it to other ecosystems are freely available and located at the following link (https://github.com/kelpforest-cameo/databaseui). PMID:25343723

  14. An online database for informing ecological network models: http://kelpforest.ucsc.edu

    USGS Publications Warehouse

    Beas-Luna, Rodrigo; Tinker, M. Tim; Novak, Mark; Carr, Mark H.; Black, August; Caselle, Jennifer E.; Hoban, Michael; Malone, Dan; Iles, Alison C.

    2014-01-01

    Ecological network models and analyses are recognized as valuable tools for understanding the dynamics and resiliency of ecosystems, and for informing ecosystem-based approaches to management. However, few databases exist that can provide the life history, demographic and species interaction information necessary to parameterize ecological network models. Faced with the difficulty of synthesizing the information required to construct models for kelp forest ecosystems along the West Coast of North America, we developed an online database (http://kelpforest.ucsc.edu/) to facilitate the collation and dissemination of such information. Many of the database's attributes are novel yet the structure is applicable and adaptable to other ecosystem modeling efforts. Information for each taxonomic unit includes stage-specific life history, demography, and body-size allometries. Species interactions include trophic, competitive, facilitative, and parasitic forms. Each data entry is temporally and spatially explicit. The online data entry interface allows researchers anywhere to contribute and access information. Quality control is facilitated by attributing each entry to unique contributor identities and source citations. The database has proven useful as an archive of species and ecosystem-specific information in the development of several ecological network models, for informing management actions, and for education purposes (e.g., undergraduate and graduate training). To facilitate adaptation of the database by other researches for other ecosystems, the code and technical details on how to customize this database and apply it to other ecosystems are freely available and located at the following link (https://github.com/kelpforest-cameo/data​baseui).

  15. Offline and online detection of damage using autoregressive models and artificial neural networks

    NASA Astrophysics Data System (ADS)

    Omenzetter, Piotr; de Lautour, Oliver R.

    2007-04-01

    Developed to study long, regularly sampled streams of data, time series analysis methods are being increasingly investigated for the use of Structural Health Monitoring. In this research, Autoregressive (AR) models are used in conjunction with Artificial Neural Networks (ANNs) for damage detection, localisation and severity assessment. In the first reported experimental exercise, AR models were used offline to fit the acceleration time histories of a 3-storey test structure in undamaged and various damaged states when excited by earthquake motion simulated on a shake table. Damage was introduced into the structure by replacing the columns with those of a thinner thickness. Analytical models of the structure in both damaged and undamaged states were also developed and updated using experimental data in order to determine structural stiffness. The coefficients of AR models were used as damage sensitive features and input into an ANN to build a relationship between them and the remaining structural stiffness. In the second, analytical exercise, a system with gradually progressing damage was numerically simulated and acceleration AR models with exogenous inputs were identified recursively. A trained ANN was then required to trace the structural stiffness online. The results for the offline and online approach showed the efficiency of using AR coefficient as damage sensitive features and good performance of the ANNs for damage detection, localization and quantification.

  16. Using a Comprehensive Model to Test and Predict the Factors of Online Learning Effectiveness

    ERIC Educational Resources Information Center

    He, Minyan

    2013-01-01

    As online learning is an important part of higher education, the effectiveness of online learning has been tested with different methods. Although the literature regarding online learning effectiveness has been related to various factors, a more comprehensive review of the factors may result in broader understanding of online learning…

  17. Online Learning in the Workplace: A Hybrid Model of Participation in Networked, Professional Learning

    ERIC Educational Resources Information Center

    Thorpe, Mary; Gordon, Jean

    2012-01-01

    The design and conceptualisation of online learning environments for work-related, professional learning was addressed through research with users of an online environment for social workers. The core questions for the research were to identify the nature of participation in the online environment, the relationship between online participation and…

  18. Microfluidic droplet-based liquid-liquid extraction: online model validation.

    PubMed

    Lubej, Martin; Novak, Uroš; Liu, Mingqiang; Martelanc, Mitja; Franko, Mladen; Plazl, Igor

    2015-05-21

    Droplet-based liquid-liquid extraction in a microchannel was studied, both theoretically and experimentally. A full 3D mathematical model, incorporating convection and diffusion in all spatial directions along with the velocity profile, was developed to depict the governing transport characteristics of droplet-based microfluidics. The finite elements method, as the most common macroscale simulation technique, was used to solve the set of differential equations regarding conservation of moment, mass and solute concentration in a two-domain system coupled by interfacial surface of droplet-based flow pattern. The model was numerically verified and validated online by following the concentrations of a solute in two phases within the microchannel. The relative azobenzene concentration profiles in a methanol/n-octane two-phase system at different positions along the channel length were retrieved by means of a thermal lens microscopic (TLM) technique coupled to a microfluidic system, which gave results of high spatial and temporal resolution. Very good agreement between model calculations and online experimental data was achieved without applying any fitting procedure to the model parameters. PMID:25850663

  19. The network-enabled optimization system server

    SciTech Connect

    Mesnier, M.P.

    1995-08-01

    Mathematical optimization is a technology under constant change and advancement, drawing upon the most efficient and accurate numerical methods to date. Further, these methods can be tailored for a specific application or generalized to accommodate a wider range of problems. This perpetual change creates an ever growing field, one that is often difficult to stay abreast of. Hence, the impetus behind the Network-Enabled Optimization System (NEOS) server, which aims to provide users, both novice and expert, with a guided tour through the expanding world of optimization. The NEOS server is responsible for bridging the gap between users and the optimization software they seek. More specifically, the NEOS server will accept optimization problems over the Internet and return a solution to the user either interactively or by e-mail. This paper discusses the current implementation of the server.

  20. Development and Application of the Reactor Coolant On-Line Leakage Evaluation Model for Pressurized Water Reactors

    SciTech Connect

    Liang, Thomas K.S.; Hung, H.-J.; Chang, C.-J.

    2001-12-15

    With the consideration of mass unbalance, coolant shrinking, and compressibility, a model for reactor coolant leakage evaluation has been developed to quantify on-line the system leakage rate with conventional system measurements, regardless of where the leak occurs. This model has been derived from the system of total continuity, and it divides the reactor coolant system (RCS) into two regions, namely, the saturated and subcooled regions. The pressurizer is considered as a saturated region, and the remaining part of the RCS is regarded as a subcooled region. Taking the on-line measurements of the RCS including the RCS pressure, temperature, pressurizer water level, and charging and letdown flow rates, this model can directly evaluate on-line the RCS leakage rate. It is noted that this model is applicable only if the RCS remains subcooled. To verify the applicability of this model, data generated by RELAP5/MOD3 simulation and experimental measurements from the Institute of Nuclear Energy Research, Taiwan, Integral System Test Facility were adopted to assess this model. With further on-line verification against the Maanshan training simulator, this model was finally delivered to the Maanshan nuclear power plant (a three-looped Westinghouse pressurized water reactor) to assist the operator training and on-line evaluation of the RCS leakage rate. The smallest amount of leak flow that can be detected by the ROCK model is 3 gal/min.

  1. Conversation Threads Hidden within Email Server Logs

    NASA Astrophysics Data System (ADS)

    Palus, Sebastian; Kazienko, Przemysław

    Email server logs contain records of all email Exchange through this server. Often we would like to analyze those emails not separately but in conversation thread, especially when we need to analyze social network extracted from those email logs. Unfortunately each mail is in different record and those record are not tided to each other in any obvious way. In this paper method for discussion threads extraction was proposed together with experiments on two different data sets - Enron and WrUT..

  2. Online model-based diagnosis to support autonomous operation of an advanced life support system

    NASA Technical Reports Server (NTRS)

    Biswas, Gautam; Manders, Eric-Jan; Ramirez, John; Mahadevan, Nagabhusan; Abdelwahed, Sherif

    2004-01-01

    This article describes methods for online model-based diagnosis of subsystems of the advanced life support system (ALS). The diagnosis methodology is tailored to detect, isolate, and identify faults in components of the system quickly so that fault-adaptive control techniques can be applied to maintain system operation without interruption. We describe the components of our hybrid modeling scheme and the diagnosis methodology, and then demonstrate the effectiveness of this methodology by building a detailed model of the reverse osmosis (RO) system of the water recovery system (WRS) of the ALS. This model is validated with real data collected from an experimental testbed at NASA JSC. A number of diagnosis experiments run on simulated faulty data are presented and the results are discussed.

  3. Online Dectection and Modeling of Safety Boundaries for Aerospace Application Using Bayesian Statistics

    NASA Technical Reports Server (NTRS)

    He, Yuning

    2015-01-01

    The behavior of complex aerospace systems is governed by numerous parameters. For safety analysis it is important to understand how the system behaves with respect to these parameter values. In particular, understanding the boundaries between safe and unsafe regions is of major importance. In this paper, we describe a hierarchical Bayesian statistical modeling approach for the online detection and characterization of such boundaries. Our method for classification with active learning uses a particle filter-based model and a boundary-aware metric for best performance. From a library of candidate shapes incorporated with domain expert knowledge, the location and parameters of the boundaries are estimated using advanced Bayesian modeling techniques. The results of our boundary analysis are then provided in a form understandable by the domain expert. We illustrate our approach using a simulation model of a NASA neuro-adaptive flight control system, as well as a system for the detection of separation violations in the terminal airspace.

  4. Online model-based diagnosis to support autonomous operation of an advanced life support system.

    PubMed

    Biswas, Gautam; Manders, Eric-Jan; Ramirez, John; Mahadevan, Nagabhusan; Abdelwahed, Sherif

    2004-01-01

    This article describes methods for online model-based diagnosis of subsystems of the advanced life support system (ALS). The diagnosis methodology is tailored to detect, isolate, and identify faults in components of the system quickly so that fault-adaptive control techniques can be applied to maintain system operation without interruption. We describe the components of our hybrid modeling scheme and the diagnosis methodology, and then demonstrate the effectiveness of this methodology by building a detailed model of the reverse osmosis (RO) system of the water recovery system (WRS) of the ALS. This model is validated with real data collected from an experimental testbed at NASA JSC. A number of diagnosis experiments run on simulated faulty data are presented and the results are discussed. PMID:15880907

  5. An online spatio-temporal prediction model for dengue fever epidemic in Kaohsiung,Taiwan

    NASA Astrophysics Data System (ADS)

    Cheng, Ming-Hung; Yu, Hwa-Lung; Angulo, Jose; Christakos, George

    2013-04-01

    Dengue Fever (DF) is one of the most serious vector-borne infectious diseases in tropical and subtropical areas. DF epidemics occur in Taiwan annually especially during summer and fall seasons. Kaohsiung city has been one of the major DF hotspots in decades. The emergence and re-emergence of the DF epidemic is complex and can be influenced by various factors including space-time dynamics of human and vector populations and virus serotypes as well as the associated uncertainties. This study integrates a stochastic space-time "Susceptible-Infected-Recovered" model under Bayesian maximum entropy framework (BME-SIR) to perform real-time prediction of disease diffusion across space-time. The proposed model is applied for spatiotemporal prediction of the DF epidemic at Kaohsiung city during 2002 when the historical series of high DF cases was recorded. The online prediction by BME-SIR model updates the parameters of SIR model and infected cases across districts over time. Results show that the proposed model is rigorous to initial guess of unknown model parameters, i.e. transmission and recovery rates, which can depend upon the virus serotypes and various human interventions. This study shows that spatial diffusion can be well characterized by BME-SIR model, especially at the district surrounding the disease outbreak locations. The prediction performance at DF hotspots, i.e. Cianjhen and Sanmin, can be degraded due to the implementation of various disease control strategies during the epidemics. The proposed online disease prediction BME-SIR model can provide the governmental agency with a valuable reference to timely identify, control, and efficiently prevent DF spread across space-time.

  6. An information diffusion model based on retweeting mechanism for online social media

    NASA Astrophysics Data System (ADS)

    Xiong, Fei; Liu, Yun; Zhang, Zhen-jiang; Zhu, Jiang; Zhang, Ying

    2012-06-01

    To characterize information propagation on online microblogs, we propose a diffusion model (SCIR) which contains four possible states: Susceptible, contacted, infected and refractory. Agents that read the information but have not decided to spread it, stay in the contacted state. They may become infected or refractory, and both the infected and refractory state are stable. Results show during the evolution process, more contacted agents appear in scale-free networks than in regular lattices. The degree based density of infected agents increases with the degree monotonously, but larger average network degree doesn't always mean less relaxation time.

  7. Assessment of Energy Removal Impacts on Physical Systems: Hydrodynamic Model Domain Expansion and Refinement, and Online Dissemination of Model Results

    SciTech Connect

    Yang, Zhaoqing; Khangaonkar, Tarang; Wang, Taiping

    2010-08-01

    In this report we describe the 1) the expansion of the PNNL hydrodynamic model domain to include the continental shelf along the coasts of Washington, Oregon, and Vancouver Island; and 2) the approach and progress in developing the online/Internet disseminations of model results and outreach efforts in support of the Puget Sound Operational Forecast System (PS-OPF). Submittal of this report completes the work on Task 2.1.2, Effects of Physical Systems, Subtask 2.1.2.1, Hydrodynamics, for fiscal year 2010 of the Environmental Effects of Marine and Hydrokinetic Energy project.

  8. PHYML Online—a web server for fast maximum likelihood-based phylogenetic inference

    PubMed Central

    Guindon, Stéphane; Lethiec, Franck; Duroux, Patrice; Gascuel, Olivier

    2005-01-01

    PHYML Online is a web interface to PHYML, a software that implements a fast and accurate heuristic for estimating maximum likelihood phylogenies from DNA and protein sequences. This tool provides the user with a number of options, e.g. nonparametric bootstrap and estimation of various evolutionary parameters, in order to perform comprehensive phylogenetic analyses on large datasets in reasonable computing time. The server and its documentation are available at . PMID:15980534

  9. Online NIR Analysis and Prediction Model for Synthesis Process of Ethyl 2-Chloropropionate

    PubMed Central

    Zhang, Wei; Song, Hang; Lu, Jing; Liu, Wen; Nie, Lirong; Yao, Shun

    2015-01-01

    Online near-infrared spectroscopy was used as a process analysis technique in the synthesis of 2-chloropropionate for the first time. Then, the partial least squares regression (PLSR) quantitative model of the product solution concentration was established and optimized. Correlation coefficient (R2) of partial least squares regression (PLSR) calibration model was 0.9944, and the root mean square error of correction (RMSEC) was 0.018105 mol/L. These values of PLSR and RMSEC could prove that the quantitative calibration model had good performance. Moreover, the root mean square error of prediction (RMSEP) of validation set was 0.036429 mol/L. The results were very similar to those of offline gas chromatographic analysis, which could prove the method was valid. PMID:26366175

  10. Online Simulations and Forecasts of the Global Aerosol Distribution in the NASA GEOS-5 Model

    NASA Technical Reports Server (NTRS)

    Colarco, Peter

    2006-01-01

    We present an analysis of simulations of the global aerosol system in the NASA GEOS-5 transport, radiation, and chemistry model. The model includes representations of all major tropospheric aerosol species, including dust, sea salt, black carbon, particulate organic matter, and sulfates. The aerosols are run online for the period 2000 through 2005 in a simulation driven by assimilated meteorology from the NASA Goddard Data Assimilation System. Aerosol surface mass concentrations are compared with existing long-term surface measurement networks. Aerosol optical thickness is compared with ground-based AERONET sun photometry and space-based retrievals from MODIS, MISR, and OMI. Particular emphasis is placed here on consistent sampling of model and satellite aerosol optical thickness to account for diurnal variations in aerosol optical properties. Additionally, we illustrate the use of this system for providing chemical weather forecasts in support of various NASA and community field missions.

  11. On-line updating of a distributed flow routing model - River Vistula case study

    NASA Astrophysics Data System (ADS)

    Karamuz, Emilia; Romanowicz, Renata; Napiorkowski, Jaroslaw

    2015-04-01

    This paper presents an application of methods of on-line updating in the River Vistula flow forecasting system. All flow-routing codes make simplifying assumptions and consider only a reduced set of the processes known to occur during a flood. Hence, all models are subject to a degree of structural error that is typically compensated for by calibration of the friction parameters. Calibrated parameter values are not, therefore, physically realistic, as in estimating them we also make allowance for a number of distinctly non-physical effects, such as model structural error and any energy losses or flow processes which occur at sub-grid scales. Calibrated model parameters are therefore area-effective, scale-dependent values which are not drawn from the same underlying statistical distribution as the equivalent at-a-point parameter of the same name. The aim of this paper is the derivation of real-time updated, on-line flow forecasts at certain strategic locations along the river, over a specified time horizon into the future, based on information on the behaviour of the flood wave upstream and available on-line measurements at a site. Depending on the length of the river reach and the slope of the river bed, a realistic forecast lead time, obtained in this manner, may range from hours to days. The information upstream can include observations of river levels and/or rainfall measurements. The proposed forecasting system will integrate distributed modelling, acting as a spatial interpolator with lumped parameter Stochastic Transfer Function models. Daily stage data from gauging stations are typically available at sites 10-60 km apart and test only the average routing performance of hydraulic models and not their ability to produce spatial predictions. Application of a distributed flow routing model makes it possible to interpolate forecasts both in time and space. This work was partly supported by the project "Stochastic flood forecasting system (The River Vistula reach

  12. User Evaluation of the NASA Technical Report Server Recommendation Service

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Bollen, Johan; Calhoun, JoAnne R.; Mackey, Calvin E.

    2004-01-01

    We present the user evaluation of two recommendation server methodologies implemented for the NASA Technical Report Server (NTRS). One methodology for generating recommendations uses log analysis to identify co-retrieval events on full-text documents. For comparison, we used the Vector Space Model (VSM) as the second methodology. We calculated cosine similarities and used the top 10 most similar documents (based on metadata) as 'recommendations'. We then ran an experiment with NASA Langley Research Center (LaRC) staff members to gather their feedback on which method produced the most 'quality' recommendations. We found that in most cases VSM outperformed log analysis of co-retrievals. However, analyzing the data revealed the evaluations may have been structurally biased in favor of the VSM generated recommendations. We explore some possible methods for combining log analysis and VSM generated recommendations and suggest areas of future work.

  13. Experience of public procurement of Open Compute servers

    NASA Astrophysics Data System (ADS)

    Bärring, Olof; Guerri, Marco; Bonfillou, Eric; Valsan, Liviu; Grigore, Alexandru; Dore, Vincent; Gentit, Alain; Clement, Benoît; Grossir, Anthony

    2015-12-01

    The Open Compute Project. OCP (http://www.opencompute.org/). was launched by Facebook in 2011 with the objective of building efficient computing infrastructures at the lowest possible cost. The technologies are released as open hardware. with the goal to develop servers and data centres following the model traditionally associated with open source software projects. In 2013 CERN acquired a few OCP servers in order to compare performance and power consumption with standard hardware. The conclusions were that there are sufficient savings to motivate an attempt to procure a large scale installation. One objective is to evaluate if the OCP market is sufficiently mature and broad enough to meet the constraints of a public procurement. This paper summarizes this procurement. which started in September 2014 and involved the Request for information (RFI) to qualify bidders and Request for Tender (RFT).

  14. User Evaluation of the NASA Technical Report Server Recommendation Service

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Bollen, Johan; Calhoun, JoAnne R.; Mackey, Calvin E.

    2004-01-01

    We present the user evaluation of two recommendation server methodologies implemented for the NASA Technical Report Server (NTRS). One methodology for generating recommendations uses log analysis to identify co-retrieval events on full-text documents. For comparison, we used the Vector Space Model (VSM) as the second methodology. We calculated cosine similarities and used the top 10 most similar documents (based on metadata) as recommendations . We then ran an experiment with NASA Langley Research Center (LaRC) staff members to gather their feedback on which method produced the most quality recommendations. We found that in most cases VSM outperformed log analysis of co-retrievals. However, analyzing the data revealed the evaluations may have been structurally biased in favor of the VSM generated recommendations. We explore some possible methods for combining log analysis and VSM generated recommendations and suggest areas of future work.

  15. Surfing for Data: A Gathering Trend in Data Storage Is the Use of Web-Based Applications that Make It Easy for Authorized Users to Access Hosted Server Content with Just a Computing Device and Browser

    ERIC Educational Resources Information Center

    Technology & Learning, 2005

    2005-01-01

    In recent years, the widespread availability of networks and the flexibility of Web browsers have shifted the industry from a client-server model to a Web-based one. In the client-server model of computing, clients run applications locally, with the servers managing storage, printing functions, and network traffic. Because every client is…

  16. The Medicago truncatula gene expression atlas web server

    PubMed Central

    2009-01-01

    Background Legumes (Leguminosae or Fabaceae) play a major role in agriculture. Transcriptomics studies in the model legume species, Medicago truncatula, are instrumental in helping to formulate hypotheses about the role of legume genes. With the rapid growth of publically available Affymetrix GeneChip Medicago Genome Array GeneChip data from a great range of tissues, cell types, growth conditions, and stress treatments, the legume research community desires an effective bioinformatics system to aid efforts to interpret the Medicago genome through functional genomics. We developed the Medicago truncatula Gene Expression Atlas (MtGEA) web server for this purpose. Description The Medicago truncatula Gene Expression Atlas (MtGEA) web server is a centralized platform for analyzing the Medicago transcriptome. Currently, the web server hosts gene expression data from 156 Affymetrix GeneChip® Medicago genome arrays in 64 different experiments, covering a broad range of developmental and environmental conditions. The server enables flexible, multifaceted analyses of transcript data and provides a range of additional information about genes, including different types of annotation and links to the genome sequence, which help users formulate hypotheses about gene function. Transcript data can be accessed using Affymetrix probe identification number, DNA sequence, gene name, functional description in natural language, GO and KEGG annotation terms, and InterPro domain number. Transcripts can also be discovered through co-expression or differential expression analysis. Flexible tools to select a subset of experiments and to visualize and compare expression profiles of multiple genes have been implemented. Data can be downloaded, in part or full, in a tabular form compatible with common analytical and visualization software. The web server will be updated on a regular basis to incorporate new gene expression data and genome annotation, and is accessible at: http

  17. Using Online Space Weather Modeling Resources in a Capstone Undergraduate Course

    NASA Astrophysics Data System (ADS)

    Liemohn, M.

    2012-04-01

    The University of Michigan offers a senior-undergraduate-level course entitled, "Space Weather Modeling," taken by all of the space weather concentration students in the Atmospheric, Oceanic, and Space Sciences department. This is the capstone course of our undergraduate series, using the foundational knowledge from the previous courses towards an integrative large-scale numerical modeling study. A fraction of the graduate students also take this course. Because the state-of-the-art modeling capabilities are well beyond what is possible in a single term of programming, this course uses available online model resources, in particular the Community Coordinated Modeling Center (CCMC), a multi-agency facility hosted by NASA's Goddard Space Flight Center. Students learn not only how to use the codes, but also the various options of what equations to solve to model a specific region of space and the various numerical approaches for implementing the equations within a code. The course is project-based, consisting of multiple written reports and oral presentations, and the technical communication skills are an important component of the grading rubric. Students learn how to conduct a numerical modeling study by critiquing several space weather modeling journal articles, and then carry out their our studies with several of the available codes. In the end, they are familiarized with the available models to know the ranges of validity and applicability for a wide array of space weather applications.

  18. 3Drefine: an interactive web server for efficient protein structure refinement.

    PubMed

    Bhattacharya, Debswapna; Nowotny, Jackson; Cao, Renzhi; Cheng, Jianlin

    2016-07-01

    3Drefine is an interactive web server for consistent and computationally efficient protein structure refinement with the capability to perform web-based statistical and visual analysis. The 3Drefine refinement protocol utilizes iterative optimization of hydrogen bonding network combined with atomic-level energy minimization on the optimized model using a composite physics and knowledge-based force fields for efficient protein structure refinement. The method has been extensively evaluated on blind CASP experiments as well as on large-scale and diverse benchmark datasets and exhibits consistent improvement over the initial structure in both global and local structural quality measures. The 3Drefine web server allows for convenient protein structure refinement through a text or file input submission, email notification, provided example submission and is freely available without any registration requirement. The server also provides comprehensive analysis of submissions through various energy and statistical feedback and interactive visualization of multiple refined models through the JSmol applet that is equipped with numerous protein model analysis tools. The web server has been extensively tested and used by many users. As a result, the 3Drefine web server conveniently provides a useful tool easily accessible to the community. The 3Drefine web server has been made publicly available at the URL: http://sysbio.rnet.missouri.edu/3Drefine/. PMID:27131371

  19. 3Drefine: an interactive web server for efficient protein structure refinement

    PubMed Central

    Bhattacharya, Debswapna; Nowotny, Jackson; Cao, Renzhi; Cheng, Jianlin

    2016-01-01

    3Drefine is an interactive web server for consistent and computationally efficient protein structure refinement with the capability to perform web-based statistical and visual analysis. The 3Drefine refinement protocol utilizes iterative optimization of hydrogen bonding network combined with atomic-level energy minimization on the optimized model using a composite physics and knowledge-based force fields for efficient protein structure refinement. The method has been extensively evaluated on blind CASP experiments as well as on large-scale and diverse benchmark datasets and exhibits consistent improvement over the initial structure in both global and local structural quality measures. The 3Drefine web server allows for convenient protein structure refinement through a text or file input submission, email notification, provided example submission and is freely available without any registration requirement. The server also provides comprehensive analysis of submissions through various energy and statistical feedback and interactive visualization of multiple refined models through the JSmol applet that is equipped with numerous protein model analysis tools. The web server has been extensively tested and used by many users. As a result, the 3Drefine web server conveniently provides a useful tool easily accessible to the community. The 3Drefine web server has been made publicly available at the URL: http://sysbio.rnet.missouri.edu/3Drefine/. PMID:27131371

  20. Differential surface models for tactile perception of shape and on-line tracking of features

    NASA Technical Reports Server (NTRS)

    Hemami, H.

    1987-01-01

    Tactile perception of shape involves an on-line controller and a shape perceptor. The purpose of the on-line controller is to maintain gliding or rolling contact with the surface, and collect information, or track specific features of the surface such as edges of a certain sharpness. The shape perceptor uses the information to perceive, estimate the parameters of, or recognize the shape. The differential surface model depends on the information collected and on the a priori information known about the robot and its physical parameters. These differential models are certain functionals that are projections of the dynamics of the robot onto the surface gradient or onto the tangent plane. A number of differential properties may be directly measured from present day tactile sensors. Others may have to be indirectly computed from measurements. Others may constitute design objectives for distributed tactile sensors of the future. A parameterization of the surface leads to linear and nonlinear sequential parameter estimation techniques for identification of the surface. Many interesting compromises between measurement and computation are possible.

  1. Human behavior in online social systems

    NASA Astrophysics Data System (ADS)

    Grabowski, A.

    2009-06-01

    We present and study data concerning human behavior in four online social systems: (i) an Internet community of friends of over 107 people, (ii) a music community website with over 106 users, (iii) a gamers’ community server with over 5 × 106 users and (iv) a booklovers’ website with over 2.5 × 105 users. The purpose of those systems is different; however, their properties are very similar. We have found that the distribution of human activity (e.g., the sum of books read or songs played) has the form of a power law. Moreover, the relationship between human activity and time has a power-law form, too. We present a simple interest-driven model of the evolution of such systems which explains the emergence of two scaling regimes.

  2. Tiled WMS/KML Server V2

    NASA Technical Reports Server (NTRS)

    Plesea, Lucian

    2012-01-01

    This software is a higher-performance implementation of tiled WMS, with integral support for KML and time-varying data. This software is compliant with the Open Geospatial WMS standard, and supports KML natively as a WMS return type, including support for the time attribute. Regionated KML wrappers are generated that match the existing tiled WMS dataset. Ping and JPG formats are supported, and the software is implemented as an Apache 2.0 module that supports a threading execution model that is capable of supporting very high request rates. The module intercepts and responds to WMS requests that match certain patterns and returns the existing tiles. If a KML format that matches an existing pyramid and tile dataset is requested, regionated KML is generated and returned to the requesting application. In addition, KML requests that do not match the existing tile datasets generate a KML response that includes the corresponding JPG WMS request, effectively adding KML support to a backing WMS server.

  3. Server-based approach to web visualization of integrated 3-D medical image data.

    PubMed Central

    Poliakov, A. V.; Albright, E.; Corina, D.; Ojemann, G.; Martin, R. F.; Brinkley, J. F.

    2001-01-01

    Although computer processing power and network bandwidth are rapidly increasing, the average desktop is still not able to rapidly process large datasets such as 3-D medical image volumes. We have therefore developed a server side approach to this problem, in which a high performance graphics server accepts commands from web clients to load, process and render 3-D image volumes and models. The renderings are saved as 2-D snapshots on the server, where they are uploaded and displayed on the client. User interactions with the graphic interface on the client side are translated into additional commands to manipulate the 3-D scene, after which the server re-renders the scene and sends a new image to the client. Example forms-based and Java-based clients are described for a brain mapping application, but the techniques should be applicable to multiple domains where 3-D medical image visualization is of interest. PMID:11825248

  4. PockDrug-Server: a new web server for predicting pocket druggability on holo and apo proteins

    PubMed Central

    Hussein, Hiba Abi; Borrel, Alexandre; Geneix, Colette; Petitjean, Michel; Regad, Leslie; Camproux, Anne-Claude

    2015-01-01

    Predicting protein pocket's ability to bind drug-like molecules with high affinity, i.e. druggability, is of major interest in the target identification phase of drug discovery. Therefore, pocket druggability investigations represent a key step of compound clinical progression projects. Currently computational druggability prediction models are attached to one unique pocket estimation method despite pocket estimation uncertainties. In this paper, we propose ‘PockDrug-Server’ to predict pocket druggability, efficient on both (i) estimated pockets guided by the ligand proximity (extracted by proximity to a ligand from a holo protein structure) and (ii) estimated pockets based solely on protein structure information (based on amino atoms that form the surface of potential binding cavities). PockDrug-Server provides consistent druggability results using different pocket estimation methods. It is robust with respect to pocket boundary and estimation uncertainties, thus efficient using apo pockets that are challenging to estimate. It clearly distinguishes druggable from less druggable pockets using different estimation methods and outperformed recent druggability models for apo pockets. It can be carried out from one or a set of apo/holo proteins using different pocket estimation methods proposed by our web server or from any pocket previously estimated by the user. PockDrug-Server is publicly available at: http://pockdrug.rpbs.univ-paris-diderot.fr. PMID:25956651

  5. College Students' Choice Modeling of Taking On-Line International Business Courses

    ERIC Educational Resources Information Center

    Yeh, Robert S.

    2006-01-01

    To understand students' choice behavior of taking on-line international business courses, a survey study is conducted to collect information regarding students' actual choices of taking on-line courses and potential factors that may have impacts on students' choices of online learning. Potential factors such as enrollment status, demographic…

  6. HydroShare: An online, collaborative environment for the sharing of hydrologic data and models (Invited)

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D.; Goodall, J. L.; Band, L. E.; Merwade, V.; Couch, A.; Arrigo, J.; Hooper, R. P.; Valentine, D. W.; Maidment, D. R.

    2013-12-01

    HydroShare is an online, collaborative system being developed for sharing hydrologic data and models. The goal of HydroShare is to enable scientists to easily discover and access data and models, retrieve them to their desktop or perform analyses in a distributed computing environment that may include grid, cloud or high performance computing model instances as necessary. Scientists may also publish outcomes (data, results or models) into HydroShare, using the system as a collaboration platform for sharing data, models and analyses. HydroShare is expanding the data sharing capability of the CUAHSI Hydrologic Information System by broadening the classes of data accommodated, creating new capability to share models and model components, and taking advantage of emerging social media functionality to enhance information about and collaboration around hydrologic data and models. One of the fundamental concepts in HydroShare is that of a Resource. All content is represented using a Resource Data Model that separates system and science metadata and has elements common to all resources as well as elements specific to the types of resources HydroShare will support. These will include different data types used in the hydrology community and models and workflows that require metadata on execution functionality. HydroShare will use the integrated Rule-Oriented Data System (iRODS) to manage federated data content and perform rule-based background actions on data and model resources, including parsing to generate metadata catalog information and the execution of models and workflows. This presentation will introduce the HydroShare functionality developed to date, describe key elements of the Resource Data Model and outline the roadmap for future development.

  7. Use of whole building simulation in on-line performance assessment: Modeling and implementation issues

    SciTech Connect

    Haves, Philip; Salsbury, Tim; Claridge, David; Liu, Mingsheng

    2001-06-15

    The application of model-based performance assessment at the whole building level is explored. The information requirements for a simulation to predict the actual performance of a particular real building, as opposed to estimating the impact of design options, are addressed with particular attention to common sources of input error and important deficiencies in most simulation models. The role of calibrated simulations is discussed. The communication requirements for passive monitoring and active testing are identified and the possibilities for using control system communications protocols to link on-line simulation and energy management and control systems are discussed. The potential of simulation programs to act as ''plug-and-play'' components on building control networks is discussed.

  8. CoCAR: An Online Synchronous Training Model for Empowering ICT Capacity of Teachers of Chinese as a Foreign Language

    ERIC Educational Resources Information Center

    Lan, Yu-Ju; Chang, Kuo-En; Chen, Nian-Shing

    2012-01-01

    In response to the need to cultivate pre-service Chinese as a foreign language (CFL) teachers' information and communication technology (ICT) competency in online synchronous environments, this research adopted a three-stage cyclical model named "cooperation-based cognition, action, and reflection" (CoCAR). The model was implemented in an 18-week…

  9. Measures of Quality in Online Education: An Investigation of the Community of Inquiry Model and the Net Generation

    ERIC Educational Resources Information Center

    Shea, Peter; Bidjerano, Temi

    2008-01-01

    The goal of this article is to present and validate an instrument that reflects the Community of Inquiry Model (Garrison, Anderson, & Archer, 2000, 2001) and inquire into whether the instrument and the model it reflects explain variation in levels of student learning and satisfaction with online courses in a higher education context. Additionally…

  10. A Terminology Server for medical language and medical information systems.

    PubMed

    Rector, A L; Solomon, W D; Nowlan, W A; Rush, T W; Zanstra, P E; Claassen, W M

    1995-03-01

    GALEN is developing a Terminology Server to support the development and integration of clinical systems through a range of key terminological services, built around a language-independent, re-usable, shared system of concepts--the CORE model. The focus is on supporting applications for medical records, clinical user interfaces and clinical information systems, but also includes systems for natural language understanding, clinical decision support, management of coding and classification schemes, and bibliographic retrieval. The Terminology Server integrates three modules: the Concept Module which implements the GRAIL formalism and manages the internal representation of concept entities, the Multilingual Module which manages the mapping of concept entities to natural language, and the Code Conversion Module which manages the mapping of concept entities to and from existing coding and classification schemes. The Terminology Server also provides external referencing to concept entities, coercion between data types, and makes its services available through a uniform applications programming interface. Taken together these services represent a new approach to the development of clinical systems and the sharing of medical knowledge. PMID:9082124

  11. Mobile object retrieval in server-based image databases

    NASA Astrophysics Data System (ADS)

    Manger, D.; Pagel, F.; Widak, H.

    2013-05-01

    The increasing number of mobile phones equipped with powerful cameras leads to huge collections of user-generated images. To utilize the information of the images on site, image retrieval systems are becoming more and more popular to search for similar objects in an own image database. As the computational performance and the memory capacity of mobile devices are constantly increasing, this search can often be performed on the device itself. This is feasible, for example, if the images are represented with global image features or if the search is done using EXIF or textual metadata. However, for larger image databases, if multiple users are meant to contribute to a growing image database or if powerful content-based image retrieval methods with local features are required, a server-based image retrieval backend is needed. In this work, we present a content-based image retrieval system with a client server architecture working with local features. On the server side, the scalability to large image databases is addressed with the popular bag-of-word model with state-of-the-art extensions. The client end of the system focuses on a lightweight user interface presenting the most similar images of the database highlighting the visual information which is common with the query image. Additionally, new images can be added to the database making it a powerful and interactive tool for mobile contentbased image retrieval.

  12. ACFIS: a web server for fragment-based drug discovery

    PubMed Central

    Hao, Ge-Fei; Jiang, Wen; Ye, Yuan-Nong; Wu, Feng-Xu; Zhu, Xiao-Lei; Guo, Feng-Biao; Yang, Guang-Fu

    2016-01-01

    In order to foster innovation and improve the effectiveness of drug discovery, there is a considerable interest in exploring unknown ‘chemical space’ to identify new bioactive compounds with novel and diverse scaffolds. Hence, fragment-based drug discovery (FBDD) was developed rapidly due to its advanced expansive search for ‘chemical space’, which can lead to a higher hit rate and ligand efficiency (LE). However, computational screening of fragments is always hampered by the promiscuous binding model. In this study, we developed a new web server Auto Core Fragment in silico Screening (ACFIS). It includes three computational modules, PARA_GEN, CORE_GEN and CAND_GEN. ACFIS can generate core fragment structure from the active molecule using fragment deconstruction analysis and perform in silico screening by growing fragments to the junction of core fragment structure. An integrated energy calculation rapidly identifies which fragments fit the binding site of a protein. We constructed a simple interface to enable users to view top-ranking molecules in 2D and the binding mode in 3D for further experimental exploration. This makes the ACFIS a highly valuable tool for drug discovery. The ACFIS web server is free and open to all users at http://chemyang.ccnu.edu.cn/ccb/server/ACFIS/. PMID:27150808

  13. ACFIS: a web server for fragment-based drug discovery.

    PubMed

    Hao, Ge-Fei; Jiang, Wen; Ye, Yuan-Nong; Wu, Feng-Xu; Zhu, Xiao-Lei; Guo, Feng-Biao; Yang, Guang-Fu

    2016-07-01

    In order to foster innovation and improve the effectiveness of drug discovery, there is a considerable interest in exploring unknown 'chemical space' to identify new bioactive compounds with novel and diverse scaffolds. Hence, fragment-based drug discovery (FBDD) was developed rapidly due to its advanced expansive search for 'chemical space', which can lead to a higher hit rate and ligand efficiency (LE). However, computational screening of fragments is always hampered by the promiscuous binding model. In this study, we developed a new web server Auto Core Fragment in silico Screening (ACFIS). It includes three computational modules, PARA_GEN, CORE_GEN and CAND_GEN. ACFIS can generate core fragment structure from the active molecule using fragment deconstruction analysis and perform in silico screening by growing fragments to the junction of core fragment structure. An integrated energy calculation rapidly identifies which fragments fit the binding site of a protein. We constructed a simple interface to enable users to view top-ranking molecules in 2D and the binding mode in 3D for further experimental exploration. This makes the ACFIS a highly valuable tool for drug discovery. The ACFIS web server is free and open to all users at http://chemyang.ccnu.edu.cn/ccb/server/ACFIS/. PMID:27150808

  14. An Online Approach for Training International Climate Scientists to Use Computer Models

    NASA Astrophysics Data System (ADS)

    Yarker, M. B.; Mesquita, M. D.; Veldore, V.

    2013-12-01

    With the mounting evidence by the work of IPCC (2007), climate change has been acknowledged as a significant challenge to Sustainable Development by the international community. It is important that scientists in developing countries have access to knowledge and tools so that well-informed decisions can be made about the mitigation and adaptation of climate change. However, training researchers to use climate modeling techniques and data analysis has become a challenge, because current capacity building approaches train researchers to use climate models through short-term workshops, which requires a large amount of funding. It has also been observed that many participants who recently completed capacity building courses still view climate and weather models as a metaphorical 'black box', where data goes in and results comes out; and there is evidence that these participants lack a basic understanding of the climate system. Both of these issues limit the ability of some scientists to go beyond running a model based on rote memorization of the process. As a result, they are unable to solve problems regarding run-time errors, thus cannot determine whether or not their model simulation is reasonable. Current research in the field of science education indicates that there are effective strategies to teach learners about science models. They involve having the learner work with, experiment with, modify, and apply models in a way that is significant and informative to the learner. It has also been noted that in the case of computational models, the installation and set up process alone can be time consuming and confusing for new users, which can hinder their ability to concentrate on using, experimenting with, and applying the model to real-world scenarios. Therefore, developing an online version of capacity building is an alternative approach to the workshop training programs, which makes use of new technologies and it allows for a long-term educational process in a way

  15. Top U.S. Sources for an Online Job Search.

    ERIC Educational Resources Information Center

    Dolan, Donna R.; Schumacher, John E.

    1994-01-01

    Discusses how to look for jobs via the Internet and online databases. Highlights include Internet sources for job information, including lists, gopher servers, online "Chronicle of Higher Education," College-Wide Information Systems (CWIS), and "America's Job Bank"; options without Internet access; where to list a resume; and appropriate online…

  16. Migrating an Online Service to WAP - A Case Study.

    ERIC Educational Resources Information Center

    Klasen, Lars

    2002-01-01

    Discusses mobile access via wireless application protocol (WAP) to online services that is offered in Sweden through InfoTorg. Topics include the Swedish online market; filtering HTML data from an Internet/Web server into WML (wireless markup language); mobile phone technology; microbrowsers; WAP protocol; and future possibilities. (LRW)

  17. Online Higher Education Instruction to Foster Critical Thinking When Assessing Environmental Issues - the Brownfield Action Model

    NASA Astrophysics Data System (ADS)

    Bower, Peter; Liddicoat, Joseph; Dittrick, Diane; Maenza-Gmelch, Terryanne; Kelsey, Ryan

    2013-04-01

    According to the Environmental Protection Agency, there are presently over half a million brownfields in the United States, but this number only includes sites for which an Environmental Site Assessment has been conducted. The actual number of brownfields is certainly into the millions and constitutes one of the major environmental issues confronting all communities today. Taught in part online for more than a decade in environmental science courses at over a dozen colleges, universities, and high schools in the United States, Brownfield Action (BA) is an interactive, web-based simulation that combines scientific expertise, constructivist education philosophy, and multimedia to advance the teaching of environmental science (Bower et al., 2011). In the online simulation and classroom, students form geotechnical consulting companies, conduct environmental site assessment investigations, and work collaboratively to solve a problem in environmental forensics. The BA model contains interdisciplinary scientific and social information that are integrated within a digital learning environment that encourages students to construct their knowledge as they learn by doing. As such, the approach improves the depth and coherence of students understanding of the course material. Like real-world environmental consultants, students are required to develop and apply expertise from a wide range of fields, including environmental science and engineering as well as journalism, medicine, public health, law, civics, economics, and business management. The overall objective is for students to gain an unprecedented appreciation of the complexity, ambiguity, and risk involved in any environmental issue or crisis.

  18. Strategies for Teaching Regional Climate Modeling: Online Professional Development for Scientists and Decision Makers

    NASA Astrophysics Data System (ADS)

    Walton, P.; Yarker, M. B.; Mesquita, M. D. S.; Otto, F. E. L.

    2014-12-01

    There is a clear role for climate science in supporting decision making at a range of scales and in a range of contexts: from Global to local, from Policy to Industry. However, clear a role climate science can play, there is also a clear discrepancy in the understanding of how to use the science and associated tools (such as climate models). Despite there being a large body of literature on the science there is clearly a need to provide greater support in how to apply appropriately. However, access to high quality professional development courses can be problematic, due to geographic, financial and time constraints. In attempt to address this gap we independently developed two online professional courses that focused on helping participants use and apply two regional climate models, WRF and PRECIS. Both courses were designed to support participants' learning through tutor led programs that covered the basic climate scientific principles of regional climate modeling and how to apply model outputs. The fundamental differences between the two courses are: 1) the WRF modeling course expected participants to design their own research question that was then run on a version of the model, whereas 2) the PRECIS course concentrated on the principles of regional modeling and how the climate science informed the modeling process. The two courses were developed to utilise the cost and time management benefits associated with eLearning, with the recognition that this mode of teaching can also be accessed internationally, providing professional development courses in countries that may not be able to provide their own. The development teams saw it as critical that the courses reflected sound educational theory, to ensure that participants had the maximum opportunity to learn successfully. In particular, the role of reflection is central to both course structures to help participants make sense of the science in relation to their own situation. This paper details the different

  19. STDP Installs in Winner-Take-All Circuits an Online Approximation to Hidden Markov Model Learning

    PubMed Central

    Kappel, David; Nessler, Bernhard; Maass, Wolfgang

    2014-01-01

    In order to cross a street without being run over, we need to be able to extract very fast hidden causes of dynamically changing multi-modal sensory stimuli, and to predict their future evolution. We show here that a generic cortical microcircuit motif, pyramidal cells with lateral excitation and inhibition, provides the basis for this difficult but all-important information processing capability. This capability emerges in the presence of noise automatically through effects of STDP on connections between pyramidal cells in Winner-Take-All circuits with lateral excitation. In fact, one can show that these motifs endow cortical microcircuits with functional properties of a hidden Markov model, a generic model for solving such tasks through probabilistic inference. Whereas in engineering applications this model is adapted to specific tasks through offline learning, we show here that a major portion of the functionality of hidden Markov models arises already from online applications of STDP, without any supervision or rewards. We demonstrate the emergent computing capabilities of the model through several computer simulations. The full power of hidden Markov model learning can be attained through reward-gated STDP. This is due to the fact that these mechanisms enable a rejection sampling approximation to theoretically optimal learning. We investigate the possible performance gain that can be achieved with this more accurate learning method for an artificial grammar task. PMID:24675787

  20. Gas Path On-line Fault Diagnostics Using a Nonlinear Integrated Model for Gas Turbine Engines

    NASA Astrophysics Data System (ADS)

    Lu, Feng; Huang, Jin-quan; Ji, Chun-sheng; Zhang, Dong-dong; Jiao, Hua-bin

    2014-08-01

    Gas turbine engine gas path fault diagnosis is closely related technology that assists operators in managing the engine units. However, the performance gradual degradation is inevitable due to the usage, and it result in the model mismatch and then misdiagnosis by the popular model-based approach. In this paper, an on-line integrated architecture based on nonlinear model is developed for gas turbine engine anomaly detection and fault diagnosis over the course of the engine's life. These two engine models have different performance parameter update rate. One is the nonlinear real-time adaptive performance model with the spherical square-root unscented Kalman filter (SSR-UKF) producing performance estimates, and the other is a nonlinear baseline model for the measurement estimates. The fault detection and diagnosis logic is designed to discriminate sensor fault and component fault. This integration architecture is not only aware of long-term engine health degradation but also effective to detect gas path performance anomaly shifts while the engine continues to degrade. Compared to the existing architecture, the proposed approach has its benefit investigated in the experiment and analysis.

  1. Methodology to model the energy and greenhouse gas emissions of electronic software distributions.

    PubMed

    Williams, Daniel R; Tang, Yinshan

    2012-01-17

    A new electronic software distribution (ESD) life cycle analysis (LCA) methodology and model structure were constructed to calculate energy consumption and greenhouse gas (GHG) emissions. In order to counteract the use of high level, top-down modeling efforts, and to increase result accuracy, a focus upon device details and data routes was taken. In order to compare ESD to a relevant physical distribution alternative, physical model boundaries and variables were described. The methodology was compiled from the analysis and operational data of a major online store which provides ESD and physical distribution options. The ESD method included the calculation of power consumption of data center server and networking devices. An in-depth method to calculate server efficiency and utilization was also included to account for virtualization and server efficiency features. Internet transfer power consumption was analyzed taking into account the number of data hops and networking devices used. The power consumed by online browsing and downloading was also factored into the model. The embedded CO(2)e of server and networking devices was proportioned to each ESD process. Three U.K.-based ESD scenarios were analyzed using the model which revealed potential CO(2)e savings of 83% when ESD was used over physical distribution. Results also highlighted the importance of server efficiency and utilization methods. PMID:22107078

  2. The RAMI On-line Model Checker (ROMC): A tool for the automated evaluation of canopy reflectance models.

    NASA Astrophysics Data System (ADS)

    Widlowski, J.-L.; Robustelli, M.; Taberner, M.; Pinty, B.; Rami Participants, All

    The Radiative transfer Model Intercomparison RAMI exercise was first launched in 1999 and then again in 2002 and 2005 RAMI aims at evaluating the performance of canopy reflectance models in absence of any absolute reference truth It does so by intercomparing models over a large ensemble of test cases under a variety of spectral and illumination conditions A series of criteria can be applied to select an ensemble of mutually agreeing 3-D Monte Carlo models to provide a surrogate truth against which all other models can then be compared We will present an overview of the RAMI activities and show how the results of the latest phase have lead to the development of the RAMI Online model checker ROMC This tool allows both model developers and users to evaluate the performance of their canopy reflectance models a against previous RAMI test cases whose results have already been published in the literature and b against test cases that are similar to the RAMI cases but for which no results will be known a priori As such the ROMC allows models to be debugged and or validated autonomously on a limited number of test cases RAMI-certified graphics that document a model s performance can be downloaded for future use in scientific presentations and or publications

  3. A Supervised Learning Process to Validate Online Disease Reports for Use in Predictive Models

    PubMed Central

    Patching, Helena M.M.; Hudson, Laurence M.; Cooke, Warrick; Garcia, Andres J.; Hay, Simon I.; Roberts, Mark; Moyes, Catherine L.

    2015-01-01

    Abstract Pathogen distribution models that predict spatial variation in disease occurrence require data from a large number of geographic locations to generate disease risk maps. Traditionally, this process has used data from public health reporting systems; however, using online reports of new infections could speed up the process dramatically. Data from both public health systems and online sources must be validated before they can be used, but no mechanisms exist to validate data from online media reports. We have developed a supervised learning process to validate geolocated disease outbreak data in a timely manner. The process uses three input features, the data source and two metrics derived from the location of each disease occurrence. The location of disease occurrence provides information on the probability of disease occurrence at that location based on environmental and socioeconomic factors and the distance within or outside the current known disease extent. The process also uses validation scores, generated by disease experts who review a subset of the data, to build a training data set. The aim of the supervised learning process is to generate validation scores that can be used as weights going into the pathogen distribution model. After analyzing the three input features and testing the performance of alternative processes, we selected a cascade of ensembles comprising logistic regressors. Parameter values for the training data subset size, number of predictors, and number of layers in the cascade were tested before the process was deployed. The final configuration was tested using data for two contrasting diseases (dengue and cholera), and 66%–79% of data points were assigned a validation score. The remaining data points are scored by the experts, and the results inform the training data set for the next set of predictors, as well as going to the pathogen distribution model. The new supervised learning process has been implemented within our live

  4. Servers for sequence–structure relationship analysis and prediction

    PubMed Central

    Dosztányi, Zsuzsanna; Magyar, Csaba; Tusnády, Gábor E.; Cserző, Miklós; Fiser, András; Simon, István

    2003-01-01

    We describe several algorithms and public servers that were developed to analyze and predict various features of protein structures. These servers provide information about the covalent state of cysteine (CYSREDOX), as well as about residues involved in non-covalent cross links that play an important role in the structural stability of proteins (SCIDE and SCPRED). We also discuss methods and servers developed to identify helical transmembrane proteins from large databases and rough genomic data, including two of the most popular transmembrane prediction methods, DAS and HMMTOP. Several biologically interesting applications of these servers are also presented. The servers are available through http://www.enzim.hu/servers.html. PMID:12824327

  5. Servers for sequence-structure relationship analysis and prediction.

    PubMed

    Dosztányi, Zsuzsanna; Magyar, Csaba; Tusnády, Gábor E; Cserzo, Miklós; Fiser, András; Simon, István

    2003-07-01

    We describe several algorithms and public servers that were developed to analyze and predict various features of protein structures. These servers provide information about the covalent state of cysteine (CYSREDOX), as well as about residues involved in non-covalent cross links that play an important role in the structural stability of proteins (SCIDE and SCPRED). We also discuss methods and servers developed to identify helical transmembrane proteins from large databases and rough genomic data, including two of the most popular transmembrane prediction methods, DAS and HMMTOP. Several biologically interesting applications of these servers are also presented. The servers are available through http://www.enzim.hu/servers.html. PMID:12824327

  6. Optimizing the NASA Technical Report Server

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Maa, Ming-Hokng

    1996-01-01

    The NASA Technical Report Server (NTRS), a World Wide Web report distribution NASA technical publications service, is modified for performance enhancement, greater protocol support, and human interface optimization. Results include: Parallel database queries, significantly decreasing user access times by an average factor of 2.3; access from clients behind firewalls and/ or proxies which truncate excessively long Uniform Resource Locators (URLs); access to non-Wide Area Information Server (WAIS) databases and compatibility with the 239-50.3 protocol; and a streamlined user interface.

  7. A Server-Based Mobile Coaching System

    PubMed Central

    Baca, Arnold; Kornfeind, Philipp; Preuschl, Emanuel; Bichler, Sebastian; Tampier, Martin; Novatchkov, Hristo

    2010-01-01

    A prototype system for monitoring, transmitting and processing performance data in sports for the purpose of providing feedback has been developed. During training, athletes are equipped with a mobile device and wireless sensors using the ANT protocol in order to acquire biomechanical, physiological and other sports specific parameters. The measured data is buffered locally and forwarded via the Internet to a server. The server provides experts (coaches, biomechanists, sports medicine specialists etc.) with remote data access, analysis and (partly automated) feedback routines. In this way, experts are able to analyze the athlete’s performance and return individual feedback messages from remote locations. PMID:22163490

  8. SERVER DEVELOPMENT FOR NSLS-II PHYSICS APPLICATIONS AND PERFORMANCE ANALYSIS

    SciTech Connect

    Shen, G.; Kraimer, M.

    2011-03-28

    The beam commissioning software framework of NSLS-II project adopts a client/server based architecture to replace the more traditional monolithic high level application approach. The server software under development is available via an open source sourceforge project named epics-pvdata, which consists of modules pvData, pvAccess, pvIOC, and pvService. Examples of two services that already exist in the pvService module are itemFinder, and gather. Each service uses pvData to store in-memory transient data, pvService to transfer data over the network, and pvIOC as the service engine. The performance benchmarking for pvAccess and both gather service and item finder service are presented in this paper. The performance comparison between pvAccess and Channel Access are presented also. For an ultra low emittance synchrotron radiation light source like NSLS II, the control system requirements, especially for beam control are tight. To control and manipulate the beam effectively, a use case study has been performed to satisfy the requirement and theoretical evaluation has been performed. The analysis shows that model based control is indispensable for beam commissioning and routine operation. However, there are many challenges such as how to re-use a design model for on-line model based control, and how to combine the numerical methods for modeling of a realistic lattice with the analytical techniques for analysis of its properties. To satisfy the requirements and challenges, adequate system architecture for the software framework for beam commissioning and operation is critical. The existing traditional approaches are self-consistent, and monolithic. Some of them have adopted a concept of middle layer to separate low level hardware processing from numerical algorithm computing, physics modelling, data manipulating and plotting, and error handling. However, none of the existing approaches can satisfy the requirement. A new design has been proposed by introducing service

  9. Online Learning. Symposium.

    ERIC Educational Resources Information Center

    2002

    This document contains three papers from a symposium on online learning that was conducted as part of a conference on human resource development (HRD). "An Instructional Strategy Framework for Online Learning Environments" (Scott D. Johnson, Steven R. Aragon) discusses the pitfalls of modeling online courses after traditional instruction instead…

  10. A Hierarchical Neuronal Model for Generation and Online Recognition of Birdsongs

    PubMed Central

    Yildiz, Izzet B.; Kiebel, Stefan J.

    2011-01-01

    The neuronal system underlying learning, generation and recognition of song in birds is one of the best-studied systems in the neurosciences. Here, we use these experimental findings to derive a neurobiologically plausible, dynamic, hierarchical model of birdsong generation and transform it into a functional model of birdsong recognition. The generation model consists of neuronal rate models and includes critical anatomical components like the premotor song-control nucleus HVC (proper name), the premotor nucleus RA (robust nucleus of the arcopallium), and a model of the syringeal and respiratory organs. We use Bayesian inference of this dynamical system to derive a possible mechanism for how birds can efficiently and robustly recognize the songs of their conspecifics in an online fashion. Our results indicate that the specific way birdsong is generated enables a listening bird to robustly and rapidly perceive embedded information at multiple time scales of a song. The resulting mechanism can be useful for investigating the functional roles of auditory recognition areas and providing predictions for future birdsong experiments. PMID:22194676

  11. MediaMesh: an architecture for integrating isochronous processing algorithms into media servers

    NASA Astrophysics Data System (ADS)

    Amini, Lisa D.; Lepre, Jorge; Kienzle, Martin G.

    1999-12-01

    Current media servers do not provide the generality required to easily integrate arbitrary isochronous processing algorithms into streams of continuous media. Specifically, present day video server architectures primarily focus on disk and network strategies for efficiently managing available resources under stringent QoS guarantees. However, they do not fully consider the problems of integrating the wide variety of algorithms required for interactive multimedia applications. Examples of applications benefiting from a more flexible server environment include watermarking, encrypting or scrambling streams, visual VCR operations, and multiplexing or demultiplexing of live presentations. In this paper, we detail the MediaMesh architecture for integrating arbitrary isochronous processing algorithms into general purpose media servers. Our framework features a programming model through which user-written modules can be dynamically loaded and interconnected in self-managing graphs of stream processing components. Design highlights include novel techniques for distributed stream control, efficient buffer management and QoS management. To demonstrate its applicability, we have implemented the MediaMesh architecture in the context of a commercial video server. We illustrate the viability of the architecture through performance data collected from four processing modules that were implemented to facilitate new classes of applications on our video server.

  12. Combining next-generation sequencing and online databases for microsatellite development in non-model organisms.

    PubMed

    Rico, Ciro; Normandeau, Eric; Dion-Côté, Anne-Marie; Rico, María Inés; Côté, Guillaume; Bernatchez, Louis

    2013-01-01

    Next-generation sequencing (NGS) is revolutionising marker development and the rapidly increasing amount of transcriptomes published across a wide variety of taxa is providing valuable sequence databases for the identification of genetic markers without the need to generate new sequences. Microsatellites are still the most important source of polymorphic markers in ecology and evolution. Motivated by our long-term interest in the adaptive radiation of a non-model species complex of whitefishes (Coregonus spp.), in this study, we focus on microsatellite characterisation and multiplex optimisation using transcriptome sequences generated by Illumina® and Roche-454, as well as online databases of Expressed Sequence Tags (EST) for the study of whitefish evolution and demographic history. We identified and optimised 40 polymorphic loci in multiplex PCR reactions and validated the robustness of our analyses by testing several population genetics and phylogeographic predictions using 494 fish from five lakes and 2 distinct ecotypes. PMID:24296905

  13. Online Robot Dead Reckoning Localization Using Maximum Relative Entropy Optimization With Model Constraints

    SciTech Connect

    Urniezius, Renaldas

    2011-03-14

    The principle of Maximum relative Entropy optimization was analyzed for dead reckoning localization of a rigid body when observation data of two attached accelerometers was collected. Model constraints were derived from the relationships between the sensors. The experiment's results confirmed that accelerometers each axis' noise can be successfully filtered utilizing dependency between channels and the dependency between time series data. Dependency between channels was used for a priori calculation, and a posteriori distribution was derived utilizing dependency between time series data. There was revisited data of autocalibration experiment by removing the initial assumption that instantaneous rotation axis of a rigid body was known. Performance results confirmed that such an approach could be used for online dead reckoning localization.

  14. Combining next-generation sequencing and online databases for microsatellite development in non-model organisms

    PubMed Central

    Rico, Ciro; Normandeau, Eric; Dion-Côté, Anne-Marie; Rico, María Inés; Côté, Guillaume; Bernatchez, Louis

    2013-01-01

    Next-generation sequencing (NGS) is revolutionising marker development and the rapidly increasing amount of transcriptomes published across a wide variety of taxa is providing valuable sequence databases for the identification of genetic markers without the need to generate new sequences. Microsatellites are still the most important source of polymorphic markers in ecology and evolution. Motivated by our long-term interest in the adaptive radiation of a non-model species complex of whitefishes (Coregonus spp.), in this study, we focus on microsatellite characterisation and multiplex optimisation using transcriptome sequences generated by Illumina® and Roche-454, as well as online databases of Expressed Sequence Tags (EST) for the study of whitefish evolution and demographic history. We identified and optimised 40 polymorphic loci in multiplex PCR reactions and validated the robustness of our analyses by testing several population genetics and phylogeographic predictions using 494 fish from five lakes and 2 distinct ecotypes. PMID:24296905

  15. New Development of the Online Integrated Climate-Chemistry model framwork (RegCM-CHEM4)

    NASA Astrophysics Data System (ADS)

    Zakey, A. S.; Shalaby, A. K.; Solmon, F.; Giorgi, F.; Tawfik, A. B.; Steiner, A. L.; Baklanov, A.

    2012-04-01

    The RegCM-CHEM4 is a new online integrated climate-chemistry model based on the regional climate model (RegCM4). The RegCM4 developed at the Abdus Salam International Centre for Theoretical Physics (ICTP), is a hydrostatic, sigma coordinate model. Tropospheric gas-phase chemistry is integrated into the climate model using the condensed version of the Carbon Bond Mechanism CBM-Z with lumped species that represent broad categories of organics based on carbon bond structure. The computationally rapid radical balance method RBM is coupled as a chemical solver to the gas-phase mechanism. Photolysis rates are determined as a function of meteorological and chemical inputs and interpolated from an array of pre-determined values based on the Tropospheric Ultraviolet-Visible Model (TUV) with cloud cover corrections. Cloud optical depths and cloud altitudes from RegCM-CHEM4 are used in the photolysis calculations, thereby directly coupling the photolysis rates and chemical reactions to meteorological conditions at each model time step. In this study, we evaluate the model over Europe for two different time scales: (1) an event-based analysis of the ozone episode associated with the heat wave of August 2003 and (2) a climatological analysis of a six-year simulation (2000-2005). For the episode analysis, model simulations show a good agreement with the European Monitoring and Evaluation Program (EMEP) observations of hourly ozone over different regions in Europe and capture ozone concentrations during and after the summer 2003 heat wave event. Analysis of the full six years of simulation indicates that the coupled chemistry-climate model can reproduce the seasonal cycle of ozone, with an overestimation of ozone in the non-event years of 5-15 ppb depending on the geographic region. Overall, the ozone and ozone precursor evaluation shows the feasibility of using RegCM-CHEM4 for decadal-length simulations of chemistry-climate interactions.

  16. On-Line Model-Based System For Nuclear Plant Monitoring

    NASA Astrophysics Data System (ADS)

    Tsoukalas, Lefteri H.; Lee, G. W.; Ragheb, Magdi; McDonough, T.; Niziolek, F.; Parker, M.

    1989-03-01

    A prototypical on-line model-based system, LASALLE1, developed at the University of Illinois in collaboration with the Illinois Department of Nuclear Safety (IDNS) is described. Its main purpose is to interpret about 300 signals, updated every two minutes at IDNS from the LaSalle Nuclear Power Plant, and to diagnose possible abnormal conditions. It is written in VAX/VMS OPS5 and operates on both the on-line and testing modes. In its knowledge base, operator and plant actions pertaining to the Emergency Operating Procedure(EOP) A-01, are encoded. This is a procedure driven by a reactor's coolant level and pressure signals; with the purpose of shutting down the reactor, maintaining adequate core cooling and reducing the reactor pressure and temperature to cold shutdown conditions ( about 90 to 200 °F). The monitoring of the procedure is performed from the perspective of Emergency Preparedness. Two major issues are addressed in this system. First, the management of the short-term or working memory of the system. LASALLE1 must reach its inferences, display its conclusion and update a message file every two minutes before a new set of data arrives from the plant. This was achieved by superimposing additional layers of control over the inferencing strategies inherent in OPS5, and developing special rules for the management of the used or outdated information. The second issue is the representation of information and its uncertainty. The concepts of information granularity and performance-level, which are based on a coupling of Probability Theory and the theory of Fuzzy Sets, are used for this purpose. The estimation of the performance-level incorporates a mathematical methodology which accounts for two types of uncertainty encountered in monitoring physical systems: Random uncertainty, in the form of of probability density functions generated by observations, measurements and sensors data and fuzzy uncertainty represented by membership functions based on symbolic

  17. Advanced Online Survival Analysis Tool for Predictive Modelling in Clinical Data Science.

    PubMed

    Montes-Torres, Julio; Subirats, José Luis; Ribelles, Nuria; Urda, Daniel; Franco, Leonardo; Alba, Emilio; Jerez, José Manuel

    2016-01-01

    One of the prevailing applications of machine learning is the use of predictive modelling in clinical survival analysis. In this work, we present our view of the current situation of computer tools for survival analysis, stressing the need of transferring the latest results in the field of machine learning to biomedical researchers. We propose a web based software for survival analysis called OSA (Online Survival Analysis), which has been developed as an open access and user friendly option to obtain discrete time, predictive survival models at individual level using machine learning techniques, and to perform standard survival analysis. OSA employs an Artificial Neural Network (ANN) based method to produce the predictive survival models. Additionally, the software can easily generate survival and hazard curves with multiple options to personalise the plots, obtain contingency tables from the uploaded data to perform different tests, and fit a Cox regression model from a number of predictor variables. In the Materials and Methods section, we depict the general architecture of the application and introduce the mathematical background of each of the implemented methods. The study concludes with examples of use showing the results obtained with public datasets. PMID:27532883

  18. Advanced Online Survival Analysis Tool for Predictive Modelling in Clinical Data Science

    PubMed Central

    Montes-Torres, Julio; Subirats, José Luis; Ribelles, Nuria; Urda, Daniel; Franco, Leonardo; Alba, Emilio; Jerez, José Manuel

    2016-01-01

    One of the prevailing applications of machine learning is the use of predictive modelling in clinical survival analysis. In this work, we present our view of the current situation of computer tools for survival analysis, stressing the need of transferring the latest results in the field of machine learning to biomedical researchers. We propose a web based software for survival analysis called OSA (Online Survival Analysis), which has been developed as an open access and user friendly option to obtain discrete time, predictive survival models at individual level using machine learning techniques, and to perform standard survival analysis. OSA employs an Artificial Neural Network (ANN) based method to produce the predictive survival models. Additionally, the software can easily generate survival and hazard curves with multiple options to personalise the plots, obtain contingency tables from the uploaded data to perform different tests, and fit a Cox regression model from a number of predictor variables. In the Materials and Methods section, we depict the general architecture of the application and introduce the mathematical background of each of the implemented methods. The study concludes with examples of use showing the results obtained with public datasets. PMID:27532883

  19. DIANA-microT web server: elucidating microRNA functions through target prediction.

    PubMed

    Maragkakis, M; Reczko, M; Simossis, V A; Alexiou, P; Papadopoulos, G L; Dalamagas, T; Giannopoulos, G; Goumas, G; Koukis, E; Kourtis, K; Vergoulis, T; Koziris, N; Sellis, T; Tsanakas, P; Hatzigeorgiou, A G

    2009-07-01

    Computational microRNA (miRNA) target prediction is one of the key means for deciphering the role of miRNAs in development and disease. Here, we present the DIANA-microT web server as the user interface to the DIANA-microT 3.0 miRNA target prediction algorithm. The web server provides extensive information for predicted miRNA:target gene interactions with a user-friendly interface, providing extensive connectivity to online biological resources. Target gene and miRNA functions may be elucidated through automated bibliographic searches and functional information is accessible through Kyoto Encyclopedia of Genes and Genomes (KEGG) pathways. The web server offers links to nomenclature, sequence and protein databases, and users are facilitated by being able to search for targeted genes using different nomenclatures or functional features, such as the genes possible involvement in biological pathways. The target prediction algorithm supports parameters calculated individually for each miRNA:target gene interaction and provides a signal-to-noise ratio and a precision score that helps in the evaluation of the significance of the predicted results. Using a set of miRNA targets recently identified through the pSILAC method, the performance of several computational target prediction programs was assessed. DIANA-microT 3.0 achieved there with 66% the highest ratio of correctly predicted targets over all predicted targets. The DIANA-microT web server is freely available at www.microrna.gr/microT. PMID:19406924

  20. Uncertainty preserving patch-based online modeling for 3D model acquisition and integration from passive motion imagery

    NASA Astrophysics Data System (ADS)

    Tang, Hao; Chang, Peng; Molina, Edgardo; Zhu, Zhigang

    2012-06-01

    In both military and civilian applications, abundant data from diverse sources captured on airborne platforms are often available for a region attracting interest. Since the data often includes motion imagery streams collected from multiple platforms flying at different altitudes, with sensors of different field of views (FOVs), resolutions, frame rates and spectral bands, it is imperative that a cohesive site model encompassing all the information can be quickly built and presented to the analysts. In this paper, we propose to develop an Uncertainty Preserving Patch-based Online Modeling System (UPPOMS) leading towards the automatic creation and updating of a cohesive, geo-registered, uncertaintypreserving, efficient 3D site terrain model from passive imagery with varying field-of-views and phenomenologies. The proposed UPPOMS has the following technical thrusts that differentiate our approach from others: (1) An uncertaintypreserved, patch-based 3D model is generated, which enables the integration of images captured with a mixture of NFOV and WFOV and/or visible and infrared motion imagery sensors. (2) Patch-based stereo matching and multi-view 3D integration are utilized, which are suitable for scenes with many low texture regions, particularly in mid-wave infrared images. (3) In contrast to the conventional volumetric algorithms, whose computational and storage costs grow exponentially with the amount of input data and the scale of the scene, the proposed UPPOMS system employs an online algorithmic pipeline, and scales well to large amount of input data. Experimental results and discussions of future work will be provided.

  1. Process modeling and bottleneck mining in online peer-review systems.

    PubMed

    Premchaiswadi, Wichian; Porouhan, Parham

    2015-01-01

    This paper is divided into three main parts. In the first part of the study, we captured, collected and formatted an event log describing the handling of reviews for proceedings of an international conference in Thailand. In the second part, we used several process mining techniques in order to discover process models, social, organizational, and hierarchical structures from the proceeding's event log. In the third part, we detected the deviations and bottlenecks of the peer review process by comparing the observed events (i.e., authentic dataset) with a pre-defined model (i.e., master map). Finally, we investigated the performance information as well as the total waiting time in order to improve the effectiveness and efficiency of the online submission and peer review system for the prospective conferences and seminars. Consequently, the main goals of the study were as follows: (1) to convert the collected event log into the appropriate format supported by process mining analysis tools, (2) to discover process models and to construct social networks based on the collected event log, and (3) to find deviations, discrepancies and bottlenecks between the collected event log and the master pre-defined model. The results showed that although each paper was initially sent to three different reviewers; it was not always possible to make a decision after the first round of reviewing; therefore, additional reviewers were invited. In total, all the accepted and rejected manuscripts were reviewed by an average of 3.9 and 3.2 expert reviewers, respectively. Moreover, obvious violations of the rules and regulations relating to careless or inappropriate peer review of a manuscript-committed by the editorial board and other staff-were identified. Nine blocks of activity in the authentic dataset were not completely compatible with the activities defined in the master model. Also, five of the activity traces were not correctly enabled, and seven activities were missed within the

  2. Implementing bioinformatic workflows within the bioextract server

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Computational workflows in bioinformatics are becoming increasingly important in the achievement of scientific advances. These workflows typically require the integrated use of multiple, distributed data sources and analytic tools. The BioExtract Server (http://bioextract.org) is a distributed servi...

  3. Interfaces for Distributed Systems of Information Servers.

    ERIC Educational Resources Information Center

    Kahle, Brewster M.; And Others

    1993-01-01

    Describes five interfaces to remote, full-text databases accessed through distributed systems of servers. These are WAIStation for the Macintosh, XWAIS for X-Windows, GWAIS for Gnu-Emacs; SWAIS for dumb terminals, and Rosebud for the Macintosh. Sixteen illustrations provide examples of display screens. Problems and needed improvements are…

  4. World Wide Web Server Standards and Guidelines.

    ERIC Educational Resources Information Center

    Stubbs, Keith M.

    This document defines the specific standards and general guidelines which the U.S. Department of Education (ED) will use to make information available on the World Wide Web (WWW). The purpose of providing such guidance is to ensure high quality and consistent content, organization, and presentation of information on ED WWW servers, in order to…

  5. Development of intelligent instruments with embedded HTTP servers for control and data acquisition in a cryogenic setup--The hardware, firmware, and software implementation.

    PubMed

    Antony, Joby; Mathuria, D S; Datta, T S; Maity, Tanmoy

    2015-12-01

    The power of Ethernet for control and automation technology is being largely understood by the automation industry in recent times. Ethernet with HTTP (Hypertext Transfer Protocol) is one of the most widely accepted communication standards today. Ethernet is best known for being able to control through internet from anywhere in the globe. The Ethernet interface with built-in on-chip embedded servers ensures global connections for crate-less model of control and data acquisition systems which have several advantages over traditional crate-based control architectures for slow applications. This architecture will completely eliminate the use of any extra PLC (Programmable Logic Controller) or similar control hardware in any automation network as the control functions are firmware coded inside intelligent meters itself. Here, we describe the indigenously built project of a cryogenic control system built for linear accelerator at Inter University Accelerator Centre, known as "CADS," which stands for "Complete Automation of Distribution System." CADS deals with complete hardware, firmware, and software implementation of the automated linac cryogenic distribution system using many Ethernet based embedded cryogenic instruments developed in-house. Each instrument works as an intelligent meter called device-server which has the control functions and control loops built inside the firmware itself. Dedicated meters with built-in servers were designed out of ARM (Acorn RISC (Reduced Instruction Set Computer) Machine) and ATMEL processors and COTS (Commercially Off-the-Shelf) SMD (Surface Mount Devices) components, with analog sensor front-end and a digital back-end web server implementing remote procedure call over HTTP for digital control and readout functions. At present, 24 instruments which run 58 embedded servers inside, each specific to a particular type of sensor-actuator combination for closed loop operations, are now deployed and distributed across control LAN (Local

  6. Development of intelligent instruments with embedded HTTP servers for control and data acquisition in a cryogenic setup—The hardware, firmware, and software implementation

    NASA Astrophysics Data System (ADS)

    Antony, Joby; Mathuria, D. S.; Datta, T. S.; Maity, Tanmoy

    2015-12-01

    The power of Ethernet for control and automation technology is being largely understood by the automation industry in recent times. Ethernet with HTTP (Hypertext Transfer Protocol) is one of the most widely accepted communication standards today. Ethernet is best known for being able to control through internet from anywhere in the globe. The Ethernet interface with built-in on-chip embedded servers ensures global connections for crate-less model of control and data acquisition systems which have several advantages over traditional crate-based control architectures for slow applications. This architecture will completely eliminate the use of any extra PLC (Programmable Logic Controller) or similar control hardware in any automation network as the control functions are firmware coded inside intelligent meters itself. Here, we describe the indigenously built project of a cryogenic control system built for linear accelerator at Inter University Accelerator Centre, known as "CADS," which stands for "Complete Automation of Distribution System." CADS deals with complete hardware, firmware, and software implementation of the automated linac cryogenic distribution system using many Ethernet based embedded cryogenic instruments developed in-house. Each instrument works as an intelligent meter called device-server which has the control functions and control loops built inside the firmware itself. Dedicated meters with built-in servers were designed out of ARM (Acorn RISC (Reduced Instruction Set Computer) Machine) and ATMEL processors and COTS (Commercially Off-the-Shelf) SMD (Surface Mount Devices) components, with analog sensor front-end and a digital back-end web server implementing remote procedure call over HTTP for digital control and readout functions. At present, 24 instruments which run 58 embedded servers inside, each specific to a particular type of sensor-actuator combination for closed loop operations, are now deployed and distributed across control LAN (Local

  7. Development of intelligent instruments with embedded HTTP servers for control and data acquisition in a cryogenic setup—The hardware, firmware, and software implementation

    SciTech Connect

    Antony, Joby; Mathuria, D. S.; Datta, T. S.; Maity, Tanmoy

    2015-12-15

    The power of Ethernet for control and automation technology is being largely understood by the automation industry in recent times. Ethernet with HTTP (Hypertext Transfer Protocol) is one of the most widely accepted communication standards today. Ethernet is best known for being able to control through internet from anywhere in the globe. The Ethernet interface with built-in on-chip embedded servers ensures global connections for crate-less model of control and data acquisition systems which have several advantages over traditional crate-based control architectures for slow applications. This architecture will completely eliminate the use of any extra PLC (Programmable Logic Controller) or similar control hardware in any automation network as the control functions are firmware coded inside intelligent meters itself. Here, we describe the indigenously built project of a cryogenic control system built for linear accelerator at Inter University Accelerator Centre, known as “CADS,” which stands for “Complete Automation of Distribution System.” CADS deals with complete hardware, firmware, and software implementation of the automated linac cryogenic distribution system using many Ethernet based embedded cryogenic instruments developed in-house. Each instrument works as an intelligent meter called device-server which has the control functions and control loops built inside the firmware itself. Dedicated meters with built-in servers were designed out of ARM (Acorn RISC (Reduced Instruction Set Computer) Machine) and ATMEL processors and COTS (Commercially Off-the-Shelf) SMD (Surface Mount Devices) components, with analog sensor front-end and a digital back-end web server implementing remote procedure call over HTTP for digital control and readout functions. At present, 24 instruments which run 58 embedded servers inside, each specific to a particular type of sensor-actuator combination for closed loop operations, are now deployed and distributed across control LAN

  8. Hydrological modeling using a dynamic neuro-fuzzy system with on-line and local learning algorithm

    NASA Astrophysics Data System (ADS)

    Hong, Yoon-Seok Timothy; White, Paul A.

    2009-01-01

    This paper introduces the dynamic neuro-fuzzy local modeling system (DNFLMS) that is based on a dynamic Takagi-Sugeno (TS) type fuzzy inference system with on-line and local learning algorithm for complex dynamic hydrological modeling tasks. Our DNFLMS is aimed to implement a fast training speed with the capability of on-line simulation where model adaptation occurs at the arrival of each new item of hydrological data. The DNFLMS applies an on-line, one-pass, training procedure to create and update fuzzy local models dynamically. The extended Kalman filtering algorithm is then implemented to optimize the parameters of the consequence part of each fuzzy model during the training phase. Local generalization in the DNFLMS is employed to optimize the parameters of each fuzzy model separately, region-by-region, using subsets of training data rather than all training data. The proposed DNFLMS is applied to develop a model to forecast the flow of Waikoropupu Springs, located in the Takaka Valley, South Island, New Zealand, and the influence of the operation of the 32 Megawatt Cobb hydropower station on spring flow. It is demonstrated that the proposed DNFLMS is superior in terms of model complexity and computational efficiency when compared with models that adopt global generalization such as a multi-layer perceptron (MLP) trained with the back propagation learning algorithm and the well-known adaptive neural-fuzzy system (ANFIS).

  9. Modeling the Relationship between Transportation-Related Carbon Dioxide Emissions and Hybrid-Online Courses at a Large Urban University

    ERIC Educational Resources Information Center

    Little, Matthew; Cordero, Eugene

    2014-01-01

    Purpose: This paper aims to investigate the relationship between hybrid classes (where a per cent of the class meetings are online) and transportation-related CO[subscript 2] emissions at a commuter campus similar to San José State University (SJSU). Design/methodology/approach: A computer model was developed to calculate the number of trips to…

  10. The Interdependence of the Factors Influencing the Perceived Quality of the Online Learning Experience: A Causal Model

    ERIC Educational Resources Information Center

    Peltier, James W.; Schibrowsky, John A.; Drago, William

    2007-01-01

    A structural model of the drivers of online education is proposed and tested. The findings help to identify the interrelated nature of the lectures delivered via technology outside of the traditional classroom, the importance of mentoring, the need to develop course structure, the changing roles for instructors and students, and the importance of…

  11. A Theoretical Model and Analysis of the Effect of Self-Regulation on Attrition from Voluntary Online Training

    ERIC Educational Resources Information Center

    Sitzmann, Traci

    2012-01-01

    A theoretical model is presented that examines self-regulatory processes and trainee characteristics as predictors of attrition from voluntary online training in order to determine who is at risk of dropping out and the processes that occur during training that determine when they are at risk of dropping out. Attrition increased following declines…

  12. An Online Process Model of Second-Order Cultivation Effects: How Television Cultivates Materialism and Its Consequences for Life Satisfaction

    ERIC Educational Resources Information Center

    Shrum, L. J.; Lee, Jaehoon; Burroughs, James E.; Rindfleisch, Aric

    2011-01-01

    Two studies investigated the interrelations among television viewing, materialism, and life satisfaction, and their underlying processes. Study 1 tested an online process model for television's cultivation of materialism by manipulating level of materialistic content. Viewing level influenced materialism, but only among participants who reported…

  13. Toward a Theoretical Model of Decision-Making and Resistance to Change among Higher Education Online Course Designers

    ERIC Educational Resources Information Center

    Dodd, Bucky J.

    2013-01-01

    Online course design is an emerging practice in higher education, yet few theoretical models currently exist to explain or predict how the diffusion of innovations occurs in this space. This study used a descriptive, quantitative survey research design to examine theoretical relationships between decision-making style and resistance to change…

  14. Community of Inquiry: A Useful Model for Examining Educational Interactions in Online Graduate Education Courses at Christian Colleges

    ERIC Educational Resources Information Center

    Bartruff, Elizabeth Ann

    2009-01-01

    Using the Community of Inquiry (COI) model as a framework, this case study analyzed the interactions of teacher and students in an online graduate level education course at a small Christian college in the Pacific Northwest. Using transcript content analysis, communication between participants was coded as either contributing to the social,…

  15. A Model for Semi-Informal Online Learning Communities: A Case Study of the NASA INSPIRE Project

    ERIC Educational Resources Information Center

    Keesee, Amanda Glasgow

    2011-01-01

    Scope and Method of Study: The purpose of this study was to develop a model of informal online learning communities based on theory, research and practice. Case study methodology was used to examine the NASA Interdisciplinary National Science Project Incorporating Research and Education Experience (INSPIRE) Project as an example of a successful…

  16. Toward a Model of Sources of Influence in Online Education: Cognitive Learning and the Effects of Web 2.0

    ERIC Educational Resources Information Center

    Carr, Caleb T.; Zube, Paul; Dickens, Eric; Hayter, Carolyn A.; Barterian, Justin A.

    2013-01-01

    To explore the integration of education processes into social media, we tested an initial model of student learning via interactive web tools and theorized three sources of influence: interpersonal, intrapersonal, and masspersonal. Three-hundred thirty-seven students observed an online lecture and then completed a series of scales. Structural…

  17. CTserver: A Computational Thermodynamics Server for the Geoscience Community

    NASA Astrophysics Data System (ADS)

    Kress, V. C.; Ghiorso, M. S.

    2006-12-01

    The CTserver platform is an Internet-based computational resource that provides on-demand services in Computational Thermodynamics (CT) to a diverse geoscience user base. This NSF-supported resource can be accessed at ctserver.ofm-research.org. The CTserver infrastructure leverages a high-quality and rigorously tested software library of routines for computing equilibrium phase assemblages and for evaluating internally consistent thermodynamic properties of materials, e.g. mineral solid solutions and a variety of geological fluids, including magmas. Thermodynamic models are currently available for 167 phases. Recent additions include Duan, Møller and Weare's model for supercritical C-O-H-S, extended to include SO2 and S2 species, and an entirely new associated solution model for O-S-Fe-Ni sulfide liquids. This software library is accessed via the CORBA Internet protocol for client-server communication. CORBA provides a standardized, object-oriented, language and platform independent, fast, low-bandwidth interface to phase property modules running on the server cluster. Network transport, language translation and resource allocation are handled by the CORBA interface. Users access server functionality in two principal ways. Clients written as browser- based Java applets may be downloaded which provide specific functionality such as retrieval of thermodynamic properties of phases, computation of phase equilibria for systems of specified composition, or modeling the evolution of these systems along some particular reaction path. This level of user interaction requires minimal programming effort and is ideal for classroom use. A more universal and flexible mode of CTserver access involves making remote procedure calls from user programs directly to the server public interface. The CTserver infrastructure relieves the user of the burden of implementing and testing the often complex thermodynamic models of real liquids and solids. A pilot application of this distributed

  18. Case analysis online: a strategic management case model for the health industry.

    PubMed

    Walsh, Anne; Bearden, Eithne

    2004-01-01

    Despite the plethora of methods and tools available to support strategic management, the challenge for health executives in the next century will relate to their ability to access and interpret data from multiple and intricate communication networks. Integrated digital networks and satellite systems will expand the scope and ease of sharing information between business divisions, and networked systems will facilitate the use of virtual case discussions across universities. While the internet is frequently used to support clinical decisions in the healthcare industry, few executives rely upon the internetfor strategic analysis. Although electronic technologies can easily synthesize data from multiple information channels, research as well as technical issues may deter their application in strategic analysis. As digital models transform access to information, online models may become increasingly relevant in designing strategic solutions. While there are various pedagogical models available to support the strategic management process, this framework was designed to enhance strategic analysis through the application of technology and electronic research. A strategic analysis framework, which incorporated internet research and case analysis in a strategic managementcourse, is described alongwith design and application issues that emerged during the case analysis process. PMID:15129900

  19. An improved PSO-SVM model for online recognition defects in eddy current testing

    NASA Astrophysics Data System (ADS)

    Liu, Baoling; Hou, Dibo; Huang, Pingjie; Liu, Banteng; Tang, Huayi; Zhang, Wubo; Chen, Peihua; Zhang, Guangxin

    2013-12-01

    Accurate and rapid recognition of defects is essential for structural integrity and health monitoring of in-service device using eddy current (EC) non-destructive testing. This paper introduces a novel model-free method that includes three main modules: a signal pre-processing module, a classifier module and an optimisation module. In the signal pre-processing module, a kind of two-stage differential structure is proposed to suppress the lift-off fluctuation that could contaminate the EC signal. In the classifier module, multi-class support vector machine (SVM) based on one-against-one strategy is utilised for its good accuracy. In the optimisation module, the optimal parameters of classifier are obtained by an improved particle swarm optimisation (IPSO) algorithm. The proposed IPSO technique can improve convergence performance of the primary PSO through the following strategies: nonlinear processing of inertia weight, introductions of the black hole and simulated annealing model with extremum disturbance. The good generalisation ability of the IPSO-SVM model has been validated through adding additional specimen into the testing set. Experiments show that the proposed algorithm can achieve higher recognition accuracy and efficiency than other well-known classifiers and the superiorities are more obvious with less training set, which contributes to online application.

  20. Integration of an Online Simulated Prescription Analysis into Undergraduate Pharmacy Teaching Using Supplemental and Replacement Models

    PubMed Central

    Zlotos, Leon; Thompson, Ian D.

    2015-01-01

    Objective. To describe student use and perceptions of online simulated prescription analysis following integration of supplemental and replacement models into pharmacy practice teaching. Methods. Strathclyde Computerized Randomized Interactive Prescription Tutor (SCRIPT) is a simulated prescription analysis tool designed to support a pharmacy practice competency class. In 2008-2009, SCRIPT scenarios were released to coincide with timetabled teaching as the supplemental model. In 2009-2010, SCRIPT also replaced one-sixth of the taught component of the class as the replacement model. Student use and performance were compared, and their perceptions were documented. Results. In both cohorts, the majority of use (over 70%) occurred immediately before assessments. Remote access decreased from 6409 (supplemental) to 3782 (replacement) attempts per 100 students. There was no difference in student performance between the cohorts, Students reported group and individual use and 4 targeted approaches using SCRIPT. Conclusion. E-learning can reduce the staff time in pharmacy practice teaching without affecting student performance. SCRIPT permits flexible learning that suits student preferences. PMID:25995512

  1. Using GOMS models and hypertext to create representations of medical procedures for online display

    NASA Technical Reports Server (NTRS)

    Gugerty, Leo; Halgren, Shannon; Gosbee, John; Rudisill, Marianne

    1991-01-01

    This study investigated two methods to improve organization and presentation of computer-based medical procedures. A literature review suggested that the GOMS (goals, operators, methods, and selecton rules) model can assist in rigorous task analysis, which can then help generate initial design ideas for the human-computer interface. GOMS model are hierarchical in nature, so this study also investigated the effect of hierarchical, hypertext interfaces. We used a 2 x 2 between subjects design, including the following independent variables: procedure organization - GOMS model based vs. medical-textbook based; navigation type - hierarchical vs. linear (booklike). After naive subjects studies the online procedures, measures were taken of their memory for the content and the organization of the procedures. This design was repeated for two medical procedures. For one procedure, subjects who studied GOMS-based and hierarchical procedures remembered more about the procedures than other subjects. The results for the other procedure were less clear. However, data for both procedures showed a 'GOMSification effect'. That is, when asked to do a free recall of a procedure, subjects who had studies a textbook procedure often recalled key information in a location inconsistent with the procedure they actually studied, but consistent with the GOMS-based procedure.

  2. On-line Meteorology-Chemistry/Aerosols Modelling and Integration for Risk Assessment: Case Studies

    NASA Astrophysics Data System (ADS)

    Bostanbekov, Kairat; Mahura, Alexander; Nuterman, Roman; Nurseitov, Daniyar; Zakarin, Edige; Baklanov, Alexander

    2016-04-01

    On regional level, and especially in areas with potential diverse sources of industrial pollutants, the risk assessment of impact on environment and population is critically important. During normal operations, the risk is minimal. However, during accidental situations, the risk is increased due to releases of harmful pollutants into different environments such as water, soil, and atmosphere where it is following processes of continuous transformation and transport. In this study, the Enviro-HIRLAM (Environment High Resolution Limited Area Model) was adapted and employed for assessment of scenarios with accidental and continuous emissions of sulphur dioxide (SO2) for selected case studies during January of 2010. The following scenarios were considered: (i) control reference run; (ii) accidental release (due to short-term 1 day fire at oil storage facility) occurred at city of Atyrau (Kazakhstan) near the northern part of the Caspian Sea; and (iii) doubling of original continuous emissions from three locations of metallurgical enterprises on the Kola Peninsula (Russia). The implemented aerosol microphysics module M7 uses 5 types - sulphates, sea salt, dust, black and organic carbon; as well as distributed in 7 size modes. Removal processes of aerosols include gravitational settling and wet deposition. As the Enviro-HIRLAM model is the on-line integrated model, both meteorological and chemical processes are simultaneously modelled at each time step. The modelled spatio-temporal variations for meteorological and chemical patterns are analyzed for both European and Kazakhstan regions domains. The results of evaluation of sulphur dioxide concentration and deposition on main populated cities, selected regions, countries are presented employing GIS tools. As outcome, the results of Enviro-HIRLAM modelling for accidental release near the Caspian Sea are integrated into the RANDOM (Risk Assessment of Nature Detriment due to Oil spill Migration) system.

  3. Deploying Server-side File System Monitoring at NERSC

    SciTech Connect

    Uselton, Andrew

    2009-05-01

    The Franklin Cray XT4 at the NERSC center was equipped with the server-side I/O monitoring infrastructure Cerebro/LMT, which is described here in detail. Insights gained from the data produced include a better understanding of instantaneous data rates during file system testing, file system behavior during regular production time, and long-term average behaviors. Information and insights gleaned from this monitoring support efforts to proactively manage the I/O infrastructure on Franklin. A simple model for I/O transactions is introduced and compared with the 250 million observations sent to the LMT database from August 2008 to February 2009.

  4. Applications of Server Clustering Technology in Sensor Networks

    NASA Astrophysics Data System (ADS)

    Davis, G.; Foley, S.; Battistuz, B.; Eakins, J.; Vernon, F. L.; Astiz, L.

    2007-12-01

    The Array Network Facility is charged with the acquisition and processing of seismic data from the Earthscope USArray experiment. High resolution data from 400 seismic sensors is streamed in near real-time to the ANF at UCSD in La Jolla, CA where it is automatically processed by machine and reviewed by analysts before being externally distributed to other data centers, including the IRIS Data Management Center. Data streams include six channels of 24- bit seismic data at 40 samples per second and over twenty channels of state-of-heath data at 1 sample per second per station. The sheer volume of data acquired and processed overwhelms the capabilities of any one affordable server system. Due to the relatively small buffers on-site (typically four hours) at the seismic stations, it is vital that the real-time systems remain online and acquiring data around the clock in order to meet data distribution requirements in a timely manner. Although the ANF does not have a 24x7x365 operations staff, the logistical difficulty in retrieving data from often remote locations after it expires from the on-site buffers requires the real- time systems to automatically recover from server failures without immediate operator intervention. To accomplish these goals, the ANF has implemented a five node Sun Solaris Cluster with acquisition and processing tasks shared by a mixture of integer and floating point processing units (Sun T2000 and V240/V245 systems). This configuration is an improvement over the typical regional network data center for a number of reasons: - By implementing a shared storage architecture, acquisition, processing, and distribution can be split between multiple systems working on the same data set, thus limiting the impact of a particularly resource-intensive task on the acquisition system. - The Solaris Cluster software monitors the health of the cluster nodes and provides the ability automatically fail over processes from a failed node to a healthy node

  5. The Live Access Server Scientific Product Generation Through Workflow Orchestration

    NASA Astrophysics Data System (ADS)

    Hankin, S.; Calahan, J.; Li, J.; Manke, A.; O'Brien, K.; Schweitzer, R.

    2006-12-01

    The Live Access Server (LAS) is a well-established Web-application for display and analysis of geo-science data sets. The software, which can be downloaded and installed by anyone, gives data providers an easy way to establish services for their on-line data holdings, so their users can make plots; create and download data sub-sets; compare (difference) fields; and perform simple analyses. Now at version 7.0, LAS has been in operation since 1994. The current "Armstrong" release of LAS V7 consists of three components in a tiered architecture: user interface, workflow orchestration and Web Services. The LAS user interface (UI) communicates with the LAS Product Server via an XML protocol embedded in an HTTP "get" URL. Libraries (APIs) have been developed in Java, JavaScript and perl that can readily generate this URL. As a result of this flexibility it is common to find LAS user interfaces of radically different character, tailored to the nature of specific datasets or the mindset of specific users. When a request is received by the LAS Product Server (LPS -- the workflow orchestration component), business logic converts this request into a series of Web Service requests invoked via SOAP. These "back- end" Web services perform data access and generate products (visualizations, data subsets, analyses, etc.). LPS then packages these outputs into final products (typically HTML pages) via Jakarta Velocity templates for delivery to the end user. "Fine grained" data access is performed by back-end services that may utilize JDBC for data base access; the OPeNDAP "DAPPER" protocol; or (in principle) the OGC WFS protocol. Back-end visualization services are commonly legacy science applications wrapped in Java or Python (or perl) classes and deployed as Web Services accessible via SOAP. Ferret is the default visualization application used by LAS, though other applications such as Matlab, CDAT, and GrADS can also be used. Other back-end services may include generation of Google

  6. Predictive Modeling to Forecast Student Outcomes and Drive Effective Interventions in Online Community College Courses

    ERIC Educational Resources Information Center

    Smith, Vernon C.; Lange, Adam; Huston, Daniel R.

    2012-01-01

    Community colleges continue to experience growth in online courses. This growth reflects the need to increase the numbers of students who complete certificates or degrees. Retaining online students, not to mention assuring their success, is a challenge that must be addressed through practical institutional responses. By leveraging existing student…

  7. Evaluating Two Models of Collaborative Tests in an Online Introductory Statistics Course

    ERIC Educational Resources Information Center

    Björnsdóttir, Auðbjörg; Garfield, Joan; Everson, Michelle

    2015-01-01

    This study explored the use of two different types of collaborative tests in an online introductory statistics course. A study was designed and carried out to investigate three research questions: (1) What is the difference in students' learning between using consensus and non-consensus collaborative tests in the online environment?, (2) What is…

  8. A Blended Model: Simultaneously Teaching a Quantitative Course Traditionally, Online, and Remotely

    ERIC Educational Resources Information Center

    Lightner, Constance A.; Lightner-Laws, Carin A.

    2016-01-01

    As universities seek to bolster enrollment through distance education, faculty are tasked with maintaining comparable teaching/learning standards in traditional, blended, and online courses. Research has shown that there is an achievement gap between students taking courses exclusively offered online versus those enrolled in face-to-face classes.…

  9. Learning from e-Family History: A Model of Online Family Historian Research Behaviour

    ERIC Educational Resources Information Center

    Friday, Kate

    2014-01-01

    Introduction: This paper reports on doctoral research which investigated the online research behaviour of family historians, from the overall perspective of local studies collections and developing online services for family historians. Method: A hybrid (primarily ethnographic) study was employed using qualitative diaries and shadowing, to examine…

  10. Testing a Model to Predict Online Cheating--Much Ado about Nothing

    ERIC Educational Resources Information Center

    Beck, Victoria

    2014-01-01

    Much has been written about student and faculty opinions on academic integrity in testing. Currently, concerns appear to focus more narrowly on online testing, generally based on anecdotal assumptions that online students are more likely to engage in academic dishonesty in testing than students in traditional on-campus courses. To address such…

  11. Online Calibration Methods for the DINA Model with Independent Attributes in CD-CAT

    ERIC Educational Resources Information Center

    Chen, Ping; Xin, Tao; Wang, Chun; Chang, Hua-Hua

    2012-01-01

    Item replenishing is essential for item bank maintenance in cognitive diagnostic computerized adaptive testing (CD-CAT). In regular CAT, online calibration is commonly used to calibrate the new items continuously. However, until now no reference has publicly become available about online calibration for CD-CAT. Thus, this study investigates the…

  12. A Proposed Model for Authenticating Knowledge Transfer in Online Discussion Forums

    ERIC Educational Resources Information Center

    Tucker, Jan P.; YoungGonzaga, Stephanie; Krause, Jaclyn

    2014-01-01

    Discussion forums are often utilized in the online classroom to build a sense of community, encourage collaboration and exchange, and measure time on task. A review of the literature revealed that there is little research that examines the role of the online discussion forum as a mechanism for knowledge transfer. Researchers reviewed 21 course…

  13. Talk in Virtual Contexts: Reflecting on Participation and Online Learning Models

    ERIC Educational Resources Information Center

    Thorpe, Mary; McCormick, Robert; Kubiak, Chris; Carmichael, Patrick

    2007-01-01

    Computer-mediated conferencing has been adopted, particularly for purposes of online course provision, as a method that can deliver community. Widespread interest in a communities-of-practice approach within both informal and formal learning has strengthened perceptions of the value of creating a community online. A case study of asynchronous…

  14. Machine Beats Experts: Automatic Discovery of Skill Models for Data-Driven Online Course Refinement

    ERIC Educational Resources Information Center

    Matsuda, Noboru; Furukawa, Tadanobu; Bier, Norman; Faloutsos, Christos

    2015-01-01

    How can we automatically determine which skills must be mastered for the successful completion of an online course? Large-scale online courses (e.g., MOOCs) often contain a broad range of contents frequently intended to be a semester's worth of materials; this breadth often makes it difficult to articulate an accurate set of skills and knowledge…

  15. Inter-University Collaboration for Online Teaching Innovation: An Emerging Model

    ERIC Educational Resources Information Center

    Nerlich, Andrea Perkins; Soldner, James L.; Millington, Michael J.

    2012-01-01

    Distance education is constantly evolving and improving. To stay current, effective online instructors must utilize the most innovative, evidence-based teaching methods available to promote student learning and satisfaction in their courses. One emerging teaching method, referred to as blended online learning (BOL), involves collaborative…

  16. Can the Current Model of Higher Education Survive MOOCs and Online Learning?

    ERIC Educational Resources Information Center

    Lucas, Henry C., Jr.

    2013-01-01

    The debate about online education--and Massive Open Online Courses (MOOCs) in particular--generates much confusion because there are so many options for how these technologies can be applied. Institutes of higher education and colleges have to examine these changes or face the risk of no longer being in control of their own fate. To survive, they…

  17. A Bayesian computational model for online character recognition and disability assessment during cursive eye writing

    PubMed Central

    Diard, Julien; Rynik, Vincent; Lorenceau, Jean

    2013-01-01

    This research involves a novel apparatus, in which the user is presented with an illusion inducing visual stimulus. The user perceives illusory movement that can be followed by the eye, so that smooth pursuit eye movements can be sustained in arbitrary directions. Thus, free-flow trajectories of any shape can be traced. In other words, coupled with an eye-tracking device, this apparatus enables “eye writing,” which appears to be an original object of study. We adapt a previous model of reading and writing to this context. We describe a probabilistic model called the Bayesian Action-Perception for Eye On-Line model (BAP-EOL). It encodes probabilistic knowledge about isolated letter trajectories, their size, high-frequency components of the produced trajectory, and pupil diameter. We show how Bayesian inference, in this single model, can be used to solve several tasks, like letter recognition and novelty detection (i.e., recognizing when a presented character is not part of the learned database). We are interested in the potential use of the eye writing apparatus by motor impaired patients: the final task we solve by Bayesian inference is disability assessment (i.e., measuring and tracking the evolution of motor characteristics of produced trajectories). Preliminary experimental results are presented, which illustrate the method, showing the feasibility of character recognition in the context of eye writing. We then show experimentally how a model of the unknown character can be used to detect trajectories that are likely to be new symbols, and how disability assessment can be performed by opportunistically observing characteristics of fine motor control, as letter are being traced. Experimental analyses also help identify specificities of eye writing, as compared to handwriting, and the resulting technical challenges. PMID:24273525

  18. A Bayesian computational model for online character recognition and disability assessment during cursive eye writing.

    PubMed

    Diard, Julien; Rynik, Vincent; Lorenceau, Jean

    2013-01-01

    This research involves a novel apparatus, in which the user is presented with an illusion inducing visual stimulus. The user perceives illusory movement that can be followed by the eye, so that smooth pursuit eye movements can be sustained in arbitrary directions. Thus, free-flow trajectories of any shape can be traced. In other words, coupled with an eye-tracking device, this apparatus enables "eye writing," which appears to be an original object of study. We adapt a previous model of reading and writing to this context. We describe a probabilistic model called the Bayesian Action-Perception for Eye On-Line model (BAP-EOL). It encodes probabilistic knowledge about isolated letter trajectories, their size, high-frequency components of the produced trajectory, and pupil diameter. We show how Bayesian inference, in this single model, can be used to solve several tasks, like letter recognition and novelty detection (i.e., recognizing when a presented character is not part of the learned database). We are interested in the potential use of the eye writing apparatus by motor impaired patients: the final task we solve by Bayesian inference is disability assessment (i.e., measuring and tracking the evolution of motor characteristics of produced trajectories). Preliminary experimental results are presented, which illustrate the method, showing the feasibility of character recognition in the context of eye writing. We then show experimentally how a model of the unknown character can be used to detect trajectories that are likely to be new symbols, and how disability assessment can be performed by opportunistically observing characteristics of fine motor control, as letter are being traced. Experimental analyses also help identify specificities of eye writing, as compared to handwriting, and the resulting technical challenges. PMID:24273525

  19. Prototyping an online wetland ecosystem services model using open model sharing standards

    USGS Publications Warehouse

    Feng, M.; Liu, S.; Euliss, N.H.; Young, Caitlin; Mushet, D.M.

    2011-01-01

    Great interest currently exists for developing ecosystem models to forecast how ecosystem services may change under alternative land use and climate futures. Ecosystem services are diverse and include supporting services or functions (e.g., primary production, nutrient cycling), provisioning services (e.g., wildlife, groundwater), regulating services (e.g., water purification, floodwater retention), and even cultural services (e.g., ecotourism, cultural heritage). Hence, the knowledge base necessary to quantify ecosystem services is broad and derived from many diverse scientific disciplines. Building the required interdisciplinary models is especially challenging as modelers from different locations and times may develop the disciplinary models needed for ecosystem simulations, and these models must be identified and made accessible to the interdisciplinary simulation. Additional difficulties include inconsistent data structures, formats, and metadata required by geospatial models as well as limitations on computing, storage, and connectivity. Traditional standalone and closed network systems cannot fully support sharing and integrating interdisciplinary geospatial models from variant sources. To address this need, we developed an approach to openly share and access geospatial computational models using distributed Geographic Information System (GIS) techniques and open geospatial standards. We included a means to share computational models compliant with Open Geospatial Consortium (OGC) Web Processing Services (WPS) standard to ensure modelers have an efficient and simplified means to publish new models. To demonstrate our approach, we developed five disciplinary models that can be integrated and shared to simulate a few of the ecosystem services (e.g., water storage, waterfowl breeding) that are provided by wetlands in the Prairie Pothole Region (PPR) of North America. ?? 2010 Elsevier Ltd.

  20. Model-based fault detection and identification with online aerodynamic model structure selection

    NASA Astrophysics Data System (ADS)

    Lombaerts, T.

    2013-12-01

    This publication describes a recursive algorithm for the approximation of time-varying nonlinear aerodynamic models by means of a joint adaptive selection of the model structure and parameter estimation. This procedure is called adaptive recursive orthogonal least squares (AROLS) and is an extension and modification of the previously developed ROLS procedure. This algorithm is particularly useful for model-based fault detection and identification (FDI) of aerospace systems. After the failure, a completely new aerodynamic model can be elaborated recursively with respect to structure as well as parameter values. The performance of the identification algorithm is demonstrated on a simulation data set.

  1. Online calculation of global marine halocarbon emissions in the chemistry climate model EMAC

    NASA Astrophysics Data System (ADS)

    Lennartz, Sinikka T.; Krysztofiak-Tong, Gisèle; Sinnhuber, Björn-Martin; Marandino, Christa A.; Tegtmeier, Susann; Krüger, Kirstin; Ziska, Franziska; Quack, Birgit

    2015-04-01

    Marine produced trace gases such as dibromomethane (CH2Br2), bromoform (CHBr3) and methyl iodide (CH3I) significantly impact tropospheric and stratospheric chemistry. Marine emissions are the dominant source of halocarbons to the atmosphere, and therefore, it is crucial to represent them accurately in order to model their impact on atmospheric chemistry. Chemistry climate models are a frequently used tool for quantifying the influence of halocarbons on ozone depletion. In these model simulations, marine emissions of halocarbons have mainly been prescribed from established emission climatologies, thus neglecting the interaction with the actual state of the atmosphere in the model. Here, we calculate halocarbon marine emissions for the first time online by coupling the submodel AIRSEA to the chemical climate model EMAC. Our method combines prescribed water concentrations and varying atmospheric concentrations derived from the model instead of using fixed emission climatologies. This method has a number of conceptual and practical advantages, as the modelled emissions can respond consistently to changes in temperature, wind speed, possible sea ice cover and atmospheric concentration in the model. Differences between the climatology-based and the new approach (2-18%) result from consideration of the actual, time-varying state of the atmosphere and the consideration of air-side transfer velocities. Extensive comparison to observations from aircraft, ships and ground stations reveal that interactively computing the air-sea flux from prescribed water concentrations leads to equally or more accurate atmospheric concentrations in the model compared to using constant emission climatologies. The effect of considering the actual state of the atmosphere is largest for gases with concentrations close to equilibrium in the surface ocean, such as CH2Br2. Halocarbons with comparably long atmospheric lifetimes, e.g. CH2Br2, are reflected more accurately in EMAC when compared to time

  2. Response of different regional online coupled models to aerosol-radiation interactions

    NASA Astrophysics Data System (ADS)

    Forkel, Renate; Balzarini, Alessandra; Brunner, Dominik; Baró, Rocio; Curci, Gabriele; Hirtl, Marcus; Honzak, Luka; Jiménez-Guerrero, Pedro; Jorba, Oriol; Pérez, Juan L.; Pirovano, Guido; San José, Roberto; Schröder, Wolfram; Tuccella, Paolo; Werhahn, Johannes; Wolke, Ralf; Žabkar, Rahela

    2016-04-01

    The importance of aerosol-meteorology interactions and their representation in online coupled regional atmospheric chemistry-meteorology models was investigated in COST Action ES1004 (EuMetChem, http://eumetchem.info/). Case study results from different models (COSMO-Muscat, COSMO-ART, and different configurations of WRF-Chem), which were applied for Europe as a coordinated exercise for the year 2010, are analyzed with respect to inter-model variability and the response of the different models to direct and indirect aerosol-radiation interactions. The main focus was on two episodes - the Russian heat wave and wildfires episode in July/August 2010 and a period in October 2010 with enhanced cloud cover and rain and including an of Saharan dust transport to Europe. Looking at physical plausibility the decrease in downward solar radiation and daytime temperature due to the direct aerosol effect is robust for all model configurations. The same holds for the pronounced decrease in cloud water content and increase in solar radiation for cloudy conditions and very low aerosol concentrations that was found for WRF-Chem when aerosol cloud interactions were considered. However, when the differences were tested for statistical significance no significant differences in mean solar radiation and mean temperature between the baseline case and the simulations including the direct and indirect effect from simulated aerosol concentrations were found over Europe for the October episode. Also for the fire episode differences between mean temperature and radiation from the simulations with and without the direct aerosol effect were not significant for the major part of the modelling domain. Only for the region with high fire emissions in Russia, the differences in mean solar radiation and temperature due to the direct effect were found to be significant during the second half of the fire episode - however only for a significance level of 0.1. The few observational data indicate that

  3. MESSA: MEta-Server for protein Sequence Analysis

    PubMed Central

    2012-01-01

    Background Computational sequence analysis, that is, prediction of local sequence properties, homologs, spatial structure and function from the sequence of a protein, offers an efficient way to obtain needed information about proteins under study. Since reliable prediction is usually based on the consensus of many computer programs, meta-severs have been developed to fit such needs. Most meta-servers focus on one aspect of sequence analysis, while others incorporate more information, such as PredictProtein for local sequence feature predictions, SMART for domain architecture and sequence motif annotation, and GeneSilico for secondary and spatial structure prediction. However, as predictions of local sequence properties, three-dimensional structure and function are usually intertwined, it is beneficial to address them together. Results We developed a MEta-Server for protein Sequence Analysis (MESSA) to facilitate comprehensive protein sequence analysis and gather structural and functional predictions for a protein of interest. For an input sequence, the server exploits a number of select tools to predict local sequence properties, such as secondary structure, structurally disordered regions, coiled coils, signal peptides and transmembrane helices; detect homologous proteins and assign the query to a protein family; identify three-dimensional structure templates and generate structure models; and provide predictive statements about the protein's function, including functional annotations, Gene Ontology terms, enzyme classification and possible functionally associated proteins. We tested MESSA on the proteome of Candidatus Liberibacter asiaticus. Manual curation shows that three-dimensional structure models generated by MESSA covered around 75% of all the residues in this proteome and the function of 80% of all proteins could be predicted. Availability MESSA is free for non-commercial use at http://prodata.swmed.edu/MESSA/ PMID:23031578

  4. Trust-Based Access Control Model from Sociological Approach in Dynamic Online Social Network Environment

    PubMed Central

    Kim, Seungjoo

    2014-01-01

    There has been an explosive increase in the population of the OSN (online social network) in recent years. The OSN provides users with many opportunities to communicate among friends and family. Further, it facilitates developing new relationships with previously unknown people having similar beliefs or interests. However, the OSN can expose users to adverse effects such as privacy breaches, the disclosing of uncontrolled material, and the disseminating of false information. Traditional access control models such as MAC, DAC, and RBAC are applied to the OSN to address these problems. However, these models are not suitable for the dynamic OSN environment because user behavior in the OSN is unpredictable and static access control imposes a burden on the users to change the access control rules individually. We propose a dynamic trust-based access control for the OSN to address the problems of the traditional static access control. Moreover, we provide novel criteria to evaluate trust factors such as sociological approach and evaluate a method to calculate the dynamic trust values. The proposed method can monitor negative behavior and modify access permission levels dynamically to prevent the indiscriminate disclosure of information. PMID:25374943

  5. A novel rumor diffusion model considering the effect of truth in online social media

    NASA Astrophysics Data System (ADS)

    Sun, Ling; Liu, Yun; Zeng, Qing-An; Xiong, Fei

    2015-12-01

    In this paper, we propose a model to investigate how truth affects rumor diffusion in online social media. Our model reveals a relation between rumor and truth — namely, when a rumor is diffusing, the truth about the rumor also diffuses with it. Two patterns of the agents used to identify rumor, self-identification and passive learning are taken into account. Combining theoretical proof and simulation analysis, we find that the threshold value of rumor diffusion is negatively correlated to the connectivity between nodes in the network and the probability β of agents knowing truth. Increasing β can reduce the maximum density of the rumor spreaders and slow down the generation speed of new rumor spreaders. On the other hand, we conclude that the best rumor diffusion strategy must balance the probability of forwarding rumor and the probability of agents losing interest in the rumor. High spread rate λ of rumor would lead to a surge in truth dissemination which will greatly limit the diffusion of rumor. Furthermore, in the case of unknown λ, increasing β can effectively reduce the maximum proportion of agents who do not know the truth, but cannot narrow the rumor diffusion range in a certain interval of β.

  6. SSIC model: A multi-layer model for intervention of online rumors spreading

    NASA Astrophysics Data System (ADS)

    Tian, Ru-Ya; Zhang, Xue-Fu; Liu, Yi-Jun

    2015-06-01

    SIR model is a classical model to simulate rumor spreading, while the supernetwork is an effective tool for modeling complex systems. Based on the Opinion SuperNetwork involving Social Sub-network, Environmental Sub-network, Psychological Sub-network, and Viewpoint Sub-network, drawing from the modeling idea of SIR model, this paper designs super SIC model (SSIC model) and its evolution rules, and also analyzes intervention effects on public opinion of four elements of supernetwork, which are opinion agent, opinion environment, agent's psychology and viewpoint. Studies show that, the SSIC model based on supernetwork has effective intervention effects on rumor spreading. It is worth noting that (i) identifying rumor spreaders in Social Sub-network and isolating them can achieve desired intervention results, (ii) improving environmental information transparency so that the public knows as much information as possible to reduce the rumors is a feasible way to intervene, (iii) persuading wavering neutrals has better intervention effects than clarifying rumors already spread everywhere, so rumors should be intervened in properly in time by psychology counseling.

  7. Bilbao Crystallographic Server. II. Representations of crystallographic point groups and space groups.

    PubMed

    Aroyo, Mois I; Kirov, Asen; Capillas, Cesar; Perez-Mato, J M; Wondratschek, Hans

    2006-03-01

    The Bilbao Crystallographic Server is a web site with crystallographic programs and databases freely available on-line (http://www.cryst.ehu.es). The server gives access to general information related to crystallographic symmetry groups (generators, general and special positions, maximal subgroups, Brillouin zones etc.). Apart from the simple tools for retrieving the stored data, there are programs for the analysis of group-subgroup relations between space groups (subgroups and supergroups, Wyckoff-position splitting schemes etc.). There are also software packages studying specific problems of solid-state physics, structural chemistry and crystallography. This article reports on the programs treating representations of point and space groups. There are tools for the construction of irreducible representations, for the study of the correlations between representations of group-subgroup pairs of space groups and for the decompositions of Kronecker products of representations. PMID:16489249

  8. San Mateo County's Server Information Program (S.I.P.): A Community-Based Alcohol Server Training Program.

    ERIC Educational Resources Information Center

    de Miranda, John

    The field of alcohol server awareness and training has grown dramatically in the past several years and the idea of training servers to reduce alcohol problems has become a central fixture in the current alcohol policy debate. The San Mateo County, California Server Information Program (SIP) is a community-based prevention strategy designed to…

  9. ECHMERIT: A new on-line global mercury-chemistry model

    NASA Astrophysics Data System (ADS)

    Jung, G.; Hedgecock, I. M.; Pirrone, N.

    2009-04-01

    Mercury is a volatile metal, that is of concern because when deposited and transformed to methylmercury accumulates within the food-web. Due to the long lifetime of elemental mercury, which is the dominant fraction of mercury species in the atmosphere, mercury is prone to long-range transport and therefore distributed over the globe, transported and hence deposited even in regions far from anthropogenic emission sources. Mercury is released to the atmosphere from a variety of natural and anthropogenic sources, in elementary and oxidised forms, and as particulate mercury. It is then transported, but also transformed chemically in the gaseous phase, as well as in aqueous phase within cloud and rain droplets. Mercury (particularly its oxidised forms) is removed from the atmosphere though wet and dry deposition processes, a large fraction of deposited mercury is, after chemical or biological reduction, re-emitted to the atmosphere as elementary mercury. To investigate mercury chemistry and transport processes on the global scale, the new, global model ECHMERIT has been developed. ECHMERIT simulates meteorology, transport, deposition, photolysis and chemistry on-line. The general circulation model on which ECHMERIT is based is ECHAM5. Sophisticated chemical modules have been implemented, including gas phase chemistry based on the CBM-Z chemistry mechanism, as well as aqueous phase chemistry, both of which have been adapted to include Hg chemistry and Hg species gas-droplet mass transfer. ECHMERIT uses the fast-J photolysis routine. State-of-the-art procedures simulating wet and dry deposition and emissions were adapted and included in the model as well. An overview of the model structure, development, validation and sensitivity studies is presented.

  10. iMODS: internal coordinates normal mode analysis server

    PubMed Central

    López-Blanco, José Ramón; Aliaga, José I.; Quintana-Ortí, Enrique S.; Chacón, Pablo

    2014-01-01

    Normal mode analysis (NMA) in internal (dihedral) coordinates naturally reproduces the collective functional motions of biological macromolecules. iMODS facilitates the exploration of such modes and generates feasible transition pathways between two homologous structures, even with large macromolecules. The distinctive internal coordinate formulation improves the efficiency of NMA and extends its applicability while implicitly maintaining stereochemistry. Vibrational analysis, motion animations and morphing trajectories can be easily carried out at different resolution scales almost interactively. The server is versatile; non-specialists can rapidly characterize potential conformational changes, whereas advanced users can customize the model resolution with multiple coarse-grained atomic representations and elastic network potentials. iMODS supports advanced visualization capabilities for illustrating collective motions, including an improved affine-model-based arrow representation of domain dynamics. The generated all-heavy-atoms conformations can be used to introduce flexibility for more advanced modeling or sampling strategies. The server is free and open to all users with no login requirement at http://imods.chaconlab.org. PMID:24771341

  11. An online mineral dust model within the global/regional NMMB: current progress and plans

    NASA Astrophysics Data System (ADS)

    Perez, C.; Haustein, K.; Janjic, Z.; Jorba, O.; Baldasano, J. M.; Black, T.; Nickovic, S.

    2008-12-01

    While mineral dust distribution and effects are important on global scales, they strongly depend on dust emissions that are occurring on small spatial and temporal scales. Indeed, the accuracy of surface wind speed used in dust models is crucial. Due to the high-order power dependency on wind friction velocity and the threshold behaviour of dust emissions, small errors in surface wind speed lead to large dust emission errors. Most global dust models use prescribed wind fields provided by major meteorological centres (e.g., NCEP and ECMWF) and their spatial resolution is currently about 1 degree x 1 degree . Such wind speeds tend to be strongly underestimated over arid and semi-arid areas and do not account for mesoscale systems responsible for a significant fraction of dust emissions regionally and globally. Other significant uncertainties in dust emissions resulting from such approaches are related to the misrepresentation of high subgrid-scale spatial heterogeneity in soil and vegetation boundary conditions, mainly in semi-arid areas. In order to significantly reduce these uncertainties, the Barcelona Supercomputing Center is currently implementing a mineral dust model coupled on-line with the new global/regional NMMB atmospheric model using the ESMF framework under development in NOAA/NCEP/EMC. The NMMB is an evolution of the operational WRF-NMME extending from meso to global scales, and including non-hydrostatic option and improved tracer advection. This model is planned to become the next-generation NCEP mesoscale model for operational weather forecasting in North America. Current implementation is based on the well established regional dust model and forecast system Eta/DREAM (http://www.bsc.es/projects/earthscience/DREAM/). First successful global simulations show the potentials of such an approach and compare well with DREAM regionally. Ongoing developments include improvements in dust size distribution representation, sedimentation, dry deposition, wet

  12. Data decomposition of Monte Carlo particle transport simulations via tally servers

    SciTech Connect

    Romano, Paul K.; Siegel, Andrew R.; Forget, Benoit; Smith, Kord

    2013-11-01

    An algorithm for decomposing large tally data in Monte Carlo particle transport simulations is developed, analyzed, and implemented in a continuous-energy Monte Carlo code, OpenMC. The algorithm is based on a non-overlapping decomposition of compute nodes into tracking processors and tally servers. The former are used to simulate the movement of particles through the domain while the latter continuously receive and update tally data. A performance model for this approach is developed, suggesting that, for a range of parameters relevant to LWR analysis, the tally server algorithm should perform with minimal overhead on contemporary supercomputers. An implementation of the algorithm in OpenMC is then tested on the Intrepid and Titan supercomputers, supporting the key predictions of the model over a wide range of parameters. We thus conclude that the tally server algorithm is a successful approach to circumventing classical on-node memory constraints en route to unprecedentedly detailed Monte Carlo reactor simulations.

  13. PSSweb: protein structural statistics web server.

    PubMed

    Gaillard, Thomas; Stote, Roland H; Dejaegere, Annick

    2016-07-01

    With the increasing number of protein structures available, there is a need for tools capable of automating the comparison of ensembles of structures, a common requirement in structural biology and bioinformatics. PSSweb is a web server for protein structural statistics. It takes as input an ensemble of PDB files of protein structures, performs a multiple sequence alignment and computes structural statistics for each position of the alignment. Different optional functionalities are proposed: structure superposition, Cartesian coordinate statistics, dihedral angle calculation and statistics, and a cluster analysis based on dihedral angles. An interactive report is generated, containing a summary of the results, tables, figures and 3D visualization of superposed structures. The server is available at http://pssweb.org. PMID:27174930

  14. Energy Servers Deliver Clean, Affordable Power

    NASA Technical Reports Server (NTRS)

    2010-01-01

    K.R. Sridhar developed a fuel cell device for Ames Research Center, that could use solar power to split water into oxygen for breathing and hydrogen for fuel on Mars. Sridhar saw the potential of the technology, when reversed, to create clean energy on Earth. He founded Bloom Energy, of Sunnyvale, California, to advance the technology. Today, the Bloom Energy Server is providing cost-effective, environmentally friendly energy to a host of companies such as eBay, Google, and The Coca-Cola Company. Bloom's NASA-derived Energy Servers generate energy that is about 67-percent cleaner than a typical coal-fired power plant when using fossil fuels and 100-percent cleaner with renewable fuels.

  15. PSSweb: protein structural statistics web server

    PubMed Central

    Gaillard, Thomas; Stote, Roland H.; Dejaegere, Annick

    2016-01-01

    With the increasing number of protein structures available, there is a need for tools capable of automating the comparison of ensembles of structures, a common requirement in structural biology and bioinformatics. PSSweb is a web server for protein structural statistics. It takes as input an ensemble of PDB files of protein structures, performs a multiple sequence alignment and computes structural statistics for each position of the alignment. Different optional functionalities are proposed: structure superposition, Cartesian coordinate statistics, dihedral angle calculation and statistics, and a cluster analysis based on dihedral angles. An interactive report is generated, containing a summary of the results, tables, figures and 3D visualization of superposed structures. The server is available at http://pssweb.org. PMID:27174930

  16. Multiple-server Flexible Blind Quantum Computation in Networks

    NASA Astrophysics Data System (ADS)

    Kong, Xiaoqin; Li, Qin; Wu, Chunhui; Yu, Fang; He, Jinjun; Sun, Zhiyuan

    2016-06-01

    Blind quantum computation (BQC) can allow a client with limited quantum power to delegate his quantum computation to a powerful server and still keep his own data private. In this paper, we present a multiple-server flexible BQC protocol, where a client who only needs the ability of accessing qua ntum channels can delegate the computational task to a number of servers. Especially, the client's quantum computation also can be achieved even when one or more delegated quantum servers break down in networks. In other words, when connections to certain quantum servers are lost, clients can adjust flexibly and delegate their quantum computation to other servers. Obviously it is trivial that the computation will be unsuccessful if all servers are interrupted.

  17. Multiple-server Flexible Blind Quantum Computation in Networks

    NASA Astrophysics Data System (ADS)

    Kong, Xiaoqin; Li, Qin; Wu, Chunhui; Yu, Fang; He, Jinjun; Sun, Zhiyuan

    2016-02-01

    Blind quantum computation (BQC) can allow a client with limited quantum power to delegate his quantum computation to a powerful server and still keep his own data private. In this paper, we present a multiple-server flexible BQC protocol, where a client who only needs the ability of accessing qua ntum channels can delegate the computational task to a number of servers. Especially, the client's quantum computation also can be achieved even when one or more delegated quantum servers break down in networks. In other words, when connections to certain quantum servers are lost, clients can adjust flexibly and delegate their quantum computation to other servers. Obviously it is trivial that the computation will be unsuccessful if all servers are interrupted.

  18. Online Sellers’ Website Quality Influencing Online Buyers’ Purchase Intention

    NASA Astrophysics Data System (ADS)

    Shea Lee, Tan; Ariff, Mohd Shoki Md; Zakuan, Norhayati; Sulaiman, Zuraidah; Zameri Mat Saman, Muhamad

    2016-05-01

    The increase adoption of Internet among young users in Malaysia provides high prospect for online seller. Young users aged between 18 and 25 years old are important to online sellers because they are actively involved in online purchasing and this group of online buyers is expected to dominate future online market. Therefore, examining online sellers’ website quality and online buyers’ purchase intention is crucial. Based on the Theory of planned behavior (TPB), a conceptual model of online sellers’ website quality and purchase intention of online buyers was developed. E-tailQ instrument was adapted in this study which composed of website design, reliability/fulfillment, security, privacy & trust, and customer service. Using online questionnaire and convenience sampling procedure, primary data were obtained from 240 online buyers aged between 18 to 25 years old. It was discovered that website design, website reliability/fulfillment, website security, privacy & trust, and website customer service positively and significantly influence intention of online buyers to continuously purchase via online channels. This study concludes that online sellers’ website quality is important in predicting online buyers’ purchase intention. Recommendation and implication of this study were discussed focusing on how online sellers should improve their website quality to stay competitive in online business.

  19. Implementing a secure client/server application

    SciTech Connect

    Kissinger, B.A.

    1994-08-01

    There is an increasing rise in attacks and security breaches on computer systems. Particularly vulnerable are systems that exchange user names and passwords directly across a network without encryption. These kinds of systems include many commercial-off-the-shelf client/server applications. A secure technique for authenticating computer users and transmitting passwords through the use of a trusted {open_quotes}broker{close_quotes} and public/private keys is described in this paper.

  20. Evidence implementation: Development of an online methodology from the knowledge-to-action model of knowledge translation.

    PubMed

    Lockwood, Craig; Stephenson, Matthew; Lizarondo, Lucylynn; van Den Hoek, Joan; Harrison, Margaret

    2016-08-01

    This paper describes an online facilitation for operationalizing the knowledge-to-action (KTA) model. The KTA model incorporates implementation planning that is optimally suited to the information needs of clinicians. The can-implement(©) is an evidence implementation process informed by the KTA model. An online counterpart, the can-implement.pro(©) , was developed to enable greater dissemination and utilization of the can-implement(©) process. The driver for this work was health professionals' need for facilitation that is iterative, informed by context and localized to the specific needs of users. The literature supporting this paper includes evaluation studies and theoretical concepts relevant to KTA model, evidence implementation and facilitation. Nursing and other health disciplines require a skill set and resources to successfully navigate the complexity of organizational requirements, inter-professional leadership and day-to-day practical management to implement evidence into clinical practice. The can-implement.pro(©) provides an accessible, inclusive system for evidence implementation projects. There is empirical support for evidence implementation informed by the KTA model, which in this phase of work has been developed for online uptake. Nurses and other clinicians seeking to implement evidence could benefit from the directed actions, planning advice and information embedded in the phases and steps of can-implement.pro(©) . PMID:27562662

  1. COMPASS server for remote homology inference.

    PubMed

    Sadreyev, Ruslan I; Tang, Ming; Kim, Bong-Hyun; Grishin, Nick V

    2007-07-01

    COMPASS is a method for homology detection and local alignment construction based on the comparison of multiple sequence alignments (MSAs). The method derives numerical profiles from given MSAs, constructs local profile-profile alignments and analytically estimates E-values for the detected similarities. Until now, COMPASS was only available for download and local installation. Here, we present a new web server featuring the latest version of COMPASS, which provides (i) increased sensitivity and selectivity of homology detection; (ii) longer, more complete alignments; and (iii) faster computational speed. After submission of the query MSA or single sequence, the server performs searches versus a user-specified database. The server includes detailed and intuitive control of the search parameters. A flexible output format, structured similarly to BLAST and PSI-BLAST, provides an easy way to read and analyze the detected profile similarities. Brief help sections are available for all input parameters and output options, along with detailed documentation. To illustrate the value of this tool for protein structure-functional prediction, we present two examples of detecting distant homologs for uncharacterized protein families. Available at http://prodata.swmed.edu/compass. PMID:17517780

  2. Development of 2MASS Catalog Server Kit

    NASA Astrophysics Data System (ADS)

    Yamauchi, Chisato

    2011-11-01

    We develop a software kit called "2MASS Catalog Server Kit" to easily construct a high-performance database server for the 2MASS Point Source Catalog (includes 470,992,970 objects) and several all-sky catalogs. Users can perform fast radial search and rectangular search using provided stored functions in SQL similar to SDSS SkyServer. Our software kit utilizes open-source RDBMS, and therefore any astronomers and developers can install our kit on their personal computers for research, observation, etc. Out kit is tuned for optimal coordinate search performance. We implement an effective radial search using an orthogonal coordinate system, which does not need any techniques that depend on HTM or HEALpix. Applying the xyz coordinate system to the database index, we can easily implement a system of fast radial search for relatively small (less than several million rows) catalogs. To enable high-speed search of huge catalogs on RDBMS, we apply three additional techniques: table partitioning, composite expression index, and optimization in stored functions. As a result, we obtain satisfactory performance of radial search for the 2MASS catalog. Our system can also perform fast rectangular search. It is implemented using techniques similar to those applied for radial search. Our way of implementation enables a compact system and will give important hints for a low-cost development of other huge catalog databases.

  3. An expert system to perform on-line controller restructuring for abrupt model changes

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan S.

    1990-01-01

    Work in progress on an expert system used to reconfigure and tune airframe/engine control systems on-line in real time in response to battle damage or structural failures is presented. The closed loop system is monitored constantly for changes in structure and performance, the detection of which prompts the expert system to choose and apply a particular control restructuring algorithm based on the type and severity of the damage. Each algorithm is designed to handle specific types of failures and each is applicable only in certain situations. The expert system uses information about the system model to identify the failure and to select the technique best suited to compensate for it. A depth-first search is used to find a solution. Once a new controller is designed and implemented it must be tuned to recover the original closed-loop handling qualities and responsiveness from the degraded system. Ideally, the pilot should not be able to tell the difference between the original and redesigned systems. The key is that the system must have inherent redundancy so that degraded or missing capabilities can be restored by creative use of alternate functionalities. With enough redundancy in the control system, minor battle damage affecting individual control surfaces or actuators, compressor efficiency, etc., can be compensated for such that the closed-loop performance in not noticeably altered. The work is applied to a Black Hawk/T700 system.

  4. Improving education in primary care: development of an online curriculum using the blended learning model

    PubMed Central

    Lewin, Linda Orkin; Singh, Mamta; Bateman, Betzi L; Glover, Pamela Bligh

    2009-01-01

    Background Standardizing the experiences of medical students in a community preceptorship where clinical sites vary by geography and discipline can be challenging. Computer-assisted learning is prevalent in medical education and can help standardize experiences, but often is not used to its fullest advantage. A blended learning curriculum combining web-based modules with face-to-face learning can ensure students obtain core curricular principles. Methods This course was developed and used at The Case Western Reserve University School of Medicine and its associated preceptorship sites in the greater Cleveland area. Leaders of a two-year elective continuity experience at the Case Western Reserve School of Medicine used adult learning principles to develop four interactive online modules presenting basics of office practice, difficult patient interviews, common primary care diagnoses, and disease prevention. They can be viewed at . Students completed surveys rating the content and technical performance of each module and completed a Generalist OSCE exam at the end of the course. Results Participating students rated all aspects of the course highly; particularly those related to charting and direct patient care. Additionally, they scored very well on the Generalist OSCE exam. Conclusion Students found the web-based modules to be valuable and to enhance their clinical learning. The blended learning model is a useful tool in designing web-based curriculum for enhancing the clinical curriculum of medical students. PMID:19515243

  5. Using an online game to evaluate effective methods of communicating ensemble model output to different audiences

    NASA Astrophysics Data System (ADS)

    Stephens, E. M.; Mylne, K.; Spiegelhalter, D.

    2011-12-01

    Effective communication of probabilistic forecasts for weather and climate applications is vital for improved understanding and decision making by the public and other end-users. Probabilistic predictions are frequently produced for uses such as hurricane warnings or climate change impact assessments, usually using ensemble prediction systems, but limited research has been undertaken to explore the best methods of communicating this information. The communication of forecasts produced by ensemble prediction systems is one of the major challenges facing meteorologists today. Most, if not all, of ensemble output is not currently communicated to the general public, leading to a widening gap between what information is computed and what is provided. A lack of public understanding and the difficulty in presenting such complex probabilistic information are two reasons often cited for not communicating ensemble weather forecasts to the public. Using an online game we explore these issues by evaluating the ability of participants to make decisions using a number of different methods of presenting probabilistic temperature and rainfall predictions. Participants are segmented demographically to better understand how outcomes vary between audiences of different backgrounds and levels of expertise. The insights gained from this work on day to day weather forecasts have implications for effective communication across the wider ensemble modeling community.

  6. SAGExplore: a web server for unambiguous tag mapping in serial analysis of gene expression oriented to gene discovery and annotation.

    PubMed

    Norambuena, Tomás; Malig, Rodrigo; Melo, Francisco

    2007-07-01

    We describe a web server for the accurate mapping of experimental tags in serial analysis of gene expression (SAGE). The core of the server relies on a database of genomic virtual tags built by a recently described method that attempts to reduce the amount of ambiguous assignments for those tags that are not unique in the genome. The method provides a complete annotation of potential virtual SAGE tags within a genome, along with an estimation of their confidence for experimental observation that ranks tags that present multiple matches in the genome. The output of the server consists of a table in HTML format that contains links to a graphic representation of the results and to some external servers and databases, facilitating the tasks of analysis of gene expression and gene discovery. Also, a table in tab delimited text format is produced, allowing the user to export the results into custom databases and software for further analysis. The current server version provides the most accurate and complete SAGE tag mapping source that is available for the yeast organism. In the near future, this server will also allow the accurate mapping of experimental SAGE-tags from other model organisms such as human, mouse, frog and fly. The server is freely available on the web at: http://dna.bio.puc.cl/SAGExplore.html. PMID:17626053

  7. A web-server of cell type discrimination system.

    PubMed

    Wang, Anyou; Zhong, Yan; Wang, Yanhua; He, Qianchuan

    2014-01-01

    Discriminating cell types is a daily request for stem cell biologists. However, there is not a user-friendly system available to date for public users to discriminate the common cell types, embryonic stem cells (ESCs), induced pluripotent stem cells (iPSCs), and somatic cells (SCs). Here, we develop WCTDS, a web-server of cell type discrimination system, to discriminate the three cell types and their subtypes like fetal versus adult SCs. WCTDS is developed as a top layer application of our recent publication regarding cell type discriminations, which employs DNA-methylation as biomarkers and machine learning models to discriminate cell types. Implemented by Django, Python, R, and Linux shell programming, run under Linux-Apache web server, and communicated through MySQL, WCTDS provides a friendly framework to efficiently receive the user input and to run mathematical models for analyzing data and then to present results to users. This framework is flexible and easy to be expended for other applications. Therefore, WCTDS works as a user-friendly framework to discriminate cell types and subtypes and it can also be expended to detect other cell types like cancer cells. PMID:24578634

  8. A Web-Server of Cell Type Discrimination System

    PubMed Central

    Zhong, Yan

    2014-01-01

    Discriminating cell types is a daily request for stem cell biologists. However, there is not a user-friendly system available to date for public users to discriminate the common cell types, embryonic stem cells (ESCs), induced pluripotent stem cells (iPSCs), and somatic cells (SCs). Here, we develop WCTDS, a web-server of cell type discrimination system, to discriminate the three cell types and their subtypes like fetal versus adult SCs. WCTDS is developed as a top layer application of our recent publication regarding cell type discriminations, which employs DNA-methylation as biomarkers and machine learning models to discriminate cell types. Implemented by Django, Python, R, and Linux shell programming, run under Linux-Apache web server, and communicated through MySQL, WCTDS provides a friendly framework to efficiently receive the user input and to run mathematical models for analyzing data and then to present results to users. This framework is flexible and easy to be expended for other applications. Therefore, WCTDS works as a user-friendly framework to discriminate cell types and subtypes and it can also be expended to detect other cell types like cancer cells. PMID:24578634

  9. RNAssess--a web server for quality assessment of RNA 3D structures.

    PubMed

    Lukasiak, Piotr; Antczak, Maciej; Ratajczak, Tomasz; Szachniuk, Marta; Popenda, Mariusz; Adamiak, Ryszard W; Blazewicz, Jacek

    2015-07-01

    Nowadays, various methodologies can be applied to model RNA 3D structure. Thus, the plausible quality assessment of 3D models has a fundamental impact on the progress of structural bioinformatics. Here, we present RNAssess server, a novel tool dedicated to visual evaluation of RNA 3D models in the context of the known reference structure for a wide range of accuracy levels (from atomic to the whole molecule perspective). The proposed server is based on the concept of local neighborhood, defined as a set of atoms observed within a sphere localized around a central atom of a particular residue. A distinctive feature of our server is the ability to perform simultaneous visual analysis of the model-reference structure coherence. RNAssess supports the quality assessment through delivering both static and interactive visualizations that allows an easy identification of native-like models and/or chosen structural regions of the analyzed molecule. A combination of results provided by RNAssess allows us to rank analyzed models. RNAssess offers new route to a fast and efficient 3D model evaluation suitable for the RNA-Puzzles challenge. The proposed automated tool is implemented as a free and open to all users web server with an user-friendly interface and can be accessed at: http://rnassess.cs.put.poznan.pl/. PMID:26068469

  10. The Purpose, Design, and Evolution of Online Interactive Textbooks: The Digital Learning Interactive Model.

    ERIC Educational Resources Information Center

    Smith, Ronald

    2000-01-01

    Describes the Digital Learning Interactive textbook which allows instructors to customize an online textbook to meet the needs of the instructor and the students. Discusses the features and components aimed at engaging students in the Digital Learning Interactive text. (CMK)

  11. HMMER web server: 2015 update

    PubMed Central

    Finn, Robert D.; Clements, Jody; Arndt, William; Miller, Benjamin L.; Wheeler, Travis J.; Schreiber, Fabian; Bateman, Alex; Eddy, Sean R.

    2015-01-01

    The HMMER website, available at http://www.ebi.ac.uk/Tools/hmmer/, provides access to the protein homology search algorithms found in the HMMER software suite. Since the first release of the website in 2011, the search repertoire has been expanded to include the iterative search algorithm, jackhmmer. The continued growth of the target sequence databases means that traditional tabular representations of significant sequence hits can be overwhelming to the user. Consequently, additional ways of presenting homology search results have been developed, allowing them to be summarised according to taxonomic distribution or domain architecture. The taxonomy and domain architecture representations can be used in combination to filter the results according to the needs of a user. Searches can also be restricted prior to submission using a new taxonomic filter, which not only ensures that the results are specific to the requested taxonomic group, but also improves search performance. The repertoire of profile hidden Markov model libraries, which are used for annotation of query sequences with protein families and domains, has been expanded to include the libraries from CATH-Gene3D, PIRSF, Superfamily and TIGRFAMs. Finally, we discuss the relocation of the HMMER webserver to the European Bioinformatics Institute and the potential impact that this will have. PMID:25943547

  12. Applying the Context, Input, Process, Product Evaluation Model for Evaluation, Research, and Redesign of an Online Master's Program

    ERIC Educational Resources Information Center

    Sancar Tokmak, Hatice; Meltem Baturay, H.; Fadde, Peter

    2013-01-01

    This study aimed to evaluate and redesign an online master's degree program consisting of 12 courses from the informatics field using a context, input, process, product (CIPP) evaluation model. Research conducted during the redesign of the online program followed a mixed methodology in which data was collected through a CIPP survey,…

  13. The STAR Online Control System

    NASA Astrophysics Data System (ADS)

    Gilkes, M. L.; Chrin, J.; Olchanski, K.; Pruneau, C. A.; Stone, N. T. B.; Wenaus, T.

    1998-10-01

    The STAR Online Software Group has designed and built a complete control system for the STAR experiment. We support SUN Solaris and Windows NT, and utilize commercial software packages including Orbix (C++) for CORBA IPC, Objectivity/DB (C++) for the configuration database, Borland JBuilder for Java GUI development, EPICS and CDEV for hardware interfacing, and RogueWave libraries (STL, Tools.h++, Threads.h++, Net.h++). The system embodies a unified object-oriented approach to experiment control. Device-specific details are encapsulated in a single server unique to each subsystem (i.e. DAQ, Trigger, sub-detectors). Key online system features include management of subsystem states, configuration management, CORBA messaging, arbitration and synchronization of multiple runs, participation of subsystems in multiple runs, a user interface incorporating ROOT and its C++ interpreter for scripting, JAVA control GUIs with automatic logging, and an online event pool from which consumers can interactively select events.

  14. NMMB/BSC-DUST: an online mineral dust atmospheric model from meso to global scales

    NASA Astrophysics Data System (ADS)

    Haustein, K.; Pérez, C.; Jorba, O.; Baldasano, J. M.; Janjic, Z.; Black, T.; Nickovic, S.

    2009-04-01

    While mineral dust distribution and effects are important at global scales, they strongly depend on dust emissions that are controlled on small spatial and temporal scales. Most global dust models use prescribed wind fields provided by meteorological centers (e.g., NCEP and ECMWF) and their spatial resolution is currently never better than about 1°×1°. Regional dust models offer substantially higher resolution (10-20 km) and are typically coupled with weather forecast models that simulate processes that GCMs either cannot resolve or can resolve only poorly. These include internal circulation features such as the low-level nocturnal jet which is a crucial feature for dust emission in several dust ‘hot spot' sources in North Africa. Based on our modeling experience with the BSC-DREAM regional forecast model (http://www.bsc.es/projects/earthscience/DREAM/) we are currently implementing an improved mineral dust model [Pérez et al., 2008] coupled online with the new global/regional NMMB atmospheric model under development in NOAA/NCEP/EMC [Janjic, 2005]. The NMMB is an evolution of the operational WRF-NMME extending from meso to global scales. The NMMB will become the next-generation NCEP model for operational weather forecast in 2010. The corresponding unified non-hydrostatic dynamical core ranges from meso to global scale allowing regional and global simulations. It has got an add-on non-hydrostatic module and it is based on the Arakawa B-grid and hybrid pressure-sigma vertical coordinates. NMMB is fully embedded into the Earth System Modeling Framework (ESMF), treating dynamics and physics separately and coupling them easily within the ESMF structure. Our main goal is to provide global dust forecasts up to 7 days at mesoscale resolutions. New features of the model include a physically-based dust emission scheme after White [1979], Iversen and White [1982] and Marticorena and Bergametti [1995] that takes the effects of saltation and sandblasting into account

  15. Using a Global Climate Model in an On-line Climate Change Course

    NASA Astrophysics Data System (ADS)

    Randle, D. E.; Chandler, M. A.; Sohl, L. E.

    2012-12-01

    Seminars on Science: Climate Change is an on-line, graduate-level teacher professional development course offered by the American Museum of Natural History. It is an intensive 6-week course covering a broad range of global climate topics, from the fundamentals of the climate system, to the causes of climate change, the role of paleoclimate investigations, and a discussion of potential consequences and risks. The instructional method blends essays, videos, textbooks, and linked websites, with required participation in electronic discussion forums that are moderated by an experienced educator and a course scientist. Most weeks include additional assignments. Three of these assignments employ computer models, including two weeks spent working with a full-fledged 3D global climate model (GCM). The global climate modeling environment is supplied through a partnership with Columbia University's Educational Global Climate Modeling Project (EdGCM). The objective is to have participants gain hands-on experience with one of the most important, yet misunderstood, aspects of climate change research. Participants in the course are supplied with a USB drive that includes installers for the software and sample data. The EdGCM software includes a version of NASA's global climate model fitted with a graphical user interface and pre-loaded with several climate change simulations. Step-by-step assignments and video tutorials help walk people through these challenging exercises and the course incorporates a special assignment discussion forum to help with technical problems and questions about the NASA GCM. There are several takeaways from our first year and a half of offering this course, which has become one of the most popular out of the twelve courses offered by the Museum. Participants report a high level of satisfaction in using EdGCM. Some report frustration at the initial steps, but overwhelmingly claim that the assignments are worth the effort. Many of the difficulties that

  16. Object Segmentation Methods for Online Model Acquisition to Guide Robotic Grasping

    NASA Astrophysics Data System (ADS)

    Ignakov, Dmitri

    A vision system is an integral component of many autonomous robots. It enables the robot to perform essential tasks such as mapping, localization, or path planning. A vision system also assists with guiding the robot's grasping and manipulation tasks. As an increased demand is placed on service robots to operate in uncontrolled environments, advanced vision systems must be created that can function effectively in visually complex and cluttered settings. This thesis presents the development of segmentation algorithms to assist in online model acquisition for guiding robotic manipulation tasks. Specifically, the focus is placed on localizing door handles to assist in robotic door opening, and on acquiring partial object models to guide robotic grasping. First, a method for localizing a door handle of unknown geometry based on a proposed 3D segmentation method is presented. Following segmentation, localization is performed by fitting a simple box model to the segmented handle. The proposed method functions without requiring assumptions about the appearance of the handle or the door, and without a geometric model of the handle. Next, an object segmentation algorithm is developed, which combines multiple appearance (intensity and texture) and geometric (depth and curvature) cues. The algorithm is able to segment objects without utilizing any a priori appearance or geometric information in visually complex and cluttered environments. The segmentation method is based on the Conditional Random Fields (CRF) framework, and the graph cuts energy minimization technique. A simple and efficient method for initializing the proposed algorithm which overcomes graph cuts' reliance on user interaction is also developed. Finally, an improved segmentation algorithm is developed which incorporates a distance metric learning (DML) step as a means of weighing various appearance and geometric segmentation cues, allowing the method to better adapt to the available data. The improved method

  17. 4-Stage Online Presence Model: Model for Module Design and Delivery Using Web 2.0 Technologies to Facilitate Critical Thinking Skills

    ERIC Educational Resources Information Center

    Goh, WeiWei; Dexter, Barbara; Self, Richard

    2014-01-01

    The main purpose of this paper is to present a conceptual model for the use of web 2.0 online technologies in order to develop and enhance students' critical thinking skills at higher education level. Wiki is chosen as the main focus in this paper. The model integrates Salmon's 5-stage model (Salmon, 2002) with Garrison's Community…

  18. An online model correction method based on an inverse problem: Part II—systematic model error correction

    NASA Astrophysics Data System (ADS)

    Xue, Haile; Shen, Xueshun; Chou, Jifan

    2015-11-01

    An online systematic error correction is presented and examined as a technique to improve the accuracy of real-time numerical weather prediction, based on the dataset of model errors (MEs) in past intervals. Given the analyses, the ME in each interval (6 h) between two analyses can be iteratively obtained by introducing an unknown tendency term into the prediction equation, shown in Part I of this two-paper series. In this part, after analyzing the 5-year (2001-2005) GRAPES-GFS (Global Forecast System of the Global and Regional Assimilation and Prediction System) error patterns and evolution, a systematic model error correction is given based on the least-squares approach by firstly using the past MEs. To test the correction, we applied the approach in GRAPES-GFS for July 2009 and January 2010. The datasets associated with the initial condition and SST used in this study were based on NCEP (National Centers for Environmental Prediction) FNL (final) data. The results indicated that the Northern Hemispheric systematically underestimated equator-to-pole geopotential gradient and westerly wind of GRAPES-GFS were largely enhanced, and the biases of temperature and wind in the tropics were strongly reduced. Therefore, the correction results in a more skillful forecast with lower mean bias and root-mean-square error and higher anomaly correlation coefficient.

  19. CheD: chemical database compilation tool, Internet server, and client for SQL servers.

    PubMed

    Trepalin, S V; Yarkov, A V

    2001-01-01

    An efficient program, which runs on a personal computer, for the storage, retrieval, and processing of chemical information, is presented, The program can work both as a stand-alone application or in conjunction with a specifically written Web server application or with some standard SQL servers, e.g., Oracle, Interbase, and MS SQL. New types of data fields are introduced, e.g., arrays for spectral information storage, HTML and database links, and user-defined functions. CheD has an open architecture; thus, custom data types, controls, and services may be added. A WWW server application for chemical data retrieval features an easy and user-friendly installation on Windows NT or 95 platforms. PMID:11206361

  20. Providing Introductory Psychology Students Access to Online Lecture Notes: The Relationship of Note Use to Performance and Class Attendance

    ERIC Educational Resources Information Center

    Grabe, Mark; Christopherson, Kimberly; Douglas, Jason

    2005-01-01

    The relationships among the frequency of access to online lecture notes, examination performance, and class attendance were investigated. Data on use of online notes were gathered from the log maintained by the server and from student responses to a questionnaire. Students who made any attempt to access online notes viewed notes associated with…

  1. A Low-order Coupled Chemistry Meteorology Model for Testing Online and Offline Advanced Data Assimilation Schemes

    NASA Astrophysics Data System (ADS)

    Bocquet, M.; Haussaire, J. M.

    2015-12-01

    Bocquet and Sakov have recently introduced a low-order model based on the coupling of thechaotic Lorenz-95 model which simulates winds along a mid-latitude circle, with thetransport of a tracer species advected by this wind field. It has been used to testadvanced data assimilation methods with an online model that couples meteorology andtracer transport. In the present study, the tracer subsystem of the model is replacedwith a reduced photochemistry module meant to emulate reactive air pollution. Thiscoupled chemistry meteorology model, the L95-GRS model, mimics continental andtranscontinental transport and photochemistry of ozone, volatile organic compounds andnitrogen dioxides.The L95-GRS is specially useful in testing advanced data assimilation schemes, such as theiterative ensemble Kalman smoother (IEnKS) that combines the best of ensemble andvariational methods. The model provides useful insights prior to any implementation ofthe data assimilation method on larger models. For instance, online and offline dataassimilation strategies based on the ensemble Kalman filter or the IEnKS can easily beevaluated with it. It allows to document the impact of species concentration observationson the wind estimation. The model also illustrates a long standing issue in atmosphericchemistry forecasting: the impact of the wind chaotic dynamics and of the chemical speciesnon-chaotic but highly nonlinear dynamics on the selected data assimilation approach.

  2. Efficient server selection system for widely distributed multiserver networks

    NASA Astrophysics Data System (ADS)

    Lee, Hyun-pyo; Park, Sung-sik; Lee, Kyoon-Ha

    2001-07-01

    In order to providing more improved quality of Internet service, the access speed to a subscriber's network and a server which is the Internet access device was rapidly enhanced by traffic distribution and installation of high-performance server. But the Internet access quality and the content for a speed were remained out of satisfaction. With such a hazard, an extended node at Internet access device has a limitation for coping with growing network traffic, and the root cause is located in the Middle-mile node between a CP (Content Provider) server and a user node. For such a problem, this paper proposes a new method to select a effective server to a client as minimizing the number of node between the server and the client while keeping the load balance among servers which is clustered by the client's location on the physically distributed multi-site environments. The proposed method use a NSP (Network Status Prober) and a contents server manager so as to get a status of each servers and distributed network, a new architecture will be shown for the server selecting algorithm and the implementation for the algorithm. And also, this paper shows the parameters selecting a best service providing server for client and that the grantor will be confirmed by the experiment over the proposed architectures.

  3. Paying for Express Checkout: Competition and Price Discrimination in Multi-Server Queuing Systems

    PubMed Central

    Deck, Cary; Kimbrough, Erik O.; Mongrain, Steeve

    2014-01-01

    We model competition between two firms selling identical goods to customers who arrive in the market stochastically. Shoppers choose where to purchase based upon both price and the time cost associated with waiting for service. One seller provides two separate queues, each with its own server, while the other seller has a single queue and server. We explore the market impact of the multi-server seller engaging in waiting cost-based-price discrimination by charging a premium for express checkout. Specifically, we analyze this situation computationally and through the use of controlled laboratory experiments. We find that this form of price discrimination is harmful to sellers and beneficial to consumers. When the two-queue seller offers express checkout for impatient customers, the single queue seller focuses on the patient shoppers thereby driving down prices and profits while increasing consumer surplus. PMID:24667809

  4. Paying for express checkout: competition and price discrimination in multi-server queuing systems.

    PubMed

    Deck, Cary; Kimbrough, Erik O; Mongrain, Steeve

    2014-01-01

    We model competition between two firms selling identical goods to customers who arrive in the market stochastically. Shoppers choose where to purchase based upon both price and the time cost associated with waiting for service. One seller provides two separate queues, each with its own server, while the other seller has a single queue and server. We explore the market impact of the multi-server seller engaging in waiting cost-based-price discrimination by charging a premium for express checkout. Specifically, we analyze this situation computationally and through the use of controlled laboratory experiments. We find that this form of price discrimination is harmful to sellers and beneficial to consumers. When the two-queue seller offers express checkout for impatient customers, the single queue seller focuses on the patient shoppers thereby driving down prices and profits while increasing consumer surplus. PMID:24667809

  5. PROMALS3D web server for accurate multiple protein sequence and structure alignments.

    PubMed

    Pei, Jimin; Tang, Ming; Grishin, Nick V

    2008-07-01

    Multiple sequence alignments are essential in computational sequence and structural analysis, with applications in homology detection, structure modeling, function prediction and phylogenetic analysis. We report PROMALS3D web server for constructing alignments for multiple protein sequences and/or structures using information from available 3D structures, database homologs and predicted secondary structures. PROMALS3D shows higher alignment accuracy than a number of other advanced methods. Input of PROMALS3D web server can be FASTA format protein sequences, PDB format protein structures and/or user-defined alignment constraints. The output page provides alignments with several formats, including a colored alignment augmented with useful information about sequence grouping, predicted secondary structures and consensus sequences. Intermediate results of sequence and structural database searches are also available. The PROMALS3D web server is available at: http://prodata.swmed.edu/promals3d/. PMID:18503087

  6. Modelling the Factors that Affect Individuals' Utilisation of Online Learning Systems: An Empirical Study Combining the Task Technology Fit Model with the Theory of Planned Behaviour

    ERIC Educational Resources Information Center

    Yu, Tai-Kuei; Yu, Tai-Yi

    2010-01-01

    Understanding learners' behaviour, perceptions and influence in terms of learner performance is crucial to predict the use of electronic learning systems. By integrating the task-technology fit (TTF) model and the theory of planned behaviour (TPB), this paper investigates the online learning utilisation of Taiwanese students. This paper provides a…

  7. Energy Efficiency in Small Server Rooms: Field Surveys and Findings

    SciTech Connect

    Cheung, Iris; Greenberg, Steve; Mahdavi, Roozbeh; Brown, Richard; Tschudi, William

    2014-08-11

    Fifty-seven percent of US servers are housed in server closets, server rooms, and localized data centers, in what are commonly referred to as small server rooms, which comprise 99percent of all server spaces in the US. While many mid-tier and enterprise-class data centers are owned by large corporations that consider energy efficiency a goal to minimize business operating costs, small server rooms typically are not similarly motivated. They are characterized by decentralized ownership and management and come in many configurations, which creates a unique set of efficiency challenges. To develop energy efficiency strategies for these spaces, we surveyed 30 small server rooms across eight institutions, and selected four of them for detailed assessments. The four rooms had Power Usage Effectiveness (PUE) values ranging from 1.5 to 2.1. Energy saving opportunities ranged from no- to low-cost measures such as raising cooling set points and better airflow management, to more involved but cost-effective measures including server consolidation and virtualization, and dedicated cooling with economizers. We found that inefficiencies mainly resulted from organizational rather than technical issues. Because of the inherent space and resource limitations, the most effective measure is to operate servers through energy-efficient cloud-based services or well-managed larger data centers, rather than server rooms. Backup power requirement, and IT and cooling efficiency should be evaluated to minimize energy waste in the server space. Utility programs are instrumental in raising awareness and spreading technical knowledge on server operation, and the implementation of energy efficiency measures in small server rooms.

  8. Server-based Approach to Web Visualization of Integrated Three-dimensional Brain Imaging Data

    PubMed Central

    Poliakov, Andrew V.; Albright, Evan; Hinshaw, Kevin P.; Corina, David P.; Ojemann, George; Martin, Richard F.; Brinkley, James F.

    2005-01-01

    The authors describe a client-server approach to three-dimensional (3-D) visualization of neuroimaging data, which enables researchers to visualize, manipulate, and analyze large brain imaging datasets over the Internet. All computationally intensive tasks are done by a graphics server that loads and processes image volumes and 3-D models, renders 3-D scenes, and sends the renderings back to the client. The authors discuss the system architecture and implementation and give several examples of client applications that allow visualization and analysis of integrated language map data from single and multiple patients. PMID:15561787

  9. Weighted fair queueing scheduling for World Wide Web proxy servers

    NASA Astrophysics Data System (ADS)

    El Abdouni Khayari, Rachid; Sadre, Ramin; Haverkort, Boudewijn R.; Zoschke, Norman

    2002-07-01

    Current world-wide web servers as well as proxy servers rely for their scheduling on services provided by the underlying operating system. In practice, this means that some form of first-come-first-served (FCFS) scheduling is utilised. Although FCFS is a reasonable scheduling strategy for job sequences that do not show much variance, in the world-wide web (WWW), however, it has been shown that the typical object sizes requested do exhibit heavy tails. This means that the probability to observe very long jobs (very large objects) is much higher than typically predicted using an exponential model. Under these circumstances, job scheduling on the basis of shortest-job first (SJF) has been shown to perform much better, in fact, to minimise the total average waiting time, simply by avoiding situations in which short jobs have to wait for very long one. However, SJF has as disadvantage that long jobs might suffer from starvation. In order to avoid the problems of both FCFS and SJF we present in this paper a new scheduling algorithm called class-based interleaving weighted fair queueing (CI-WFQ). This algorithm uses the specific characteristics of the job stream being served, that is, the distribution of the sizes of the objects being requested, to set its parameters such that good mean reponse times are obtained and starvation does not occur. In the paper, the new scheduling approach is introduced and compared, using trace-driven simulations, with existing scheduling approaches.

  10. WAMI: a web server for the analysis of minisatellite maps

    PubMed Central

    2010-01-01

    Background Minisatellites are genomic loci composed of tandem arrays of short repetitive DNA segments. A minisatellite map is a sequence of symbols that represents the tandem repeat array such that the set of symbols is in one-to-one correspondence with the set of distinct repeats. Due to variations in repeat type and organization as well as copy number, the minisatellite maps have been widely used in forensic and population studies. In either domain, researchers need to compare the set of maps to each other, to build phylogenetic trees, to spot structural variations, and to study duplication dynamics. Efficient algorithms for these tasks are required to carry them out reliably and in reasonable time. Results In this paper we present WAMI, a web-server for the analysis of minisatellite maps. It performs the above mentioned computational tasks using efficient algorithms that take the model of map evolution into account. The WAMI interface is easy to use and the results of each analysis task are visualized. Conclusions To the best of our knowledge, WAMI is the first server providing all these computational facilities to the minisatellite community. The WAMI web-interface and the source code of the underlying programs are available at http://www.nubios.nileu.edu.eg/tools/wami. PMID:20525398

  11. Improved wet weather wastewater influent modelling at Viikinmäki WWTP by on-line weather radar information.

    PubMed

    Heinonen, M; Jokelainen, M; Fred, T; Koistinen, J; Hohti, H

    2013-01-01

    Municipal wastewater treatment plant (WWTP) influent is typically dependent on diurnal variation of urban production of liquid waste, infiltration of stormwater runoff and groundwater infiltration. During wet weather conditions the infiltration phenomenon typically increases the risk of overflows in the sewer system as well as the risk of having to bypass the WWTP. Combined sewer infrastructure multiplies the role of rainwater runoff in the total influent. Due to climate change, rain intensity and magnitude is tending to rise as well, which can already be observed in the normal operation of WWTPs. Bypass control can be improved if the WWTP is prepared for the increase of influent, especially if there is some storage capacity prior to the treatment plant. One option for this bypass control is utilisation of on-line weather-radar-based forecast data of rainfall as an input for the on-line influent model. This paper reports the Viikinmäki WWTP wet weather influent modelling project results where gridded exceedance probabilities of hourly rainfall accumulations for the next 3 h from the Finnish Meteorological Institute are utilised as on-line input data for the influent model. PMID:23925175

  12. Automated data evaluation and modelling of simultaneous (19) F-(1) H medium-resolution NMR spectra for online reaction monitoring.

    PubMed

    Zientek, Nicolai; Laurain, Clément; Meyer, Klas; Paul, Andrea; Engel, Dirk; Guthausen, Gisela; Kraume, Matthias; Maiwald, Michael

    2016-06-01

    Medium-resolution nuclear magnetic resonance spectroscopy (MR-NMR) currently develops to an important analytical tool for both quality control and process monitoring. In contrast to high-resolution online NMR (HR-NMR), MR-NMR can be operated under rough environmental conditions. A continuous re-circulating stream of reaction mixture from the reaction vessel to the NMR spectrometer enables a non-invasive, volume integrating online analysis of reactants and products. Here, we investigate the esterification of 2,2,2-trifluoroethanol with acetic acid to 2,2,2-trifluoroethyl acetate both by (1) H HR-NMR (500 MHz) and (1) H and (19) F MR-NMR (43 MHz) as a model system. The parallel online measurement is realised by splitting the flow, which allows the adjustment of quantitative and independent flow rates, both in the HR-NMR probe as well as in the MR-NMR probe, in addition to a fast bypass line back to the reactor. One of the fundamental acceptance criteria for online MR-MNR spectroscopy is a robust data treatment and evaluation strategy with the potential for automation. The MR-NMR spectra are treated by an automated baseline and phase correction using the minimum entropy method. The evaluation strategies comprise (i) direct integration, (ii) automated line fitting, (iii) indirect hard modelling (IHM) and (iv) partial least squares regression (PLS-R). To assess the potential of these evaluation strategies for MR-NMR, prediction results are compared with the line fitting data derived from the quantitative HR-NMR spectroscopy. Although, superior results are obtained from both IHM and PLS-R for (1) H MR-NMR, especially the latter demands for elaborate data pretreatment, whereas IHM models needed no previous alignment. Copyright © 2015 John Wiley & Sons, Ltd. PMID:25854892

  13. An online model correction method based on an inverse problem: Part I—Model error estimation by iteration

    NASA Astrophysics Data System (ADS)

    Xue, Haile; Shen, Xueshun; Chou, Jifan

    2015-10-01

    Errors inevitably exist in numerical weather prediction (NWP) due to imperfect numeric and physical parameterizations. To eliminate these errors, by considering NWP as an inverse problem, an unknown term in the prediction equations can be estimated inversely by using the past data, which are presumed to represent the imperfection of the NWP model (model error, denoted as ME). In this first paper of a two-part series, an iteration method for obtaining the MEs in past intervals is presented, and the results from testing its convergence in idealized experiments are reported. Moreover, two batches of iteration tests were applied in the global forecast system of the Global and Regional Assimilation and Prediction System (GRAPES-GFS) for July-August 2009 and January-February 2010. The datasets associated with the initial conditions and sea surface temperature (SST) were both based on NCEP (National Centers for Environmental Prediction) FNL (final) data. The results showed that 6th h forecast errors were reduced to 10% of their original value after a 20-step iteration. Then, off-line forecast error corrections were estimated linearly based on the 2-month mean MEs and compared with forecast errors. The estimated error corrections agreed well with the forecast errors, but the linear growth rate of the estimation was steeper than the forecast error. The advantage of this iteration method is that the MEs can provide the foundation for online correction. A larger proportion of the forecast errors can be expected to be canceled out by properly introducing the model error correction into GRAPES-GFS.

  14. Research needs and opportunities in server intervention programs.

    PubMed

    Saltz, R F

    1989-01-01

    Prevention specialists have recently focused on ways to shape the drinking context and environment to reduce the risks of drinking and driving. Server intervention refers to a set of strategies to control drinking in service establishments through changes in management policies, serving practices, and by training servers and other employees to monitor and control patrons' alcohol consumption. Research on server intervention is mixed, but seems to indicate that some server intervention practices can reduce levels of alcohol intoxication by patrons. Further work is needed to determine how such effects can be enhanced. Topics for future research include optimal components of specific training curriculum, policies needed to support and extend server training, importance of "booster" sessions, and the relationship of server intervention to broader social and legal environments that discourage drinking and driving. PMID:2793495

  15. Developing the online survey.

    PubMed

    Gordon, Jeffry S; McNew, Ryan

    2008-12-01

    Institutions of higher education are now using Internet-based technology tools to conduct surveys for data collection. Research shows that the type and quality of responses one receives with online surveys are comparable with what one receives in paper-based surveys. Data collection can take place on Web-based surveys, e-mail-based surveys, and personal digital assistants/Smartphone devices. Web surveys can be subscription templates, software packages installed on one's own server, or created from scratch using Web programming development tools. All of these approaches have their advantages and disadvantages. The survey owner must make informed decisions as to the right technology to implement. The correct choice can save hours of work in sorting, organizing, and analyzing data. PMID:18940417

  16. An online model-based method for state of energy estimation of lithium-ion batteries using dual filters

    NASA Astrophysics Data System (ADS)

    Dong, Guangzhong; Chen, Zonghai; Wei, Jingwen; Zhang, Chenbin; Wang, Peng

    2016-01-01

    The state-of-energy of lithium-ion batteries is an important evaluation index for energy storage systems in electric vehicles and smart grids. To improve the battery state-of-energy estimation accuracy and reliability, an online model-based estimation approach is proposed against uncertain dynamic load currents and environment temperatures. Firstly, a three-dimensional response surface open-circuit-voltage model is built up to improve the battery state-of-energy estimation accuracy, taking various temperatures into account. Secondly, a total-available-energy-capacity model that involves temperatures and discharge rates is reconstructed to improve the accuracy of the battery model. An extended-Kalman-filter and particle-filter based dual filters algorithm is then developed to establish an online model-based estimator for the battery state-of-energy. The extended-Kalman-filter is employed to update parameters of the battery model using real-time battery current and voltage at each sampling interval, while the particle-filter is applied to estimate the battery state-of-energy. Finally, the proposed approach is verified by experiments conducted on a LiFePO4 lithium-ion battery under different operating currents and temperatures. Experimental results indicate that the battery model simulates battery dynamics robustly with high accuracy, and the estimates of the dual filters converge to the real state-of-energy within an error of ±4%.

  17. A client/server approach to telemedicine.

    PubMed

    Vaughan, B J; Torok, K E; Kelly, L M; Ewing, D J; Andrews, L T

    1995-01-01

    This paper describes the Medical College of Ohio's efforts in developing a client/server telemedicine system. Telemedicine vastly improves the ability of a medical center physician or specialist to interactively consult with a physician at a remote health care facility. The patient receives attention more quickly, he and his family do not need to travel long distances to obtain specialists' services, and the primary care physician can be involved in diagnosis and developing a treatment program [1, 2]. Telemedicine consultations are designed to improve access to health services in underserved urban and rural communities and reduce isolation of rural practitioners [3]. PMID:8563396

  18. HS06 Benchmark for an ARM Server

    NASA Astrophysics Data System (ADS)

    Kluth, Stefan

    2014-06-01

    We benchmarked an ARM cortex-A9 based server system with a four-core CPU running at 1.1 GHz. The system used Ubuntu 12.04 as operating system and the HEPSPEC 2006 (HS06) benchmarking suite was compiled natively with gcc-4.4 on the system. The benchmark was run for various settings of the relevant gcc compiler options. We did not find significant influence from the compiler options on the benchmark result. The final HS06 benchmark result is 10.4.

  19. World wide web implementation of the Langley technical report server

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Gottlich, Gretchen L.; Bianco, David J.

    1994-01-01

    On January 14, 1993, NASA Langley Research Center (LaRC) made approximately 130 formal, 'unclassified, unlimited' technical reports available via the anonymous FTP Langley Technical Report Server (LTRS). LaRC was the first organization to provide a significant number of aerospace technical reports for open electronic dissemination. LTRS has been successful in its first 18 months of operation, with over 11,000 reports distributed and has helped lay the foundation for electronic document distribution for NASA. The availability of World Wide Web (WWW) technology has revolutionized the Internet-based information community. This paper describes the transition of LTRS from a centralized FTP site to a distributed data model using the WWW, and suggests how the general model for LTRS can be applied to other similar systems.

  20. Sex-Related Online Behaviors, Perceived Peer Norms and Adolescents’ Experience with Sexual Behavior: Testing an Integrative Model

    PubMed Central

    Doornwaard, Suzan M.; ter Bogt, Tom F. M.; Reitz, Ellen; van den Eijnden, Regina J. J. M.

    2015-01-01

    Research on the role of sex-related Internet use in adolescents’ sexual development has often isolated the Internet and online behaviors from other, offline influencing factors in adolescents’ lives, such as processes in the peer domain. The aim of this study was to test an integrative model explaining how receptive (i.e., use of sexually explicit Internet material [SEIM]) and interactive (i.e., use of social networking sites [SNS]) sex-related online behaviors interrelate with perceived peer norms in predicting adolescents’ experience with sexual behavior. Structural equation modeling on longitudinal data from 1,132 Dutch adolescents (Mage T1 = 13.95; range 11-17; 52.7% boys) demonstrated concurrent, direct, and indirect effects between sex-related online behaviors, perceived peer norms, and experience with sexual behavior. SEIM use (among boys) and SNS use (among boys and girls) predicted increases in adolescents’ perceptions of peer approval of sexual behavior and/or in their estimates of the numbers of sexually active peers. These perceptions, in turn, predicted increases in adolescents’ level of experience with sexual behavior at the end of the study. Boys’ SNS use also directly predicted increased levels of experience with sexual behavior. These findings highlight the need for multisystemic research and intervention development to promote adolescents’ sexual health. PMID:26086606

  1. Sex-Related Online Behaviors, Perceived Peer Norms and Adolescents' Experience with Sexual Behavior: Testing an Integrative Model.

    PubMed

    Doornwaard, Suzan M; ter Bogt, Tom F M; Reitz, Ellen; van den Eijnden, Regina J J M

    2015-01-01

    Research on the role of sex-related Internet use in adolescents' sexual development has often isolated the Internet and online behaviors from other, offline influencing factors in adolescents' lives, such as processes in the peer domain. The aim of this study was to test an integrative model explaining how receptive (i.e., use of sexually explicit Internet material [SEIM]) and interactive (i.e., use of social networking sites [SNS]) sex-related online behaviors interrelate with perceived peer norms in predicting adolescents' experience with sexual behavior. Structural equation modeling on longitudinal data from 1,132 Dutch adolescents (M(age) T1 = 13.95; range 11-17; 52.7% boys) demonstrated concurrent, direct, and indirect effects between sex-related online behaviors, perceived peer norms, and experience with sexual behavior. SEIM use (among boys) and SNS use (among boys and girls) predicted increases in adolescents' perceptions of peer approval of sexual behavior and/or in their estimates of the numbers of sexually active peers. These perceptions, in turn, predicted increases in adolescents' level of experience with sexual behavior at the end of the study. Boys' SNS use also directly predicted increased levels of experience with sexual behavior. These findings highlight the need for multisystemic research and intervention development to promote adolescents' sexual health. PMID:26086606

  2. Care Models of eHealth Services: A Case Study on the Design of a Business Model for an Online Precare Service

    PubMed Central

    2015-01-01

    Background With a growing population of health care clients in the future, the organization of high-quality and cost-effective service providing becomes an increasing challenge. New online eHealth services are proposed as innovative options for the future. Yet, a major barrier to these services appears to be the lack of new business model designs. Although design efforts generally result in visual models, no such artifacts have been found in the literature on business model design. This paper investigates business model design in eHealth service practices from a design perspective. It adopts a research by design approach and seeks to unravel what characteristics of business models determine an online service and what are important value exchanges between health professionals and clients. Objective The objective of the study was to analyze the construction of care models in-depth, framing the essential elements of a business model, and design a new care model that structures these elements for the particular context of an online pre-care service in practice. Methods This research employs a qualitative method of an in-depth case study in which different perspectives on constructing a care model are investigated. Data are collected by using the visual business modeling toolkit, designed to cocreate and visualize the business model. The cocreated models are transcribed and analyzed per actor perspective, transactions, and value attributes. Results We revealed eight new actors in the business model for providing the service. Essential actors are: the intermediary network coordinator connecting companies, the service dedicated information technology specialists, and the service dedicated health specialist. In the transactions for every service providing we found a certain type of contract, such as a license contract and service contracts for precare services and software products. In addition to the efficiency, quality, and convenience, important value attributes

  3. NMMB/BSC-DUST: an online mineral dust atmospheric model from meso to global scales

    NASA Astrophysics Data System (ADS)

    Haustein, K.; Pérez, C.; Jorba, O.; Baldasano, J. M.; Janjic, Z.; Black, T.; Nickovic, S.

    2009-04-01

    While mineral dust distribution and effects are important at global scales, they strongly depend on dust emissions that are controlled on small spatial and temporal scales. Most global dust models use prescribed wind fields provided by meteorological centers (e.g., NCEP and ECMWF) and their spatial resolution is currently never better than about 1°×1°. Regional dust models offer substantially higher resolution (10-20 km) and are typically coupled with weather forecast models that simulate processes that GCMs either cannot resolve or can resolve only poorly. These include internal circulation features such as the low-level nocturnal jet which is a crucial feature for dust emission in several dust ‘hot spot' sources in North Africa. Based on our modeling experience with the BSC-DREAM regional forecast model (http://www.bsc.es/projects/earthscience/DREAM/) we are currently implementing an improved mineral dust model [Pérez et al., 2008] coupled online with the new global/regional NMMB atmospheric model under development in NOAA/NCEP/EMC [Janjic, 2005]. The NMMB is an evolution of the operational WRF-NMME extending from meso to global scales. The NMMB will become the next-generation NCEP model for operational weather forecast in 2010. The corresponding unified non-hydrostatic dynamical core ranges from meso to global scale allowing regional and global simulations. It has got an add-on non-hydrostatic module and it is based on the Arakawa B-grid and hybrid pressure-sigma vertical coordinates. NMMB is fully embedded into the Earth System Modeling Framework (ESMF), treating dynamics and physics separately and coupling them easily within the ESMF structure. Our main goal is to provide global dust forecasts up to 7 days at mesoscale resolutions. New features of the model include a physically-based dust emission scheme after White [1979], Iversen and White [1982] and Marticorena and Bergametti [1995] that takes the effects of saltation and sandblasting into account

  4. Designing a controlled medical vocabulary server: the VOSER project.

    PubMed

    Rocha, R A; Huff, S M; Haug, P J; Warner, H R

    1994-12-01

    The authors describe their experience designing a controlled medical vocabulary server created to support the exchange of patient data and medical decision logic. The first section introduces practical and theoretical premises that guided the design of the vocabulary server. The second section describes a series of structures needed to implement the proposed server, emphasizing their conformance to the design premises. The third section introduces potential applications that provide services to end users and also a group of tools necessary for maintaining the server corpus. In the fourth section, the authors propose an implementation strategy based on a common framework and on the participation of groups from different health-related domains. PMID:7895474

  5. Research on On-Line Modeling of Fed-Batch Fermentation Process Based on v-SVR

    NASA Astrophysics Data System (ADS)

    Ma, Yongjun

    The fermentation process is very complex and non-linear, many parameters are not easy to measure directly on line, soft sensor modeling is a good solution. This paper introduces v-support vector regression (v-SVR) for soft sensor modeling of fed-batch fermentation process. v-SVR is a novel type of learning machine. It can control the accuracy of fitness and prediction error by adjusting the parameter v. An on-line training algorithm is discussed in detail to reduce the training complexity of v-SVR. The experimental results show that v-SVR has low error rate and better generalization with appropriate v.

  6. A secure online image trading system for untrusted cloud environments.

    PubMed

    Munadi, Khairul; Arnia, Fitri; Syaryadhi, Mohd; Fujiyoshi, Masaaki; Kiya, Hitoshi

    2015-01-01

    In conventional image trading systems, images are usually stored unprotected on a server, rendering them vulnerable to untrusted server providers and malicious intruders. This paper proposes a conceptual image trading framework that enables secure storage and retrieval over Internet services. The process involves three parties: an image publisher, a server provider, and an image buyer. The aim is to facilitate secure storage and retrieval of original images for commercial transactions, while preventing untrusted server providers and unauthorized users from gaining access to true contents. The framework exploits the Discrete Cosine Transform (DCT) coefficients and the moment invariants of images. Original images are visually protected in the DCT domain, and stored on a repository server. Small representation of the original images, called thumbnails, are generated and made publicly accessible for browsing. When a buyer is interested in a thumbnail, he/she sends a query to retrieve the visually protected image. The thumbnails and protected images are matched using the DC component of the DCT coefficients and the moment invariant feature. After the matching process, the server returns the corresponding protected image to the buyer. However, the image remains visually protected unless a key is granted. Our target application is the online market, where publishers sell their stock images over the Internet using public cloud servers. PMID:26090324

  7. Enhancing Online Distance Education in Small Rural US Schools: A Hybrid, Learner-Centred Model

    ERIC Educational Resources Information Center

    de la Varre, Claire; Keane, Julie; Irvin, Matthew J.

    2011-01-01

    Online distance education (ODE) has become pervasive and can potentially transform pedagogical practices across primary, secondary and university-based educational systems. ODE is considered a flexible option for non-traditional students such as adult learners and home-schoolers, and a convenient way to deliver remedial courses. ODE is also a…

  8. Enhancing Online Distance Education in Small Rural US Schools: A Hybrid, Learner-Centred Model

    ERIC Educational Resources Information Center

    de la Varre, Claire; Keane, Julie; Irvin, Matthew J.

    2010-01-01

    Online distance education (ODE) has become pervasive and can potentially transform pedagogical practices across primary, secondary and university-based educational systems. ODE is considered a flexible option for non-traditional students such as adult learners and home-schoolers, and a convenient way to deliver remedial courses. ODE is also a…

  9. Social Work Online Education: A Model for Getting Started and Staying Connected

    ERIC Educational Resources Information Center

    Moore, Sharon E.; Golder, Seana; Sterrett, Emma; Faul, Anna C.; Yankeelov, Pam; Weathers Mathis, Lynetta; Barbee, Anita P.

    2015-01-01

    Social work education has been greatly affected by ongoing technological advances in society at large and in the academy. Options for instructional delivery have been broadened tremendously. The University of Louisville is the first in Kentucky to put its master's of social work degree fully online, with a first cohort admitted in 2012. The…

  10. The REEAL Model: A Framework for Faculty Training in Online Discussion Facilitation

    ERIC Educational Resources Information Center

    Bedford, Laurie

    2014-01-01

    Discussion forums are a primary tool for interactions in the online classroom. Discussions are a critical part of the learning process for students, and instructor facilitation should reflect this importance. Effective instructor discussion facilitation encourages students, provides evidence and analysis and links the discussion to subsequent…

  11. A Preliminary Evaluation of Short Blended Online Training Workshop for TPACK Development Using Technology Acceptance Model

    ERIC Educational Resources Information Center

    Alsofyani, Mohammed Modeef; Aris, Baharuddin bin; Eynon, Rebecca; Majid, Norazman Abdul

    2012-01-01

    The use of Short Blended Online Training (SBOT) for the development of Technological Pedagogical and Content Knowledge (TPACK) is a promising approach to facilitate the use of e-learning by academics. Adult learners prefer the blend of pedagogies such as the presentation, demonstration, practice and feedback if they are structured and…

  12. Evaluation of an Online Model for Adjunct and Full-Time Community College Faculty Professional Development

    ERIC Educational Resources Information Center

    Baxter, Thomas D.

    2011-01-01

    The utilization of adjunct faculty, especially in the community college setting, has steadily increased over the last several decades. Staff development for faculty at a community college, however, is often disproportionately targeted toward full-time faculty. This study used a program evaluation to assess an existing online faculty development…

  13. Quality Models in Online and Open Education around the Globe: State of the Art and Recommendations

    ERIC Educational Resources Information Center

    Ossiannilsson, Ebba; Williams, Keith; Camilleri, Anthony F.; Brown, Mark

    2015-01-01

    This report is written for: (1) institutional leaders responsible for quality in online, open and flexible higher education; (2) faculty wanting to have an overview of the field; (3) newcomers that want to develop quality schemes; (4) policy makers in governments, agencies and organisations; and (5) major educational stakeholders in the…

  14. Attitudes towards Online Feedback on Writing: Why Students Mistrust the Learning Potential of Models

    ERIC Educational Resources Information Center

    Strobl, Carola

    2015-01-01

    This exploratory study sheds new light on students' perceptions of online feedback types for a complex writing task, summary writing from spoken input in a foreign language (L2), and investigates how these correlate with their actual learning to write. Students tend to favour clear-cut, instructivist rather than constructivist feedback, and guided…

  15. Activity-Based Costing Models for Alternative Modes of Delivering On-Line Courses

    ERIC Educational Resources Information Center

    Garbett, Chris

    2011-01-01

    In recent years there has been growth in online distance learning courses. This has been prompted by; new technology such as the Internet, mobile learning, video and audio conferencing: the explosion in student numbers in Higher Education, and the need for outreach to a world wide market. Web-based distance learning is seen as a solution to…

  16. Applying Fuzzy Logic for Learner Modeling and Decision Support in Online Learning Systems

    ERIC Educational Resources Information Center

    Al-Aubidy, Kasim M.

    2005-01-01

    Advances in computers and multimedia technology have changed traditional methods for learning and skills training. Online learning continues to play a major success of any academic program. Such learning can personalize learning needs for students, it can provide an environment where virtual reality techniques are used to create interactive…

  17. Factors Affecting Perceived Learning, Satisfaction, and Quality in the Online MBA: A Structural Equation Modeling Approach

    ERIC Educational Resources Information Center

    Sebastianelli, Rose; Swift, Caroline; Tamimi, Nabil

    2015-01-01

    The authors examined how six factors related to content and interaction affect students' perceptions of learning, satisfaction, and quality in online master of business administration (MBA) courses. They developed three scale items to measure each factor. Using survey data from MBA students at a private university, the authors estimated structural…

  18. Online Help-Seeking in Communities of Practice: Modeling the Acceptance of Conceptual Artifacts

    ERIC Educational Resources Information Center

    Nistor, Nicolae; Schworm, Silke; Werner, Matthias

    2012-01-01

    Interactive online help systems are considered to be a fruitful supplement to traditional IT helpdesks, which are often overloaded. They often comprise user-generated FAQ collections playing the role of technology-based conceptual artifacts. Two main questions arise: how the conceptual artifacts should be used, and which factors influence their…

  19. Online PhD Program Delivery Models and Their Relationship to Student Success

    ERIC Educational Resources Information Center

    Jorissen, Shari L.

    2012-01-01

    Attrition rates in Ph.D. programs are at approximately 50% in traditional Ph.D. programs and 10-20% higher in online Ph.D. programs. Understanding the relationship between student factors, measures of student success (retention, graduation, year to degree), and student satisfaction is important to support and improve retention, graduation rates,…

  20. [A Terahertz Spectral Database Based on Browser/Server Technique].

    PubMed

    Zhang, Zhuo-yong; Song, Yue

    2015-09-01

    With the solution of key scientific and technical problems and development of instrumentation, the application of terahertz technology in various fields has been paid more and more attention. Owing to the unique characteristic advantages, terahertz technology has been showing a broad future in the fields of fast, non-damaging detections, as well as many other fields. Terahertz technology combined with other complementary methods can be used to cope with many difficult practical problems which could not be solved before. One of the critical points for further development of practical terahertz detection methods depends on a good and reliable terahertz spectral database. We developed a BS (browser/server) -based terahertz spectral database recently. We designed the main structure and main functions to fulfill practical requirements. The terahertz spectral database now includes more than 240 items, and the spectral information was collected based on three sources: (1) collection and citation from some other abroad terahertz spectral databases; (2) collected from published literatures; and (3) spectral data measured in our laboratory. The present paper introduced the basic structure and fundament functions of the terahertz spectral database developed in our laboratory. One of the key functions of this THz database is calculation of optical parameters. Some optical parameters including absorption coefficient, refractive index, etc. can be calculated based on the input THz time domain spectra. The other main functions and searching methods of the browser/server-based terahertz spectral database have been discussed. The database search system can provide users convenient functions including user registration, inquiry, displaying spectral figures and molecular structures, spectral matching, etc. The THz database system provides an on-line searching function for registered users. Registered users can compare the input THz spectrum with the spectra of database, according to