2009-01-01
Oracle 9i, 10g MySQL MS SQL Server MS SQL Server Operating System Supported Windows 2003 Server Windows 2000 Server (32 bit...WebStar (Mac OS X) SunOne Internet Information Services (IIS) Database Server Supported MS SQL Server MS SQL Server Oracle 9i, 10g...challenges of Web-based surveys are: 1) identifying the best Commercial Off the Shelf (COTS) Web-based survey packages to serve the particular
An Optimization of the Basic School Military Occupational Skill Assignment Process
2003-06-01
Corps Intranet (NMCI)23 supports it. We evaluated the use of Microsoft’s SQL Server, but dismissed this after learning that TBS did not possess a SQL ...Server license or a qualified SQL Server administrator.24 SQL Server would have provided for additional security measures not available in MS...administrator. Although not has powerful as SQL Server, MS Access can handle the multi-user environment necessary for this system.25 The training
Degroeve, Sven; Maddelein, Davy; Martens, Lennart
2015-07-01
We present an MS(2) peak intensity prediction server that computes MS(2) charge 2+ and 3+ spectra from peptide sequences for the most common fragment ions. The server integrates the Unimod public domain post-translational modification database for modified peptides. The prediction model is an improvement of the previously published MS(2)PIP model for Orbitrap-LTQ CID spectra. Predicted MS(2) spectra can be downloaded as a spectrum file and can be visualized in the browser for comparisons with observations. In addition, we added prediction models for HCD fragmentation (Q-Exactive Orbitrap) and show that these models compute accurate intensity predictions on par with CID performance. We also show that training prediction models for CID and HCD separately improves the accuracy for each fragmentation method. The MS(2)PIP prediction server is accessible from http://iomics.ugent.be/ms2pip. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
2013-09-01
Malicious Activity Simulation Tool MMORPG Massively Multiplayer Online Role-Playing Game MMS Mission Management Server MOA Memorandum of Agreement MS...conferencing, and massively multiplayer online role- playing games (MMORPG). During all of these Internet-based exchanges and transactions, the Internet user...In its 2011 Internet Crime Report, the Internet Crime Complaint Center (IC3) stated there were more than 300,000 complaints of online criminal
Reactive Aggregate Model Protecting Against Real-Time Threats
2014-09-01
on the underlying functionality of three core components. • MS SQL server 2008 backend database. • Microsoft IIS running on Windows server 2008...services. The capstone tested a Linux-based Apache web server with the following software implementations: • MySQL as a Linux-based backend server for...malicious compromise. 1. Assumptions • GINA could connect to a backend MS SQL database through proper configuration of DotNetNuke. • GINA had access
Performance of the High Sensitivity Open Source Multi-GNSS Assisted GNSS Reference Server.
NASA Astrophysics Data System (ADS)
Sarwar, Ali; Rizos, Chris; Glennon, Eamonn
2015-06-01
The Open Source GNSS Reference Server (OSGRS) exploits the GNSS Reference Interface Protocol (GRIP) to provide assistance data to GPS receivers. Assistance can be in terms of signal acquisition and in the processing of the measurement data. The data transfer protocol is based on Extensible Mark-up Language (XML) schema. The first version of the OSGRS required a direct hardware connection to a GPS device to acquire the data necessary to generate the appropriate assistance. Scenarios of interest for the OSGRS users are weak signal strength indoors, obstructed outdoors or heavy multipath environments. This paper describes an improved version of OSGRS that provides alternative assistance support from a number of Global Navigation Satellite Systems (GNSS). The underlying protocol to transfer GNSS assistance data from global casters is the Networked Transport of RTCM (Radio Technical Commission for Maritime Services) over Internet Protocol (NTRIP), and/or the RINEX (Receiver Independent Exchange) format. This expands the assistance and support model of the OSGRS to globally available GNSS data servers connected via internet casters. A variety of formats and versions of RINEX and RTCM streams become available, which strengthens the assistance provisioning capability of the OSGRS platform. The prime motivation for this work was to enhance the system architecture of the OSGRS to take advantage of globally available GNSS data sources. Open source software architectures and assistance models provide acquisition and data processing assistance for GNSS receivers operating in weak signal environments. This paper describes test scenarios to benchmark the OSGRSv2 performance against other Assisted-GNSS solutions. Benchmarking devices include the SPOT satellite messenger, MS-Based & MS-Assisted GNSS, HSGNSS (SiRFstar-III) and Wireless Sensor Networks Assisted-GNSS. Benchmarked parameters include the number of tracked satellites, the Time to Fix First (TTFF), navigation availability and accuracy. Three different configurations of Multi-GNSS assistance servers were used, namely Cloud-Client-Server, the Demilitarized Zone (DMZ) Client-Server and PC-Client-Server; with respect to the connectivity location of client and server. The impact on the performance based on server and/or client initiation, hardware capability, network latency, processing delay and computation times with their storage, scalability, processing and load sharing capabilities, were analysed. The performance of the OSGRS is compared against commercial GNSS, Assisted-GNSS and WSN-enabled GNSS devices. The OSGRS system demonstrated lower TTFF and higher availability.
Server-Controlled Identity-Based Authenticated Key Exchange
NASA Astrophysics Data System (ADS)
Guo, Hua; Mu, Yi; Zhang, Xiyong; Li, Zhoujun
We present a threshold identity-based authenticated key exchange protocol that can be applied to an authenticated server-controlled gateway-user key exchange. The objective is to allow a user and a gateway to establish a shared session key with the permission of the back-end servers, while the back-end servers cannot obtain any information about the established session key. Our protocol has potential applications in strong access control of confidential resources. In particular, our protocol possesses the semantic security and demonstrates several highly-desirable security properties such as key privacy and transparency. We prove the security of the protocol based on the Bilinear Diffie-Hellman assumption in the random oracle model.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-24
... electronic servers in close physical proximity to the Exchange's trading and execution system. See id. at 59299. Partial Cabinets A User is able to request a physical cabinet to house its servers and other... Exchange enter the Exchange's trading and execution systems through the same order gateway, regardless of...
Implementation of Medical Information Exchange System Based on EHR Standard
Han, Soon Hwa; Kim, Sang Guk; Jeong, Jun Yong; Lee, Bi Na; Choi, Myeong Seon; Kim, Il Kon; Park, Woo Sung; Ha, Kyooseob; Cho, Eunyoung; Kim, Yoon; Bae, Jae Bong
2010-01-01
Objectives To develop effective ways of sharing patients' medical information, we developed a new medical information exchange system (MIES) based on a registry server, which enabled us to exchange different types of data generated by various systems. Methods To assure that patient's medical information can be effectively exchanged under different system environments, we adopted the standardized data transfer methods and terminologies suggested by the Center for Interoperable Electronic Healthcare Record (CIEHR) of Korea in order to guarantee interoperability. Regarding information security, MIES followed the security guidelines suggested by the CIEHR of Korea. This study aimed to develop essential security systems for the implementation of online services, such as encryption of communication, server security, database security, protection against hacking, contents, and network security. Results The registry server managed information exchange as well as the registration information of the clinical document architecture (CDA) documents, and the CDA Transfer Server was used to locate and transmit the proper CDA document from the relevant repository. The CDA viewer showed the CDA documents via connection with the information systems of related hospitals. Conclusions This research chooses transfer items and defines document standards that follow CDA standards, such that exchange of CDA documents between different systems became possible through ebXML. The proposed MIES was designed as an independent central registry server model in order to guarantee the essential security of patients' medical information. PMID:21818447
Implementation of Medical Information Exchange System Based on EHR Standard.
Han, Soon Hwa; Lee, Min Ho; Kim, Sang Guk; Jeong, Jun Yong; Lee, Bi Na; Choi, Myeong Seon; Kim, Il Kon; Park, Woo Sung; Ha, Kyooseob; Cho, Eunyoung; Kim, Yoon; Bae, Jae Bong
2010-12-01
To develop effective ways of sharing patients' medical information, we developed a new medical information exchange system (MIES) based on a registry server, which enabled us to exchange different types of data generated by various systems. To assure that patient's medical information can be effectively exchanged under different system environments, we adopted the standardized data transfer methods and terminologies suggested by the Center for Interoperable Electronic Healthcare Record (CIEHR) of Korea in order to guarantee interoperability. Regarding information security, MIES followed the security guidelines suggested by the CIEHR of Korea. This study aimed to develop essential security systems for the implementation of online services, such as encryption of communication, server security, database security, protection against hacking, contents, and network security. The registry server managed information exchange as well as the registration information of the clinical document architecture (CDA) documents, and the CDA Transfer Server was used to locate and transmit the proper CDA document from the relevant repository. The CDA viewer showed the CDA documents via connection with the information systems of related hospitals. This research chooses transfer items and defines document standards that follow CDA standards, such that exchange of CDA documents between different systems became possible through ebXML. The proposed MIES was designed as an independent central registry server model in order to guarantee the essential security of patients' medical information.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-20
... Change To Eliminate the 100MB Connectivity Option and Fee March 14, 2012. Pursuant to Section 19(b)(1) of... Exchange proposes to eliminate 100MB connectivity between the Exchange and co-located servers, as well as..., Section X(b) to eliminate 100MB connectivity between the Exchange and co-located servers, as well as...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-24
... services allow Users to rent space in the data center so they may locate their electronic servers in close... User is able to request a physical cabinet to house its servers and other equipment in the data center... Exchange enter the Exchange's trading and execution systems through the same order gateway, regardless of...
Mobile Assisted Security in Wireless Sensor Networks
2015-08-03
server from Google’s DNS, Chromecast and the content server does the 3-way TCP Handshake which is followed by Client Hello and Server Hello TLS messages...utilized TLS v1.2, except NTP servers and google’s DNS server. In the TLS v1.2, after handshake, client and server sends Client Hello and Server Hello ...Messages in order. In Client Hello messages, client offers a list of Cipher Suites that it supports. Each Cipher Suite defines the key exchange algorithm
Wang, Chunliang; Ritter, Felix; Smedby, Orjan
2010-07-01
To enhance the functional expandability of a picture archiving and communication systems (PACS) workstation and to facilitate the integration of third-part image-processing modules, we propose a browser-server style method. In the proposed solution, the PACS workstation shows the front-end user interface defined in an XML file while the image processing software is running in the background as a server. Inter-process communication (IPC) techniques allow an efficient exchange of image data, parameters, and user input between the PACS workstation and stand-alone image-processing software. Using a predefined communication protocol, the PACS workstation developer or image processing software developer does not need detailed information about the other system, but will still be able to achieve seamless integration between the two systems and the IPC procedure is totally transparent to the final user. A browser-server style solution was built between OsiriX (PACS workstation software) and MeVisLab (Image-Processing Software). Ten example image-processing modules were easily added to OsiriX by converting existing MeVisLab image processing networks. Image data transfer using shared memory added <10ms of processing time while the other IPC methods cost 1-5 s in our experiments. The browser-server style communication based on IPC techniques is an appealing method that allows PACS workstation developers and image processing software developers to cooperate while focusing on different interests.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-15
... premises controlled by the Exchange in order that they may locate their electronic servers in close... the Exchange's trading and execution systems through the same order gateway regardless of whether the... weekends if NOT scheduled at least 1 day in advance. Rack and Stack Installation of one $200 per server...
2004-03-01
with MySQL . This choice was made because MySQL is open source. Any significant database engine such as Oracle or MS- SQL or even MS Access can be used...10 Figure 6. The DoD vs . Commercial Life Cycle...necessarily be interested in SCADA network security 13. MySQL (Database server) – This station represents a typical data server for a web page
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, D.W.; Johnston, W.E.; Hall, D.E.
1990-03-01
We describe the use of the Sun Remote Procedure Call and Unix socket interprocess communication mechanisms to provide the network transport for a distributed, client-server based, image handling system. Clients run under Unix or UNICOS and servers run under Unix or MS-DOS. The use of remote procedure calls across local or wide-area networks to make video movies is addressed.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-14
... Market Maker Standard quote server as a gateway for communicating eQuotes to MIAX. Because of the... connect the Limited Service Ports to independent servers that host their eQuote and purge functionality... same server for all of their Market Maker quoting activity. Currently, Market Makers in the MIAX System...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-26
... rather than forcing them to use their Market Maker Standard quote server as a gateway for communicating e... technical flexibility to connect additional Limited Service Ports to independent servers that host their e... mitigate the risk of using the same server for all of their Market Maker quoting activity. By using the...
Data exchange technology based on handshake protocol for industrial automation system
NASA Astrophysics Data System (ADS)
Astafiev, A. V.; Shardin, T. O.
2018-05-01
In the article, questions of data exchange technology based on the handshake protocol for industrial automation system are considered. The methods of organizing the technology in client-server applications are analyzed. In the process of work, the main threats of client-server applications that arise during the information interaction of users are indicated. Also, a comparative analysis of analogue systems was carried out, as a result of which the most suitable option was chosen for further use. The basic schemes for the operation of the handshake protocol are shown, as well as the general scheme of the implemented application, which describes the entire process of interaction between the client and the server.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-16
... the Exchange in order that they may locate their electronic servers in close physical proximity to the... execution systems through the same order gateway regardless of whether the sender is co-located in the... scheduled at least 1 day in advance. Rack and Stack Installation of one $200 per server. server in User's...
Czaplewski, Cezary; Karczynska, Agnieszka; Sieradzan, Adam K; Liwo, Adam
2018-04-30
A server implementation of the UNRES package (http://www.unres.pl) for coarse-grained simulations of protein structures with the physics-based UNRES model, coined a name UNRES server, is presented. In contrast to most of the protein coarse-grained models, owing to its physics-based origin, the UNRES force field can be used in simulations, including those aimed at protein-structure prediction, without ancillary information from structural databases; however, the implementation includes the possibility of using restraints. Local energy minimization, canonical molecular dynamics simulations, replica exchange and multiplexed replica exchange molecular dynamics simulations can be run with the current UNRES server; the latter are suitable for protein-structure prediction. The user-supplied input includes protein sequence and, optionally, restraints from secondary-structure prediction or small x-ray scattering data, and simulation type and parameters which are selected or typed in. Oligomeric proteins, as well as those containing D-amino-acid residues and disulfide links can be treated. The output is displayed graphically (minimized structures, trajectories, final models, analysis of trajectory/ensembles); however, all output files can be downloaded by the user. The UNRES server can be freely accessed at http://unres-server.chem.ug.edu.pl.
Hu, Peter F; Yang, Shiming; Li, Hsiao-Chi; Stansbury, Lynn G; Yang, Fan; Hagegeorge, George; Miller, Catriona; Rock, Peter; Stein, Deborah M; Mackenzie, Colin F
2017-01-01
Research and practice based on automated electronic patient monitoring and data collection systems is significantly limited by system down time. We asked whether a triple-redundant Monitor of Monitors System (MoMs) to collect and summarize key information from system-wide data sources could achieve high fault tolerance, early diagnosis of system failure, and improve data collection rates. In our Level I trauma center, patient vital signs(VS) monitors were networked to collect real time patient physiologic data streams from 94 bed units in our various resuscitation, operating, and critical care units. To minimize the impact of server collection failure, three BedMaster® VS servers were used in parallel to collect data from all bed units. To locate and diagnose system failures, we summarized critical information from high throughput datastreams in real-time in a dashboard viewer and compared the before and post MoMs phases to evaluate data collection performance as availability time, active collection rates, and gap duration, occurrence, and categories. Single-server collection rates in the 3-month period before MoMs deployment ranged from 27.8 % to 40.5 % with combined 79.1 % collection rate. Reasons for gaps included collection server failure, software instability, individual bed setting inconsistency, and monitor servicing. In the 6-month post MoMs deployment period, average collection rates were 99.9 %. A triple redundant patient data collection system with real-time diagnostic information summarization and representation improved the reliability of massive clinical data collection to nearly 100 % in a Level I trauma center. Such data collection framework may also increase the automation level of hospital-wise information aggregation for optimal allocation of health care resources.
Leaders Are the Network: Applying the Kotter Model in Shaping Future Information Systems
2010-01-01
common operational picture (COP) ( Hinson , 2009). Figure 3 demonstrates how CID combines Link 16 and FBCB2 feeds. The CID server polls different...Link 16 Info Exchange A B C S A D S Figure 3 FBCB2-Link 16 Information Exchange. Source: Created by author based on information derived from Hinson ...31552-new-army-leader-development-strategy- released/ (accessed July 30, 2010). Hinson , Jason and Summit, Bob, “Combat Identification Server: Blue
Web catalog of oceanographic data using GeoNetwork
NASA Astrophysics Data System (ADS)
Marinova, Veselka; Stefanov, Asen
2017-04-01
Most of the data collected, analyzed and used by Bulgarian oceanographic data center (BgODC) from scientific cruises, argo floats, ferry boxes and real time operating systems are spatially oriented and need to be displayed on the map. The challenge is to make spatial information more accessible to users, decision makers and scientists. In order to meet this challenge, BgODC concentrate its efforts on improving dynamic and standardized access to their geospatial data as well as those from various related organizations and institutions. BgODC currently is implementing a project to create a geospatial portal for distributing metadata and search, exchange and harvesting spatial data. There are many open source software solutions able to create such spatial data infrastructure (SDI). Finally, the GeoNetwork open source is chosen, as it is already widespread. This software is free, effective and "cheap" solution for implementing SDI at organization level. It is platform independent and runs under many operating systems. Filling of the catalog goes through these practical steps: • Managing and storing data reliably within MS SQL spatial data base; • Registration of maps and data of various formats and sources in GeoServer (most popular open source geospatial server embedded with GeoNetwork) ; • Filling added meta data and publishing geospatial data at the desktop of GeoNetwork. GeoServer and GeoNetwork are based on Java so they require installing of a servlet engine like Tomcat. The experience gained from the use of GeoNetwork Open Source confirms that the catalog meets the requirements for data management and is flexible enough to customize. Building the catalog facilitates sustainable data exchange between end users. The catalog is a big step towards implementation of the INSPIRE directive due to availability of many features necessary for producing "INSPIRE compliant" metadata records. The catalog now contains all available GIS data provided by BgODC for Internet access. Searching data within the catalog is based upon geographic extent, theme type and free text search.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perry, Marcia
The IRCD is an IRC server that was originally distributed by the IRCD Hybrid developer team for use as a server in IRC message over the public Internet. By supporting the IRC protocol defined in the IRC RFC, IRCD allows the users to create and join channels for group or one-to-one text-based instant messaging. It stores information about channels (e.g., whether it is public, secret, or invite-only, the topic set, membership) and users (who is online and what channels they are members of). It receives messages for a specific user or channel and forwards these messages to the targeted destination.more » Since server-to-server communication is also supported, these targeted destinations may be connected to different IRC servers. Messages are exchanged over TCP connections that remain open between the client and the server. The IRCD is being used within the Pervasive Computing Collaboration Environment (PCCE) as the 'chat server' for message exchange over public and private channels. After an LBNLSecureMessaging(PCCE chat) client has been authenticated, the client connects to IRCD with its assigned nickname or 'nick.' The client can then create or join channels for group discussions or one-to-one conversations. These channels can have an initial mode of public or invite-only and the mode may be changed after creation. If a channel is public, any one online can join the discussion; if a channel is invite-only, users can only join if existing members of the channel explicity invite them. Users can be invited to any type of channel and users may be members of multiple channels simultaneously. For use with the PCCE environment, the IRCD application (which was written in C) was ported to Linux and has been tested and installed under Linux Redhat 7.2. The source code was also modified with SSL so that all messages exchanged over the network are encrypted. This modified IRC server also verifies with an authentication server that the client is who he or she claims to be and that this user is authorized to ain access to the IRCD.« less
CheD: chemical database compilation tool, Internet server, and client for SQL servers.
Trepalin, S V; Yarkov, A V
2001-01-01
An efficient program, which runs on a personal computer, for the storage, retrieval, and processing of chemical information, is presented, The program can work both as a stand-alone application or in conjunction with a specifically written Web server application or with some standard SQL servers, e.g., Oracle, Interbase, and MS SQL. New types of data fields are introduced, e.g., arrays for spectral information storage, HTML and database links, and user-defined functions. CheD has an open architecture; thus, custom data types, controls, and services may be added. A WWW server application for chemical data retrieval features an easy and user-friendly installation on Windows NT or 95 platforms.
SLiMSearch 2.0: biological context for short linear motifs in proteins
Davey, Norman E.; Haslam, Niall J.; Shields, Denis C.
2011-01-01
Short, linear motifs (SLiMs) play a critical role in many biological processes. The SLiMSearch 2.0 (Short, Linear Motif Search) web server allows researchers to identify occurrences of a user-defined SLiM in a proteome, using conservation and protein disorder context statistics to rank occurrences. User-friendly output and visualizations of motif context allow the user to quickly gain insight into the validity of a putatively functional motif occurrence. For each motif occurrence, overlapping UniProt features and annotated SLiMs are displayed. Visualization also includes annotated multiple sequence alignments surrounding each occurrence, showing conservation and protein disorder statistics in addition to known and predicted SLiMs, protein domains and known post-translational modifications. In addition, enrichment of Gene Ontology terms and protein interaction partners are provided as indicators of possible motif function. All web server results are available for download. Users can search motifs against the human proteome or a subset thereof defined by Uniprot accession numbers or GO term. The SLiMSearch server is available at: http://bioware.ucd.ie/slimsearch2.html. PMID:21622654
Japan Data Exchange Network JDXnet and Cloud-type Data Relay Server for Earthquake Observation Data
NASA Astrophysics Data System (ADS)
Takano, K.; Urabe, T.; Tsuruoka, H.; Nakagawa, S.
2015-12-01
In Japan, high-sensitive seismic observation and broad-band seismic observation are carried out by several organization such as Japan Meteorological Agency (JMA) , National Research Institute for Earth Science and Disaster Prevention (NIED), nine National Universities, Japan Agency for Marine-Earth Science and Technology (JAMSTEC) , etc. The total number of the observation station is about 1400 points. The total volume of the seismic waveform data collected from all these observation station is about 1MByte for 1 second (about 8 to 10Mbps) by using the WIN system(Urabe 1991). JDXnet is the Japan Data eXchange network for earthquake observation data. JDXnet was started from 2007 by cooperation of the researchers of each organization. All the seismic waveform data are available at the all organizations in real-time. The core of JDXnet is the broadcast type real-time data exchange by using the nationwide L2-VPN service offered in JGN-X of NICT and SINET4 of NII. Before the Tohoku earthquake, the nine national universities had collected seismic data to each data center and then exchanged with other universities and institutions by JDXnet. However, in this case, if the center of the university was stopped, all data of the university could not use even though there are some alive observation stations. Because of this problem, we have prepared the data relay server in the data center of SINET4 ie the cloud center. This data relay server collects data directly from the observation stations of the universities and delivers data to all universities and institutions by JDXnet. By using the relay server on cloud center, even if some universities are affected by a large disaster, it is eliminated that the data of the living station is lost. If the researchers set up seismometers and send data to the relay server, then data are available to all researchers. This mechanism promotes the joint use of the seismometers and joint research activities in nationwide researchers.
Conversation Threads Hidden within Email Server Logs
NASA Astrophysics Data System (ADS)
Palus, Sebastian; Kazienko, Przemysław
Email server logs contain records of all email Exchange through this server. Often we would like to analyze those emails not separately but in conversation thread, especially when we need to analyze social network extracted from those email logs. Unfortunately each mail is in different record and those record are not tided to each other in any obvious way. In this paper method for discussion threads extraction was proposed together with experiments on two different data sets - Enron and WrUT..
www.p2p.edu: Rip, Mix & Burn Your Education.
ERIC Educational Resources Information Center
Gillespie, Thom
2001-01-01
Discusses peer to peer technology which allows uploading files from one hard drive to another. Topics include the client/server model for education; the Napster client/server model; Gnutella; Freenet and other projects to allow the free exchange of information without censorship; bandwidth problems; copyright issues; metadata; and the United…
Proposal for a new CAPE-OPEN Object Model
Process simulation applications require the exchange of significant amounts of data between the flowsheet environment, unit operation model, and thermodynamic server. Packing and unpacking various data types and exchanging data using structured text-based architectures, including...
A web access script language to support clinical application development.
O'Kane, K C; McColligan, E E
1998-02-01
This paper describes the development of a script language to support the implementation of decentralized, clinical information applications on the World Wide Web (Web). The goal of this work is to facilitate construction of low overhead, fully functional clinical information systems that can be accessed anywhere by low cost Web browsers to search, retrieve and analyze stored patient data. The Web provides a model of network access to data bases on a global scale. Although it was originally conceived as a means to exchange scientific documents, Web browsers and servers currently support access to a wide variety of audio, video, graphical and text based data to a rapidly growing community. Access to these services is via inexpensive client software browsers that connect to servers by means of the open architecture of the Internet. In this paper, the design and implementation of a script language that supports the development of low cost, Web-based, distributed clinical information systems for both Inter- and Intra-Net use is presented. The language is based on the Mumps language and, consequently, supports many legacy applications with few modifications. Several enhancements, however, have been made to support modern programming practices and the Web interface. The interpreter for the language also supports standalone program execution on Unix, MS-Windows, OS/2 and other operating systems.
NASA Technical Reports Server (NTRS)
Sundermier, Amy (Inventor)
2002-01-01
A method for acquiring and assembling software components at execution time into a client program, where the components may be acquired from remote networked servers is disclosed. The acquired components are assembled according to knowledge represented within one or more acquired mediating components. A mediating component implements knowledge of an object model. A mediating component uses its implemented object model knowledge, acquired component class information and polymorphism to assemble components into an interacting program at execution time. The interactions or abstract relationships between components in the object model may be implemented by the mediating component as direct invocations or indirect events or software bus exchanges. The acquired components may establish communications with remote servers. The acquired components may also present a user interface representing data to be exchanged with the remote servers. The mediating components may be assembled into layers, allowing arbitrarily complex programs to be constructed at execution time.
Asynchronous data change notification between database server and accelerator controls system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fu, W.; Morris, J.; Nemesure, S.
2011-10-10
Database data change notification (DCN) is a commonly used feature. Not all database management systems (DBMS) provide an explicit DCN mechanism. Even for those DBMS's which support DCN (such as Oracle and MS SQL server), some server side and/or client side programming may be required to make the DCN system work. This makes the setup of DCN between database server and interested clients tedious and time consuming. In accelerator control systems, there are many well established software client/server architectures (such as CDEV, EPICS, and ADO) that can be used to implement data reflection servers that transfer data asynchronously to anymore » client using the standard SET/GET API. This paper describes a method for using such a data reflection server to set up asynchronous DCN (ADCN) between a DBMS and clients. This method works well for all DBMS systems which provide database trigger functionality. Asynchronous data change notification (ADCN) between database server and clients can be realized by combining the use of a database trigger mechanism, which is supported by major DBMS systems, with server processes that use client/server software architectures that are familiar in the accelerator controls community (such as EPICS, CDEV or ADO). This approach makes the ADCN system easy to set up and integrate into an accelerator controls system. Several ADCN systems have been set up and used in the RHIC-AGS controls system.« less
CDC WONDER: a cooperative processing architecture for public health.
Friede, A; Rosen, D H; Reid, J A
1994-01-01
CDC WONDER is an information management architecture designed for public health. It provides access to information and communications without the user's needing to know the location of data or communication pathways and mechanisms. CDC WONDER users have access to extractions from some 40 databases; electronic mail (e-mail); and surveillance data processing. System components include the Remote Client, the Communications Server, the Queue Managers, and Data Servers and Process Servers. The Remote Client software resides in the user's machine; other components are at the Centers for Disease Control and Prevention (CDC). The Remote Client, the Communications Server, and the Applications Server provide access to the information and functions in the Data Servers and Process Servers. The system architecture is based on cooperative processing, and components are coupled via pure message passing, using several protocols. This architecture allows flexibility in the choice of hardware and software. One system limitation is that final results from some subsystems are obtained slowly. Although designed for public health, CDC WONDER could be useful for other disciplines that need flexible, integrated information exchange. PMID:7719813
NASA Astrophysics Data System (ADS)
Reddy, K. Rasool; Rao, Ch. Madhava
2018-04-01
Currently safety is one of the primary concerns in the transmission of images due to increasing the use of images within the industrial applications. So it's necessary to secure the image facts from unauthorized individuals. There are various strategies are investigated to secure the facts. In that encryption is certainly one of maximum distinguished method. This paper gives a sophisticated Rijndael (AES) algorithm to shield the facts from unauthorized humans. Here Exponential Key Change (EKE) concept is also introduced to exchange the key between client and server. The things are exchange in a network among client and server through a simple protocol is known as Trivial File Transfer Protocol (TFTP). This protocol is used mainly in embedded servers to transfer the data and also provide protection to the data if protection capabilities are integrated. In this paper, implementing a GUI environment for image encryption and decryption. All these experiments carried out on Linux environment the usage of Open CV-Python script.
Deterministic entanglement distillation for secure double-server blind quantum computation.
Sheng, Yu-Bo; Zhou, Lan
2015-01-15
Blind quantum computation (BQC) provides an efficient method for the client who does not have enough sophisticated technology and knowledge to perform universal quantum computation. The single-server BQC protocol requires the client to have some minimum quantum ability, while the double-server BQC protocol makes the client's device completely classical, resorting to the pure and clean Bell state shared by two servers. Here, we provide a deterministic entanglement distillation protocol in a practical noisy environment for the double-server BQC protocol. This protocol can get the pure maximally entangled Bell state. The success probability can reach 100% in principle. The distilled maximally entangled states can be remaind to perform the BQC protocol subsequently. The parties who perform the distillation protocol do not need to exchange the classical information and they learn nothing from the client. It makes this protocol unconditionally secure and suitable for the future BQC protocol.
Deterministic entanglement distillation for secure double-server blind quantum computation
Sheng, Yu-Bo; Zhou, Lan
2015-01-01
Blind quantum computation (BQC) provides an efficient method for the client who does not have enough sophisticated technology and knowledge to perform universal quantum computation. The single-server BQC protocol requires the client to have some minimum quantum ability, while the double-server BQC protocol makes the client's device completely classical, resorting to the pure and clean Bell state shared by two servers. Here, we provide a deterministic entanglement distillation protocol in a practical noisy environment for the double-server BQC protocol. This protocol can get the pure maximally entangled Bell state. The success probability can reach 100% in principle. The distilled maximally entangled states can be remaind to perform the BQC protocol subsequently. The parties who perform the distillation protocol do not need to exchange the classical information and they learn nothing from the client. It makes this protocol unconditionally secure and suitable for the future BQC protocol. PMID:25588565
Amin, Ruhul; Islam, S K Hafizul; Biswas, G P; Khan, Muhammad Khurram; Kumar, Neeraj
2015-11-01
In the last few years, numerous remote user authentication and session key agreement schemes have been put forwarded for Telecare Medical Information System, where the patient and medical server exchange medical information using Internet. We have found that most of the schemes are not usable for practical applications due to known security weaknesses. It is also worth to note that unrestricted number of patients login to the single medical server across the globe. Therefore, the computation and maintenance overhead would be high and the server may fail to provide services. In this article, we have designed a medical system architecture and a standard mutual authentication scheme for single medical server, where the patient can securely exchange medical data with the doctor(s) via trusted central medical server over any insecure network. We then explored the security of the scheme with its resilience to attacks. Moreover, we formally validated the proposed scheme through the simulation using Automated Validation of Internet Security Schemes and Applications software whose outcomes confirm that the scheme is protected against active and passive attacks. The performance comparison demonstrated that the proposed scheme has lower communication cost than the existing schemes in literature. In addition, the computation cost of the proposed scheme is nearly equal to the exiting schemes. The proposed scheme not only efficient in terms of different security attacks, but it also provides an efficient login, mutual authentication, session key agreement and verification and password update phases along with password recovery.
Patients’ Data Management System Protected by Identity-Based Authentication and Key Exchange
Rivero-García, Alexandra; Santos-González, Iván; Hernández-Goya, Candelaria; Caballero-Gil, Pino; Yung, Moti
2017-01-01
A secure and distributed framework for the management of patients’ information in emergency and hospitalization services is proposed here in order to seek improvements in efficiency and security in this important area. In particular, confidentiality protection, mutual authentication, and automatic identification of patients are provided. The proposed system is based on two types of devices: Near Field Communication (NFC) wristbands assigned to patients, and mobile devices assigned to medical staff. Two other main elements of the system are an intermediate server to manage the involved data, and a second server with a private key generator to define the information required to protect communications. An identity-based authentication and key exchange scheme is essential to provide confidential communication and mutual authentication between the medical staff and the private key generator through an intermediate server. The identification of patients is carried out through a keyed-hash message authentication code. Thanks to the combination of the aforementioned tools, a secure alternative mobile health (mHealth) scheme for managing patients’ data is defined for emergency and hospitalization services. Different parts of the proposed system have been implemented, including mobile application, intermediate server, private key generator and communication channels. Apart from that, several simulations have been performed, and, compared with the current system, significant improvements in efficiency have been observed. PMID:28362328
Patients' Data Management System Protected by Identity-Based Authentication and Key Exchange.
Rivero-García, Alexandra; Santos-González, Iván; Hernández-Goya, Candelaria; Caballero-Gil, Pino; Yung, Moti
2017-03-31
A secure and distributed framework for the management of patients' information in emergency and hospitalization services is proposed here in order to seek improvements in efficiency and security in this important area. In particular, confidentiality protection, mutual authentication, and automatic identification of patients are provided. The proposed system is based on two types of devices: Near Field Communication (NFC) wristbands assigned to patients, and mobile devices assigned to medical staff. Two other main elements of the system are an intermediate server to manage the involved data, and a second server with a private key generator to define the information required to protect communications. An identity-based authentication and key exchange scheme is essential to provide confidential communication and mutual authentication between the medical staff and the private key generator through an intermediate server. The identification of patients is carried out through a keyed-hash message authentication code. Thanks to the combination of the aforementioned tools, a secure alternative mobile health (mHealth) scheme for managing patients' data is defined for emergency and hospitalization services. Different parts of the proposed system have been implemented, including mobile application, intermediate server, private key generator and communication channels. Apart from that, several simulations have been performed, and, compared with the current system, significant improvements in efficiency have been observed.
NASA Technical Reports Server (NTRS)
Baumbach, J. I.; Vonirmer, A.
1995-01-01
To assist current discussion in the field of ion mobility spectrometry, at the Institut fur Spectrochemie und angewandte Spektroskopie, Dortmund, start with 4th of December, 1994 work of an FTP-Server, available for all research groups at univerisities, institutes and research worker in industry. We support the exchange, interpretation, and database-search of ion mobility spectra through data format JCAMP-DS (Joint Committee on Atomic and Molecular Physical Data) as well as literature retrieval, pre-print, notice, and discussion board. We describe in general lines the entrance conditions, local addresses, and main code words. For further details, a monthly news report will be prepared for all common users. Internet email address for subscribing is included in document.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-22
..., Incorporated; Order Approving a Proposed Rule Change Relating to Co- Location Service Fees I. Introduction On... to co-location services and related fees. The proposed rule change was published for comment in the... of the equipment to the Exchange's servers, at no additional charge. This ``co-location service...
Navy Network Dependability: Models, Metrics, and Tools
2010-01-01
different COP servers. The COP Synchronization Tool (CST) is the preferred method of exchanging data between COP servers: A critical component of COP...ASW mission’s equipment strings. A major difference in results between the new model and the old one is that the new one is far less optimistic about...understand why perceptions about the dependability (e.g., availability) of networks from users’ (e.g., sailors) per- spectives sometimes differ from the
2009-09-01
DIFFIE-HELLMAN KEY EXCHANGE .......................14 III. GHOSTNET SETUP .........................................15 A. INSTALLATION OF OPENVPN FOR...16 3. Verifying the Secure Connection ..............16 B. RUNNING OPENVPN AS A SERVER ON WINDOWS ............17 1. Creating...Generating Server and Client Keys ............20 5. Keys to Transfer to the Client ...............21 6. Configuring OpenVPN to Use Certificates
CINTEX: International Interoperability Extensions to EOSDIS
NASA Technical Reports Server (NTRS)
Graves, Sara J.
1997-01-01
A large part of the research under this cooperative agreement involved working with representatives of the DLR, NASDA, EDC, and NOAA-SAA data centers to propose a set of enhancements and additions to the EOSDIS Version 0 Information Management System (V0 IMS) Client/Server Message Protocol. Helen Conover of ITSL led this effort to provide for an additional geographic search specification (WRS Path/Row), data set- and data center-specific search criteria, search by granule ID, specification of data granule subsetting requests, data set-based ordering, and the addition of URLs to result messages. The V0 IMS Server Cookbook is an evolving document, providing resources and information to data centers setting up a VO IMS Server. Under this Cooperative Agreement, Helen Conover revised, reorganized, and expanded this document, and converted it to HTML. Ms. Conover has also worked extensively with the IRE RAS data center, CPSSI, in Russia. She served as the primary IMS contact for IRE-CPSSI and as IRE-CPSSI's liaison to other members of IMS and Web Gateway (WG) development teams. Her documentation of IMS problems in the IRE environment (Sun servers and low network bandwidth) led to a general restructuring of the V0 IMS Client message polling system. to the benefit of all IMS participants. In addition to the IMS server software and documentation. which are generally available to CINTEX sites, Ms. Conover also provided database design documentation and consulting, order tracking software, and hands-on testing and debug assistance to IRE. In the final pre-operational phase of IRE-CPSSI development, she also supplied information on configuration management, including ideas and processes in place at the Global Hydrology Resource Center (GHRC), an EOSDIS data center operated by ITSL.
Tandon, Chanderdeep
2013-01-01
Background The increasing number of patients suffering from urolithiasis represents one of the major challenges which nephrologists face worldwide today. For enhancing therapeutic outcomes of this disease, the pathogenic basis for the formation of renal stones is the need of hour. Proteins are found as major component in human renal stone matrix and are considered to have a potential role in crystal–membrane interaction, crystal growth and stone formation but their role in urolithiasis still remains obscure. Methods Proteins were isolated from the matrix of human CaOx containing kidney stones. Proteins having MW>3 kDa were subjected to anion exchange chromatography followed by molecular-sieve chromatography. The effect of these purified proteins was tested against CaOx nucleation and growth and on oxalate injured Madin–Darby Canine Kidney (MDCK) renal epithelial cells for their activity. Proteins were identified by Matrix-assisted laser desorption/ionization-time of flight (MALDI-TOF MS) followed by database search with MASCOT server. In silico molecular interaction studies with CaOx crystals were also investigated. Results Five proteins were identified from the matrix of calcium oxalate kidney stones by MALDI-TOF MS followed by database search with MASCOT server with the competence to control the stone formation process. Out of which two proteins were promoters, two were inhibitors and one protein had a dual activity of both inhibition and promotion towards CaOx nucleation and growth. Further molecular modelling calculations revealed the mode of interaction of these proteins with CaOx at the molecular level. Conclusions We identified and characterized Ethanolamine-phosphate cytidylyltransferase, Ras GTPase-activating-like protein, UDP-glucose:glycoprotein glucosyltransferase 2, RIMS-binding protein 3A, Macrophage-capping protein as novel proteins from the matrix of human calcium oxalate stone which play a critical role in kidney stone formation. Thus, these proteins having potential to modulate calcium oxalate crystallization will throw light on understanding and controlling urolithiasis in humans. PMID:23894559
UAV Data Exchange Test Bed for At-Sea and Ashore Information Systems
2014-12-02
29 3.2 Visualization using NASA World Wind . . . . . . . . . . . . . . . . . . . . . . . . 30 3.3 Visualization using Quantum GIS...Data Server and the Global Positioning Warehouse 37 4.1 Naval Position Repository Installation . . . . . . . . . . . . . . . . . . . . . . . . . 37 4.2...4.4 Data Exchange between CSD and NPR . . . . . . . . . . . . . . . . . . . . . . . . 41 5 Maritime Tactical Command and Control 43 5.1 Global Command
Web-Based Distributed Simulation of Aeronautical Propulsion System
NASA Technical Reports Server (NTRS)
Zheng, Desheng; Follen, Gregory J.; Pavlik, William R.; Kim, Chan M.; Liu, Xianyou; Blaser, Tammy M.; Lopez, Isaac
2001-01-01
An application was developed to allow users to run and view the Numerical Propulsion System Simulation (NPSS) engine simulations from web browsers. Simulations were performed on multiple INFORMATION POWER GRID (IPG) test beds. The Common Object Request Broker Architecture (CORBA) was used for brokering data exchange among machines and IPG/Globus for job scheduling and remote process invocation. Web server scripting was performed by JavaServer Pages (JSP). This application has proven to be an effective and efficient way to couple heterogeneous distributed components.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-20
... Eliminate the 100MB Connectivity Option and Fee March 14, 2012. Pursuant to Section 19(b)(1) of the... proposes to eliminate 100MB connectivity between the Exchange and co-located servers, as well as associated... Proposed Rule Change 1. Purpose The Exchange proposes to modify Rule 7034(b) to eliminate 100MB...
Jeong, Eun Sook; Cha, Eunju; Cha, Sangwon; Kim, Sunghwan; Oh, Han Bin; Kwon, Oh-Seung; Lee, Jaeick
2017-11-21
In this study, a hydrogen/deuterium (H/D) exchange method using gas chromatography-electrospray ionization/mass spectrometry (GC-ESI/MS) was first investigated as a novel tool for online H/D exchange of multitarget analytes. The GC and ESI source were combined with a homemade heated column transfer line. GC-ESI/MS-based H/D exchange occurs in an atmospheric pressure ion source as a result of reacting the gas-phase analyte eluted from GC with charged droplets of deuterium oxide infused as the ESI spray solvent. The consumption of the deuterated solvent at a flow rate of 2 μL min -1 was more economical than that in online H/D exchange methods reported to date. In-ESI-source H/D exchange by GC-ESI/MS was applied to 11 stimulants with secondary amino or hydroxyl groups. After H/D exchange, the spectra of the stimulants showed unexchanged, partially exchanged, and fully exchanged ions showing various degrees of exchange. The relative abundances corrected for naturally occurring isotopes of the fully exchanged ions of stimulants, except for etamivan, were in the range 24.3-85.5%. Methylephedrine and cyclazodone showed low H/D exchange efficiency under acidic, neutral, and basic spray solvent conditions and nonexchange for etamivan with an acidic phenolic OH group. The in-ESI-source H/D exchange efficiency by GC-ESI/MS was sufficient to determine the number of hydrogen by elucidation of fragmentation from the spectrum. Therefore, this online H/D exchange technique using GC-ESI/MS has potential as an alternative method for simultaneous H/D exchange of multitarget analytes.
Electronic Transfer of School Records.
ERIC Educational Resources Information Center
Yeagley, Raymond
2001-01-01
Describes the electronic transfer of student records, notably the use of a Web-server named CHARLOTTE sponsored by the National Forum on Education Statistics and an Electronic Data Exchange system named SPEEDE/ExPRESS. (PKP)
PEM public key certificate cache server
NASA Astrophysics Data System (ADS)
Cheung, T.
1993-12-01
Privacy Enhanced Mail (PEM) provides privacy enhancement services to users of Internet electronic mail. Confidentiality, authentication, message integrity, and non-repudiation of origin are provided by applying cryptographic measures to messages transferred between end systems by the Message Transfer System. PEM supports both symmetric and asymmetric key distribution. However, the prevalent implementation uses a public key certificate-based strategy, modeled after the X.509 directory authentication framework. This scheme provides an infrastructure compatible with X.509. According to RFC 1422, public key certificates can be stored in directory servers, transmitted via non-secure message exchanges, or distributed via other means. Directory services provide a specialized distributed database for OSI applications. The directory contains information about objects and then provides structured mechanisms for accessing that information. Since directory services are not widely available now, a good approach is to manage certificates in a centralized certificate server. This document describes the detailed design of a centralized certificate cache serve. This server manages a cache of certificates and a cache of Certificate Revocation Lists (CRL's) for PEM applications. PEMapplications contact the server to obtain/store certificates and CRL's. The server software is programmed in C and ELROS. To use this server, ISODE has to be configured and installed properly. The ISODE library 'libisode.a' has to be linked together with this library because ELROS uses the transport layer functions provided by 'libisode.a.' The X.500 DAP library that is included with the ELROS distribution has to be linked in also, since the server uses the DAP library functions to communicate with directory servers.
Password-only authenticated three-party key exchange with provable security in the standard model.
Nam, Junghyun; Choo, Kim-Kwang Raymond; Kim, Junghwan; Kang, Hyun-Kyu; Kim, Jinsoo; Paik, Juryon; Won, Dongho
2014-01-01
Protocols for password-only authenticated key exchange (PAKE) in the three-party setting allow two clients registered with the same authentication server to derive a common secret key from their individual password shared with the server. Existing three-party PAKE protocols were proven secure under the assumption of the existence of random oracles or in a model that does not consider insider attacks. Therefore, these protocols may turn out to be insecure when the random oracle is instantiated with a particular hash function or an insider attack is mounted against the partner client. The contribution of this paper is to present the first three-party PAKE protocol whose security is proven without any idealized assumptions in a model that captures insider attacks. The proof model we use is a variant of the indistinguishability-based model of Bellare, Pointcheval, and Rogaway (2000), which is one of the most widely accepted models for security analysis of password-based key exchange protocols. We demonstrated that our protocol achieves not only the typical indistinguishability-based security of session keys but also the password security against undetectable online dictionary attacks.
Masson, Glenn R.; Maslen, Sarah L.
2017-01-01
Until recently, one of the major limitations of hydrogen/deuterium exchange mass spectrometry (HDX-MS) was the peptide-level resolution afforded by proteolytic digestion. This limitation can be selectively overcome through the use of electron-transfer dissociation to fragment peptides in a manner that allows the retention of the deuterium signal to produce hydrogen/deuterium exchange tandem mass spectrometry (HDX-MS/MS). Here, we describe the application of HDX-MS/MS to structurally screen inhibitors of the oncogene phosphoinositide 3-kinase catalytic p110α subunit. HDX-MS/MS analysis is able to discern a conserved mechanism of inhibition common to a range of inhibitors. Owing to the relatively minor amounts of protein required, this technique may be utilised in pharmaceutical development for screening potential therapeutics. PMID:28381646
Black Sea GIS developed in MHI
NASA Astrophysics Data System (ADS)
Zhuk, E.; Khaliulin, A.; Zodiatis, G.; Nikolaidis, A.; Isaeva, E.
2016-08-01
The work aims at creating the Black Sea geoinformation system (GIS) and complementing it with a model bank. The software for data access and visualization was developed using client server architecture. A map service based on MapServer and MySQL data management system were chosen for the Black Sea GIS. Php-modules and python-scripts are used to provide data access, processing, and exchange between the client application and the server. According to the basic data types, the module structure of GIS was developed. Each type of data is matched to a module which allows selection and visualization of the data. At present, a GIS complement with a model bank (the models build in to the GIS) and users' models (programs launched on users' PCs but receiving and displaying data via GIS) is developed.
Environmental Monitoring Using Sensor Networks
NASA Astrophysics Data System (ADS)
Yang, J.; Zhang, C.; Li, X.; Huang, Y.; Fu, S.; Acevedo, M. F.
2008-12-01
Environmental observatories, consisting of a variety of sensor systems, computational resources and informatics, are important for us to observe, model, predict, and ultimately help preserve the health of the nature. The commoditization and proliferation of coin-to-palm sized wireless sensors will allow environmental monitoring with unprecedented fine spatial and temporal resolution. Once scattered around, these sensors can identify themselves, locate their positions, describe their functions, and self-organize into a network. They communicate through wireless channel with nearby sensors and transmit data through multi-hop protocols to a gateway, which can forward information to a remote data server. In this project, we describe an environmental observatory called Texas Environmental Observatory (TEO) that incorporates a sensor network system with intertwined wired and wireless sensors. We are enhancing and expanding the existing wired weather stations to include wireless sensor networks (WSNs) and telemetry using solar-powered cellular modems. The new WSNs will monitor soil moisture and support long-term hydrologic modeling. Hydrologic models are helpful in predicting how changes in land cover translate into changes in the stream flow regime. These models require inputs that are difficult to measure over large areas, especially variables related to storm events, such as soil moisture antecedent conditions and rainfall amount and intensity. This will also contribute to improve rainfall estimations from meteorological radar data and enhance hydrological forecasts. Sensor data are transmitted from monitoring site to a Central Data Collection (CDC) Server. We incorporate a GPRS modem for wireless telemetry, a single-board computer (SBC) as Remote Field Gateway (RFG) Server, and a WSN for distributed soil moisture monitoring. The RFG provides effective control, management, and coordination of two independent sensor systems, i.e., a traditional datalogger-based wired sensor system and the WSN-based wireless sensor system. The RFG also supports remote manipulation of the devices in the field such as the SBC, datalogger, and WSN. Sensor data collected from the distributed monitoring stations are stored in a database (DB) Server. The CDC Server acts as an intermediate component to hide the heterogeneity of different devices and support data validation required by the DB Server. Daemon programs running on the CDC Server pre-process the data before it is inserted into the database, and periodically perform synchronization tasks. A SWE-compliant data repository is installed to enable data exchange, accepting data from both internal DB Server and external sources through the OGC web services. The web portal, i.e. TEO Online, serves as a user-friendly interface for data visualization, analysis, synthesis, modeling, and K-12 educational outreach activities. It also provides useful capabilities for system developers and operators to remotely monitor system status and remotely update software and system configuration, which greatly simplifies the system debugging and maintenance tasks. We also implement Sensor Observation Services (SOS) at this layer, conforming to the SWE standard to facilitate data exchange. The standard SensorML/O&M data representation makes it easy to integrate our sensor data into the existing Geographic Information Systems (GIS) web services and exchange the data with other organizations.
Electronic Attack Platform Placement Optimization
2014-09-01
Processing in VBA ...............................................................33 2. Client-Server Using Two Different Excel Application...6 Figure 3. Screenshot of the VBA IDE contained within all Microsoft Office products...application using MS Excel’s Applicatin.OnTime method. .....................................33 Figure 20. WINSOCK API Functions needed to use TCP via VBA
Wireless communication of real-time ultrasound data and control
NASA Astrophysics Data System (ADS)
Tobias, Richard J.
2015-03-01
The Internet of Things (IoT) is expected to grow to 26 billion connected devices by 2020, plus the PC, smart phone, and tablet segment that includes mobile Health (mHealth) connected devices is projected to account for another 7.3 billion units by 2020. This paper explores some of the real-time constraints on the data-flow and control of a wireless connected ultrasound machine. The paper will define an ultrasound server and the capabilities necessary for real-time use of the device. The concept of an ultrasound server wirelessly (or over any network) connected to multiple lightweight clients on devices like an iPad, iPhone, or Android-based tablet, smartphone and other network-attached displays (i.e., Google Glass) is explored. Latency in the ultrasound data stream is one of the key areas to measure and to focus on keeping as small as possible (<30ms) so that the ultrasound operator can see what is at the probe at that moment, instead of where the probe was a short period earlier. By keeping the latency less than 30ms, the operator will feel like the data he sees on the wireless connected devices is running in real-time with the operator. The second parameter is the management of bandwidth. At minimum we need to be able to see 20 frames-per- second. It is possible to achieve ultrasound in triplex mode at >20 frames-per-second on a properly configured wireless network. The ultrasound server needs to be designed to accept multiple ultrasound data clients and multiple control clients. A description of the server and some of its key features will be described.
Image-based electronic patient records for secured collaborative medical applications.
Zhang, Jianguo; Sun, Jianyong; Yang, Yuanyuan; Liang, Chenwen; Yao, Yihong; Cai, Weihua; Jin, Jin; Zhang, Guozhen; Sun, Kun
2005-01-01
We developed a Web-based system to interactively display image-based electronic patient records (EPR) for secured intranet and Internet collaborative medical applications. The system consists of four major components: EPR DICOM gateway (EPR-GW), Image-based EPR repository server (EPR-Server), Web Server and EPR DICOM viewer (EPR-Viewer). In the EPR-GW and EPR-Viewer, the security modules of Digital Signature and Authentication are integrated to perform the security processing on the EPR data with integrity and authenticity. The privacy of EPR in data communication and exchanging is provided by SSL/TLS-based secure communication. This presentation gave a new approach to create and manage image-based EPR from actual patient records, and also presented a way to use Web technology and DICOM standard to build an open architecture for collaborative medical applications.
NASA Astrophysics Data System (ADS)
Wibonele, Kasanda J.; Zhang, Yanqing
2002-03-01
A web data mining system using granular computing and ASP programming is proposed. This is a web based application, which allows web users to submit survey data for many different companies. This survey is a collection of questions that will help these companies develop and improve their business and customer service with their clients by analyzing survey data. This web application allows users to submit data anywhere. All the survey data is collected into a database for further analysis. An administrator of this web application can login to the system and view all the data submitted. This web application resides on a web server, and the database resides on the MS SQL server.
Password-Only Authenticated Three-Party Key Exchange with Provable Security in the Standard Model
Nam, Junghyun; Kim, Junghwan; Kang, Hyun-Kyu; Kim, Jinsoo; Paik, Juryon
2014-01-01
Protocols for password-only authenticated key exchange (PAKE) in the three-party setting allow two clients registered with the same authentication server to derive a common secret key from their individual password shared with the server. Existing three-party PAKE protocols were proven secure under the assumption of the existence of random oracles or in a model that does not consider insider attacks. Therefore, these protocols may turn out to be insecure when the random oracle is instantiated with a particular hash function or an insider attack is mounted against the partner client. The contribution of this paper is to present the first three-party PAKE protocol whose security is proven without any idealized assumptions in a model that captures insider attacks. The proof model we use is a variant of the indistinguishability-based model of Bellare, Pointcheval, and Rogaway (2000), which is one of the most widely accepted models for security analysis of password-based key exchange protocols. We demonstrated that our protocol achieves not only the typical indistinguishability-based security of session keys but also the password security against undetectable online dictionary attacks. PMID:24977229
Fermilab Science Education Office
on the Education Server about Science Education, but turn on JavaScript to enable all this site's - About - FAQ - Fermilab Friends - Fermilab Home Fermilab Office of Education & Public Outreach @fnal.gov Lederman Science Education Center Fermilab MS 777 Box 500 Batavia, IL 60510 (630) 840-8258 * fax
Patient Data Synchronization Process in a Continuity of Care Environment
Haras, Consuela; Sauquet, Dominique; Ameline, Philippe; Jaulent, Marie-Christine; Degoulet, Patrice
2005-01-01
In a distributed patient record environment, we analyze the processes needed to ensure exchange and access to EHR data. We propose an adapted method and the tools for data synchronization. Our study takes into account the issues of user rights management for data access and of decreasing the amount of data exchanged over the network. We describe a XML-based synchronization model that is portable and independent of specific medical data models. The implemented platform consists of several servers, of local network clients, of workstations running user’s interfaces and of data exchange and synchronization tools. PMID:16779049
Mazur, Sharlyn J.; Weber, Daniel P.
2018-01-01
Hydrogen-deuterium exchange mass spectrometry (HDX-MS) provides information about protein conformational mobility under native conditions. The area between exchange curves, Abec, a functional data analysis concept, was adapted to the interpretation of HDX-MS data and provides a useful measure of exchange curve dissimilarity for tests of significance. Importantly, for most globular proteins under native conditions, Abec values provide an estimate of the log ratio of exchange-competent fractions in the two states, and thus are related to differences in the free energy of microdomain unfolding. PMID:28236290
Deneyer, M; Hachimi-Idrissi, S; Michel, L; Nyssen, M; De Moor, G; Vandenplas, Y
2012-01-01
The authors propose the introduction of a pilot project: "paediatric core file exchange in emergencies" (PCF-EXEM) which enables the exchange of medical data between the attending paediatrician (AP), holder of the medical record, and on-duty medical units (i.e. general practitioners, paediatricians, surgeons, emergency physicians,...). This project is based on two pillars: a protected server (PCF-server) containing paediatric core files (PCF), with important clinical data that should be available for the physician in order to quickly get a clear insight into the relevant clinical medical history of the child, and secondly, the possibility to provide feedback to the attending physician about the findings recorded during the on-call duty. The permanent availability of health data on the PCF-server and the possibility to provide feedback represent together the PCF-EXEM-project. This project meets the demand of the care providers to have relevant medical information permanently available in order to guarantee high quality care in emergency situations. The frail balance between the right to informative privacy and professional confidentiality on the one hand and the right to quality health care on the other hand has been taken into account. The technical and practical feasibility of this project is described. The objectives and vision of the PCF-EXEM project are conform to Belgian legislation concerning the processing of medical data and are in line with the still under consideration European projects which are focusing on interoperability and the development of a common access control to databanks containing health data for care providers. PCF-EXEM could therefore be a model for other EU countries as well.
Advanced Pulse Oximetry System for Remote Monitoring and Management
Pak, Ju Geon; Park, Kee Hyun
2012-01-01
Pulse oximetry data such as saturation of peripheral oxygen (SpO2) and pulse rate are vital signals for early diagnosis of heart disease. Therefore, various pulse oximeters have been developed continuously. However, some of the existing pulse oximeters are not equipped with communication capabilities, and consequently, the continuous monitoring of patient health is restricted. Moreover, even though certain oximeters have been built as network models, they focus on exchanging only pulse oximetry data, and they do not provide sufficient device management functions. In this paper, we propose an advanced pulse oximetry system for remote monitoring and management. The system consists of a networked pulse oximeter and a personal monitoring server. The proposed pulse oximeter measures a patient's pulse oximetry data and transmits the data to the personal monitoring server. The personal monitoring server then analyzes the received data and displays the results to the patient. Furthermore, for device management purposes, operational errors that occur in the pulse oximeter are reported to the personal monitoring server, and the system configurations of the pulse oximeter, such as thresholds and measurement targets, are modified by the server. We verify that the proposed pulse oximetry system operates efficiently and that it is appropriate for monitoring and managing a pulse oximeter in real time. PMID:22933841
Advanced pulse oximetry system for remote monitoring and management.
Pak, Ju Geon; Park, Kee Hyun
2012-01-01
Pulse oximetry data such as saturation of peripheral oxygen (SpO(2)) and pulse rate are vital signals for early diagnosis of heart disease. Therefore, various pulse oximeters have been developed continuously. However, some of the existing pulse oximeters are not equipped with communication capabilities, and consequently, the continuous monitoring of patient health is restricted. Moreover, even though certain oximeters have been built as network models, they focus on exchanging only pulse oximetry data, and they do not provide sufficient device management functions. In this paper, we propose an advanced pulse oximetry system for remote monitoring and management. The system consists of a networked pulse oximeter and a personal monitoring server. The proposed pulse oximeter measures a patient's pulse oximetry data and transmits the data to the personal monitoring server. The personal monitoring server then analyzes the received data and displays the results to the patient. Furthermore, for device management purposes, operational errors that occur in the pulse oximeter are reported to the personal monitoring server, and the system configurations of the pulse oximeter, such as thresholds and measurement targets, are modified by the server. We verify that the proposed pulse oximetry system operates efficiently and that it is appropriate for monitoring and managing a pulse oximeter in real time.
Carroll, Adam J; Badger, Murray R; Harvey Millar, A
2010-07-14
Standardization of analytical approaches and reporting methods via community-wide collaboration can work synergistically with web-tool development to result in rapid community-driven expansion of online data repositories suitable for data mining and meta-analysis. In metabolomics, the inter-laboratory reproducibility of gas-chromatography/mass-spectrometry (GC/MS) makes it an obvious target for such development. While a number of web-tools offer access to datasets and/or tools for raw data processing and statistical analysis, none of these systems are currently set up to act as a public repository by easily accepting, processing and presenting publicly submitted GC/MS metabolomics datasets for public re-analysis. Here, we present MetabolomeExpress, a new File Transfer Protocol (FTP) server and web-tool for the online storage, processing, visualisation and statistical re-analysis of publicly submitted GC/MS metabolomics datasets. Users may search a quality-controlled database of metabolite response statistics from publicly submitted datasets by a number of parameters (eg. metabolite, species, organ/biofluid etc.). Users may also perform meta-analysis comparisons of multiple independent experiments or re-analyse public primary datasets via user-friendly tools for t-test, principal components analysis, hierarchical cluster analysis and correlation analysis. They may interact with chromatograms, mass spectra and peak detection results via an integrated raw data viewer. Researchers who register for a free account may upload (via FTP) their own data to the server for online processing via a novel raw data processing pipeline. MetabolomeExpress https://www.metabolome-express.org provides a new opportunity for the general metabolomics community to transparently present online the raw and processed GC/MS data underlying their metabolomics publications. Transparent sharing of these data will allow researchers to assess data quality and draw their own insights from published metabolomics datasets.
NASA Technical Reports Server (NTRS)
Ullman, Richard; Bane, Bob; Yang, Jingli
2008-01-01
A shell script has been written as a means of automatically making HDF-EOS-formatted data sets available via the World Wide Web. ("HDF-EOS" and variants thereof are defined in the first of the two immediately preceding articles.) The shell script chains together some software tools developed by the Data Usability Group at Goddard Space Flight Center to perform the following actions: Extract metadata in Object Definition Language (ODL) from an HDF-EOS file, Convert the metadata from ODL to Extensible Markup Language (XML), Reformat the XML metadata into human-readable Hypertext Markup Language (HTML), Publish the HTML metadata and the original HDF-EOS file to a Web server and an Open-source Project for a Network Data Access Protocol (OPeN-DAP) server computer, and Reformat the XML metadata and submit the resulting file to the EOS Clearinghouse, which is a Web-based metadata clearinghouse that facilitates searching for, and exchange of, Earth-Science data.
Provably Secure Password-based Authentication in TLS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abdalla, Michel; Emmanuel, Bresson; Chevassut, Olivier
2005-12-20
In this paper, we show how to design an efficient, provably secure password-based authenticated key exchange mechanism specifically for the TLS (Transport Layer Security) protocol. The goal is to provide a technique that allows users to employ (short) passwords to securely identify themselves to servers. As our main contribution, we describe a new password-based technique for user authentication in TLS, called Simple Open Key Exchange (SOKE). Loosely speaking, the SOKE ciphersuites are unauthenticated Diffie-Hellman ciphersuites in which the client's Diffie-Hellman ephemeral public value is encrypted using a simple mask generation function. The mask is simply a constant value raised tomore » the power of (a hash of) the password.The SOKE ciphersuites, in advantage over previous pass-word-based authentication ciphersuites for TLS, combine the following features. First, SOKE has formal security arguments; the proof of security based on the computational Diffie-Hellman assumption is in the random oracle model, and holds for concurrent executions and for arbitrarily large password dictionaries. Second, SOKE is computationally efficient; in particular, it only needs operations in a sufficiently large prime-order subgroup for its Diffie-Hellman computations (no safe primes). Third, SOKE provides good protocol flexibility because the user identity and password are only required once a SOKE ciphersuite has actually been negotiated, and after the server has sent a server identity.« less
OceanNOMADS: A New Distribution Node for Operational Ocean Model Output
NASA Astrophysics Data System (ADS)
Cross, S.; Vance, T.; Breckenridge, T.
2009-12-01
The NOAA National Operational Model Archive and Distribution System (NOMADS) is a distributed, web-services based project providing real-time and retrospective access to climate and weather model data and related datasets. OceanNOMADS is a new NOMADS node dedicated to ocean model and related data, with an initial focus on operational ocean models from NOAA and the U.S. Navy. The node offers data access through a Thematic Real-time Environmental Distributed Data Services (THREDDS) server via the commonly used OPeNDAP protocol. The primary server is operated by the National Coastal Data Development Center and hosted by the Northern Gulf Institute at Stennis Space Center, MS. In cooperation with the National Marine Fisheries Service and Mississippi State University (MSU), a duplicate server is being installed at MSU with a 1-gigabit connection to the National Lambda Rail. This setup will allow us to begin to quantify the benefit of high-speed data connections to scientists needing remote access to these large datasets. Work is also underway on the next generation of services from OceanNOMADS, including user-requested server-side data reformatting, regridding, and aggregation, as well as tools for model-data comparison.
Using a Java Web-based Graphical User Interface to access the SOHO Data Arch ive
NASA Astrophysics Data System (ADS)
Scholl, I.; Girard, Y.; Bykowski, A.
This paper presents the architecture of a Java web-based graphical interface dedicated to the access of the SOHO Data archive. This application allows local and remote users to search in the SOHO data catalog and retrieve the SOHO data files from the archive. It has been developed at MEDOC (Multi-Experiment Data and Operations Centre), located at the Institut d'Astrophysique Spatiale (Orsay, France), which is one of the European Archives for the SOHO data. This development is part of a joint effort between ESA, NASA and IAS in order to implement long term archive systems for the SOHO data. The software architecture is built as a client-server application using Java language and SQL above a set of components such as an HTTP server, a JDBC gateway, a RDBMS server, a data server and a Web browser. Since HTML pages and CGI scripts are not powerful enough to allow user interaction during a multi-instrument catalog search, this type of requirement enforces the choice of Java as the main language. We also discuss performance issues, security problems and portability on different Web browsers and operating syste ms.
Hydrogen Exchange Mass Spectrometry
Mayne, Leland
2018-01-01
Hydrogen exchange (HX) methods can reveal much about the structure, energetics, and dynamics of proteins. The addition of mass spectrometry (MS) to an earlier fragmentation-separation HX analysis now extends HX studies to larger proteins at high structural resolution and can provide information not available before. This chapter discusses experimental aspects of HX labeling, especially with respect to the use of MS and the analysis of MS data. PMID:26791986
Investigation of Molecular Exchange Using DEXSY with Ultra-High Pulsed Field Gradients
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gratz, Marcel; Galvosas, Petrik
2008-12-05
Diffusion exchange spectroscopy has been employed for the investigation of water exchange between different regions of a cosmetic lotion as well as for the exchange of n-pentane between the inter- and intra-crystalline space in zeolite NaX. We successfully combined this two-dimensional (2D) NMR experiment with methods for the application of ultra-high pulsed field gradients of up to 35 T/m, resulting in observation times and mixing times as short as 2 ms and 2.8 ms, respectively.
A Secure Authenticated Key Exchange Protocol for Credential Services
NASA Astrophysics Data System (ADS)
Shin, Seonghan; Kobara, Kazukuni; Imai, Hideki
In this paper, we propose a leakage-resilient and proactive authenticated key exchange (called LRP-AKE) protocol for credential services which provides not only a higher level of security against leakage of stored secrets but also secrecy of private key with respect to the involving server. And we show that the LRP-AKE protocol is provably secure in the random oracle model with the reduction to the computational Difie-Hellman problem. In addition, we discuss about some possible applications of the LRP-AKE protocol.
NASA Astrophysics Data System (ADS)
Cryar, Adam; Groves, Kate; Quaglia, Milena
2017-06-01
Hydrogen-deuterium exchange mass spectrometry (HDX-MS) is an important tool for measuring and monitoring protein structure. A bottom-up approach to HDX-MS provides peptide level deuterium uptake values and a more refined localization of deuterium incorporation compared with global HDX-MS measurements. The degree of localization provided by HDX-MS is proportional to the number of peptides that can be identified and monitored across an exchange experiment. Ion mobility spectrometry (IMS) has been shown to improve MS-based peptide analysis of biological samples through increased separation capacity. The integration of IMS within HDX-MS workflows has been commercialized but presently its adoption has not been widespread. The potential benefits of IMS, therefore, have not yet been fully explored. We herein describe a comprehensive evaluation of traveling wave ion mobility integrated within an online-HDX-MS system and present the first reported example of UDMSE acquisition for HDX analysis. Instrument settings required for optimal peptide identifications are described and the effects of detector saturation due to peak compression are discussed. A model system is utilized to confirm the comparability of HDX-IM-MS and HDX-MS uptake values prior to an evaluation of the benefits of IMS at increasing sample complexity. Interestingly, MS and IM-MS acquisitions were found to identify distinct populations of peptides that were unique to the respective methods, a property that can be utilized to increase the spatial resolution of HDX-MS experiments by >60%. [Figure not available: see fulltext.
NASA Astrophysics Data System (ADS)
Xie, Qi; Hu, Bin; Chen, Ke-Fei; Liu, Wen-Hao; Tan, Xiao
2015-11-01
In three-party password authenticated key exchange (AKE) protocol, since two users use their passwords to establish a secure session key over an insecure communication channel with the help of the trusted server, such a protocol may suffer the password guessing attacks and the server has to maintain the password table. To eliminate the shortages of password-based AKE protocol, very recently, according to chaotic maps, Lee et al. [2015 Nonlinear Dyn. 79 2485] proposed a first three-party-authenticated key exchange scheme without using passwords, and claimed its security by providing a well-organized BAN logic test. Unfortunately, their protocol cannot resist impersonation attack, which is demonstrated in the present paper. To overcome their security weakness, by using chaotic maps, we propose a biometrics-based anonymous three-party AKE protocol with the same advantages. Further, we use the pi calculus-based formal verification tool ProVerif to show that our AKE protocol achieves authentication, security and anonymity, and an acceptable efficiency. Project supported by the Natural Science Foundation of Zhejiang Province, China (Grant No. LZ12F02005), the Major State Basic Research Development Program of China (Grant No. 2013CB834205), and the National Natural Science Foundation of China (Grant No. 61070153).
Hamuro, Yoshitomo
2017-03-01
A new strategy to analyze amide hydrogen/deuterium exchange mass spectrometry (HDX-MS) data is proposed, utilizing a wider time window and isotope envelope analysis of each peptide. While most current scientific reports present HDX-MS data as a set of time-dependent deuteration levels of peptides, the ideal HDX-MS data presentation is a complete set of backbone amide hydrogen exchange rates. The ideal data set can provide single amide resolution, coverage of all exchange events, and the open/close ratio of each amide hydrogen in EX2 mechanism. Toward this goal, a typical HDX-MS protocol was modified in two aspects: measurement of a wider time window in HDX-MS experiments and deconvolution of isotope envelope of each peptide. Measurement of a wider time window enabled the observation of deuterium incorporation of most backbone amide hydrogens. Analysis of the isotope envelope instead of centroid value provides the deuterium distribution instead of the sum of deuteration levels in each peptide. A one-step, global-fitting algorithm optimized exchange rate and deuterium retention during the analysis of each amide hydrogen by fitting the deuterated isotope envelopes at all time points of all peptides in a region. Application of this strategy to cytochrome c yielded 97 out of 100 amide hydrogen exchange rates. A set of exchange rates determined by this approach is more appropriate for a patent or regulatory filing of a biopharmaceutical than a set of peptide deuteration levels obtained by a typical protocol. A wider time window of this method also eliminates false negatives in protein-ligand binding site identification. Graphical Abstract ᅟ.
NASA Astrophysics Data System (ADS)
Hamuro, Yoshitomo
2017-03-01
A new strategy to analyze amide hydrogen/deuterium exchange mass spectrometry (HDX-MS) data is proposed, utilizing a wider time window and isotope envelope analysis of each peptide. While most current scientific reports present HDX-MS data as a set of time-dependent deuteration levels of peptides, the ideal HDX-MS data presentation is a complete set of backbone amide hydrogen exchange rates. The ideal data set can provide single amide resolution, coverage of all exchange events, and the open/close ratio of each amide hydrogen in EX2 mechanism. Toward this goal, a typical HDX-MS protocol was modified in two aspects: measurement of a wider time window in HDX-MS experiments and deconvolution of isotope envelope of each peptide. Measurement of a wider time window enabled the observation of deuterium incorporation of most backbone amide hydrogens. Analysis of the isotope envelope instead of centroid value provides the deuterium distribution instead of the sum of deuteration levels in each peptide. A one-step, global-fitting algorithm optimized exchange rate and deuterium retention during the analysis of each amide hydrogen by fitting the deuterated isotope envelopes at all time points of all peptides in a region. Application of this strategy to cytochrome c yielded 97 out of 100 amide hydrogen exchange rates. A set of exchange rates determined by this approach is more appropriate for a patent or regulatory filing of a biopharmaceutical than a set of peptide deuteration levels obtained by a typical protocol. A wider time window of this method also eliminates false negatives in protein-ligand binding site identification.
The NIST Internet time service
NASA Astrophysics Data System (ADS)
Levine, Judah
1994-05-01
We will describe the NIST Network Time Service which provides time and frequency information over the Internet. Our first time server is located in Boulder, Colorado, a second backup server is under construction there, and we plan to install a third server on the East Coast later this year. The servers are synchronized to UTC(NIST) with an uncertainty of about 0.8 ms RMS and they will respond to time requests from any client on the Internet in several different formats including the DAYTIME, TIME and NTP protocols. The DAYTIME and TIME protocols are the easiest to use and are suitable for providing time to PC's and other small computers. In addition to UTC(NIST), the DAYTIME message provides advance notice of leap seconds and of the transitions to and from Daylight Saving Time. The Daylight Saving Time notice is based on the US transition dates of the first Sunday in April and the last one in October. The NTP is a more complex protocol that is suitable for larger machines; it is normally run as a 'daemon' process in the background and can keep the time of the client to within a few milliseconds of UTC(NIST). We will describe the operating principles of various kinds of client software ranging from a simple program that queries the server once and sets the local clock to more complex 'daemon' processes (such as NTP) that continuously correct the time of the local clock based on periodic calibrations.
The NIST Internet time service
NASA Technical Reports Server (NTRS)
Levine, Judah
1994-01-01
We will describe the NIST Network Time Service which provides time and frequency information over the Internet. Our first time server is located in Boulder, Colorado, a second backup server is under construction there, and we plan to install a third server on the East Coast later this year. The servers are synchronized to UTC(NIST) with an uncertainty of about 0.8 ms RMS and they will respond to time requests from any client on the Internet in several different formats including the DAYTIME, TIME and NTP protocols. The DAYTIME and TIME protocols are the easiest to use and are suitable for providing time to PC's and other small computers. In addition to UTC(NIST), the DAYTIME message provides advance notice of leap seconds and of the transitions to and from Daylight Saving Time. The Daylight Saving Time notice is based on the US transition dates of the first Sunday in April and the last one in October. The NTP is a more complex protocol that is suitable for larger machines; it is normally run as a 'daemon' process in the background and can keep the time of the client to within a few milliseconds of UTC(NIST). We will describe the operating principles of various kinds of client software ranging from a simple program that queries the server once and sets the local clock to more complex 'daemon' processes (such as NTP) that continuously correct the time of the local clock based on periodic calibrations.
Yi, Lin; Ouyang, Yilan; Sun, Xue; Xu, Naiyu; Linhardt, Robert J; Zhang, Zhenqing
2015-12-04
Dextran, a family of natural polysaccharides, consists of an α (1→6) linked-glucose main (backbone) chain having a number of branches. The determination of the types and the quantities of branches in dextran is important in understanding its various biological roles. In this study, a hyphenated method using high-performance anion exchange chromatography (HPAEC) in parallel with pulsed amperometric detection (PAD) and mass spectrometry (MS) was applied to qualitative and quantitative analysis of dextran branches. A rotary cation-exchange cartridge array desalter was used for removal of salt from the HPAEC eluent making it MS compatible. MS and MS/MS were used to provide structural information on the enzymatically prepared dextran oligosaccharides. PAD provides quantitative data on the ratio of enzyme-resistant, branched dextran oligosaccharides. Both the types and degree of branching found in a variety of dextrans could be simultaneously determined online using this method. Copyright © 2015 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chainer, Timothy J.; Parida, Pritish R.
Systems and methods for cooling include one or more computing structure, an inter-structure liquid cooling system that includes valves configured to selectively provide liquid coolant to the one or more computing structures; a heat rejection system that includes one or more heat rejection units configured to cool liquid coolant; and one or more liquid-to-liquid heat exchangers that include valves configured to selectively transfer heat from liquid coolant in the inter-structure liquid cooling system to liquid coolant in the heat rejection system. Each computing structure further includes one or more liquid-cooled servers; and an intra-structure liquid cooling system that has valvesmore » configured to selectively provide liquid coolant to the one or more liquid-cooled servers.« less
Provisioning cooling elements for chillerless data centers
Chainer, Timothy J.; Parida, Pritish R.
2016-12-13
Systems and methods for cooling include one or more computing structure, an inter-structure liquid cooling system that includes valves configured to selectively provide liquid coolant to the one or more computing structures; a heat rejection system that includes one or more heat rejection units configured to cool liquid coolant; and one or more liquid-to-liquid heat exchangers that include valves configured to selectively transfer heat from liquid coolant in the inter-structure liquid cooling system to liquid coolant in the heat rejection system. Each computing structure further includes one or more liquid-cooled servers; and an intra-structure liquid cooling system that has valves configured to selectively provide liquid coolant to the one or more liquid-cooled servers.
NDEx - The Network Data Exchange | Informatics Technology for Cancer Research (ITCR)
NDEx is an online commons where scientists can upload, share, and publicly distribute biological networks and pathway models. The NDEx Project maintains a web-accessible public server, a documentation website, provides seamless connectivity to Cytoscape as well as programmatic access using a variety of languages including Python and Java.
Chen, Hung-Ming; Liou, Yong-Zan
2014-10-01
In a mobile health management system, mobile devices act as the application hosting devices for personal health records (PHRs) and the healthcare servers construct to exchange and analyze PHRs. One of the most popular PHR standards is continuity of care record (CCR). The CCR is expressed in XML formats. However, parsing is an expensive operation that can degrade XML processing performance. Hence, the objective of this study was to identify different operational and performance characteristics for those CCR parsing models including the XML DOM parser, the SAX parser, the PULL parser, and the JSON parser with regard to JSON data converted from XML-based CCR. Thus, developers can make sensible choices for their target PHR applications to parse CCRs when using mobile devices or servers with different system resources. Furthermore, the simulation experiments of four case studies are conducted to compare the parsing performance on Android mobile devices and the server with large quantities of CCR data.
Application of wireless networks-peer-to-peer information sharing
NASA Astrophysics Data System (ADS)
ellappan, Vijayan; chaki, suchismita; kumar, avn
2017-11-01
Peer to Peer communications and its applications have gotten to be ordinary construction modelling in the wired network environment. But then, they have not been successfully adjusted with the wireless environment. Unlike the traditional client-server framework, in a P2P framework, each node can play the role of client as well as server simultaneously and exchange data or information with others. We aim to design an application which can adapt to the wireless ad-hoc networks. Peer to Peer communication can help people to share their files (information, image, audio, video and so on) and communicate with each other without relying on a particular network infrastructure or limited data usage. Here there is a central server with the help of which, the peers will have the capability to get the information about the other peers in the network. Indeed, even without the Internet, devices have the potential to allow users to connect and communicate in a special way through short range remote protocols such Wi-Fi.
Securing a web-based teleradiology platform according to German law and "best practices".
Spitzer, Michael; Ullrich, Tobias; Ueckert, Frank
2009-01-01
The Medical Data and Picture Exchange platform (MDPE), as a teleradiology system, facilitates the exchange of digital medical imaging data among authorized users. It features extensive support of the DICOM standard including networking functions. Since MDPE is designed as a web service, security and confidentiality of data and communication pose an outstanding challenge. To comply with demands of German laws and authorities, a generic data security concept considered as "best practice" in German health telematics was adapted to the specific demands of MDPE. The concept features strict logical and physical separation of diagnostic and identity data and thus an all-encompassing pseudonymization throughout the system. Hence, data may only be merged at authorized clients. MDPE's solution of merging data from separate sources within a web browser avoids technically questionable techniques such as deliberate cross-site scripting. Instead, data is merged dynamically by JavaScriptlets running in the user's browser. These scriptlets are provided by one server, while content and method calls are generated by another server. Additionally, MDPE uses encrypted temporary IDs for communication and merging of data.
Pichler, Peter; Mazanek, Michael; Dusberger, Frederico; Weilnböck, Lisa; Huber, Christian G; Stingl, Christoph; Luider, Theo M; Straube, Werner L; Köcher, Thomas; Mechtler, Karl
2012-11-02
While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC-MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge.
2012-01-01
While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC–MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge. PMID:23088386
Du, Fuying; Liu, Ting; Liu, Tian; Wang, Yongwei; Wan, Yakun; Xing, Jie
2011-10-30
Triptolide (TP), the primary active component of the herbal medicine Tripterygium wilfordii Hook F, has shown promising antileukemic and anti-inflammatory activity. The pharmacokinetic profile of TP indicates an extensive metabolic elimination in vivo; however, its metabolic data is rarely available partly because of the difficulty in identifying it due to the absence of appropriate ultraviolet chromophores in the structure and the presence of endogenous interferences in biological samples. In the present study, the biotransformation of TP was investigated by improved data-dependent accurate mass spectrometric analysis, using an LTQ/Orbitrap hybrid mass spectrometer in conjunction with the online hydrogen (H)/deuterium (D) exchange technique for rapid structural characterization. Accurate full-scan MS and MS/MS data were processed with multiple post-acquisition data-mining techniques, which were complementary and effective in detecting both common and uncommon metabolites from biological matrices. As a result, 38 phase I, 9 phase II and 8 N-acetylcysteine (NAC) metabolites of TP were found in rat urine. Accurate MS/MS data were used to support assignments of metabolite structures, and online H/D exchange experiments provided additional evidence for exchangeable hydrogen atoms in the structure. The results showed the main phase I metabolic pathways of TP are hydroxylation, hydrolysis and desaturation, and the resulting metabolites subsequently undergo phase II processes. The presence of NAC conjugates indicated the capability of TP to form reactive intermediate species. This study also demonstrated the effectiveness of LC/HR-MS(n) in combination with multiple post-acquisition data-mining methods and the online H/D exchange technique for the rapid identification of drug metabolites. Copyright © 2011 John Wiley & Sons, Ltd.
Acter, Thamina; Kim, Donghwi; Ahmed, Arif; Ha, Ji-Hyoung; Kim, Sunghwan
2017-08-01
Herein we report the observation of atmospheric pressure in-source hydrogen-deuterium exchange (HDX) of thiol group for the first time. The HDX for thiol group was optimized for positive atmospheric pressure photoionization (APPI) mass spectrometry (MS). The optimized HDX-MS was applied for 31 model compounds (thiols, thiophenes, and sulfides) to demonstrate that exchanged peaks were observed only for thiols. The optimized method has been successfully applied to the isolated fractions of sulfur-rich oil samples. The exchange of one and two thiol hydrogens with deuterium was observed in the thiol fraction; no HDX was observed in the other fractions. Thus, the results presented in this study demonstrate that the HDX-MS method using APPI ionization source can be effective for speciation of sulfur compounds. This method has the potential to be used to access corrosion problems caused by thiol-containing compounds. Graphical Abstract ᅟ.
Raptor: An Enterprise Knowledge Discovery Engine Version 2.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
2011-08-31
The Raptor Version 2.0 computer code uses a set of documents as seed documents to recommend documents of interest from a large, target set of documents. The computer code provides results that show the recommended documents with the highest similarity to the seed documents. Version 2.0 was specifically developed to work with SharePoint 2007 and MS SQL server.
NASA Technical Reports Server (NTRS)
Rocker, JoAnne; Roncaglia, George J.; Heimerl, Lynn N.; Nelson, Michael L.
2002-01-01
Interoperability and data-exchange are critical for the survival of government information management programs. E-government initiatives are transforming the way the government interacts with the public. More information is to be made available through web-enabled technologies. Programs such as the NASA's Scientific and Technical Information (STI) Program Office are tasked to find more effective ways to disseminate information to the public. The NASA STI Program is an agency-wide program charged with gathering, organizing, storing, and disseminating NASA-produced information for research and public use. The program is investigating the use of a new protocol called the Open Archives Initiative (OAI) as a means to improve data interoperability and data collection. OAI promotes the use of the OAI harvesting protocol as a simple way for data sharing among repositories. In two separate initiatives, the STI Program is implementing OAI In collaboration with the Air Force, Department of Energy, and Old Dominion University, the NASA STI Program has funded research on implementing the OAI to exchange data between the three organizations. The second initiative is the deployment of OAI for the NASA technical report server (TRS) environment. The NASA TRS environment is comprised of distributed technical report servers with a centralized search interface. This paper focuses on the implementation of OAI to promote interoperability among diverse data repositories.
NASA Technical Reports Server (NTRS)
Boulanger, Richard P., Jr.; Kwauk, Xian-Min; Stagnaro, Mike; Kliss, Mark (Technical Monitor)
1998-01-01
The BIO-Plex control system requires real-time, flexible, and reliable data delivery. There is no simple "off-the-shelf 'solution. However, several commercial packages will be evaluated using a testbed at ARC for publish- and-subscribe and client-server communication architectures. Point-to-point communication architecture is not suitable for real-time BIO-Plex control system. Client-server architecture provides more flexible data delivery. However, it does not provide direct communication among nodes on the network. Publish-and-subscribe implementation allows direct information exchange among nodes on the net, providing the best time-critical communication. In this work Network Data Delivery Service (NDDS) from Real-Time Innovations, Inc. ARTIE will be used to implement publish-and subscribe architecture. It offers update guarantees and deadlines for real-time data delivery. Bridgestone, a data acquisition and control software package from National Instruments, will be tested for client-server arrangement. A microwave incinerator located at ARC will be instrumented with a fieldbus network of control devices. BridgeVIEW will be used to implement an enterprise server. An enterprise network consisting of several nodes at ARC and a WAN connecting ARC and RISC will then be setup to evaluate proposed control system architectures. Several network configurations will be evaluated for fault tolerance, quality of service, reliability and efficiency. Data acquired from these network evaluation tests will then be used to determine preliminary design criteria for the BIO-Plex distributed control system.
Guttman, Miklos; Wales, Thomas E; Whittington, Dale; Engen, John R; Brown, Jeffery M; Lee, Kelly K
2016-04-01
Hydrogen/deuterium exchange (HDX) mass spectrometry (MS) for protein structural analysis has been adopted for many purposes, including biopharmaceutical development. One of the benefits of examining amide proton exchange by mass spectrometry is that it can readily resolve different exchange regimes, as evidenced by either binomial or bimodal isotope patterns. By careful analysis of the isotope pattern during exchange, more insight can be obtained on protein behavior in solution. However, one must be sure that any observed bimodal isotope patterns are not artifacts of analysis and are reflective of the true behavior in solution. Sample carryover and certain stationary phases are known as potential sources of bimodal artifacts. Here, we describe an additional undocumented source of deuterium loss resulting in artificial bimodal patterns for certain highly charged peptides. We demonstrate that this phenomenon is predominantly due to gas-phase proton exchange between peptides and bulk solvent within the initial stages of high-transmission conjoined ion guides. Minor adjustments of the ion guide settings, as reported here, eliminate the phenomenon without sacrificing signal intensity. Such gas-phase deuterium loss should be appreciated for all HDX-MS studies using such ion optics, even for routine studies not focused on interpreting bimodal spectra. Graphical Abstract ᅟ.
Multi-resolution extension for transmission of geodata in a mobile context
NASA Astrophysics Data System (ADS)
Follin, Jean-Michel; Bouju, Alain; Bertrand, Frédéric; Boursier, Patrice
2005-03-01
A solution is proposed for the management of multi-resolution vector data in a mobile spatial information visualization system. The client-server architecture and the models of data and transfer of the system are presented first. The aim of this system is to reduce data exchanged between client and server by reusing data already present on the client side. Then, an extension of this system to multi-resolution data is proposed. Our solution is based on the use of increments in a multi-scale database. A database architecture where data sets for different predefined scales are precomputed and stored on the server side is adopted. In this model, each object representing the same real world entities at different levels of detail has to be linked beforehand. Increments correspond to the difference between two datasets with different levels of detail. They are transmitted in order to increase (or decrease) the detail to the client upon request. They include generalization and refinement operators allowing transitions between the different levels. Finally, a framework suited to the transfer of multi-resolution data in a mobile context is presented. This allows reuse of data locally available at different levels of detail and, in this way, reduces the amount of data transferred between client and server.
Niu, Xingliang; Luo, Jun; Xu, Deran; Zou, Hongyan; Kong, Lingyi
2017-02-05
Ginkgolides, the main active constituents of Ginkgo biloba, possess significant selectively inhibition on platelet-activating factor and pancreatic lipase and attract wide attention in pharmacological research area. In our study, an effective hydrogen/deuterium (H/D) exchange method was developed by exchanging the α-Hs of lactone groups in ginkgolides with Ds, which was very useful for the elucidation of the fragmentation patterns of ginkgolides in Quadrupole Time-of-flight Mass Spectrometry (Q-TOF-MS), especially in accurately distinguishing the type and position of substituent in framework of ginkgolides. Then, a systematic research strategy for qualitative and quantitative analysis of ginkgolides, based on H/D exchange, tandem solid-phase extraction and LC-Q-TOF-MS, was developed, which was successfully applied in each medicinal part of G. biloba, which indicated that ginkgolide B was the most abundant ginkgolide in the seeds of G. biloba (60.6μg/g). This research was the successful application of H/D exchange in natural products, and proved that H/D exchange is a potential method for analysis research of complex TCMs active constituents. Copyright © 2016 Elsevier B.V. All rights reserved.
eCX: A Secure Infrastructure for E-Course Delivery.
ERIC Educational Resources Information Center
Yau, Joe C. K; Hui, Lucas C. K.; Cheung, Bruce; Yiu, S. M.
2003-01-01
Presents a mechanism, the Secure e-Course eXchange (eCX) designed to protect learning material from unauthorized dissemination, and shows how this mechanism can be integrated in the operation model of online learning course providers. The design of eCX is flexible to fit two operating models, the Institutional Server Model and the Corporate Server…
WMT: The CSDMS Web Modeling Tool
NASA Astrophysics Data System (ADS)
Piper, M.; Hutton, E. W. H.; Overeem, I.; Syvitski, J. P.
2015-12-01
The Community Surface Dynamics Modeling System (CSDMS) has a mission to enable model use and development for research in earth surface processes. CSDMS strives to expand the use of quantitative modeling techniques, promotes best practices in coding, and advocates for the use of open-source software. To streamline and standardize access to models, CSDMS has developed the Web Modeling Tool (WMT), a RESTful web application with a client-side graphical interface and a server-side database and API that allows users to build coupled surface dynamics models in a web browser on a personal computer or a mobile device, and run them in a high-performance computing (HPC) environment. With WMT, users can: Design a model from a set of components Edit component parameters Save models to a web-accessible server Share saved models with the community Submit runs to an HPC system Download simulation results The WMT client is an Ajax application written in Java with GWT, which allows developers to employ object-oriented design principles and development tools such as Ant, Eclipse and JUnit. For deployment on the web, the GWT compiler translates Java code to optimized and obfuscated JavaScript. The WMT client is supported on Firefox, Chrome, Safari, and Internet Explorer. The WMT server, written in Python and SQLite, is a layered system, with each layer exposing a web service API: wmt-db: database of component, model, and simulation metadata and output wmt-api: configure and connect components wmt-exe: launch simulations on remote execution servers The database server provides, as JSON-encoded messages, the metadata for users to couple model components, including descriptions of component exchange items, uses and provides ports, and input parameters. Execution servers are network-accessible computational resources, ranging from HPC systems to desktop computers, containing the CSDMS software stack for running a simulation. Once a simulation completes, its output, in NetCDF, is packaged and uploaded to a data server where it is stored and from which a user can download it as a single compressed archive file.
NASA Astrophysics Data System (ADS)
Kehlenbeck, Matthias; Breitner, Michael H.
Business users define calculated facts based on the dimensions and facts contained in a data warehouse. These business calculation definitions contain necessary knowledge regarding quantitative relations for deep analyses and for the production of meaningful reports. The business calculation definitions are implementation and widely organization independent. But no automated procedures facilitating their exchange across organization and implementation boundaries exist. Separately each organization currently has to map its own business calculations to analysis and reporting tools. This paper presents an innovative approach based on standard Semantic Web technologies. This approach facilitates the exchange of business calculation definitions and allows for their automatic linking to specific data warehouses through semantic reasoning. A novel standard proxy server which enables the immediate application of exchanged definitions is introduced. Benefits of the approach are shown in a comprehensive case study.
On the security of a simple three-party key exchange protocol without server's public keys.
Nam, Junghyun; Choo, Kim-Kwang Raymond; Park, Minkyu; Paik, Juryon; Won, Dongho
2014-01-01
Authenticated key exchange protocols are of fundamental importance in securing communications and are now extensively deployed for use in various real-world network applications. In this work, we reveal major previously unpublished security vulnerabilities in the password-based authenticated three-party key exchange protocol according to Lee and Hwang (2010): (1) the Lee-Hwang protocol is susceptible to a man-in-the-middle attack and thus fails to achieve implicit key authentication; (2) the protocol cannot protect clients' passwords against an offline dictionary attack; and (3) the indistinguishability-based security of the protocol can be easily broken even in the presence of a passive adversary. We also propose an improved password-based authenticated three-party key exchange protocol that addresses the security vulnerabilities identified in the Lee-Hwang protocol.
On the Security of a Simple Three-Party Key Exchange Protocol without Server's Public Keys
Nam, Junghyun; Choo, Kim-Kwang Raymond; Park, Minkyu; Paik, Juryon; Won, Dongho
2014-01-01
Authenticated key exchange protocols are of fundamental importance in securing communications and are now extensively deployed for use in various real-world network applications. In this work, we reveal major previously unpublished security vulnerabilities in the password-based authenticated three-party key exchange protocol according to Lee and Hwang (2010): (1) the Lee-Hwang protocol is susceptible to a man-in-the-middle attack and thus fails to achieve implicit key authentication; (2) the protocol cannot protect clients' passwords against an offline dictionary attack; and (3) the indistinguishability-based security of the protocol can be easily broken even in the presence of a passive adversary. We also propose an improved password-based authenticated three-party key exchange protocol that addresses the security vulnerabilities identified in the Lee-Hwang protocol. PMID:25258723
Schrader, T; Hufnagl, P; Schlake, W; Dietel, M
2005-01-01
In the autumn a German screening program was started for detecting breast cancer in the population of women fifty and above. For the first time in this program, quality assurance rules were established: All statements of the radiologists and pathologists have to be confirmed by a second opinion. This improvement in quality is combined with a delay in time and additional expence. A new Telepathology Consultation Service was developed based on the experiences of the Telepathology Consultation Center of the UICC to speed up the second opinion process. The complete web-based service is operated under MS Windows 2003 Server, as web server the Internet Information Server, and the SQL-Server (both Microsoft) as the database. The websites, forms and control mechanism have been coded in by ASP scripts and JavaScript. A study to evaluate the effectiveness of telepathological consultation in comparison to conventional consultation has been carried out. Pathologists of the Professional Association of German Pathologists took part as well as requesting pathologists and as consultants for other participants. The quality of telepathological diagnosis was comparable to the conventional diagnosis. Telepathology allows a faster respond of 1 to 2 day (conventional postal delay). The time to prepare a telepathology request is about twice as conventional. This ratio may be inverted by an interface between the Pathology Information System and the Telepathology Server and the use of virtual microscopy. The Telepathology Consultation Service of the Professional Association of German Pathologists is a fast and effective German-language, internet-based service for obtaining a second opinion.
Software Modules for the Proximity-1 Space Link Interleaved Time Synchronization (PITS) Protocol
NASA Technical Reports Server (NTRS)
Woo, Simon S.; Veregge, John R.; Gao, Jay L.; Clare, Loren P.; Mills, David
2012-01-01
The Proximity-1 Space Link Interleaved Time Synchronization (PITS) protocol provides time distribution and synchronization services for space systems. A software prototype implementation of the PITS algorithm has been developed that also provides the test harness to evaluate the key functionalities of PITS with simulated data source and sink. PITS integrates time synchronization functionality into the link layer of the CCSDS Proximity-1 Space Link Protocol. The software prototype implements the network packet format, data structures, and transmit- and receive-timestamp function for a time server and a client. The software also simulates the transmit and receive-time stamp exchanges via UDP (User Datagram Protocol) socket between a time server and a time client, and produces relative time offsets and delay estimates.
A mobile information management system used in textile enterprises
NASA Astrophysics Data System (ADS)
Huang, C.-R.; Yu, W.-D.
2008-02-01
The mobile information management system (MIMS) for textile enterprises is based on Microsoft Visual Studios. NET2003 Server, Microsoft SQL Server 2000, C++ language and wireless application protocol (WAP) and wireless markup language (WML) technology. The portable MIMS is composed of three-layer structures, i.e. showing layer; operating layer; and data visiting layer corresponding to the port-link module; processing module; and database module. By using the MIMS, not only the information exchanges become more convenient and easier, but also the compatible between the giant information capacity and a micro-cell phone and functional expansion nature in operating and designing can be realized by means of build-in units. The development of MIMS is suitable for the utilization in textile enterprises.
Development of a system for transferring images via a network: supporting a regional liaison.
Mihara, Naoki; Manabe, Shiro; Takeda, Toshihiro; Shinichirou, Kitamura; Junichi, Murakami; Kouji, Kiso; Matsumura, Yasushi
2013-01-01
We developed a system that transfers images via network and started using them in our hospital's PACS (Picture Archiving and Communication Systems) in 2006. We are pleased to report that the system has been re-developed and has been running so that there will be a regional liaison in the future. It has become possible to automatically transfer images simply by selecting the destination hospital that is registered in advance at the relay server. The gateway of this system can send images to a multi-center, relay management server, which receives the images and resends them. This system has the potential to be useful for image exchange, and to serve as a regional medical liaison.
Provisioning cooling elements for chillerless data centers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chainer, Timothy J.; Parida, Pritish R.
Systems and methods for cooling include one or more computing structure, an inter-structure liquid cooling system that includes valves configured to selectively provide liquid coolant to the one or more computing structures; a heat rejection system that includes one or more heat rejection units configured to cool liquid coolant; and one or more liquid-to-liquid heat exchangers that include valves configured to selectively transfer heat from liquid coolant in the inter-structure liquid cooling system to liquid coolant in the heat rejection system. Each computing structure further includes one or more liquid-cooled servers; and an intra-structure liquid cooling system that has valvesmore » configured to selectively provide liquid coolant to the one or more liquid-cooled servers.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Busbey, A.B.
A number of methods and products, both hardware and software, to allow data exchange between Apple Macintosh computers and MS-DOS based systems. These included serial null modem connections, MS-DOS hardware and/or software emulation, MS-DOS disk-reading hardware and networking.
Modular multiple sensors information management for computer-integrated surgery.
Vaccarella, Alberto; Enquobahrie, Andinet; Ferrigno, Giancarlo; Momi, Elena De
2012-09-01
In the past 20 years, technological advancements have modified the concept of modern operating rooms (ORs) with the introduction of computer-integrated surgery (CIS) systems, which promise to enhance the outcomes, safety and standardization of surgical procedures. With CIS, different types of sensor (mainly position-sensing devices, force sensors and intra-operative imaging devices) are widely used. Recently, the need for a combined use of different sensors raised issues related to synchronization and spatial consistency of data from different sources of information. In this study, we propose a centralized, multi-sensor management software architecture for a distributed CIS system, which addresses sensor information consistency in both space and time. The software was developed as a data server module in a client-server architecture, using two open-source software libraries: Image-Guided Surgery Toolkit (IGSTK) and OpenCV. The ROBOCAST project (FP7 ICT 215190), which aims at integrating robotic and navigation devices and technologies in order to improve the outcome of the surgical intervention, was used as the benchmark. An experimental protocol was designed in order to prove the feasibility of a centralized module for data acquisition and to test the application latency when dealing with optical and electromagnetic tracking systems and ultrasound (US) imaging devices. Our results show that a centralized approach is suitable for minimizing synchronization errors; latency in the client-server communication was estimated to be 2 ms (median value) for tracking systems and 40 ms (median value) for US images. The proposed centralized approach proved to be adequate for neurosurgery requirements. Latency introduced by the proposed architecture does not affect tracking system performance in terms of frame rate and limits US images frame rate at 25 fps, which is acceptable for providing visual feedback to the surgeon in the OR. Copyright © 2012 John Wiley & Sons, Ltd.
Petruk, Ariel A; Defelipe, Lucas A; Rodríguez Limardo, Ramiro G; Bucci, Hernán; Marti, Marcelo A; Turjanski, Adrian G
2013-01-08
It is now clear that proteins are flexible entities that in solution switch between conformations to achieve their function. Hydrogen/Deuterium Exchange Mass Spectrometry (HX/MS) is an invaluable tool to understand dynamic changes in proteins modulated by cofactor binding, post-transductional modifications, or protein-protein interactions. ERK2MAPK, a protein involved in highly conserved signal transduction pathways of paramount importance for normal cellular function, has been extensively studied by HX/MS. Experiments of the ERK2MAPK in the inactive and active states (in the presence or absence of bound ATP) have provided valuable information on the plasticity of the MAPK domain. However, interpretation of the HX/MS data is difficult, and changes are mostly explained in relation to available X-ray structures, precluding a complete atomic picture of protein dynamics. In the present work, we have used all atom Molecular Dynamics simulations (MD) to provide a theoretical framework for the interpretation of HX/MS data. Our results show that detailed analysis of protein-solvent interaction along the MD simulations allows (i) prediction of the number of protons exchanged for each peptide in the HX/MS experiments, (ii) rationalization of the experimentally observed changes in exchange rates in different protein conditions at the residue level, and (iii) that at least for ERK2MAPK, most of the functionally observed differences in protein dynamics are related to what can be considered the native state conformational ensemble. In summary, the combination of HX/MS experiments with all atom MD simulations emerges as a powerful approach to study protein native state dynamics with atomic resolution.
Performance evaluation of cross-flow single-phase liquid-to-gas polymer tube heat exchanger
NASA Astrophysics Data System (ADS)
Dewanjee, Sujan; Hossain, Md. Rakibul; Rahman, Md. Ashiqur
2017-06-01
Reduced core weight and material cost, higher corrosion resistance are some of the major eye catching properties to study polymers over metal in heat exchanger applications in spite of the former's relatively low thermal conductivity and low strength. In the present study, performance of polymer parallel thin tube heat exchanger is numerically evaluated for cross flow liquid to air applications for a wide range of design and operating parameters such as tube diameter, thickness, fluid velocity and temperature, etc. using Computational Fluid Dynamics (CFD). Among a range of available polymeric materials, those with a moderate to high thermal conductivity and strength are selected for this study. A 90 cm × 1 cm single unit of polymer tubes, with appropriate number of tubes such that at least a gap of 5 mm is maintained in between the tubes, is used as a basic unit and multiple combination in the transverse direction of this single unit is simulated to measure the effect. The tube inner diameter is varied from 2 mm to 4 mm and the pressure drop is measured to have a relative idea of pumping cost. For each inner diameter the thickness is varied from .5 mm to 2.5 mm. The water velocity and the air velocity are varied from 0.4 m/s to 2 m/s and 1 m/s to 5 m/s, respectively. The performance of the polymer heat exchanger is compared with that of metal heat exchanger through and an optimum design for polymer heat exchanger is sought out.
Serving by local consensus in the public service location game.
Sun, Yi-Fan; Zhou, Hai-Jun
2016-09-02
We discuss the issue of distributed and cooperative decision-making in a network game of public service location. Each node of the network can decide to host a certain public service incurring in a construction cost and serving all the neighboring nodes and itself. A pure consumer node has to pay a tax, and the collected tax is evenly distributed to all the hosting nodes to remedy their construction costs. If all nodes make individual best-response decisions, the system gets trapped in an inefficient situation of high tax level. Here we introduce a decentralized local-consensus selection mechanism which requires nodes to recommend their neighbors of highest local impact as candidate servers, and a node may become a server only if all its non-server neighbors give their assent. We demonstrate that although this mechanism involves only information exchange among neighboring nodes, it leads to socially efficient solutions with tax level approaching the lowest possible value. Our results may help in understanding and improving collective problem-solving in various networked social and robotic systems.
HARMONY: a server for the assessment of protein structures
Pugalenthi, G.; Shameer, K.; Srinivasan, N.; Sowdhamini, R.
2006-01-01
Protein structure validation is an important step in computational modeling and structure determination. Stereochemical assessment of protein structures examine internal parameters such as bond lengths and Ramachandran (φ,ψ) angles. Gross structure prediction methods such as inverse folding procedure and structure determination especially at low resolution can sometimes give rise to models that are incorrect due to assignment of misfolds or mistracing of electron density maps. Such errors are not reflected as strain in internal parameters. HARMONY is a procedure that examines the compatibility between the sequence and the structure of a protein by assigning scores to individual residues and their amino acid exchange patterns after considering their local environments. Local environments are described by the backbone conformation, solvent accessibility and hydrogen bonding patterns. We are now providing HARMONY through a web server such that users can submit their protein structure files and, if required, the alignment of homologous sequences. Scores are mapped on the structure for subsequent examination that is useful to also recognize regions of possible local errors in protein structures. HARMONY server is located at PMID:16844999
Serving by local consensus in the public service location game
Sun, Yi-Fan; Zhou, Hai-Jun
2016-01-01
We discuss the issue of distributed and cooperative decision-making in a network game of public service location. Each node of the network can decide to host a certain public service incurring in a construction cost and serving all the neighboring nodes and itself. A pure consumer node has to pay a tax, and the collected tax is evenly distributed to all the hosting nodes to remedy their construction costs. If all nodes make individual best-response decisions, the system gets trapped in an inefficient situation of high tax level. Here we introduce a decentralized local-consensus selection mechanism which requires nodes to recommend their neighbors of highest local impact as candidate servers, and a node may become a server only if all its non-server neighbors give their assent. We demonstrate that although this mechanism involves only information exchange among neighboring nodes, it leads to socially efficient solutions with tax level approaching the lowest possible value. Our results may help in understanding and improving collective problem-solving in various networked social and robotic systems. PMID:27586793
Serving by local consensus in the public service location game
NASA Astrophysics Data System (ADS)
Sun, Yi-Fan; Zhou, Hai-Jun
2016-09-01
We discuss the issue of distributed and cooperative decision-making in a network game of public service location. Each node of the network can decide to host a certain public service incurring in a construction cost and serving all the neighboring nodes and itself. A pure consumer node has to pay a tax, and the collected tax is evenly distributed to all the hosting nodes to remedy their construction costs. If all nodes make individual best-response decisions, the system gets trapped in an inefficient situation of high tax level. Here we introduce a decentralized local-consensus selection mechanism which requires nodes to recommend their neighbors of highest local impact as candidate servers, and a node may become a server only if all its non-server neighbors give their assent. We demonstrate that although this mechanism involves only information exchange among neighboring nodes, it leads to socially efficient solutions with tax level approaching the lowest possible value. Our results may help in understanding and improving collective problem-solving in various networked social and robotic systems.
2012-09-01
Services FSD Federated Services Daemon I&A Identification and Authentication IKE Internet Key Exchange KPI Key Performance Indicator LAN Local Area...spection takes place in different processes in the server architecture. Key Performance Indica- tor ( KPI )s associated with the system need to be...application and risk analysis of security controls. Thus, measurement of the KPIs is needed before an informed tradeoff between the performance penalties
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-20
... be charged on a per-Login ID basis. Firms may access C2 via either a CMI Client Application [[Page..., using different Login IDs, accessing the same CMI Client Application Server or FIX Port, allowing the firm to only pay the monthly fee once. Alternatively, a firm may use the same Login ID to access...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-14
... charges to assess a fee for each CMI Login ID. Firms may access CBOEdirect via either a CMI Client... Login IDs, accessing the same CMI Client Application Server, allowing the firm to only pay the monthly fee once. Alternatively, a firm may use the same Login ID to access different CMI Client Application...
NASA Astrophysics Data System (ADS)
Hamuro, Yoshitomo
2017-05-01
Protein backbone amide hydrogen/deuterium exchange mass spectrometry (HDX-MS) typically utilizes enzymatic digestion after the exchange reaction and before MS analysis to improve data resolution. Gas-phase fragmentation of a peptic fragment prior to MS analysis is a promising technique to further increase the resolution. The biggest technical challenge for this method is elimination of intramolecular hydrogen/deuterium exchange (scrambling) in the gas phase. The scrambling obscures the location of deuterium. Jørgensen's group pioneered a method to minimize the scrambling in gas-phase electron capture/transfer dissociation. Despite active investigation, the mechanism of hydrogen scrambling is not well-understood. The difficulty stems from the fact that the degree of hydrogen scrambling depends on instruments, various parameters of mass analysis, and peptide analyzed. In most hydrogen scrambling investigations, the hydrogen scrambling is measured by the percentage of scrambling in a whole molecule. This paper demonstrates that the degree of intramolecular hydrogen/deuterium exchange depends on the nature of exchangeable hydrogen sites. The deuterium on Tyr amide of neurotensin (9-13), Arg-Pro-Tyr-Ile-Leu, migrated significantly faster than that on Ile or Leu amides, indicating the loss of deuterium from the original sites is not mere randomization of hydrogen and deuterium but more site-specific phenomena. This more precise approach may help understand the mechanism of intramolecular hydrogen exchange and provide higher confidence for the parameter optimization to eliminate intramolecular hydrogen/deuterium exchange during gas-phase fragmentation.
Online hydrogen/deuterium exchange performed in the ion mobility cell of a hybrid mass spectrometer.
Nagy, Kornél; Redeuil, Karine; Rezzi, Serge
2009-11-15
The present paper describes the performance of online, gas-phase hydrogen/deuterium exchange implemented in the ion mobility cell of a quadrupole time-of-flight mass spectrometer. Deuterium oxide and deuterated methanol were utilized to create deuterated vapor that is introduced into the ion mobility region of the mass spectrometer. Hydrogen/deuterium exchange occurs spontaneously in the milliseconds time frame without the need of switching the instrument into ion mobility mode. The exchange was studied in case of low molecular weight molecules and proteins. The observed number of exchanged hydrogens was equal to the number of theoretically exchangeable hydrogens for all low molecular weight compounds. This method needs only minimal instrumental modifications, is simple, cheap, environment friendly, compatible with ultraperformance liquid chromatography, and can be implemented on commercially available instruments. It does not compromise choice of liquid chromatographic solvents and accurate mass or parallel-fragmentation (MS(E)) methods. The performance of this method was compared to that of conventional alternatives where the deuterated solvent is introduced into the cone gas of the instrument. Although the degree of exchange was similar between the two methods, the "cone gas method" requires 10 times higher deuterated solvent volumes (50 muL/min) and offers reduced sensitivity in the tandem mass spectrometry (MS/MS) mode. The presented method is suggested as a standard future element of mass spectrometers to aid online structural characterization of unknowns and to study conformational changes of proteins with hydrogen/deuterium exchange.
Hamuro, Yoshitomo
2017-05-01
Protein backbone amide hydrogen/deuterium exchange mass spectrometry (HDX-MS) typically utilizes enzymatic digestion after the exchange reaction and before MS analysis to improve data resolution. Gas-phase fragmentation of a peptic fragment prior to MS analysis is a promising technique to further increase the resolution. The biggest technical challenge for this method is elimination of intramolecular hydrogen/deuterium exchange (scrambling) in the gas phase. The scrambling obscures the location of deuterium. Jørgensen's group pioneered a method to minimize the scrambling in gas-phase electron capture/transfer dissociation. Despite active investigation, the mechanism of hydrogen scrambling is not well-understood. The difficulty stems from the fact that the degree of hydrogen scrambling depends on instruments, various parameters of mass analysis, and peptide analyzed. In most hydrogen scrambling investigations, the hydrogen scrambling is measured by the percentage of scrambling in a whole molecule. This paper demonstrates that the degree of intramolecular hydrogen/deuterium exchange depends on the nature of exchangeable hydrogen sites. The deuterium on Tyr amide of neurotensin (9-13), Arg-Pro-Tyr-Ile-Leu, migrated significantly faster than that on Ile or Leu amides, indicating the loss of deuterium from the original sites is not mere randomization of hydrogen and deuterium but more site-specific phenomena. This more precise approach may help understand the mechanism of intramolecular hydrogen exchange and provide higher confidence for the parameter optimization to eliminate intramolecular hydrogen/deuterium exchange during gas-phase fragmentation. Graphical Abstract ᅟ.
[Relevance of the hemovigilance regional database for the shared medical file identity server].
Doly, A; Fressy, P; Garraud, O
2008-11-01
The French Health Products Safety Agency coordinates the national initiative of computerization of blood products traceability within regional blood banks and public and private hospitals. The Auvergne-Loire Regional French Blood Service, based in Saint-Etienne, together with a number of public hospitals set up a transfusion data network named EDITAL. After four years of progressive implementation and experimentation, a software enabling standardized data exchange has built up a regional nominative database, endorsed by the Traceability Computerization National Committee in 2004. This database now provides secured web access to a regional transfusion history enabling biologists and all hospital and family practitioners to take in charge the patient follow-up. By running independently from the softwares of its partners, EDITAL database provides reference for the regional identity server.
OneSAF as an In-Stride Mission Command Asset
2014-06-01
implementation approach. While DARPA began with a funded project to complete the capability as a “ big bang ” approach the approach here is based on reuse and...Command (MC), Modeling and Simulation (M&S), Distributed Interactive Simulation (DIS) ABSTRACT: To provide greater interoperability and integration...within Mission Command (MC) Systems the One Semi-Automated Forces (OneSAF) entity level simulation is evolving from a tightly coupled client server
Analysis of Protein Conformation and Dynamics by Hydrogen/Deuterium Exchange MS
Engen, John R.
2009-01-01
synopsis Recent technological advances hydrogen exchange MS have led to improvements in the technique’s ability to analyze the shape and movements of proteins. John Engen of Northeastern University gives a much needed update on the field. The cover, created by Engen, shows proteins “swimming” in an H2O/D2O solution with a sample mass spectrum in the background. PMID:19788312
An object-oriented, technology-adaptive information model
NASA Technical Reports Server (NTRS)
Anyiwo, Joshua C.
1995-01-01
The primary objective was to develop a computer information system for effectively presenting NASA's technologies to American industries, for appropriate commercialization. To this end a comprehensive information management model, applicable to a wide variety of situations, and immune to computer software/hardware technological gyrations, was developed. The model consists of four main elements: a DATA_STORE, a data PRODUCER/UPDATER_CLIENT and a data PRESENTATION_CLIENT, anchored to a central object-oriented SERVER engine. This server engine facilitates exchanges among the other model elements and safeguards the integrity of the DATA_STORE element. It is designed to support new technologies, as they become available, such as Object Linking and Embedding (OLE), on-demand audio-video data streaming with compression (such as is required for video conferencing), Worldwide Web (WWW) and other information services and browsing, fax-back data requests, presentation of information on CD-ROM, and regular in-house database management, regardless of the data model in place. The four components of this information model interact through a system of intelligent message agents which are customized to specific information exchange needs. This model is at the leading edge of modern information management models. It is independent of technological changes and can be implemented in a variety of ways to meet the specific needs of any communications situation. This summer a partial implementation of the model has been achieved. The structure of the DATA_STORE has been fully specified and successfully tested using Microsoft's FoxPro 2.6 database management system. Data PRODUCER/UPDATER and PRESENTATION architectures have been developed and also successfully implemented in FoxPro; and work has started on a full implementation of the SERVER engine. The model has also been successfully applied to a CD-ROM presentation of NASA's technologies in support of Langley Research Center's TAG efforts.
Dynamic XML-based exchange of relational data: application to the Human Brain Project.
Tang, Zhengming; Kadiyska, Yana; Li, Hao; Suciu, Dan; Brinkley, James F
2003-01-01
This paper discusses an approach to exporting relational data in XML format for data exchange over the web. We describe the first real-world application of SilkRoute, a middleware program that dynamically converts existing relational data to a user-defined XML DTD. The application, called XBrain, wraps SilkRoute in a Java Server Pages framework, thus permitting a web-based XQuery interface to a legacy relational database. The application is demonstrated as a query interface to the University of Washington Brain Project's Language Map Experiment Management System, which is used to manage data about language organization in the brain.
Forecasting Foreign Currency Exchange Rates for Air Force Budgeting
2015-03-26
FORECASTING FOREIGN CURRENCY EXCHANGE RATES FOR AIR FORCE BUDGETING THESIS MARCH 2015...States. AFIT-ENV-MS-15-M-178 FORECASTING FOREIGN CURRENCY EXCHANGE RATES FOR AIR FORCE BUDGETING THESIS Presented to the Faculty...FORECASTING FOREIGN CURRENCY EXCHANGE RATES FOR AIR FORCE BUDGETING Nicholas R. Gardner, BS Captain, USAF Committee Membership: Lt Col Jonathan
Tian, Tze-Feng; Wang, San-Yuan; Kuo, Tien-Chueh; Tan, Cheng-En; Chen, Guan-Yuan; Kuo, Ching-Hua; Chen, Chi-Hsin Sally; Chan, Chang-Chuan; Lin, Olivia A; Tseng, Y Jane
2016-11-01
Two-dimensional gas chromatography time-of-flight mass spectrometry (GC×GC/TOF-MS) is superior for chromatographic separation and provides great sensitivity for complex biological fluid analysis in metabolomics. However, GC×GC/TOF-MS data processing is currently limited to vendor software and typically requires several preprocessing steps. In this work, we implement a web-based platform, which we call GC 2 MS, to facilitate the application of recent advances in GC×GC/TOF-MS, especially for metabolomics studies. The core processing workflow of GC 2 MS consists of blob/peak detection, baseline correction, and blob alignment. GC 2 MS treats GC×GC/TOF-MS data as pictures and clusters the pixels as blobs according to the brightness of each pixel to generate a blob table. GC 2 MS then aligns the blobs of two GC×GC/TOF-MS data sets according to their distance and similarity. The blob distance and similarity are the Euclidean distance of the first and second retention times of two blobs and the Pearson's correlation coefficient of the two mass spectra, respectively. GC 2 MS also directly corrects the raw data baseline. The analytical performance of GC 2 MS was evaluated using GC×GC/TOF-MS data sets of Angelica sinensis compounds acquired under different experimental conditions and of human plasma samples. The results show that GC 2 MS is an easy-to-use tool for detecting peaks and correcting baselines, and GC 2 MS is able to align GC×GC/TOF-MS data sets acquired under different experimental conditions. GC 2 MS is freely accessible at http://gc2ms.web.cmdm.tw .
2001-01-01
System (GCCS) Track Database Management System (TDBM) (3) GCCS Integrated Imagery and Intelligence (3) Intelligence Shared Data Server (ISDS) General ...The CTH is a powerful model that will allow more than just message systems to exchange information. It could be used for object-oriented databases, as...of the Naval Integrated Tactical Environmental System I (NITES I) is used as a case study to demonstrate the utility of this distributed component
Kan, Zhong-Yuan; Walters, Benjamin T.; Mayne, Leland; Englander, S. Walter
2013-01-01
Hydrogen exchange technology provides a uniquely powerful instrument for measuring protein structural and biophysical properties, quantitatively and in a nonperturbing way, and determining how these properties are implemented to produce protein function. A developing hydrogen exchange–mass spectrometry method (HX MS) is able to analyze large biologically important protein systems while requiring only minuscule amounts of experimental material. The major remaining deficiency of the HX MS method is the inability to deconvolve HX results to individual amino acid residue resolution. To pursue this goal we used an iterative optimization program (HDsite) that integrates recent progress in multiple peptide acquisition together with previously unexamined isotopic envelope-shape information and a site-resolved back-exchange correction. To test this approach, residue-resolved HX rates computed from HX MS data were compared with extensive HX NMR measurements, and analogous comparisons were made in simulation trials. These tests found excellent agreement and revealed the important computational determinants. PMID:24019478
NASA Astrophysics Data System (ADS)
Hamuro, Yoshitomo; E, Sook Yen
2018-05-01
The technological goal of hydrogen/deuterium exchange-mass spectrometry (HDX-MS) is to determine backbone amide hydrogen exchange rates. The most critical challenge to achieve this goal is obtaining the deuterium incorporation in single-amide resolution, and gas-phase fragmentation may provide a universal solution. The gas-phase fragmentation may generate the daughter ions which differ by a single amino acid and the difference in deuterium incorporations in the two analogous ions can yield the deuterium incorporation at the sub-localized site. Following the pioneering works by Jørgensen and Rand, several papers utilized the electron transfer dissociation (ETD) to determine the location of deuterium in single-amide resolution. This paper demonstrates further advancement of the strategy by determining backbone amide hydrogen exchange rates, instead of just determining deuterium incorporation at a single time point, in combination with a wide time window monitoring. A method to evaluate the effects of scrambling and to determine the exchange rates from partially scrambled HDX-ETD-MS data is described. All parent ions for ETD fragmentation were regio-selectively scrambled: The deuterium in some regions of a peptide ion was scrambled while that in the other regions was not scrambled. The method determined 31 backbone amide hydrogen exchange rates of cytochrome c in the non-scrambled regions. Good fragmentation of a parent ion, a low degree of scrambling, and a low number of exchangeable hydrogens in the preceding side chain are the important factors to determine the exchange rate. The exchange rates determined by the HDX-MS are in good agreement with those determined by NMR. [Figure not available: see fulltext.
Hamuro, Yoshitomo; E, Sook Yen
2018-05-01
The technological goal of hydrogen/deuterium exchange-mass spectrometry (HDX-MS) is to determine backbone amide hydrogen exchange rates. The most critical challenge to achieve this goal is obtaining the deuterium incorporation in single-amide resolution, and gas-phase fragmentation may provide a universal solution. The gas-phase fragmentation may generate the daughter ions which differ by a single amino acid and the difference in deuterium incorporations in the two analogous ions can yield the deuterium incorporation at the sub-localized site. Following the pioneering works by Jørgensen and Rand, several papers utilized the electron transfer dissociation (ETD) to determine the location of deuterium in single-amide resolution. This paper demonstrates further advancement of the strategy by determining backbone amide hydrogen exchange rates, instead of just determining deuterium incorporation at a single time point, in combination with a wide time window monitoring. A method to evaluate the effects of scrambling and to determine the exchange rates from partially scrambled HDX-ETD-MS data is described. All parent ions for ETD fragmentation were regio-selectively scrambled: The deuterium in some regions of a peptide ion was scrambled while that in the other regions was not scrambled. The method determined 31 backbone amide hydrogen exchange rates of cytochrome c in the non-scrambled regions. Good fragmentation of a parent ion, a low degree of scrambling, and a low number of exchangeable hydrogens in the preceding side chain are the important factors to determine the exchange rate. The exchange rates determined by the HDX-MS are in good agreement with those determined by NMR. Graphical Abstract ᅟ.
NASA Astrophysics Data System (ADS)
Hamuro, Yoshitomo; E, Sook Yen
2018-03-01
The technological goal of hydrogen/deuterium exchange-mass spectrometry (HDX-MS) is to determine backbone amide hydrogen exchange rates. The most critical challenge to achieve this goal is obtaining the deuterium incorporation in single-amide resolution, and gas-phase fragmentation may provide a universal solution. The gas-phase fragmentation may generate the daughter ions which differ by a single amino acid and the difference in deuterium incorporations in the two analogous ions can yield the deuterium incorporation at the sub-localized site. Following the pioneering works by Jørgensen and Rand, several papers utilized the electron transfer dissociation (ETD) to determine the location of deuterium in single-amide resolution. This paper demonstrates further advancement of the strategy by determining backbone amide hydrogen exchange rates, instead of just determining deuterium incorporation at a single time point, in combination with a wide time window monitoring. A method to evaluate the effects of scrambling and to determine the exchange rates from partially scrambled HDX-ETD-MS data is described. All parent ions for ETD fragmentation were regio-selectively scrambled: The deuterium in some regions of a peptide ion was scrambled while that in the other regions was not scrambled. The method determined 31 backbone amide hydrogen exchange rates of cytochrome c in the non-scrambled regions. Good fragmentation of a parent ion, a low degree of scrambling, and a low number of exchangeable hydrogens in the preceding side chain are the important factors to determine the exchange rate. The exchange rates determined by the HDX-MS are in good agreement with those determined by NMR. [Figure not available: see fulltext.
2018 NDIA 33rd Annual National Test and Evaluation Conference
2018-05-17
Breach IOC Delayed RDT&E Overrun MS B IOC First Flight CDR Wind Tunnel Campaign Flight Test Campaign $ Peak Burn Rate Occurs Around FF Wind Tunnel...Connectivity Team – Tier 2 network support, network characterization and analysis, walk-the- wire trouble resolution, assistance with new site Connection...File Transfer Protocol (SFTP) Server. The Test and Training Enabling Architecture (TENA) is used for over the wire simulation protocol via the DISGW
Mass spectrometry combinations for structural characterization of sulfated-steroid metabolites.
Yan, Yuetian; Rempel, Don L; Holy, Timothy E; Gross, Michael L
2014-05-01
Steroid conjugates, which often occur as metabolites, are challenging to characterize. One application is female-mouse urine, where steroid conjugates serve as important ligands for the pheromone-sensing neurons. Although the two with the highest abundance in mouse urine were previously characterized with mass spectrometry (MS) and NMR to be sulfated steroids, many more exist but remain structurally unresolved. Given that their physical and chemical properties are similar, they are likely to have a sulfated steroid ring structure. Because these compounds occur in trace amounts in mouse urine and elsewhere, their characterization by NMR will be difficult. Thus, MS methods become the primary approach for determining structure. Here, we show that a combination of MS tools is effective for determining the structures of sulfated steroids. Using 4-pregnene analogs, we explored high-resolving power MS (HR-MS) to determine chemical formulae; HD exchange MS (HDX-MS) to determine number of active, exchangeable hydrogens (e.g., OH groups); methoxyamine hydrochloride (MOX) derivatization MS, or reactive desorption electrospray ionization with hydroxylamine to determine the number of carbonyl groups; and tandem MS (MS(n)), high-resolution tandem MS (HRMS/MS), and GC-MS to obtain structural details of the steroid ring. From the fragmentation studies, we deduced three major fragmentation rules for this class of sulfated steroids. We also show that a combined MS approach is effective for determining structure of steroid metabolites, with important implications for targeted metabolomics in general and for the study of mouse social communication in particular.
Robinson, Judas; de Lusignan, Simon; Kostkova, Patty; Madge, Bruce; Marsh, A; Biniaris, C
2006-01-01
Rich Site Summary (RSS) feeds are a method for disseminating and syndicating the contents of a website using extensible mark-up language (XML). The Primary Care Electronic Library (PCEL) distributes recent additions to the site in the form of an RSS feed. When new resources are added to PCEL, they are manually assigned medical subject headings (MeSH terms), which are then automatically mapped to SNOMED-CT terms using the Unified Medical Language System (UMLS) Metathesaurus. The library is thus searchable using MeSH or SNOMED-CT. Our syndicate partner wished to have remote access to PCEL coronary heart disease (CHD) information resources based on SNOMED-CT search terms. To pilot the supply of relevant information resources in response to clinically coded requests, using RSS syndication for transmission between web servers. Our syndicate partner provided a list of CHD SNOMED-CT terms to its end-users, a list which was coded according to UMLS specifications. When the end-user requested relevant information resources, this request was relayed from our syndicate partner's web server to the PCEL web server. The relevant resources were retrieved from the PCEL MySQL database. This database is accessed using a server side scripting language (PHP), which enables the production of dynamic RSS feeds on the basis of Source Asserted Identifiers (CODEs) contained in UMLS. Retrieving resources using SNOMED-CT terms using syndication can be used to build a functioning application. The process from request to display of syndicated resources took less than one second. The results of the pilot illustrate that it is possible to exchange data between servers using RSS syndication. This method could be utilised dynamically to supply digital library resources to a clinical system with SNOMED-CT data used as the standard of reference.
A concept to standardize raw biosignal transmission for brain-computer interfaces.
Breitwieser, Christian; Neuper, Christa; Müller-Putz, Gernot R
2011-01-01
With this concept we introduced the attempt of a standardized interface called TiA to transmit raw biosignals. TiA is able to deal with multirate and block-oriented data transmission. Data is distinguished by different signal types (e.g., EEG, EOG, NIRS, …), whereby those signals can be acquired at the same time from different acquisition devices. TiA is built as a client-server model. Multiple clients can connect to one server. Information is exchanged via a control- and a separated data connection. Control commands and meta information are transmitted over the control connection. Raw biosignal data is delivered using the data connection in a unidirectional way. For this purpose a standardized handshaking protocol and raw data packet have been developed. Thus, an abstraction layer between hardware devices and data processing was evolved facilitating standardization.
Mass Spectrometry Combinations for Structural Characterization of Sulfated-Steroid Metabolites
NASA Astrophysics Data System (ADS)
Yan, Yuetian; Rempel, Don L.; Holy, Timothy E.; Gross, Michael L.
2014-05-01
Steroid conjugates, which often occur as metabolites, are challenging to characterize. One application is female-mouse urine, where steroid conjugates serve as important ligands for the pheromone-sensing neurons. Although the two with the highest abundance in mouse urine were previously characterized with mass spectrometry (MS) and NMR to be sulfated steroids, many more exist but remain structurally unresolved. Given that their physical and chemical properties are similar, they are likely to have a sulfated steroid ring structure. Because these compounds occur in trace amounts in mouse urine and elsewhere, their characterization by NMR will be difficult. Thus, MS methods become the primary approach for determining structure. Here, we show that a combination of MS tools is effective for determining the structures of sulfated steroids. Using 4-pregnene analogs, we explored high-resolving power MS (HR-MS) to determine chemical formulae; HD exchange MS (HDX-MS) to determine number of active, exchangeable hydrogens (e.g., OH groups); methoxyamine hydrochloride (MOX) derivatization MS, or reactive desorption electrospray ionization with hydroxylamine to determine the number of carbonyl groups; and tandem MS (MSn), high-resolution tandem MS (HRMS/MS), and GC-MS to obtain structural details of the steroid ring. From the fragmentation studies, we deduced three major fragmentation rules for this class of sulfated steroids. We also show that a combined MS approach is effective for determining structure of steroid metabolites, with important implications for targeted metabolomics in general and for the study of mouse social communication in particular.
Mass spectrometry combinations for structural characterization of sulfated-steroid metabolites
Yan, Yuetian; Rempel, Don; Holy, Timothy E.; Gross, Michael L.
2015-01-01
Steroid conjugates, which often occur as metabolites, are challenging to characterize. One application is female-mouse urine, where steroid conjugates serve as important ligands for the pheromone-sensing neurons. Although the two with the highest abundance in mouse urine were previously characterized with mass spectrometry (MS) and NMR to be sulfated steroids, many more exist but remain structurally unresolved. Given that their physical and chemical properties are similar, they are likely to have a sulfated steroid ring structure. Because these compounds occur in trace amounts in mouse urine and elsewhere, their characterization by NMR will be difficult. Thus, MS methods become the primary approach for determining structure. Here, we show that a combination of MS tools is effective for determining the structures of sulfated steroids. Using 4-pregnene analogs, we explored high-resolving power MS (HR-MS) to determine chemical formulae; HD exchange MS (HDX-MS) to determine number of active, exchangeable hydrogens (e.g., OH groups); methoxyamine hydrochloride (MOX) derivatization MS, or reactive desorption electrospray ionization with hydroxylamine to determine the number of carbonyl groups; and tandem MS (MSn), high-resolution tandem MS (HRMS/MS), and GC-MS to obtain structural details of the steroid ring. From the fragmentation studies, we deduced three major fragmentation rules for this class of sulfated steroids. We also show that a combined MS approach is effective for determining structure of steroid metabolites, with important implications for targeted metabolomics in general and for the study of mouse social communication in particular. PMID:24658800
Ajax Architecture Implementation Techniques
NASA Astrophysics Data System (ADS)
Hussaini, Syed Asadullah; Tabassum, S. Nasira; Baig, Tabassum, M. Khader
2012-03-01
Today's rich Web applications use a mix of Java Script and asynchronous communication with the application server. This mechanism is also known as Ajax: Asynchronous JavaScript and XML. The intent of Ajax is to exchange small pieces of data between the browser and the application server, and in doing so, use partial page refresh instead of reloading the entire Web page. AJAX (Asynchronous JavaScript and XML) is a powerful Web development model for browser-based Web applications. Technologies that form the AJAX model, such as XML, JavaScript, HTTP, and XHTML, are individually widely used and well known. However, AJAX combines these technologies to let Web pages retrieve small amounts of data from the server without having to reload the entire page. This capability makes Web pages more interactive and lets them behave like local applications. Web 2.0 enabled by the Ajax architecture has given rise to a new level of user interactivity through web browsers. Many new and extremely popular Web applications have been introduced such as Google Maps, Google Docs, Flickr, and so on. Ajax Toolkits such as Dojo allow web developers to build Web 2.0 applications quickly and with little effort.
2008-07-01
also a large Internet service provider and an operator of two of the 13 root zone servers that provide the basic information for locating Internet ...routing and address information to assure continued connectivity and speed? In addition, exchange point technology needs to be improved and there are...alternative technology will come along that will make the Internet outmoded in the same way the Internet has begun to make the Public Switched Telephone
Park, In-Hee; Venable, John D; Steckler, Caitlin; Cellitti, Susan E; Lesley, Scott A; Spraggon, Glen; Brock, Ansgar
2015-09-28
Hydrogen exchange (HX) studies have provided critical insight into our understanding of protein folding, structure, and dynamics. More recently, hydrogen exchange mass spectrometry (HX-MS) has become a widely applicable tool for HX studies. The interpretation of the wealth of data generated by HX-MS experiments as well as other HX methods would greatly benefit from the availability of exchange predictions derived from structures or models for comparison with experiment. Most reported computational HX modeling studies have employed solvent-accessible-surface-area based metrics in attempts to interpret HX data on the basis of structures or models. In this study, a computational HX-MS prediction method based on classification of the amide hydrogen bonding modes mimicking the local unfolding model is demonstrated. Analysis of the NH bonding configurations from molecular dynamics (MD) simulation snapshots is used to determine partitioning over bonded and nonbonded NH states and is directly mapped into a protection factor (PF) using a logistics growth function. Predicted PFs are then used for calculating deuteration values of peptides and compared with experimental data. Hydrogen exchange MS data for fatty acid synthase thioesterase (FAS-TE) collected for a range of pHs and temperatures was used for detailed evaluation of the approach. High correlation between prediction and experiment for observable fragment peptides is observed in the FAS-TE and additional benchmarking systems that included various apo/holo proteins for which literature data were available. In addition, it is shown that HX modeling can improve experimental resolution through decomposition of in-exchange curves into rate classes, which correlate with prediction from MD. Successful rate class decompositions provide further evidence that the presented approach captures the underlying physical processes correctly at the single residue level. This assessment is further strengthened in a comparison of residue resolved protection factor predictions for staphylococcal nuclease with NMR data, which was also used to compare prediction performance with other algorithms described in the literature. The demonstrated transferable and scalable MD based HX prediction approach adds significantly to the available tools for HX-MS data interpretation based on available structures and models.
Park, In-Hee; Venable, John D.; Steckler, Caitlin; Cellitti, Susan E.; Lesley, Scott A.; Spraggon, Glen; Brock, Ansgar
2015-01-01
Hydrogen exchange (HX) studies have provided critical insight into our understanding of protein folding, structure and dynamics. More recently, Hydrogen Exchange Mass Spectrometry (HX-MS) has become a widely applicable tool for HX studies. The interpretation of the wealth of data generated by HX-MS experiments as well as other HX methods would greatly benefit from the availability of exchange predictions derived from structures or models for comparison with experiment. Most reported computational HX modeling studies have employed solvent-accessible-surface-area based metrics in attempts to interpret HX data on the basis of structures or models. In this study, a computational HX-MS prediction method based on classification of the amide hydrogen bonding modes mimicking the local unfolding model is demonstrated. Analysis of the NH bonding configurations from Molecular Dynamics (MD) simulation snapshots is used to determine partitioning over bonded and non-bonded NH states and is directly mapped into a protection factor (PF) using a logistics growth function. Predicted PFs are then used for calculating deuteration values of peptides and compared with experimental data. Hydrogen exchange MS data for Fatty acid synthase thioesterase (FAS-TE) collected for a range of pHs and temperatures was used for detailed evaluation of the approach. High correlation between prediction and experiment for observable fragment peptides is observed in the FAS-TE and additional benchmarking systems that included various apo/holo proteins for which literature data were available. In addition, it is shown that HX modeling can improve experimental resolution through decomposition of in-exchange curves into rate classes, which correlate with prediction from MD. Successful rate class decompositions provide further evidence that the presented approach captures the underlying physical processes correctly at the single residue level. This assessment is further strengthened in a comparison of residue resolved protection factor predictions for staphylococcal nuclease with NMR data, which was also used to compare prediction performance with other algorithms described in the literature. The demonstrated transferable and scalable MD based HX prediction approach adds significantly to the available tools for HX-MS data interpretation based on available structures and models. PMID:26241692
GLobal Integrated Design Environment
NASA Technical Reports Server (NTRS)
Kunkel, Matthew; McGuire, Melissa; Smith, David A.; Gefert, Leon P.
2011-01-01
The GLobal Integrated Design Environment (GLIDE) is a collaborative engineering application built to resolve the design session issues of real-time passing of data between multiple discipline experts in a collaborative environment. Utilizing Web protocols and multiple programming languages, GLIDE allows engineers to use the applications to which they are accustomed in this case, Excel to send and receive datasets via the Internet to a database-driven Web server. Traditionally, a collaborative design session consists of one or more engineers representing each discipline meeting together in a single location. The discipline leads exchange parameters and iterate through their respective processes to converge on an acceptable dataset. In cases in which the engineers are unable to meet, their parameters are passed via e-mail, telephone, facsimile, or even postal mail. The result of this slow process of data exchange would elongate a design session to weeks or even months. While the iterative process remains in place, software can now exchange parameters securely and efficiently, while at the same time allowing for much more information about a design session to be made available. GLIDE is written in a compilation of several programming languages, including REALbasic, PHP, and Microsoft Visual Basic. GLIDE client installers are available to download for both Microsoft Windows and Macintosh systems. The GLIDE client software is compatible with Microsoft Excel 2000 or later on Windows systems, and with Microsoft Excel X or later on Macintosh systems. GLIDE follows the Client-Server paradigm, transferring encrypted and compressed data via standard Web protocols. Currently, the engineers use Excel as a front end to the GLIDE Client, as many of their custom tools run in Excel.
SimExTargId: A comprehensive package for real-time LC-MS data acquisition and analysis.
Edmands, William M B; Hayes, Josie; Rappaport, Stephen M
2018-05-22
Liquid chromatography mass spectrometry (LC-MS) is the favored method for untargeted metabolomic analysis of small molecules in biofluids. Here we present SimExTargId, an open-source R package for autonomous analysis of metabolomic data and real-time observation of experimental runs. This simultaneous, fully automated and multi-threaded (optional) package is a wrapper for vendor-independent format conversion (ProteoWizard), xcms- and CAMERA- based peak-picking, MetMSLine-based pre-processing and covariate-based statistical analysis. Users are notified of detrimental instrument drift or errors by email. Also included are two shiny applications, targetId for real-time MS2 target identification, and peakMonitor to monitor targeted metabolites. SimExTargId is publicly available under GNU LGPL v3.0 license at https://github.com/JosieLHayes/simExTargId, which includes a vignette with example data. SimExTargId should be installed on a dedicated data-processing workstation or server that is networked to the LC-MS platform to facilitate MS1 profiling of metabolomic data. josie.hayes@berkeley.edu. Supplementary data are available at Bioinformatics online.
Video streaming technologies using ActiveX and LabVIEW
NASA Astrophysics Data System (ADS)
Panoiu, M.; Rat, C. L.; Panoiu, C.
2015-06-01
The goal of this paper is to present the possibilities of remote image processing through data exchange between two programming technologies: LabVIEW and ActiveX. ActiveX refers to the process of controlling one program from another via ActiveX component; where one program acts as the client and the other as the server. LabVIEW can be either client or server. Both programs (client and server) exist independent of each other but are able to share information. The client communicates with the ActiveX objects that the server opens to allow the sharing of information [7]. In the case of video streaming [1] [2], most ActiveX controls can only display the data, being incapable of transforming it into a data type that LabVIEW can process. This becomes problematic when the system is used for remote image processing. The LabVIEW environment itself provides little if any possibilities for video streaming, and the methods it does offer are usually not high performance, but it possesses high performance toolkits and modules specialized in image processing, making it ideal for processing the captured data. Therefore, we chose to use existing software, specialized in video streaming along with LabVIEW and to capture the data provided by them, for further use, within LabVIEW. The software we studied (the ActiveX controls of a series of media players that utilize streaming technology) provide high quality data and a very small transmission delay, ensuring the reliability of the results of the image processing.
HDX Workbench: Software for the Analysis of H/D Exchange MS Data
NASA Astrophysics Data System (ADS)
Pascal, Bruce D.; Willis, Scooter; Lauer, Janelle L.; Landgraf, Rachelle R.; West, Graham M.; Marciano, David; Novick, Scott; Goswami, Devrishi; Chalmers, Michael J.; Griffin, Patrick R.
2012-09-01
Hydrogen/deuterium exchange mass spectrometry (HDX-MS) is an established method for the interrogation of protein conformation and dynamics. While the data analysis challenge of HDX-MS has been addressed by a number of software packages, new computational tools are needed to keep pace with the improved methods and throughput of this technique. To address these needs, we report an integrated desktop program titled HDX Workbench, which facilitates automation, management, visualization, and statistical cross-comparison of large HDX data sets. Using the software, validated data analysis can be achieved at the rate of generation. The application is available at the project home page http://hdx.florida.scripps.edu.
Quantifying Uncertainty in Expert Judgment: Initial Results
2013-03-01
lines of source code were added in . ---------- C++ = 32%; JavaScript = 29%; XML = 15%; C = 7%; CSS = 7%; Java = 5%; Oth- er = 5% LOC = 927,266...much total effort in person years has been spent on this project? CMU/SEI-2013-TR-001 | 33 5 MySQL , the most popular Open Source SQL...as MySQL , Oracle, PostgreSQL, MS SQL Server, ODBC, or Interbase. Features include email reminders, iCal/vCal import/export, re- mote subscriptions
Hoekman, Berend; Breitling, Rainer; Suits, Frank; Bischoff, Rainer; Horvatovich, Peter
2012-01-01
Data processing forms an integral part of biomarker discovery and contributes significantly to the ultimate result. To compare and evaluate various publicly available open source label-free data processing workflows, we developed msCompare, a modular framework that allows the arbitrary combination of different feature detection/quantification and alignment/matching algorithms in conjunction with a novel scoring method to evaluate their overall performance. We used msCompare to assess the performance of workflows built from modules of publicly available data processing packages such as SuperHirn, OpenMS, and MZmine and our in-house developed modules on peptide-spiked urine and trypsin-digested cerebrospinal fluid (CSF) samples. We found that the quality of results varied greatly among workflows, and interestingly, heterogeneous combinations of algorithms often performed better than the homogenous workflows. Our scoring method showed that the union of feature matrices of different workflows outperformed the original homogenous workflows in some cases. msCompare is open source software (https://trac.nbic.nl/mscompare), and we provide a web-based data processing service for our framework by integration into the Galaxy server of the Netherlands Bioinformatics Center (http://galaxy.nbic.nl/galaxy) to allow scientists to determine which combination of modules provides the most accurate processing for their particular LC-MS data sets. PMID:22318370
Hydrogen/deuterium exchange studies of native rabbit MM-CK dynamics.
Mazon, Hortense; Marcillat, Olivier; Forest, Eric; Vial, Christian
2004-02-01
Creatine kinase (CK) isoenzymes catalyse the reversible transfer of a phosphoryl group from ATP onto creatine. This reaction plays a very important role in the regulation of intracellular ATP concentrations in excitable tissues. CK isoenzymes are highly resistant to proteases in native conditions. To appreciate localized backbone dynamics, kinetics of amide hydrogen exchange with deuterium was measured by pulse-labeling the dimeric cytosolic muscle CK isoenzyme. Upon exchange, the protein was digested with pepsin, and the deuterium content of the resulting peptides was determined by liquid chromatography coupled to mass spectrometry (MS). The deuteration kinetics of 47 peptides identified by MS/MS and covering 96% of the CK backbone were analyzed. Four deuteration patterns have been recognized: The less deuterated peptides are located in the saddle-shaped core of CK, whereas most of the highly deuterated peptides are close to the surface and located around the entrance to the active site. Their exchange kinetics are discussed by comparison with the known secondary and tertiary structures of CK with the goal to reveal the conformational dynamics of the protein. Some of the observed dynamic motions may be linked to the conformational changes associated with substrate binding and catalytic mechanism.
Wrapping SRS with CORBA: from textual data to distributed objects.
Coupaye, T
1999-04-01
Biological data come in very different shapes. Databanks are maintained and used by distinct organizations. Text is the de facto Standard exchange format. The SRS system can integrate heterogeneous textual databanks but it was lacking a way to structure the extracted data. This paper presents a CORBA interface to the SRS system which manages databanks in a flat file format. SRS Object Servers are CORBA wrappers for SRS. They allow client applications (visualisation tools, data mining tools, etc.) to access and query SRS servers remotely through an Object Request Broker (ORB). They provide loader objects that contain the information extracted from the databanks by SRS. Loader objects are not hard-coded but generated in a flexible way by using loader specifications which allow SRS administrators to package data coming from distinct databanks. The prototype may be available for beta-testing. Please contact the SRS group (http://srs.ebi.ac.uk).
Zhang, Zhongqi; Zhang, Aming; Xiao, Gang
2012-06-05
Protein hydrogen/deuterium exchange (HDX) followed by protease digestion and mass spectrometric (MS) analysis is accepted as a standard method for studying protein conformation and conformational dynamics. In this article, an improved HDX MS platform with fully automated data processing is described. The platform significantly reduces systematic and random errors in the measurement by introducing two types of corrections in HDX data analysis. First, a mixture of short peptides with fast HDX rates is introduced as internal standards to adjust the variations in the extent of back exchange from run to run. Second, a designed unique peptide (PPPI) with slow intrinsic HDX rate is employed as another internal standard to reflect the possible differences in protein intrinsic HDX rates when protein conformations at different solution conditions are compared. HDX data processing is achieved with a comprehensive HDX model to simulate the deuterium labeling and back exchange process. The HDX model is implemented into the in-house developed software MassAnalyzer and enables fully unattended analysis of the entire protein HDX MS data set starting from ion detection and peptide identification to final processed HDX output, typically within 1 day. The final output of the automated data processing is a set (or the average) of the most possible protection factors for each backbone amide hydrogen. The utility of the HDX MS platform is demonstrated by exploring the conformational transition of a monoclonal antibody by increasing concentrations of guanidine.
Shavazi, Masoumeh Abbasi; Morowatisharifabad, Mohammad Ali; Shavazi, Mohammad Taghi Abbasi; Mirzaei, Masoud; Ardekani, Ali Mellat
2016-07-01
Currently with the emergence of the Internet, patients have an opportunity to exchange social support online. However, little attention has been devoted to different dimensions of online social support exchanged in virtual support communities for patients with multiple sclerosis (MS). To provide a rich insight, the aim of this qualitative study was to explore and categorize different dimensions of online social support in messages exchanged in a virtual support community for patients with MS. A total of 548 posted messages created during one year period were selected using purposive sampling to consider the maximum variation sampling. Prior-research-driven thematic analysis was then conducted. In this regard, we used the Cutruna and Suhr's coding system. The messages that could not be categorized with the used coding system were thematically analyzed to explore new additional social support themes. The results showed that various forms of social support including informational, emotional, network, esteem and tangible support were exchanged. Moreover, new additional social support themes including sharing personal experiences, sharing coping strategies and spiritual support emerged in this virtual support community. The wide range of online social support exchanged in the virtual support community can be regarded as a supplementary source of social support for patients with MS. Future researches can examine online social support more comprehensively considering additional social support themes emerging in the present study.
Bouyssié, David; Dubois, Marc; Nasso, Sara; Gonzalez de Peredo, Anne; Burlet-Schiltz, Odile; Aebersold, Ruedi; Monsarrat, Bernard
2015-01-01
The analysis and management of MS data, especially those generated by data independent MS acquisition, exemplified by SWATH-MS, pose significant challenges for proteomics bioinformatics. The large size and vast amount of information inherent to these data sets need to be properly structured to enable an efficient and straightforward extraction of the signals used to identify specific target peptides. Standard XML based formats are not well suited to large MS data files, for example, those generated by SWATH-MS, and compromise high-throughput data processing and storing. We developed mzDB, an efficient file format for large MS data sets. It relies on the SQLite software library and consists of a standardized and portable server-less single-file database. An optimized 3D indexing approach is adopted, where the LC-MS coordinates (retention time and m/z), along with the precursor m/z for SWATH-MS data, are used to query the database for data extraction. In comparison with XML formats, mzDB saves ∼25% of storage space and improves access times by a factor of twofold up to even 2000-fold, depending on the particular data access. Similarly, mzDB shows also slightly to significantly lower access times in comparison with other formats like mz5. Both C++ and Java implementations, converting raw or XML formats to mzDB and providing access methods, will be released under permissive license. mzDB can be easily accessed by the SQLite C library and its drivers for all major languages, and browsed with existing dedicated GUIs. The mzDB described here can boost existing mass spectrometry data analysis pipelines, offering unprecedented performance in terms of efficiency, portability, compactness, and flexibility. PMID:25505153
NASA Astrophysics Data System (ADS)
Hosseinian, A.; Meghdadi Isfahani, A. H.
2018-04-01
In this study, the heat transfer enhancement due to the surface vibration for a double pipe heat exchanger, made of PVDF, is investigated. In order to create forced vibrations (3-9 m/s2, 100 Hz) on the outer surface of the heat exchanger electro-dynamic vibrators are used. Experiments were performed at inner Reynolds numbers ranging from 2533 to 9960. The effects of volume flow rate and temperature on heat transfer performance are evaluated. Results demonstrated that heat transfer coefficient increases by increasing vibration level and mass flow rate. The most increase in heat transfer coefficient is 97% which is obtained for the highest vibration level (9 m/s2) in the experiment range.
Toth, Ronald T; Mills, Brittney J; Joshi, Sangeeta B; Esfandiary, Reza; Bishop, Steven M; Middaugh, C Russell; Volkin, David B; Weis, David D
2017-09-05
A barrier to the use of hydrogen exchange-mass spectrometry (HX-MS) in many contexts, especially analytical characterization of various protein therapeutic candidates, is that differences in temperature, pH, ionic strength, buffering agent, or other additives can alter chemical exchange rates, making HX data gathered under differing solution conditions difficult to compare. Here, we present data demonstrating that HX chemical exchange rates can be substantially altered not only by the well-established variables of temperature and pH but also by additives including arginine, guanidine, methionine, and thiocyanate. To compensate for these additive effects, we have developed an empirical method to correct the hydrogen-exchange data for these differences. First, differences in chemical exchange rates are measured by use of an unstructured reporter peptide, YPI. An empirical chemical exchange correction factor, determined by use of the HX data from the reporter peptide, is then applied to the HX measurements obtained from a protein of interest under different solution conditions. We demonstrate that the correction is experimentally sound through simulation and in a proof-of-concept experiment using unstructured peptides under slow-exchange conditions (pD 4.5 at ambient temperature). To illustrate its utility, we applied the correction to HX-MS excipient screening data collected for a pharmaceutically relevant IgG4 mAb being characterized to determine the effects of different formulations on backbone dynamics.
Biographer: web-based editing and rendering of SBGN compliant biochemical networks.
Krause, Falko; Schulz, Marvin; Ripkens, Ben; Flöttmann, Max; Krantz, Marcus; Klipp, Edda; Handorf, Thomas
2013-06-01
The rapid accumulation of knowledge in the field of Systems Biology during the past years requires advanced, but simple-to-use, methods for the visualization of information in a structured and easily comprehensible manner. We have developed biographer, a web-based renderer and editor for reaction networks, which can be integrated as a library into tools dealing with network-related information. Our software enables visualizations based on the emerging standard Systems Biology Graphical Notation. It is able to import networks encoded in various formats such as SBML, SBGN-ML and jSBGN, a custom lightweight exchange format. The core package is implemented in HTML5, CSS and JavaScript and can be used within any kind of web-based project. It features interactive graph-editing tools and automatic graph layout algorithms. In addition, we provide a standalone graph editor and a web server, which contains enhanced features like web services for the import and export of models and visualizations in different formats. The biographer tool can be used at and downloaded from the web page http://biographer.biologie.hu-berlin.de/. The different software packages, including a server-independent version as well as a web server for Windows and Linux based systems, are available at http://code.google.com/p/biographer/ under the open-source license LGPL
Method of Performance-Aware Security of Unicast Communication in Hybrid Satellite Networks
NASA Technical Reports Server (NTRS)
Baras, John S. (Inventor); Roy-Chowdhury, Ayan (Inventor)
2014-01-01
A method and apparatus utilizes Layered IPSEC (LES) protocol as an alternative to IPSEC for network-layer security including a modification to the Internet Key Exchange protocol. For application-level security of web browsing with acceptable end-to-end delay, the Dual-mode SSL protocol (DSSL) is used instead of SSL. The LES and DSSL protocols achieve desired end-to-end communication security while allowing the TCP and HTTP proxy servers to function correctly.
2013-06-01
Communication Applet) UNIGE – D.I.M.E. Using a free application as “MIT APP Inventor” Android Software Development Kit DEGRADED C2 ICCRTS 2013...operate on an Android operating system up-gradable on which will be developed a simplified ACA ( Android Communication Applet) that will call C24U...Server) IP number . . . Portable COTS Devices ACA - C24U ( Android Communication Applet) Sending/receiving SEFL (Simple Exchange
Guo, Zhongxian; Liu, Ying; Li, Shuping; Yang, Zhaoguang
2009-12-01
Identification of microbial contaminants in drinking water is a challenge to matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) due to low levels of microorganisms in fresh water. To avoid the time-consuming culture step of obtaining enough microbial cells for subsequent MALDI-MS analysis, a combination of membrane filtration and nanoparticles- or microparticles-based magnetic separation is a fast and efficient approach. In this work, the interaction of bacteria and fluidMAG-PAA, a cation-exchange superparamagnetic nanomaterial, was investigated by MALDI-MS analysis and transmission electron microscopy. FluidMAG-PAA selectively captured cells of Salmonella, Bacillus, Enterococcus and Staphylococcus aureus. This capture was attributed to the aggregation of negatively charged nanoparticles on bacterial cell regional surfaces that bear positive charges. Three types of non-porous silica-encapsulated anion-exchange magnetic microparticles (SiMAG-Q, SiMAG-PEI, SiMAG-DEAE) were capable of concentrating a variety of bacteria, and were compared with silica-free, smaller fluidMAG particles. Salmonella, Escherichia coli, Enterococcus and other bacteria spiked in aqueous solutions, tap water and reservoir water were separated and concentrated by membrane filtration and magnetic separation based on these ion-exchange magnetic materials, and then characterized by whole cell MALDI-MS. By comparing with the mass spectra of the isolates and pure cells, bacteria in fresh water can be rapidly detected at 1 x 10(3) colony-forming units (cfu)/mL. Copyright 2009 John Wiley & Sons, Ltd.
Song, Shiming; Zhang, Cuifang; Chen, Zhaojie; He, Fengmei; Wei, Jie; Tan, Huihua; Li, Xuesheng
2018-07-06
In this study, we developed an anion exchanger-disposable pipette extraction (DPX) method to detect the residual concentrations of eight neonicotinoid insecticides (dinotefuran, acetamiprid, clothianidin, thiacloprid, imidachloprid, imidaclothiz, nitenpyram, and thiamethoxam) and eight insect growth regulators (IGRs; triflumuron, cyromazine, buprofezin, methoxyfenozide, tebufenozide, chromafenozide, fenoxycarb, and RH 5849) in Chinese honey samples collected from different floral sources and different geographical regions using liquid chromatography tandem mass spectrometry (LC-MS/MS). QAE Sephadex A-25 was used as the anion exchanger in the DPX column for the purification and cleanup of honey samples. Analytes were eluted with a mixture of acetonitrile and 0.1 M HCl, and the elution was subjected to LC analysis. This method was thoroughly validated for its reproducibility, linearity, trueness, and recovery. Satisfactory recovery of pesticides was obtained ranging from 72% to 111% with intraday RSDs (n = 5) of 1%-10%. High linearity (R 2 ≥ 0.9987) was observed for all 16 pesticides. Limits of detection and quantification for all 16 compounds ranged from 0.3 to 3 μg/kg and from 1 to 10 μg/kg, respectively. Pesticide residues (9-113 μg/kg) were found in Chinese honey samples. The anion exchanger-DPX method was effective for removing sugars and retaining target analytes. Moreover, this method was highly reliable and sensitive for detecting neonicotinoids and IGRs in different floral sources of honey and will be applicable to matrixes with high sugar content. Copyright © 2018 Elsevier B.V. All rights reserved.
Shavazi, Masoumeh Abbasi; Morowatisharifabad, Mohammad Ali; Shavazi, Mohammad Taghi Abbasi; Mirzaei, Masoud; Ardekani, Ali Mellat
2016-01-01
Background: Currently with the emergence of the Internet, patients have an opportunity to exchange social support online. However, little attention has been devoted to different dimensions of online social support exchanged in virtual support communities for patients with multiple sclerosis (MS). Methods: To provide a rich insight, the aim of this qualitative study was to explore and categorize different dimensions of online social support in messages exchanged in a virtual support community for patients with MS. A total of 548 posted messages created during one year period were selected using purposive sampling to consider the maximum variation sampling. Prior-research-driven thematic analysis was then conducted. In this regard, we used the Cutruna and Suhr’s coding system. The messages that could not be categorized with the used coding system were thematically analyzed to explore new additional social support themes. Results: The results showed that various forms of social support including informational, emotional, network, esteem and tangible support were exchanged. Moreover, new additional social support themes including sharing personal experiences, sharing coping strategies and spiritual support emerged in this virtual support community. Conclusion: The wide range of online social support exchanged in the virtual support community can be regarded as a supplementary source of social support for patients with MS. Future researches can examine online social support more comprehensively considering additional social support themes emerging in the present study. PMID:27382585
Bandu, Raju; Ahn, Hyun Soo; Lee, Joon Won; Kim, Yong Woo; Choi, Seon Hee; Kim, Hak Jin; Kim, Kwang Pyo
2015-01-01
In vivo rat kidney tissue metabolites of an anticancer drug, cisplatin (cis-diamminedichloroplatinum [II]) (CP) which is used for the treatment of testicular, ovarian, bladder, cervical, esophageal, small cell lung, head and neck cancers, have been identified and characterized by using liquid chromatography positive ion electrospray ionization tandem mass spectrometry (LC/ESI-MS/MS) in combination with on line hydrogen/deuterium exchange (HDX) experiments. To identify in vivo metabolites, kidney tissues were collected after intravenous administration of CP to adult male Sprague-Dawley rats (n = 3 per group). The tissue samples were homogenized and extracted using newly optimized metabolite extraction procedure which involves liquid extraction with phosphate buffer containing ethyl acetate and protein precipitation with mixed solvents of methanol-water-chloroform followed by solid-phase clean-up procedure on Oasis HLB 3cc cartridges and then subjected to LC/ESI-HRMS analysis. A total of thirty one unknown in vivo metabolites have been identified and the structures of metabolites were elucidated using LC-MS/MS experiments combined with accurate mass measurements. Online HDX experiments have been used to further support the structural characterization of metabolites. The results showed that CP undergoes a series of ligand exchange biotransformation reactions with water and other nucleophiles like thio groups of methionine, cysteine, acetylcysteine, glutathione and thioether. This is the first research approach focused on the structure elucidation of biotransformation products of CP in rats, and the identification of metabolites provides essential information for further pharmacological and clinical studies of CP, and may also be useful to develop various effective new anticancer agents.
NASA Astrophysics Data System (ADS)
Zhang, Qian; Noble, Kyle A.; Mao, Yuan; Young, Nicolas L.; Sathe, Shridhar K.; Roux, Kenneth H.; Marshall, Alan G.
2013-07-01
The potential epitopes of a recombinant food allergen protein, cashew Ana o 2, reactive to polyclonal antibodies, were mapped by solution-phase amide backbone H/D exchange (HDX) coupled with Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS). Ana o 2 polyclonal antibodies were purified in the serum from a goat immunized with cashew nut extract. Antibodies were incubated with recombinant Ana o 2 (rAna o 2) to form antigen:polyclonal antibody (Ag:pAb) complexes. Complexed and uncomplexed (free) rAna o 2 were then subjected to HDX-MS analysis. Four regions protected from H/D exchange upon pAb binding are identified as potential epitopes and mapped onto a homologous model.
Lee, Tian-Fu
2014-12-01
Telecare medicine information systems provide a communicating platform for accessing remote medical resources through public networks, and help health care workers and medical personnel to rapidly making correct clinical decisions and treatments. An authentication scheme for data exchange in telecare medicine information systems enables legal users in hospitals and medical institutes to establish a secure channel and exchange electronic medical records or electronic health records securely and efficiently. This investigation develops an efficient and secure verified-based three-party authentication scheme by using extended chaotic maps for data exchange in telecare medicine information systems. The proposed scheme does not require server's public keys and avoids time-consuming modular exponential computations and scalar multiplications on elliptic curve used in previous related approaches. Additionally, the proposed scheme is proven secure in the random oracle model, and realizes the lower bounds of messages and rounds in communications. Compared to related verified-based approaches, the proposed scheme not only possesses higher security, but also has lower computational cost and fewer transmissions. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
iELM—a web server to explore short linear motif-mediated interactions
Weatheritt, Robert J.; Jehl, Peter; Dinkel, Holger; Gibson, Toby J.
2012-01-01
The recent expansion in our knowledge of protein–protein interactions (PPIs) has allowed the annotation and prediction of hundreds of thousands of interactions. However, the function of many of these interactions remains elusive. The interactions of Eukaryotic Linear Motif (iELM) web server provides a resource for predicting the function and positional interface for a subset of interactions mediated by short linear motifs (SLiMs). The iELM prediction algorithm is based on the annotated SLiM classes from the Eukaryotic Linear Motif (ELM) resource and allows users to explore both annotated and user-generated PPI networks for SLiM-mediated interactions. By incorporating the annotated information from the ELM resource, iELM provides functional details of PPIs. This can be used in proteomic analysis, for example, to infer whether an interaction promotes complex formation or degradation. Furthermore, details of the molecular interface of the SLiM-mediated interactions are also predicted. This information is displayed in a fully searchable table, as well as graphically with the modular architecture of the participating proteins extracted from the UniProt and Phospho.ELM resources. A network figure is also presented to aid the interpretation of results. The iELM server supports single protein queries as well as large-scale proteomic submissions and is freely available at http://i.elm.eu.org. PMID:22638578
Efficiently Distributing Component-based Applications Across Wide-Area Environments
2002-01-01
a variety of sophisticated network-accessible services such as e-mail, banking, on-line shopping, entertainment, and serv - ing as a data exchange...product database Customer Serves as a façade to Order and Account Stateful Session Beans ShoppingCart Maintains list of items to be bought by customer...Pet Store tests; and JBoss 3.0.3 with Jetty 4.1.0, for the RUBiS tests) and a sin- gle database server ( Oracle 8.1.7 Enterprise Edition), each running
CSlib, a library to couple codes via Client/Server messaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plimpton, Steve
The CSlib is a small, portable library which enables two (or more) independent simulation codes to be coupled, by exchanging messages with each other. Both codes link to the library when they are built, and can them communicate with each other as they run. The messages contain data or instructions that the two codes send back-and-forth to each other. The messaging can take place via files, sockets, or MPI. The latter is a standard distributed-memory message-passing library.
DOE Office of Scientific and Technical Information (OSTI.GOV)
The open source Project Haystack initiative defines meta data and communication standards related to data from buildings and intelligent devices. The Project Haystack REST API defines standard formats and operations for exchanging Haystack tagged data over HTTP. The HaystackRuby gem wraps calls to this REST API to enable Ruby application to easily integrate data hosted on a Project Haystack compliant server. The HaystackRuby gem was developed at the National Renewable Energy Lab to support applications related to campus energy. We hope that this tool may be useful to others.
Assessment & Commitment Tracking System (ACTS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryant, Robert A.; Childs, Teresa A.; Miller, Michael A.
2004-12-20
The ACTS computer code provides a centralized tool for planning and scheduling assessments, tracking and managing actions associated with assessments or that result from an event or condition, and "mining" data for reporting and analyzing information for improving performance. The ACTS application is designed to work with the MS SQL database management system. All database interfaces are written in SQL. The following software is used to develop and support the ACTS application: Cold Fusion HTML JavaScript Quest TOAD Microsoft Visual Source Safe (VSS) HTML Mailer for sending email Microsoft SQL Microsoft Internet Information Server
Minimizing back exchange in the hydrogen exchange-mass spectrometry experiment.
Walters, Benjamin T; Ricciuti, Alec; Mayne, Leland; Englander, S Walter
2012-12-01
The addition of mass spectrometry (MS) analysis to the hydrogen exchange (HX) proteolytic fragmentation experiment extends powerful HX methodology to the study of large biologically important proteins. A persistent problem is the degradation of HX information due to back exchange of deuterium label during the fragmentation-separation process needed to prepare samples for MS measurement. This paper reports a systematic analysis of the factors that influence back exchange (solution pH, ionic strength, desolvation temperature, LC column interaction, flow rates, system volume). The many peptides exhibit a range of back exchange due to intrinsic amino acid HX rate differences. Accordingly, large back exchange leads to large variability in D-recovery from one residue to another as well as one peptide to another that cannot be corrected for by reference to any single peptide-level measurement. The usual effort to limit back exchange by limiting LC time provides little gain. Shortening the LC elution gradient by 3-fold only reduced back exchange by ~2%, while sacrificing S/N and peptide count. An unexpected dependence of back exchange on ionic strength as well as pH suggests a strategy in which solution conditions are changed during sample preparation. Higher salt should be used in the first stage of sample preparation (proteolysis and trapping) and lower salt (<20 mM) and pH in the second stage before electrospray injection. Adjustment of these and other factors together with recent advances in peptide fragment detection yields hundreds of peptide fragments with D-label recovery of 90% ± 5%.
Minimizing Back Exchange in the Hydrogen Exchange-Mass Spectrometry Experiment
NASA Astrophysics Data System (ADS)
Walters, Benjamin T.; Ricciuti, Alec; Mayne, Leland; Englander, S. Walter
2012-12-01
The addition of mass spectrometry (MS) analysis to the hydrogen exchange (HX) proteolytic fragmentation experiment extends powerful HX methodology to the study of large biologically important proteins. A persistent problem is the degradation of HX information due to back exchange of deuterium label during the fragmentation-separation process needed to prepare samples for MS measurement. This paper reports a systematic analysis of the factors that influence back exchange (solution pH, ionic strength, desolvation temperature, LC column interaction, flow rates, system volume). The many peptides exhibit a range of back exchange due to intrinsic amino acid HX rate differences. Accordingly, large back exchange leads to large variability in D-recovery from one residue to another as well as one peptide to another that cannot be corrected for by reference to any single peptide-level measurement. The usual effort to limit back exchange by limiting LC time provides little gain. Shortening the LC elution gradient by 3-fold only reduced back exchange by ~2 %, while sacrificing S/N and peptide count. An unexpected dependence of back exchange on ionic strength as well as pH suggests a strategy in which solution conditions are changed during sample preparation. Higher salt should be used in the first stage of sample preparation (proteolysis and trapping) and lower salt (<20 mM) and pH in the second stage before electrospray injection. Adjustment of these and other factors together with recent advances in peptide fragment detection yields hundreds of peptide fragments with D-label recovery of 90 % ± 5 %.
Bruggink, C.; Koeleman, C.; Barreto, V.; Lui, Y.; Pohl, C.; Ingendoh, A.; Wuhrer, M.; Hokke, C.; Deelder, A.
2007-01-01
High-pH anion-exchange chromatography with pulsed amperometric detection (HPAEC-PAD) is an established technique for selective separation and analysis of underivatized carbohydrates. The miniaturization of chromatographic techniques by means of capillary columns, and on-line coupling to mass spectrometry are critical to the further development of glycan analysis methods that are compatible with the current requirements in clinical settings. A system has been developed based on the Dionex BioLC equipped with a microbore gradient pump with PEEK flow splitter, a FAMOS micro autosampler, a modified electrochemical cell for on-line capillary PAD, and a capillary column (380 μm i.d.) packed with a new type of anion-exchange resin. This system operates with sensitivity in the low femtomol range. In addition, an on-line capillary desalter has been developed to allow direct coupling to the Bruker Esquire 3000 ion-trap mass spectrometer with electrospray ionization interface (ESI-IT-MS). Both systems have been evaluated using oligosaccharide standards as well as urine samples exhibiting various lysosomal oligosaccharide storage diseases. Initial data indicate that the robust and selective anion-exchange system, in combination with ESI-IT-MS for structure confirmation and analysis, provides a powerful platform that complements existing nano/capillary LC-MS methods for analytical determination of oligosaccharides in biological matrices.
González-Díaz, Humberto; Muíño, Laura; Anadón, Ana M; Romaris, Fernanda; Prado-Prado, Francisco J; Munteanu, Cristian R; Dorado, Julián; Sierra, Alejandro Pazos; Mezo, Mercedes; González-Warleta, Marta; Gárate, Teresa; Ubeira, Florencio M
2011-06-01
Infections caused by human parasites (HPs) affect the poorest 500 million people worldwide but chemotherapy has become expensive, toxic, and/or less effective due to drug resistance. On the other hand, many 3D structures in Protein Data Bank (PDB) remain without function annotation. We need theoretical models to quickly predict biologically relevant Parasite Self Proteins (PSP), which are expressed differentially in a given parasite and are dissimilar to proteins expressed in other parasites and have a high probability to become new vaccines (unique sequence) or drug targets (unique 3D structure). We present herein a model for PSPs in eight different HPs (Ascaris, Entamoeba, Fasciola, Giardia, Leishmania, Plasmodium, Trypanosoma, and Toxoplasma) with 90% accuracy for 15 341 training and validation cases. The model combines protein residue networks, Markov Chain Models (MCM) and Artificial Neural Networks (ANN). The input parameters are the spectral moments of the Markov transition matrix for electrostatic interactions associated with the protein residue complex network calculated with the MARCH-INSIDE software. We implemented this model in a new web-server called MISS-Prot (MARCH-INSIDE Scores for Self-Proteins). MISS-Prot was programmed using PHP/HTML/Python and MARCH-INSIDE routines and is freely available at: . This server is easy to use by non-experts in Bioinformatics who can carry out automatic online upload and prediction with 3D structures deposited at PDB (mode 1). We can also study outcomes of Peptide Mass Fingerprinting (PMFs) and MS/MS for query proteins with unknown 3D structures (mode 2). We illustrated the use of MISS-Prot in experimental and/or theoretical studies of peptides from Fasciola hepatica cathepsin proteases or present on 10 Anisakis simplex allergens (Ani s 1 to Ani s 10). In doing so, we combined electrophoresis (1DE), MALDI-TOF Mass Spectroscopy, and MASCOT to seek sequences, Molecular Mechanics + Molecular Dynamics (MM/MD) to generate 3D structures and MISS-Prot to predict PSP scores. MISS-Prot also allows the prediction of PSP proteins in 16 additional species including parasite hosts, fungi pathogens, disease transmission vectors, and biotechnologically relevant organisms.
MyDas, an Extensible Java DAS Server
Jimenez, Rafael C.; Quinn, Antony F.; Jenkinson, Andrew M.; Mulder, Nicola; Martin, Maria; Hunter, Sarah; Hermjakob, Henning
2012-01-01
A large number of diverse, complex, and distributed data resources are currently available in the Bioinformatics domain. The pace of discovery and the diversity of information means that centralised reference databases like UniProt and Ensembl cannot integrate all potentially relevant information sources. From a user perspective however, centralised access to all relevant information concerning a specific query is essential. The Distributed Annotation System (DAS) defines a communication protocol to exchange annotations on genomic and protein sequences; this standardisation enables clients to retrieve data from a myriad of sources, thus offering centralised access to end-users. We introduce MyDas, a web server that facilitates the publishing of biological annotations according to the DAS specification. It deals with the common functionality requirements of making data available, while also providing an extension mechanism in order to implement the specifics of data store interaction. MyDas allows the user to define where the required information is located along with its structure, and is then responsible for the communication protocol details. PMID:23028496
MyDas, an extensible Java DAS server.
Salazar, Gustavo A; García, Leyla J; Jones, Philip; Jimenez, Rafael C; Quinn, Antony F; Jenkinson, Andrew M; Mulder, Nicola; Martin, Maria; Hunter, Sarah; Hermjakob, Henning
2012-01-01
A large number of diverse, complex, and distributed data resources are currently available in the Bioinformatics domain. The pace of discovery and the diversity of information means that centralised reference databases like UniProt and Ensembl cannot integrate all potentially relevant information sources. From a user perspective however, centralised access to all relevant information concerning a specific query is essential. The Distributed Annotation System (DAS) defines a communication protocol to exchange annotations on genomic and protein sequences; this standardisation enables clients to retrieve data from a myriad of sources, thus offering centralised access to end-users.We introduce MyDas, a web server that facilitates the publishing of biological annotations according to the DAS specification. It deals with the common functionality requirements of making data available, while also providing an extension mechanism in order to implement the specifics of data store interaction. MyDas allows the user to define where the required information is located along with its structure, and is then responsible for the communication protocol details.
DASMI: exchanging, annotating and assessing molecular interaction data.
Blankenburg, Hagen; Finn, Robert D; Prlić, Andreas; Jenkinson, Andrew M; Ramírez, Fidel; Emig, Dorothea; Schelhorn, Sven-Eric; Büch, Joachim; Lengauer, Thomas; Albrecht, Mario
2009-05-15
Ever increasing amounts of biological interaction data are being accumulated worldwide, but they are currently not readily accessible to the biologist at a single site. New techniques are required for retrieving, sharing and presenting data spread over the Internet. We introduce the DASMI system for the dynamic exchange, annotation and assessment of molecular interaction data. DASMI is based on the widely used Distributed Annotation System (DAS) and consists of a data exchange specification, web servers for providing the interaction data and clients for data integration and visualization. The decentralized architecture of DASMI affords the online retrieval of the most recent data from distributed sources and databases. DASMI can also be extended easily by adding new data sources and clients. We describe all DASMI components and demonstrate their use for protein and domain interactions. The DASMI tools are available at http://www.dasmi.de/ and http://ipfam.sanger.ac.uk/graph. The DAS registry and the DAS 1.53E specification is found at http://www.dasregistry.org/.
Jin, Weihua; Wang, Jing; Ren, Sumei; Song, Ni; Zhang, Quanbin
2012-01-01
A fucoidan extracted from Saccharina japonica was fractionated by anion exchange chromatography. The most complex fraction F0.5 was degraded by dilute sulphuric acid and then separated by use of an activated carbon column. Fraction Y1 was fractionated by anion exchange and gel filtration chromatography while Fraction Y2 was fractionated by gel filtration chromatography. The fractions were determined by ESI-MS and analyzed by ESI-CID-MS/MS. It was concluded that F0.5 had a backbone of alternating 4-linked GlcA and 2-linked Man with the first Man residue from the nonreducing end accidentally sulfated at C6. In addition, F0.5 had a 3-linked glucuronan, in accordance with a previous report by NMR. Some other structural characteristics included GlcA 1→3 Man 1→4 GlcA, Man 1→3 GlcA 1→4 GlcA, Fuc 1→4 GlcA and Fuc 1→3 Fuc. Finally, it was shown that fucose was sulfated at C2 or C4 while galactose was sulfated at C2, C4 or C6. PMID:23170074
Proteomic analysis of Toxocara canis excretory and secretory (TES) proteins.
Sperotto, Rita Leal; Kremer, Frederico Schmitt; Aires Berne, Maria Elisabeth; Costa de Avila, Luciana F; da Silva Pinto, Luciano; Monteiro, Karina Mariante; Caumo, Karin Silva; Ferreira, Henrique Bunselmeyer; Berne, Natália; Borsuk, Sibele
2017-01-01
Toxocariasis is a neglected disease, and its main etiological agent is the nematode Toxocara canis. Serological diagnosis is performed by an enzyme-linked immunosorbent assay using T. canis excretory and secretory (TES) antigens produced by in vitro cultivation of larvae. Identification of TES proteins can be useful for the development of new diagnostic strategies since few TES components have been described so far. Herein, we report the results obtained by proteomic analysis of TES proteins using a liquid chromatography-tandem mass spectrometry (LC-MS/MS) approach. TES fractions were separated by one-dimensional SDS-PAGE and analyzed by LC-MS/MS. The MS/MS spectra were compared with a database of protein sequences deduced from the genome sequence of T. canis, and a total of 19 proteins were identified. Classification according to the signal peptide prediction using the SignalP server showed that seven of the identified proteins were extracellular, 10 had cytoplasmic or nuclear localization, while the subcellular localization of two proteins was unknown. Analysis of molecular functions by BLAST2GO showed that the majority of the gene ontology (GO) terms associated with the proteins present in the TES sample were associated with binding functions, including but not limited to protein binding (GO:0005515), inorganic ion binding (GO:0043167), and organic cyclic compound binding (GO:0097159). This study provides additional information about the exoproteome of T. canis, which can lead to the development of new strategies for diagnostics or vaccination. Copyright © 2016 Elsevier B.V. All rights reserved.
Development of mobile platform integrated with existing electronic medical records.
Kim, YoungAh; Kim, Sung Soo; Kang, Simon; Kim, Kyungduk; Kim, Jun
2014-07-01
This paper describes a mobile Electronic Medical Record (EMR) platform designed to manage and utilize the existing EMR and mobile application with optimized resources. We structured the mEMR to reuse services of retrieval and storage in mobile app environments that have already proven to have no problem working with EMRs. A new mobile architecture-based mobile solution was developed in four steps: the construction of a server and its architecture; screen layout and storyboard making; screen user interface design and development; and a pilot test and step-by-step deployment. This mobile architecture consists of two parts, the server-side area and the client-side area. In the server-side area, it performs the roles of service management for EMR and documents and for information exchange. Furthermore, it performs menu allocation depending on user permission and automatic clinical document architecture document conversion. Currently, Severance Hospital operates an iOS-compatible mobile solution based on this mobile architecture and provides stable service without additional resources, dealing with dynamic changes of EMR templates. The proposed mobile solution should go hand in hand with the existing EMR system, and it can be a cost-effective solution if a quality EMR system is operated steadily with this solution. Thus, we expect this example to be shared with hospitals that currently plan to deploy mobile solutions.
Development of Mobile Platform Integrated with Existing Electronic Medical Records
Kim, YoungAh; Kang, Simon; Kim, Kyungduk; Kim, Jun
2014-01-01
Objectives This paper describes a mobile Electronic Medical Record (EMR) platform designed to manage and utilize the existing EMR and mobile application with optimized resources. Methods We structured the mEMR to reuse services of retrieval and storage in mobile app environments that have already proven to have no problem working with EMRs. A new mobile architecture-based mobile solution was developed in four steps: the construction of a server and its architecture; screen layout and storyboard making; screen user interface design and development; and a pilot test and step-by-step deployment. This mobile architecture consists of two parts, the server-side area and the client-side area. In the server-side area, it performs the roles of service management for EMR and documents and for information exchange. Furthermore, it performs menu allocation depending on user permission and automatic clinical document architecture document conversion. Results Currently, Severance Hospital operates an iOS-compatible mobile solution based on this mobile architecture and provides stable service without additional resources, dealing with dynamic changes of EMR templates. Conclusions The proposed mobile solution should go hand in hand with the existing EMR system, and it can be a cost-effective solution if a quality EMR system is operated steadily with this solution. Thus, we expect this example to be shared with hospitals that currently plan to deploy mobile solutions. PMID:25152837
Huang, Ean-Wen; Hung, Rui-Suan; Chiou, Shwu-Fen; Liu, Fei-Ying; Liou, Der-Ming
2011-01-01
Information and communication technologies progress rapidly and many novel applications have been developed in many domains of human life. In recent years, the demand for healthcare services has been growing because of the increase in the elderly population. Consequently, a number of healthcare institutions have focused on creating technologies to reduce extraneous work and improve the quality of service. In this study, an information platform for tele- healthcare services was implemented. The architecture of the platform included a web-based application server and client system. The client system was able to retrieve the blood pressure and glucose levels of a patient stored in measurement instruments through Bluetooth wireless transmission. The web application server assisted the staffs and clients in analyzing the health conditions of patients. In addition, the server provided face-to-face communications and instructions through remote video devices. The platform deployed a service-oriented architecture, which consisted of HL7 standard messages and web service components. The platform could transfer health records into HL7 standard clinical document architecture for data exchange with other organizations. The prototyping system was pretested and evaluated in a homecare department of hospital and a community management center for chronic disease monitoring. Based on the results of this study, this system is expected to improve the quality of healthcare services.
Biographer: web-based editing and rendering of SBGN compliant biochemical networks
Krause, Falko; Schulz, Marvin; Ripkens, Ben; Flöttmann, Max; Krantz, Marcus; Klipp, Edda; Handorf, Thomas
2013-01-01
Motivation: The rapid accumulation of knowledge in the field of Systems Biology during the past years requires advanced, but simple-to-use, methods for the visualization of information in a structured and easily comprehensible manner. Results: We have developed biographer, a web-based renderer and editor for reaction networks, which can be integrated as a library into tools dealing with network-related information. Our software enables visualizations based on the emerging standard Systems Biology Graphical Notation. It is able to import networks encoded in various formats such as SBML, SBGN-ML and jSBGN, a custom lightweight exchange format. The core package is implemented in HTML5, CSS and JavaScript and can be used within any kind of web-based project. It features interactive graph-editing tools and automatic graph layout algorithms. In addition, we provide a standalone graph editor and a web server, which contains enhanced features like web services for the import and export of models and visualizations in different formats. Availability: The biographer tool can be used at and downloaded from the web page http://biographer.biologie.hu-berlin.de/. The different software packages, including a server-indepenent version as well as a web server for Windows and Linux based systems, are available at http://code.google.com/p/biographer/ under the open-source license LGPL. Contact: edda.klipp@biologie.hu-berlin.de or handorf@physik.hu-berlin.de PMID:23574737
Network oriented radiological and medical archive
NASA Astrophysics Data System (ADS)
Ferraris, M.; Frixione, P.; Squarcia, S.
2001-10-01
In this paper the basic ideas of NORMA (Network Oriented Radiological and Medical Archive) are discussed. NORMA is an original project built by a team of physicists in collaboration with radiologists in order to select the best Treatment Planning in radiotherapy. It allows physicians and health physicists, working in different places, to discuss on interesting clinical cases visualizing the same diagnostic images, at the same time, and highlighting zones of interest (tumors and organs at risk). NORMA has a client/server architecture in order to be platform independent. Applying World Wide Web technologies, it can be easily used by people with no specific computer knowledge providing a verbose help to guide the user through the right steps of execution. The client side is an applet while the server side is a Java application. In order to optimize execution the project also includes a proprietary protocol, lying over TCP/IP suite, that organizes data exchanges and control messages. Diagnostic images are retrieved from a relational database or from a standard DICOM (Digital Images and COmmunications in Medicine) PACS through the DICOM-WWW gateway allowing connection of the usual Web browsers, used by the NORMA system, to DICOM applications via the HTTP protocol. Browser requests are sent to the gateway from the Web server through CGI (Common Gateway Interface). DICOM software translates the requests in DICOM messages and organizes the communication with the remote DICOM Application.
NASA Astrophysics Data System (ADS)
Wu, Bo; Chu, Yan-qiu; Dai, Zhao-yun; Ding, Chuan-fan
2008-06-01
Allergic contact dermatitis is a delayed hypersensitivity reaction, which results from skin exposure to low molecular weight chemicals such as haptens. To clarify the pathogenic mechanism, electrospray ionization mass spectrometry (ESI-MS) and hydrogen/deuterium (H/D) exchange, as well as UV spectroscopy, were applied to determine the interaction between the model protein cytochrome c (cyt c) and the hapten 2,4-dinitro-fluorobenzene (DNFB). The ESI-MS results demonstrate that the conformation of cyt c can change from native folded state into partially unfolded state with the increase of DNFB. The equilibrium state H/D exchange followed by ESI-MS further confirms the above results. UV spectroscopy indicates that the strong-field coordination between iron of heme (prosthetic group) and His18 or Met80 of cyt c is not obviously affected by the hapten.
Plasma exchange therapy in steroid-unresponsive relapses in patients with multiple sclerosis.
Trebst, Corinna; Reising, Ansgar; Kielstein, Jan T; Hafer, Carsten; Stangel, Martin
2009-01-01
Plasma exchange (PE) is well established for conditions such as rapid progressive vasculitis associated with autoantibodies against neutrophil cytoplasmic antigens (ANCA), anti-glomerular basement membrane (GBM) antibody disease, or thrombotic thrombocytopenic purpura (TTP). Also, several neurological disorders, such as acute worsening in myasthenia gravis, Guillan-Barré syndrome (GBS) and chronic inflammatory demyelinating polyneuropathy (CIDP), can successfully be treated with PE. Only small case series have previously shown that PE is also effective in relapses in patients with multiple sclerosis (MS). We report our experiences of PE therapy in a series of 20 patients with 21 steroid unresponsive MS relapses. A marked-to-moderate clinical response with clear gain of function in 76% of patients with uni- or bilateral optic neuritis and in 87.5% of patients with relapses other than optic neuritis was observed. PE is an effective and well tolerated therapeutic option for steroid-unresponsive MS relapses.
Ubiquitous-Severance Hospital Project: Implementation and Results
Chang, Bung-Chul; Kim, Young-A; Kim, Jee Hea; Jung, Hae Kyung; Kang, Eun Hae; Kang, Hee Suk; Lee, Hyung Il; Kim, Yong Ook; Yoo, Sun Kook; Sunwoo, Ilnam; An, Seo Yong; Jeong, Hye Jeong
2010-01-01
Objectives The purpose of this study was to review an implementation of u-Severance information system with focus on electronic hospital records (EHR) and to suggest future improvements. Methods Clinical Data Repository (CDR) of u-Severance involved implementing electronic medical records (EMR) as the basis of EHR and the management of individual health records. EHR were implemented with service enhancements extending to the clinical decision support system (CDSS) and expanding the knowledge base for research with a repository for clinical data and medical care information. Results The EMR system of Yonsei University Health Systems (YUHS) consists of HP integrity superdome servers using MS SQL as a database management system and MS Windows as its operating system. Conclusions YUHS is a high-performing medical institution with regards to efficient management and customer satisfaction; however, after 5 years of implementation of u-Severance system, several limitations with regards to expandability and security have been identified. PMID:21818425
Recent Advances in Targeted and Untargeted Metabolomics by NMR and MS/NMR Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bingol, Kerem
Metabolomics has made significant progress in multiple fronts in the last 18 months. This minireview aimed to give an overview of these advancements in the light of their contribution to targeted and untargeted metabolomics. New computational approaches have emerged to overcome manual absolute quantitation step of metabolites in 1D 1H NMR spectra. This provides more consistency between inter-laboratory comparisons. Integration of 2D NMR metabolomics databases under a unified web server allowed very accurate identification of the metabolites that have been catalogued in these databases. For the remaining uncatalogued and unknown metabolites, new cheminformatics approaches have been developed by combining NMRmore » and mass spectrometry. These hybrid NMR/MS approaches accelerated the identification of unknowns in untargeted studies, and now they are allowing to profile ever larger number of metabolites in application studies.« less
Ubiquitous-severance hospital project: implementation and results.
Chang, Bung-Chul; Kim, Nam-Hyun; Kim, Young-A; Kim, Jee Hea; Jung, Hae Kyung; Kang, Eun Hae; Kang, Hee Suk; Lee, Hyung Il; Kim, Yong Ook; Yoo, Sun Kook; Sunwoo, Ilnam; An, Seo Yong; Jeong, Hye Jeong
2010-03-01
The purpose of this study was to review an implementation of u-Severance information system with focus on electronic hospital records (EHR) and to suggest future improvements. Clinical Data Repository (CDR) of u-Severance involved implementing electronic medical records (EMR) as the basis of EHR and the management of individual health records. EHR were implemented with service enhancements extending to the clinical decision support system (CDSS) and expanding the knowledge base for research with a repository for clinical data and medical care information. The EMR system of Yonsei University Health Systems (YUHS) consists of HP integrity superdome servers using MS SQL as a database management system and MS Windows as its operating system. YUHS is a high-performing medical institution with regards to efficient management and customer satisfaction; however, after 5 years of implementation of u-Severance system, several limitations with regards to expandability and security have been identified.
OGS improvements in the year 2011 in running the Northeastern Italy Seismic Network
NASA Astrophysics Data System (ADS)
Bragato, P. L.; Pesaresi, D.; Saraò, A.; Di Bartolomeo, P.; Durı, G.
2012-04-01
The Centro di Ricerche Sismologiche (CRS, Seismological Research Center) of the Istituto Nazionale di Oceanografia e di Geofisica Sperimentale (OGS, Italian National Institute for Oceanography and Experimental Geophysics) in Udine (Italy) after the strong earthquake of magnitude M=6.4 occurred in 1976 in the Italian Friuli-Venezia Giulia region, started to operate the Northeastern Italy Seismic Network: it currently consists of 15 very sensitive broad band and 21 simpler short period seismic stations, all telemetered to and acquired in real time at the OGS-CRS data center in Udine. Real time data exchange agreements in place with other Italian, Slovenian, Austrian and Swiss seismological institutes lead to a total number of about 100 seismic stations acquired in real time, which makes the OGS the reference institute for seismic monitoring of Northeastern Italy. Since 2002 OGS-CRS is using the Antelope software suite on several workstations plus a SUN Cluster as the main tool for collecting, analyzing, archiving and exchanging seismic data, initially in the framework of the EU Interreg IIIA project "Trans-national seismological networks in the South-Eastern Alps". SeisComP is also used as a real time data exchange server tool. In order to improve the seismological monitoring of the Northeastern Italy area, at OGS-CRS we tuned existing programs and created ad hoc ones like: a customized web server named PickServer to manually relocate earthquakes, a script for automatic moment tensor determination, scripts for web publishing of earthquake parametric data, waveforms, state of health parameters and shaking maps, noise characterization by means of automatic spectra analysis, and last but not least scripts for email/SMS/fax alerting. The OGS-CRS Real Time Seismological website (RTS, http://rts.crs.inogs.it/) operative since several years was initially developed in the framework of the Italian DPC-INGV S3 Project: the RTS website shows classic earthquake locations parametric data plus ShakeMap and moment tensor information. At OGS-CRS we also spent a considerable amount of efforts in improving the long-period performances of broadband seismic stations, either by carrying out full re-installations and/or applying thermal insulations to the seismometers: more examples of PSD plots of the PRED broad band seismic station installation in the cave tunnel of Cave del Predil using a Quanterra Q330HR high resolution digitizer and a Sterckeisen STS-2 broadband seismometer will be illustrated. Efforts in strengthening the reliability of data links, exploring the use of redundant satellite/radio/GPRS links will also be shown.
Moorthy, Balakrishnan S; Schultz, Steven G; Kim, Sherry G; Topp, Elizabeth M
2014-06-02
Solid state amide hydrogen/deuterium exchange with mass spectrometric analysis (ssHDX-MS) was used to assess the conformation of myoglobin (Mb) in lyophilized formulations, and the results correlated with the extent of aggregation during storage. Mb was colyophilized with sucrose (1:1 or 1:8 w/w), mannitol (1:1 w/w), or NaCl (1:1 w/w) or in the absence of excipients. Immediately after lyophilization, samples of each formulation were analyzed by ssHDX-MS and Fourier transform infrared spectroscopy (FTIR) to assess Mb conformation, and by dynamic light scattering (DLS) and size exclusion chromatography (SEC) to determine the extent of aggregation. The remaining samples were then placed on stability at 25 °C and 60% RH or 40 °C and 75% RH for up to 1 year, withdrawn at intervals, and analyzed for aggregate content by SEC and DLS. In ssHDX-MS of samples immediately after lyophilization (t = 0), Mb was less deuterated in solids containing sucrose (1:1 and 1:8 w/w) than in those containing mannitol (1:1 w/w), NaCl (1:1 w/w), or Mb alone. Deuterium uptake kinetics and peptide mass envelopes also indicated greater Mb structural perturbation in mannitol, NaCl, or Mb-alone samples at t = 0. The extent of deuterium incorporation and kinetic parameters related to rapidly and slowly exchanging amide pools (Nfast, Nslow), measured at t = 0, were highly correlated with the extent of aggregation on storage as measured by SEC. In contrast, the extent of aggregation was weakly correlated with FTIR band intensity and peak position measured at t = 0. The results support the use of ssHDX-MS as a formulation screening tool in developing lyophilized protein drug products.
Li, Bo; Tang, Jing; Yang, Qingxia; Cui, Xuejiao; Li, Shuang; Chen, Sijie; Cao, Quanxing; Xue, Weiwei; Chen, Na; Zhu, Feng
2016-12-13
In untargeted metabolomics analysis, several factors (e.g., unwanted experimental &biological variations and technical errors) may hamper the identification of differential metabolic features, which requires the data-driven normalization approaches before feature selection. So far, ≥16 normalization methods have been widely applied for processing the LC/MS based metabolomics data. However, the performance and the sample size dependence of those methods have not yet been exhaustively compared and no online tool for comparatively and comprehensively evaluating the performance of all 16 normalization methods has been provided. In this study, a comprehensive comparison on these methods was conducted. As a result, 16 methods were categorized into three groups based on their normalization performances across various sample sizes. The VSN, the Log Transformation and the PQN were identified as methods of the best normalization performance, while the Contrast consistently underperformed across all sub-datasets of different benchmark data. Moreover, an interactive web tool comprehensively evaluating the performance of 16 methods specifically for normalizing LC/MS based metabolomics data was constructed and hosted at http://server.idrb.cqu.edu.cn/MetaPre/. In summary, this study could serve as a useful guidance to the selection of suitable normalization methods in analyzing the LC/MS based metabolomics data.
López Uriarte, Graciela Arelí; Burciaga Flores, Carlos Horacio; Torres de la Cruz, Víctor Manuel; Medina Aguado, María Magdalena; Gómez Puente, Viviana Maricela; Romero Gutiérrez, Liliana Nayeli; Martínez de Villarreal, Laura Elia
2018-06-01
Prenatal diagnosis of Down syndrome (DS) is based on the calculated risk of maternal age, biochemical and ultrasonographic markers and recently by cfDNA. Differences in proteomic profiles may give an opportunity to find new biomarkers. Characterize proteome of serum of mothers carrying DS fetus. Blood serum samples of three groups of women were obtained, (a) 10 non-pregnant, (b) 10 pregnant with healthy fetus by ultrasound evaluation, (c) nine pregnant with DS fetus. Sample preparation was as follows: Albumin/IgG depletion, desalting, and trypsin digestion; the process was performed in nanoUPLC MS/MS. Data analysis was made with Mass Lynx 4.1 and ProteinLynx Global Server 3.0, peptide and protein recognition by MASCOT algorithm and UNIPROT-Swissprot database. Each group showed different protein profiles. Some proteins were shared between groups. Only sera from pregnant women showed proteins related to immune and clot pathways. Mothers with DS fetus had 42 specific proteins. We found a different serum protein profile in mothers carrying DS fetuses that do not reflect expression of genes in the extra chromosome. Further studies will be necessary to establish the role of these proteins in aneuploid fetus and analyze their possible use as potential biomarkers.
Li, Bo; Tang, Jing; Yang, Qingxia; Cui, Xuejiao; Li, Shuang; Chen, Sijie; Cao, Quanxing; Xue, Weiwei; Chen, Na; Zhu, Feng
2016-01-01
In untargeted metabolomics analysis, several factors (e.g., unwanted experimental & biological variations and technical errors) may hamper the identification of differential metabolic features, which requires the data-driven normalization approaches before feature selection. So far, ≥16 normalization methods have been widely applied for processing the LC/MS based metabolomics data. However, the performance and the sample size dependence of those methods have not yet been exhaustively compared and no online tool for comparatively and comprehensively evaluating the performance of all 16 normalization methods has been provided. In this study, a comprehensive comparison on these methods was conducted. As a result, 16 methods were categorized into three groups based on their normalization performances across various sample sizes. The VSN, the Log Transformation and the PQN were identified as methods of the best normalization performance, while the Contrast consistently underperformed across all sub-datasets of different benchmark data. Moreover, an interactive web tool comprehensively evaluating the performance of 16 methods specifically for normalizing LC/MS based metabolomics data was constructed and hosted at http://server.idrb.cqu.edu.cn/MetaPre/. In summary, this study could serve as a useful guidance to the selection of suitable normalization methods in analyzing the LC/MS based metabolomics data. PMID:27958387
Vijay, Sonam
2014-01-01
Salivary gland proteins of Anopheles mosquitoes offer attractive targets to understand interactions with sporozoites, blood feeding behavior, homeostasis, and immunological evaluation of malaria vectors and parasite interactions. To date limited studies have been carried out to elucidate salivary proteins of An. stephensi salivary glands. The aim of the present study was to provide detailed analytical attributives of functional salivary gland proteins of urban malaria vector An. stephensi. A proteomic approach combining one-dimensional electrophoresis (1DE), ion trap liquid chromatography mass spectrometry (LC/MS/MS), and computational bioinformatic analysis was adopted to provide the first direct insight into identification and functional characterization of known salivary proteins and novel salivary proteins of An. stephensi. Computational studies by online servers, namely, MASCOT and OMSSA algorithms, identified a total of 36 known salivary proteins and 123 novel proteins analysed by LC/MS/MS. This first report describes a baseline proteomic catalogue of 159 salivary proteins belonging to various categories of signal transduction, regulation of blood coagulation cascade, and various immune and energy pathways of An. stephensi sialotranscriptome by mass spectrometry. Our results may serve as basis to provide a putative functional role of proteins in concept of blood feeding, biting behavior, and other aspects of vector-parasite host interactions for parasite development in anopheline mosquitoes. PMID:25126571
Vijay, Sonam; Rawat, Manmeet; Sharma, Arun
2014-01-01
Salivary gland proteins of Anopheles mosquitoes offer attractive targets to understand interactions with sporozoites, blood feeding behavior, homeostasis, and immunological evaluation of malaria vectors and parasite interactions. To date limited studies have been carried out to elucidate salivary proteins of An. stephensi salivary glands. The aim of the present study was to provide detailed analytical attributives of functional salivary gland proteins of urban malaria vector An. stephensi. A proteomic approach combining one-dimensional electrophoresis (1DE), ion trap liquid chromatography mass spectrometry (LC/MS/MS), and computational bioinformatic analysis was adopted to provide the first direct insight into identification and functional characterization of known salivary proteins and novel salivary proteins of An. stephensi. Computational studies by online servers, namely, MASCOT and OMSSA algorithms, identified a total of 36 known salivary proteins and 123 novel proteins analysed by LC/MS/MS. This first report describes a baseline proteomic catalogue of 159 salivary proteins belonging to various categories of signal transduction, regulation of blood coagulation cascade, and various immune and energy pathways of An. stephensi sialotranscriptome by mass spectrometry. Our results may serve as basis to provide a putative functional role of proteins in concept of blood feeding, biting behavior, and other aspects of vector-parasite host interactions for parasite development in anopheline mosquitoes.
Tianxiao Jiang; Siddiqui, Hasan; Ray, Shruti; Asman, Priscella; Ozturk, Musa; Ince, Nuri F
2017-07-01
This paper presents a portable platform to collect and review behavioral data simultaneously with neurophysiological signals. The whole system is comprised of four parts: a sensor data acquisition interface, a socket server for real-time data streaming, a Simulink system for real-time processing and an offline data review and analysis toolbox. A low-cost microcontroller is used to acquire data from external sensors such as accelerometer and hand dynamometer. The micro-controller transfers the data either directly through USB or wirelessly through a bluetooth module to a data server written in C++ for MS Windows OS. The data server also interfaces with the digital glove and captures HD video from webcam. The acquired sensor data are streamed under User Datagram Protocol (UDP) to other applications such as Simulink/Matlab for real-time analysis and recording. Neurophysiological signals such as electroencephalography (EEG), electrocorticography (ECoG) and local field potential (LFP) recordings can be collected simultaneously in Simulink and fused with behavioral data. In addition, we developed a customized Matlab Graphical User Interface (GUI) software to review, annotate and analyze the data offline. The software provides a fast, user-friendly data visualization environment with synchronized video playback feature. The software is also capable of reviewing long-term neural recordings. Other featured functions such as fast preprocessing with multithreaded filters, annotation, montage selection, power-spectral density (PSD) estimate, time-frequency map and spatial spectral map are also implemented.
Jiang, Xiaogang; Feng, Shun; Tian, Ruijun; Han, Guanghui; Jiang, Xinning; Ye, Mingliang; Zou, Hanfa
2007-02-01
An approach was developed to automate sample introduction for nanoflow LC-MS/MS (microLC-MS/MS) analysis using a strong cation exchange (SCX) trap column. The system consisted of a 100 microm id x 2 cm SCX trap column and a 75 microm id x 12 cm C18 RP analytical column. During the sample loading step, the flow passing through the SCX trap column was directed to waste for loading a large volume of sample at high flow rate. Then the peptides bound on the SCX trap column were eluted onto the RP analytical column by a high salt buffer followed by RP chromatographic separation of the peptides at nanoliter flow rate. It was observed that higher performance of separation could be achieved with the system using SCX trap column than with the system using C18 trap column. The high proteomic coverage using this approach was demonstrated in the analysis of tryptic digest of BSA and yeast cell lysate. In addition, this system was also applied to two-dimensional separation of tryptic digest of human hepatocellular carcinoma cell line SMMC-7721 for large scale proteome analysis. This system was fully automated and required minimum changes on current microLC-MS/MS system. This system represented a promising platform for routine proteome analysis.
Chamkasem, Narong
2017-08-30
A simple high-throughput liquid chromatography/tandem mass spectrometry (LC-MS-MS) method was developed for the determination of maleic hydrazide, glyphosate, fosetyl aluminum, and ethephon in grapes using a reversed-phase column with weak anion-exchange and cation-exchange mixed mode. A 5 g test portion was shaken with 50 mM HOAc and 10 mM Na 2 EDTA in 1/3 (v/v) MeOH/H 2 O for 10 min. After centrifugation, the extract was passed through an Oasis HLB cartridge to retain suspended particulates and nonpolar interferences. The final solution was injected and directly analyzed in 17 min by LC-MS-MS. Two MS-MS transitions were monitored in the method for each target compound to achieve true positive identification. Four isotopically labeled internal standards corresponding to each analyte were used to correct for matrix suppression effects and/or instrument signal drift. The linearity of the detector response was demonstrated in the range from 10 to 1000 ng/mL for each analyte with a coefficient of determination (R 2 ) of ≥0.995. The average recovery for all analytes at 100, 500, and 2000 ng/g (n = 5) ranged from 87 to 111%, with a relative standard deviation of less than 17%. The estimated LOQs for maleic hydrazide, glyphosate, fosetyl-Al, and ethephon were 38, 19, 29, and 34 ng/g, respectively.
Szabo, Zoltan; Thayer, James R; Agroskin, Yury; Lin, Shanhua; Liu, Yan; Srinivasan, Kannan; Saba, Julian; Viner, Rosa; Huhmer, Andreas; Rohrer, Jeff; Reusch, Dietmar; Harfouche, Rania; Khan, Shaheer H; Pohl, Christopher
2017-05-01
Characterization of glycans present on glycoproteins has become of increasing importance due to their biological implications, such as protein folding, immunogenicity, cell-cell adhesion, clearance, receptor interactions, etc. In this study, the resolving power of high-performance anion exchange chromatography with pulsed amperometric detection (HPAE-PAD) was applied to glycan separations and coupled to mass spectrometry to characterize native glycans released from different glycoproteins. A new, rapid workflow generates glycans from 200 μg of glycoprotein supporting reliable and reproducible annotation by mass spectrometry (MS). With the relatively high flow rate of HPAE-PAD, post-column splitting diverted 60% of the flow to a novel desalter, then to the mass spectrometer. The delay between PAD and MS detectors is consistent, and salt removal after the column supports MS. HPAE resolves sialylated (charged) glycans and their linkage and positional isomers very well; separations of neutral glycans are sufficient for highly reproducible glycoprofiling. Data-dependent MS 2 in negative mode provides highly informative, mostly C- and Z-type glycosidic and cross-ring fragments, making software-assisted and manual annotation reliable. Fractionation of glycans followed by exoglycosidase digestion confirms MS-based annotations. Combining the isomer resolution of HPAE with MS 2 permitted thorough N-glycan annotation and led to characterization of 17 new structures from glycoproteins with challenging glycan profiles.
Zheng, Yao-Rong; Stang, Peter J.
2009-01-01
The direct observation of dynamic ligand exchange beween Pt-N coordination-driven self-assembled supramolecular polygons (triangles and rectangles) has been achieved using stable isotope labeling (1H/2D) of the pyridyl donors and electrospray ionization mass spectrometry (ESI-MS) together with NMR spectroscopy. Both the thermodynamic and kinetic aspects of such exchange processes have been established based on quantitative mass spectral results. Further investigation showed that the exchange is highly dependent on experimental conditions such as temperature, solvent, and the counter anions. PMID:19243144
Pan, Shiyang; Mu, Yuan; Wang, Hong; Wang, Tong; Huang, Peijun; Ma, Jianfeng; Jiang, Li; Zhang, Jie; Gu, Bing; Yi, Lujiang
2010-04-01
To meet the needs of management of medical case information and biospecimen simultaneously, we developed a novel medical case information system integrating with biospecimen management. The database established by MS SQL Server 2000 covered, basic information, clinical diagnosis, imaging diagnosis, pathological diagnosis and clinical treatment of patient; physicochemical property, inventory management and laboratory analysis of biospecimen; users log and data maintenance. The client application developed by Visual C++ 6.0 was used to implement medical case and biospecimen management, which was based on Client/Server model. This system can perform input, browse, inquest, summary of case and related biospecimen information, and can automatically synthesize case-records based on the database. Management of not only a long-term follow-up on individual, but also of grouped cases organized according to the aim of research can be achieved by the system. This system can improve the efficiency and quality of clinical researches while biospecimens are used coordinately. It realizes synthesized and dynamic management of medical case and biospecimen, which may be considered as a new management platform.
NASA Astrophysics Data System (ADS)
Szpunar, Joanna; McSheehy, Shona; Połeć, Kasia; Vacchina, Véronique; Mounicou, Sandra; Rodriguez, Isaac; Łobiński, Ryszard
2000-07-01
Recent advances in the coupling of gas chromatography (GC) and high performance liquid chromatography (HPLC) with inductively coupled plasma mass spectrometry (ICP MS) and their role in trace element speciation analysis of environmental materials are presented. The discussion is illustrated with three research examples concerning the following topics: (i) development and coupling of multicapillary microcolumn GC with ICP MS for speciation of organotin in sediment and biological tissue samples; (ii) speciation of arsenic in marine algae by size-exclusion-anion-exchange HPLC-ICP MS; and (iii) speciation of cadmium in plant cell cultures by size-exclusion HPLC-ICP MS. Particular attention is paid to the problem of signal identification in ICP MS chromatograms; the potential of electrospray MS/MS for this purpose is highlighted.
Acter, Thamina; Kim, Donghwi; Ahmed, Arif; Jin, Jang Mi; Yim, Un Hyuk; Shim, Won Joon; Kim, Young Hwan; Kim, Sunghwan
2016-05-01
This paper presents a detailed investigation of the feasibility of optimized positive and negative atmospheric pressure chemical ionization (APCI) mass spectrometry (MS) and atmospheric pressure photoionization (APPI) MS coupled to hydrogen-deuterium exchange (HDX) for structural assignment of diverse oxygen-containing compounds. The important parameters for optimization of HDX MS were characterized. The optimized techniques employed in the positive and negative modes showed satisfactory HDX product ions for the model compounds when dichloromethane and toluene were employed as a co-solvent in APCI- and APPI-HDX, respectively. The evaluation of the mass spectra obtained from 38 oxygen-containing compounds demonstrated that the extent of the HDX of the ions was structure-dependent. The combination of information provided by different ionization techniques could be used for better speciation of oxygen-containing compounds. For example, (+) APPI-HDX is sensitive to compounds with alcohol, ketone, or aldehyde substituents, while (-) APPI-HDX is sensitive to compounds with carboxylic functional groups. In addition, the compounds with alcohol can be distinguished from other compounds by the presence of exchanged peaks. The combined information was applied to study chemical compositions of degraded oils. The HDX pattern, double bond equivalent (DBE) distribution, and previously reported oxidation products were combined to predict structures of the compounds produced from oxidation of oil. Overall, this study shows that APCI- and APPI-HDX MS are useful experimental techniques that can be applied for the structural analysis of oxygen-containing compounds.
Implementation of a World Wide Web server for the oil and gas industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blaylock, R.E.; Martin, F.D.; Emery, R.
1995-12-31
The Gas and Oil Technology Exchange and Communication Highway, (GO-TECH), provides an electronic information system for the petroleum community for the purpose of exchanging ideas, data, and technology. The personal computer-based system fosters communication and discussion by linking oil and gas producers with resource centers, government agencies, consulting firms, service companies, national laboratories, academic research groups, and universities throughout the world. The oil and gas producers are provided access to the GO-TECH World Wide Web home page via modem links, as well as Internet. The future GO-TECH applications will include the establishment of{open_quote}Virtual corporations {close_quotes} consisting of consortiums of smallmore » companies, consultants, and service companies linked by electronic information systems. These virtual corporations will have the resources and expertise previously found only in major corporations.« less
Implementation of a World Wide Web server for the oil and gas industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blaylock, R.E.; Martin, F.D.; Emery, R.
1996-10-01
The Gas and Oil Technology Exchange and Communication Highway (GO-TECH) provides an electronic information system for the petroleum community for exchanging ideas, data, and technology. The PC-based system fosters communication and discussion by linking the oil and gas producers with resource centers, government agencies, consulting firms, service companies, national laboratories, academic research groups, and universities throughout the world. The oil and gas producers can access the GO-TECH World Wide Web (WWW) home page through modem links, as well as through the Internet. Future GO-TECH applications will include the establishment of virtual corporations consisting of consortia of small companies, consultants, andmore » service companies linked by electronic information systems. These virtual corporations will have the resources and expertise previously found only in major corporations.« less
The covert channel over HTTP protocol
NASA Astrophysics Data System (ADS)
Graniszewski, Waldemar; Krupski, Jacek; Szczypiorski, Krzysztof
2016-09-01
The paper presents a new steganographic method - the covert channel is created over HTTP protocol header, i.e. trailer field. HTTP protocol is one of the most frequently used in the Internet. The popularity of the Web servers and network traffic from, and to them, is one of the requirements for undetectable message exchange. To study this kind of the information hiding technique an application in Javascript language based on the Node.js framework was written. The results of the experiment that was performed to send a message in the covert channel are also presented.
Bumm, Klaus; Zheng, Mingzhong; Bailey, Clyde; Zhan, Fenghuang; Chiriva-Internati, M; Eddlemon, Paul; Terry, Julian; Barlogie, Bart; Shaughnessy, John D
2002-02-01
Clinical GeneOrganizer (CGO) is a novel windows-based archiving, organization and data mining software for the integration of gene expression profiling in clinical medicine. The program implements various user-friendly tools and extracts data for further statistical analysis. This software was written for Affymetrix GeneChip *.txt files, but can also be used for any other microarray-derived data. The MS-SQL server version acts as a data mart and links microarray data with clinical parameters of any other existing database and therefore represents a valuable tool for combining gene expression analysis and clinical disease characteristics.
NASA Astrophysics Data System (ADS)
Huang, Richard Y.-C.; Iacob, Roxana E.; Krystek, Stanley R.; Jin, Mi; Wei, Hui; Tao, Li; Das, Tapan K.; Tymiak, Adrienne A.; Engen, John R.; Chen, Guodong
2017-05-01
Aggregation of protein therapeutics has long been a concern across different stages of manufacturing processes in the biopharmaceutical industry. It is often indicative of aberrant protein therapeutic higher-order structure. In this study, the aggregation propensity of a human Fc-fusion protein therapeutic was characterized. Hydrogen/deuterium exchange mass spectrometry (HDX-MS) was applied to examine the conformational dynamics of dimers collected from a bioreactor. HDX-MS data combined with spatial aggregation propensity calculations revealed a potential aggregation interface in the Fc domain. This study provides a general strategy for the characterization of the aggregation propensity of Fc-fusion proteins at the molecular level.
Risk Assessment of the Naval Postgraduate School Gigabit Network
2004-09-01
Management Server (1) • Ras Server (1) • Remedy Server (1) • Samba Server(2) • SQL Servers (3) • Web Servers (3) • WINS Server (1) • Library...Server Bob Sharp INCA Windows 2000 Advanced Server NPGS Landesk SQL 2000 Alan Pires eagle Microsoft Windows 2000 Advanced Server EWS NPGS Landesk...Advanced Server Special Projects NPGS SQL Alan Pires MC01BDB Microsoft Windows 2000 Advanced Server Special Projects NPGS SQL 2000 Alan Pires
The diverse and expanding role of mass spectrometry in structural and molecular biology.
Lössl, Philip; van de Waterbeemd, Michiel; Heck, Albert Jr
2016-12-15
The emergence of proteomics has led to major technological advances in mass spectrometry (MS). These advancements not only benefitted MS-based high-throughput proteomics but also increased the impact of mass spectrometry on the field of structural and molecular biology. Here, we review how state-of-the-art MS methods, including native MS, top-down protein sequencing, cross-linking-MS, and hydrogen-deuterium exchange-MS, nowadays enable the characterization of biomolecular structures, functions, and interactions. In particular, we focus on the role of mass spectrometry in integrated structural and molecular biology investigations of biological macromolecular complexes and cellular machineries, highlighting work on CRISPR-Cas systems and eukaryotic transcription complexes. © 2016 The Authors. Published under the terms of the CC BY NC ND 4.0 license.
Löck, Steffen; Roth, Klaus; Skripcak, Tomas; Worbs, Mario; Helmbrecht, Stephan; Jakobi, Annika; Just, Uwe; Krause, Mechthild; Baumann, Michael; Enghardt, Wolfgang; Lühr, Armin
2015-09-01
To guarantee equal access to optimal radiotherapy, a concept of patient assignment to photon or particle radiotherapy using remote treatment plan exchange and comparison - ReCompare - was proposed. We demonstrate the implementation of this concept and present its clinical applicability. The ReCompare concept was implemented using a client-server based software solution. A clinical workflow for the remote treatment plan exchange and comparison was defined. The steps required by the user and performed by the software for a complete plan transfer were described and an additional module for dose-response modeling was added. The ReCompare software was successfully tested in cooperation with three external partner clinics and worked meeting all required specifications. It was compatible with several standard treatment planning systems, ensured patient data protection, and integrated in the clinical workflow. The ReCompare software can be applied to support non-particle radiotherapy institutions with the patient-specific treatment decision on the optimal irradiation modality by remote treatment plan exchange and comparison. Copyright © 2015. Published by Elsevier GmbH.
X-Windows Information Sharing Protocol Widget Class
NASA Technical Reports Server (NTRS)
Barry, Matthew R.
2006-01-01
The X-Windows Information Sharing Protocol (ISP) Widget Class ("Class") is used here in the object-oriented-programming sense of the word) was devised to simplify the task of implementing ISP graphical-user-interface (GUI) computer programs. ISP programming tasks require many method calls to identify, query, and interpret the connections and messages exchanged between a client and an ISP server. Most X-Windows GUI programs use widget sets or toolkits to facilitate management of complex objects. The widget standards facilitate construction of toolkits and application programs. The X-Windows Information Sharing Protocol (ISP) Widget Class encapsulates the client side of the ISP programming libraries within the framework of an X-Windows widget. Using the widget framework, X-Windows GUI programs can interact with ISP services in an abstract way and in the same manner as that of other graphical widgets, making it easier to write ISP GUI client programs. Wrapping ISP client services inside a widget framework enables a programmer to treat an ISP server interface as though it were a GUI. Moreover, an alternate subclass could implement another communication protocol in the same sort of widget.
Telemedicine with integrated data security in ATM-based networks
NASA Astrophysics Data System (ADS)
Thiel, Andreas; Bernarding, Johannes; Kurth, Ralf; Wenzel, Rudiger; Villringer, Arno; Tolxdorff, Thomas
1997-05-01
Telemedical services rely on the digital transfer of large amounts of data in a short time. The acceptance of these services requires therefore new hard- and software concepts. The fast exchange of data is well performed within a high- speed ATM-based network. The fast access to the data from different platforms imposes more difficult problems, which may be divided into those relating to standardized data formats and those relating to different levels of data security across nations. For a standardized access to the formats and those relating to different levels of data security across nations. For a standardized access to the image data, a DICOM 3.0 server was implemented.IMages were converted into the DICOM 3.0 standard if necessary. The access to the server is provided by an implementation of DICOM in JAVA allowing access to the data from different platforms. Data protection measures to ensure the secure transfer of sensitive patient data are not yet solved within the DICOM concept. We investigated different schemes to protect data using the DICOM/JAVA modality with as little impact on data transfer speed as possible.
A Proposal of TLS Implementation for Cross Certification Model
NASA Astrophysics Data System (ADS)
Kaji, Tadashi; Fujishiro, Takahiro; Tezuka, Satoru
Today, TLS is widely used for achieving a secure communication system. And TLS is used PKI for server authentication and/or client authentication. However, its PKI environment, which is called as “multiple trust anchors environment,” causes the problem that the verifier has to maintain huge number of CA certificates in the ubiquitous network because the increase of terminals connected to the network brings the increase of CAs. However, most of terminals in the ubiquitous network will not have enough memory to hold such huge number of CA certificates. Therefore, another PKI environment, “cross certification environment”, is useful for the ubiquitous network. But, because current TLS is designed for the multiple trust anchors model, TLS cannot work efficiently on the cross-certification model. This paper proposes a TLS implementation method to support the cross certification model efficiently. Our proposal reduces the size of exchanged messages between the TLS client and the TLS server during the handshake process. Therefore, our proposal is suitable for implementing TLS in the terminals that do not have enough computing power and memory in ubiquitous network.
Feasibility of interactive biking exercise system for telemanagement in elderly.
Finkelstein, Joseph; Jeong, In Cheol
2013-01-01
Inexpensive cycling equipment is widely available for home exercise however its use is hampered by lack of tools supporting real-time monitoring of cycling exercise in elderly and coordination with a clinical care team. To address these barriers, we developed a low-cost mobile system aimed at facilitating safe and effective home-based cycling exercise. The system used a miniature wireless 3-axis accelerometer that transmitted the cycling acceleration data to a tablet PC that was integrated with a multi-component disease management system. An exercise dashboard was presented to a patient allowing real-time graphical visualization of exercise progress. The system was programmed to alert patients when exercise intensity exceeded the levels recommended by the patient care providers and to exchange information with a central server. The feasibility of the system was assessed by testing the accuracy of cycling speed monitoring and reliability of alerts generated by the system. Our results demonstrated high validity of the system both for upper and lower extremity exercise monitoring as well as reliable data transmission between home unit and central server.
NASA's EOSDIS: options for data providers
NASA Astrophysics Data System (ADS)
Khalsa, Siri J.; Ujhazy, John E.
1995-12-01
EOSDIS, the data and information system being developed by NASA to support interdisciplinary earth science research into the 21st century, will do more than manage and distribute data from EOS-era satellites. It will also promote the exchange of data, tools, and research results across disciplinary, agency, and national boundaries. This paper describes the options that data providers will have for interacting with the EOSDIS Core System (ECS), the infrastructure of EOSDIS. The options include: using the ECS advertising service to announce the availability of data at the provider's site; submitting a candidate data set to one of the Distributed Active Archive Centers (DAACs); establishing a data server that will make the data accessible via ECS and establishing Local Information Manager (LIM) which would make the data available for multi-site searches. One additional option is through custom gateway interfaces which would provide access to existing data archives. The gateway, data server, and LIM options require the implementation of ECS code at the provider site to insure proper protocols. The advertisement and ingest options require no part of ECS design to reside at the provider site.
Seven perspectives on GPCR H/D-exchange proteomics methods
Zhang, Xi
2017-01-01
Recent research shows surging interest to visualize human G protein-coupled receptor (GPCR) dynamic structures using the bottom-up H/D-exchange (HDX) proteomics technology. This opinion article clarifies critical technical nuances and logical thinking behind the GPCR HDX proteomics method, to help scientists overcome cross-discipline pitfalls, and understand and reproduce the protocol at high quality. The 2010 89% HDX structural coverage of GPCR was achieved with both structural and analytical rigor. This article emphasizes systematically considering membrane protein structure stability and compatibility with chromatography and mass spectrometry (MS) throughout the pipeline, including the effects of metal ions, zero-detergent shock, and freeze-thaws on HDX result rigor. This article proposes to view bottom-up HDX as two steps to guide choices of detergent buffers and chromatography settings: (I) protein HDX labeling in native buffers, and (II) peptide-centric analysis of HDX labels, which applies (a) bottom-up MS/MS to construct peptide matrix and (b) HDX MS to locate and quantify H/D labels. The detergent-low-TCEP digestion method demystified the challenge of HDX-grade GPCR digestion. GPCR HDX proteomics is a structural approach, thus its choice of experimental conditions should let structure lead and digestion follow, not the opposite. PMID:28529698
Isotopic Exchange HPLC-HRMS/MS Applied to Cyclic Proanthocyanidins in Wine and Cranberries
NASA Astrophysics Data System (ADS)
Longo, Edoardo; Rossetti, Fabrizio; Scampicchio, Matteo; Boselli, Emanuele
2018-01-01
Cyclic B-type proanthocyanidins in red wines and grapes have been discovered recently. However, proanthocyanidins of a different chemical structure (non-cyclic A-type proanthocyanidins) already known to be present in cranberries and wine possess an identical theoretical mass. As a matter of fact, the retention times and the MS/MS fragmentations found for the proposed novel cyclic B-type tetrameric proanthocyanidin in red wine and the known tetrameric proanthocyanidin in a cranberry extract are herein shown to be identical. Thus, hydrogen/deuterium (H/D) exchange was applied to HPLC-HRMS/MS to confirm the actual chemical structure of the new oligomeric proanthocyanidins. The comparison of the results in water and deuterium oxide and between wine and cranberry extract indicates that the cyclic B-type tetrameric proanthocyanidin is the actual constituent of the recently proposed novel tetrameric species ([C60H49O24]+, m/z 1153.2608). Surprisingly, the same compound was also identified as the main tetrameric proanthocyanidin in cranberries. Finally, a totally new cyclic B-type hexameric proanthocyanidin ([C90H73O36]+, m/z 1729.3876) belonging to this novel class was identified for the first time in red wine. [Figure not available: see fulltext.
Key Management Scheme Based on Route Planning of Mobile Sink in Wireless Sensor Networks.
Zhang, Ying; Liang, Jixing; Zheng, Bingxin; Jiang, Shengming; Chen, Wei
2016-01-29
In many wireless sensor network application scenarios the key management scheme with a Mobile Sink (MS) should be fully investigated. This paper proposes a key management scheme based on dynamic clustering and optimal-routing choice of MS. The concept of Traveling Salesman Problem with Neighbor areas (TSPN) in dynamic clustering for data exchange is proposed, and the selection probability is used in MS route planning. The proposed scheme extends static key management to dynamic key management by considering the dynamic clustering and mobility of MSs, which can effectively balance the total energy consumption during the activities. Considering the different resources available to the member nodes and sink node, the session key between cluster head and MS is established by modified an ECC encryption with Diffie-Hellman key exchange (ECDH) algorithm and the session key between member node and cluster head is built with a binary symmetric polynomial. By analyzing the security of data storage, data transfer and the mechanism of dynamic key management, the proposed scheme has more advantages to help improve the resilience of the key management system of the network on the premise of satisfying higher connectivity and storage efficiency.
phpMs: A PHP-Based Mass Spectrometry Utilities Library.
Collins, Andrew; Jones, Andrew R
2018-03-02
The recent establishment of cloud computing, high-throughput networking, and more versatile web standards and browsers has led to a renewed interest in web-based applications. While traditionally big data has been the domain of optimized desktop and server applications, it is now possible to store vast amounts of data and perform the necessary calculations offsite in cloud storage and computing providers, with the results visualized in a high-quality cross-platform interface via a web browser. There are number of emerging platforms for cloud-based mass spectrometry data analysis; however, there is limited pre-existing code accessible to web developers, especially for those that are constrained to a shared hosting environment where Java and C applications are often forbidden from use by the hosting provider. To remedy this, we provide an open-source mass spectrometry library for one of the most commonly used web development languages, PHP. Our new library, phpMs, provides objects for storing and manipulating spectra and identification data as well as utilities for file reading, file writing, calculations, peptide fragmentation, and protein digestion as well as a software interface for controlling search engines. We provide a working demonstration of some of the capabilities at http://pgb.liv.ac.uk/phpMs .
Bishop, Michael Jason; Crow, Brian S; Kovalcik, Kasey D; George, Joe; Bralley, James A
2007-04-01
A rapid and accurate quantitative method was developed and validated for the analysis of four urinary organic acids with nitrogen containing functional groups, formiminoglutamic acid (FIGLU), pyroglutamic acid (PYRGLU), 5-hydroxyindoleacetic acid (5-HIAA), and 2-methylhippuric acid (2-METHIP) by liquid chromatography tandem mass spectrometry (LC/MS/MS). The chromatography was developed using a weak anion-exchange amino column that provided mixed-mode retention of the analytes. The elution gradient relied on changes in mobile phase pH over a concave gradient, without the use of counter-ions or concentrated salt buffers. A simple sample preparation was used, only requiring the dilution of urine prior to instrumental analysis. The method was validated based on linearity (r2>or=0.995), accuracy (85-115%), precision (C.V.<12%), sample preparation stability (
Integrating sequence and structural biology with DAS
Prlić, Andreas; Down, Thomas A; Kulesha, Eugene; Finn, Robert D; Kähäri, Andreas; Hubbard, Tim JP
2007-01-01
Background The Distributed Annotation System (DAS) is a network protocol for exchanging biological data. It is frequently used to share annotations of genomes and protein sequence. Results Here we present several extensions to the current DAS 1.5 protocol. These provide new commands to share alignments, three dimensional molecular structure data, add the possibility for registration and discovery of DAS servers, and provide a convention how to provide different types of data plots. We present examples of web sites and applications that use the new extensions. We operate a public registry of DAS sources, which now includes entries for more than 250 distinct sources. Conclusion Our DAS extensions are essential for the management of the growing number of services and exchange of diverse biological data sets. In addition the extensions allow new types of applications to be developed and scientific questions to be addressed. The registry of DAS sources is available at PMID:17850653
NASA Astrophysics Data System (ADS)
Marshall, Stuart; Thaler, Jon; Schalk, Terry; Huffer, Michael
2006-06-01
The LSST Camera Control System (CCS) will manage the activities of the various camera subsystems and coordinate those activities with the LSST Observatory Control System (OCS). The CCS comprises a set of modules (nominally implemented in software) which are each responsible for managing one camera subsystem. Generally, a control module will be a long lived "server" process running on an embedded computer in the subsystem. Multiple control modules may run on a single computer or a module may be implemented in "firmware" on a subsystem. In any case control modules must exchange messages and status data with a master control module (MCM). The main features of this approach are: (1) control is distributed to the local subsystem level; (2) the systems follow a "Master/Slave" strategy; (3) coordination will be achieved by the exchange of messages through the interfaces between the CCS and its subsystems. The interface between the camera data acquisition system and its downstream clients is also presented.
Three-party authenticated key agreements for optimal communication
Lee, Tian-Fu; Hwang, Tzonelih
2017-01-01
Authenticated key agreements enable users to determine session keys, and to securely communicate with others over an insecure channel via the session keys. This study investigates the lower bounds on communications for three-party authenticated key agreements and considers whether or not the sub-keys for generating a session key can be revealed in the channel. Since two clients do not share any common secret key, they require the help of the server to authenticate their identities and exchange confidential and authenticated information over insecure networks. However, if the session key security is based on asymmetric cryptosystems, then revealing the sub-keys cannot compromise the session key. The clients can directly exchange the sub-keys and reduce the transmissions. In addition, authenticated key agreements were developed by using the derived results of the lower bounds on communications. Compared with related approaches, the proposed protocols had fewer transmissions and realized the lower bounds on communications. PMID:28355253
Brusniak, Mi-Youn; Bodenmiller, Bernd; Campbell, David; Cooke, Kelly; Eddes, James; Garbutt, Andrew; Lau, Hollis; Letarte, Simon; Mueller, Lukas N; Sharma, Vagisha; Vitek, Olga; Zhang, Ning; Aebersold, Ruedi; Watts, Julian D
2008-01-01
Background Quantitative proteomics holds great promise for identifying proteins that are differentially abundant between populations representing different physiological or disease states. A range of computational tools is now available for both isotopically labeled and label-free liquid chromatography mass spectrometry (LC-MS) based quantitative proteomics. However, they are generally not comparable to each other in terms of functionality, user interfaces, information input/output, and do not readily facilitate appropriate statistical data analysis. These limitations, along with the array of choices, present a daunting prospect for biologists, and other researchers not trained in bioinformatics, who wish to use LC-MS-based quantitative proteomics. Results We have developed Corra, a computational framework and tools for discovery-based LC-MS proteomics. Corra extends and adapts existing algorithms used for LC-MS-based proteomics, and statistical algorithms, originally developed for microarray data analyses, appropriate for LC-MS data analysis. Corra also adapts software engineering technologies (e.g. Google Web Toolkit, distributed processing) so that computationally intense data processing and statistical analyses can run on a remote server, while the user controls and manages the process from their own computer via a simple web interface. Corra also allows the user to output significantly differentially abundant LC-MS-detected peptide features in a form compatible with subsequent sequence identification via tandem mass spectrometry (MS/MS). We present two case studies to illustrate the application of Corra to commonly performed LC-MS-based biological workflows: a pilot biomarker discovery study of glycoproteins isolated from human plasma samples relevant to type 2 diabetes, and a study in yeast to identify in vivo targets of the protein kinase Ark1 via phosphopeptide profiling. Conclusion The Corra computational framework leverages computational innovation to enable biologists or other researchers to process, analyze and visualize LC-MS data with what would otherwise be a complex and not user-friendly suite of tools. Corra enables appropriate statistical analyses, with controlled false-discovery rates, ultimately to inform subsequent targeted identification of differentially abundant peptides by MS/MS. For the user not trained in bioinformatics, Corra represents a complete, customizable, free and open source computational platform enabling LC-MS-based proteomic workflows, and as such, addresses an unmet need in the LC-MS proteomics field. PMID:19087345
Sinz, Andrea
2018-05-28
Structural mass spectrometry (MS) is gaining increasing importance for deriving valuable three-dimensional structural information on proteins and protein complexes, and it complements existing techniques, such as NMR spectroscopy and X-ray crystallography. Structural MS unites different MS-based techniques, such as hydrogen/deuterium exchange, native MS, ion-mobility MS, protein footprinting, and chemical cross-linking/MS, and it allows fundamental questions in structural biology to be addressed. In this Minireview, I will focus on the cross-linking/MS strategy. This method not only delivers tertiary structural information on proteins, but is also increasingly being used to decipher protein interaction networks, both in vitro and in vivo. Cross-linking/MS is currently one of the most promising MS-based approaches to derive structural information on very large and transient protein assemblies and intrinsically disordered proteins. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Exchange stiffness in thin film Co alloys
NASA Astrophysics Data System (ADS)
Eyrich, C.; Huttema, W.; Arora, M.; Montoya, E.; Rashidi, F.; Burrowes, C.; Kardasz, B.; Girt, E.; Heinrich, B.; Mryasov, O. N.; From, M.; Karis, O.
2012-04-01
The exchange stiffness (Aex) is one of the key parameters controlling magnetization reversal in magnetic materials. We used a method based on the spin spiral formation in two ferromagnetic films antiferromagnetically coupled across a non-magnetic spacer layer and Brillouin scattering to measure Aex for a series of Co1-δXδ (X = Cr, Ni, Ru, Pd, Pt) thin film alloys. The results show that Aex of Co alloys does not necessarily scale with Ms; Aex approximately decreases at the rate of 1.1%, 1.5%, 2.1%, 3.5%, and 5.6%, while Ms decreases at the rate of 1.1%, 0.5%, 1.1%, 3.7%, and 2.5% per addition of 1 at % of Pt, Ni, Pd, Cr, and Ru, respectively.
NASA Astrophysics Data System (ADS)
Valeja, Santosh G.; Emmett, Mark R.; Marshall, Alan G.
2012-04-01
Hydrogen/deuterium exchange monitored by mass spectrometry is an important non-perturbing tool to study protein structure and protein-protein interactions. However, water in the reversed-phase liquid chromatography mobile phase leads to back-exchange of D for H during chromatographic separation of proteolytic peptides following H/D exchange, resulting in incorrect identification of fast-exchanging hydrogens as unexchanged hydrogens. Previously, fast high-performance liquid chromatography (HPLC) and supercritical fluid chromatography have been shown to decrease back-exchange. Here, we show that replacement of up to 40% of the water in the LC mobile phase by the modifiers, dimethylformamide (DMF) and N-methylpyrrolidone (NMP) (i.e., polar organic modifiers that lack rapid exchanging hydrogens), significantly reduces back-exchange. On-line LC micro-ESI FT-ICR MS resolves overlapped proteolytic peptide isotopic distributions, allowing for quantitative determination of the extent of back-exchange. The DMF modified solvent composition also improves chromatographic separation while reducing back-exchange relative to conventional solvent.
Family Child Care Peer-to-Peer Exchange: A Policy Briefing, December 10-12, 1998.
ERIC Educational Resources Information Center
Haack, Peggy
In December 1998, the Ms. Foundation for Women brought together four grantee organizations for an exchange about their work, struggles, accomplishments, and visions for a stronger U.S. family child care system for the 21st century. Four interrelated areas were addressed: (1) family child care, welfare reform, and the rise of public funding for…
ERIC Educational Resources Information Center
van der Meer, Larah; Sutherland, Dean; O'Reilly, Mark F.; Lancioni, Giulio E.; Sigafoos, Jeff
2012-01-01
We compared acquisition of, and preference for, manual signing (MS), picture exchange (PE), and speech-generating devices (SGDs) in four children with autism spectrum disorders (ASD). Intervention was introduced across participants in a non-concurrent multiple-baseline design and acquisition of the three communication modes was compared in an…
ERIC Educational Resources Information Center
Heinson, C. D.; Williams, J. M.; Tinnerman, W. N.; Malloy, T. B.
2005-01-01
The role of ethanol O-d in nullifying the deuterolysis may be demonstrated by determining that transesterification of methyl acetoacetate of the ethyl ester occurs as well as deuterium exchange of the five acetoacetate hydrogens. The significant acidity of the methylene protons in the acetoacetate group, the efficacy of base catalysis, the role of…
Boles, Tammy H; Wells, Martha J M
2016-12-01
Amphetamine and methamphetamine are emerging contaminants-those for which no regulations currently require monitoring or public reporting of their presence in our water supply. In this research, a protocol for weak cation-exchange (WCX) SPE coupled with LC-MS/MS was developed for determination of emerging contaminants amphetamine and methamphetamine in a complex wastewater matrix. Gradient LC parameters were adjusted to yield baseline separation of methamphetamine from other contaminants. Methamphetamine-D5 was used as the internal standard (IS) to compensate for sample loss during SPE and for signal loss during MS (matrix effects). Recoveries were 102.1 ± 7.9% and 99.4 ± 4.0% for amphetamine and methamphetamine, respectively, using WCX sorbent. Notably, methamphetamine was determined to be present in wastewater influent at each sampling date tested. Amphetamine was present in wastewater influent on two of four sampling dates. Amphetamine concentrations ranged from undetectable to 86.4 ng/L in influent, but it was undetectable in wastewater effluent. Methamphetamine was detected in influent at concentrations ranging from 27.0-60.3 ng/L. Methamphetamine concentration was reduced but incompletely removed at this facility. Although absent in one post-UV effluent sample, concentrations of methamphetamine ranged from 10.8-14.8 ng/L. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Partial cooperative unfolding in proteins as observed by hydrogen exchange mass spectrometry
Engen, John R.; Wales, Thomas E.; Chen, Shugui; Marzluff, Elaine M.; Hassell, Kerry M.; Weis, David D.; Smithgall, Thomas E.
2013-01-01
Many proteins do not exist in a single rigid conformation. Protein motions, or dynamics, exist and in many cases are important for protein function. The analysis of protein dynamics relies on biophysical techniques that can distinguish simultaneously existing populations of molecules and their rates of interconversion. Hydrogen exchange (HX) detected by mass spectrometry (MS) is contributing to our understanding of protein motions by revealing unfolding and dynamics on a wide timescale, ranging from seconds to hours to days. In this review we discuss HX MS-based analyses of protein dynamics, using our studies of multi-domain kinases as examples. Using HX MS, we have successfully probed protein dynamics and unfolding in the isolated SH3, SH2 and kinase domains of the c-Src and Abl kinase families, as well as the role of inter- and intra-molecular interactions in the global control of kinase function. Coupled with high-resolution structural information, HX MS has proved to be a powerful and versatile tool for the analysis of the conformational dynamics in these kinase systems, and has provided fresh insight regarding the regulatory control of these important signaling proteins. HX MS studies of dynamics are applicable not only to the proteins we illustrate here, but to a very wide range of proteins and protein systems, and should play a role in both classification of and greater understanding of the prevalence of protein motion. PMID:23682200
Agarande, M; Benzoubir, S; Bouisset, P; Calmet, D
2001-08-01
Trace levels (pg kg(-1)) of 241Am in sediments were determined by isotope dilution high resolution inductively coupled plasma mass spectrometry (ID HR ICP-MS) using a microconcentric nebulizer. 241Am was isolated from major elements like Ca and Fe by different selective precipitations. In further steps. Am was first separated from other transuranic elements and purified by anion exchange and extraction chromatography prior to the mass spectrometric measurements. The ID HR ICP-MS results are compared with isotope dilution alpha spectrometry.
A new database sub-system for grain-size analysis
NASA Astrophysics Data System (ADS)
Suckow, Axel
2013-04-01
Detailed grain-size analyses of large depth profiles for palaeoclimate studies create large amounts of data. For instance (Novothny et al., 2011) presented a depth profile of grain-size analyses with 2 cm resolution and a total depth of more than 15 m, where each sample was measured with 5 repetitions on a Beckman Coulter LS13320 with 116 channels. This adds up to a total of more than four million numbers. Such amounts of data are not easily post-processed by spreadsheets or standard software; also MS Access databases would face serious performance problems. The poster describes a database sub-system dedicated to grain-size analyses. It expands the LabData database and laboratory management system published by Suckow and Dumke (2001). This compatibility with a very flexible database system provides ease to import the grain-size data, as well as the overall infrastructure of also storing geographic context and the ability to organize content like comprising several samples into one set or project. It also allows easy export and direct plot generation of final data in MS Excel. The sub-system allows automated import of raw data from the Beckman Coulter LS13320 Laser Diffraction Particle Size Analyzer. During post processing MS Excel is used as a data display, but no number crunching is implemented in Excel. Raw grain size spectra can be exported and controlled as Number- Surface- and Volume-fractions, while single spectra can be locked for further post-processing. From the spectra the usual statistical values (i.e. mean, median) can be computed as well as fractions larger than a grain size, smaller than a grain size, fractions between any two grain sizes or any ratio of such values. These deduced values can be easily exported into Excel for one or more depth profiles. However, such a reprocessing for large amounts of data also allows new display possibilities: normally depth profiles of grain-size data are displayed only with summarized parameters like the clay content, sand content, etc., which always only displays part of the available information at each depth. Alternatively, full spectra were displayed at one depth. The new software now allows to display the whole grain-size spectrum at each depth in a three dimensional display. LabData and the grain-size subsystem are based on MS Access as front-end and MS SQL Server as back-end database systems. The SQL code for the data model, SQL server procedures and triggers and the MS Access basic code for the front end are public domain code, published under the GNU GPL license agreement and are available free of charge. References: Novothny, Á., Frechen, M., Horváth, E., Wacha, L., Rolf, C., 2011. Investigating the penultimate and last glacial cycles of the Sütt dating, high-resolution grain size, and magnetic susceptibility data. Quaternary International 234, 75-85. Suckow, A., Dumke, I., 2001. A database system for geochemical, isotope hydrological and geochronological laboratories. Radiocarbon 43, 325-337.
Workflow based framework for life science informatics.
Tiwari, Abhishek; Sekhar, Arvind K T
2007-10-01
Workflow technology is a generic mechanism to integrate diverse types of available resources (databases, servers, software applications and different services) which facilitate knowledge exchange within traditionally divergent fields such as molecular biology, clinical research, computational science, physics, chemistry and statistics. Researchers can easily incorporate and access diverse, distributed tools and data to develop their own research protocols for scientific analysis. Application of workflow technology has been reported in areas like drug discovery, genomics, large-scale gene expression analysis, proteomics, and system biology. In this article, we have discussed the existing workflow systems and the trends in applications of workflow based systems.
2015-01-01
Solid state amide hydrogen/deuterium exchange with mass spectrometric analysis (ssHDX-MS) was used to assess the conformation of myoglobin (Mb) in lyophilized formulations, and the results correlated with the extent of aggregation during storage. Mb was colyophilized with sucrose (1:1 or 1:8 w/w), mannitol (1:1 w/w), or NaCl (1:1 w/w) or in the absence of excipients. Immediately after lyophilization, samples of each formulation were analyzed by ssHDX-MS and Fourier transform infrared spectroscopy (FTIR) to assess Mb conformation, and by dynamic light scattering (DLS) and size exclusion chromatography (SEC) to determine the extent of aggregation. The remaining samples were then placed on stability at 25 °C and 60% RH or 40 °C and 75% RH for up to 1 year, withdrawn at intervals, and analyzed for aggregate content by SEC and DLS. In ssHDX-MS of samples immediately after lyophilization (t = 0), Mb was less deuterated in solids containing sucrose (1:1 and 1:8 w/w) than in those containing mannitol (1:1 w/w), NaCl (1:1 w/w), or Mb alone. Deuterium uptake kinetics and peptide mass envelopes also indicated greater Mb structural perturbation in mannitol, NaCl, or Mb-alone samples at t = 0. The extent of deuterium incorporation and kinetic parameters related to rapidly and slowly exchanging amide pools (Nfast, Nslow), measured at t = 0, were highly correlated with the extent of aggregation on storage as measured by SEC. In contrast, the extent of aggregation was weakly correlated with FTIR band intensity and peak position measured at t = 0. The results support the use of ssHDX-MS as a formulation screening tool in developing lyophilized protein drug products. PMID:24816133
An integrated gateway for various PHDs in U-healthcare environments.
Park, KeeHyun; Pak, JuGeon
2012-01-01
We propose an integrated gateway for various personal health devices (PHDs). This gateway receives measurements from various PHDs and conveys them to a remote monitoring server (MS). It provides two kinds of transmission modes: immediate transmission and integrated transmission. The former mode operates if a measurement exceeds a predetermined threshold or in the case of an emergency. In the latter mode, the gateway retains the measurements instead of forwarding them. When the reporting time comes, the gateway extracts all the stored measurements, integrates them into one message, and transmits the integrated message to the MS. Through this mechanism, the transmission overhead can be reduced. On the basis of the proposed gateway, we construct a u-healthcare system comprising an activity monitor, a medication dispenser, and a pulse oximeter. The evaluation results show that the size of separate messages from various PHDs is reduced through the integration process, and the process does not require much time; the integration time is negligible.
An Integrated Gateway for Various PHDs in U-Healthcare Environments
Park, KeeHyun; Pak, JuGeon
2012-01-01
We propose an integrated gateway for various personal health devices (PHDs). This gateway receives measurements from various PHDs and conveys them to a remote monitoring server (MS). It provides two kinds of transmission modes: immediate transmission and integrated transmission. The former mode operates if a measurement exceeds a predetermined threshold or in the case of an emergency. In the latter mode, the gateway retains the measurements instead of forwarding them. When the reporting time comes, the gateway extracts all the stored measurements, integrates them into one message, and transmits the integrated message to the MS. Through this mechanism, the transmission overhead can be reduced. On the basis of the proposed gateway, we construct a u-healthcare system comprising an activity monitor, a medication dispenser, and a pulse oximeter. The evaluation results show that the size of separate messages from various PHDs is reduced through the integration process, and the process does not require much time; the integration time is negligible. PMID:22899891
MassTRIX: mass translator into pathways.
Suhre, Karsten; Schmitt-Kopplin, Philippe
2008-07-01
Recent technical advances in mass spectrometry (MS) have brought the field of metabolomics to a point where large numbers of metabolites from numerous prokaryotic and eukaryotic organisms can now be easily and precisely detected. The challenge today lies in the correct annotation of these metabolites on the basis of their accurate measured masses. Assignment of bulk chemical formula is generally possible, but without consideration of the biological and genomic context, concrete metabolite annotations remain difficult and uncertain. MassTRIX responds to this challenge by providing a hypothesis-driven approach to high precision MS data annotation. It presents the identified chemical compounds in their genomic context as differentially colored objects on KEGG pathway maps. Information on gene transcription or differences in the gene complement (e.g. samples from different bacterial strains) can be easily added. The user can thus interpret the metabolic state of the organism in the context of its potential and, in the case of submitted transcriptomics data, real enzymatic capacities. The MassTRIX web server is freely accessible at http://masstrix.org.
Digging into the low molecular weight peptidome with the OligoNet web server.
Liu, Youzhong; Forcisi, Sara; Lucio, Marianna; Harir, Mourad; Bahut, Florian; Deleris-Bou, Magali; Krieger-Weber, Sibylle; Gougeon, Régis D; Alexandre, Hervé; Schmitt-Kopplin, Philippe
2017-09-15
Bioactive peptides play critical roles in regulating many biological processes. Recently, natural short peptides biomarkers are drawing significant attention and are considered as "hidden treasure" of drug candidates. High resolution and high mass accuracy provided by mass spectrometry (MS)-based untargeted metabolomics would enable the rapid detection and wide coverage of the low-molecular-weight peptidome. However, translating unknown masses (<1 500 Da) into putative peptides is often limited due to the lack of automatic data processing tools and to the limit of peptide databases. The web server OligoNet responds to this challenge by attempting to decompose each individual mass into a combination of amino acids out of metabolomics datasets. It provides an additional network-based data interpretation named "Peptide degradation network" (PDN), which unravels interesting relations between annotated peptides and generates potential functional patterns. The ab initio PDN built from yeast metabolic profiling data shows a great similarity with well-known metabolic networks, and could aid biological interpretation. OligoNet allows also an easy evaluation and interpretation of annotated peptides in systems biology, and is freely accessible at https://daniellyz200608105.shinyapps.io/OligoNet/ .
Wang, Zhangjie; Zhang, Tianji; Xie, Shaoshuai; Liu, Xinyue; Li, Hongmei; Linhardt, Robert J; Chi, Lianli
2018-03-01
Low molecular weight heparins (LMWHs) are widely used anticoagulant drugs. The composition and sequence of LMWH oligosaccharides determine their safety and efficacy. The short oligosaccharide pool in LMWHs undergoes more depolymerization reactions than the longer chains and is the most sensitive indicator of the manufacturing process. Electrospray ionization tandem mass spectrometry (ESI-MS/MS) has been demonstrated as a powerful tool to sequence synthetic heparin oligosaccharide but never been applied to analyze complicated mixture like LMWHs. We established an offline strong anion exchange (SAX)-high performance liquid chromatography (HPLC) and ESI-MS/MS approach to sequence the short oligosaccharides of dalteparin sodium. With the help of in-house developed MS/MS interpretation software, the sequences of 18 representative species ranging from tetrasaccharide to octasaccharide were obtained. Interestingly, we found a novel 2,3-disulfated hexauronic acid structure and reconfirmed it by complementary heparinase digestion and LC-MS/MS analysis. This approach provides straightforward and in-depth insight to the structure of LMWHs and the reaction mechanism of heparin depolymerization. Copyright © 2017 Elsevier Ltd. All rights reserved.
Cho, Yunju; Choi, Man-Ho; Kim, Byungjoo; Kim, Sunghwan
2016-04-29
An experimental setup for the speciation of compounds by hydrogen/deuterium exchange (HDX) with atmospheric pressure ionization while performing chromatographic separation is presented. The proposed experimental setup combines the high performance supercritical fluid chromatography (SFC) system that can be readily used as an inlet for mass spectrometry (MS) and atmospheric pressure photo ionization (APPI) or atmospheric pressure chemical ionization (APCI) HDX. This combination overcomes the limitation of an approach using conventional liquid chromatography (LC) by minimizing the amount of deuterium solvents used for separation. In the SFC separation, supercritical CO2 was used as a major component of the mobile phase, and methanol was used as a minor co-solvent. By using deuterated methanol (CH3OD), AP HDX was achieved during SFC separation. To prove the concept, thirty one nitrogen- and/or oxygen-containing standard compounds were analyzed by SFC-AP HDX MS. The compounds were successfully speciated from the obtained SFC-MS spectra. The exchange ions were observed with as low as 1% of CH3OD in the mobile phase, and separation could be performed within approximately 20min using approximately 0.24 mL of CH3OD. The results showed that SFC separation and APPI/APCI HDX could be successfully performed using the suggested method. Copyright © 2016 Elsevier B.V. All rights reserved.
Hu, Wenbing; Liu, Jianan; Luo, Qun; Han, Yumiao; Wu, Kui; Lv, Shuang; Xiong, Shaoxiang; Wang, Fuyi
2011-05-30
Hydrogen/deuterium exchange mass spectrometry (H/DX MS) has become a powerful tool to investigate protein-protein and protein-ligand interactions, but it is still challenging to localize the interaction regions/sites of ligands with pepsin-resistant proteins such as lipocalins. β-Lactoglobulin (BLG), a member of the lipocalin family, can bind a variety of small hydrophobic molecules including retinols, retinoic acids, and long linear fatty acids. However, whether the binding site of linear molecules locates in the external groove or internal cavity of BLG is controversial. In this study we used H/DX MS combined with docking simulation to localize the interaction sites of a tested ligand, sodium dodecyl sulfate (SDS), binding to BLG. H/DX MS results indicated that SDS can bind to both the external and the internal sites in BLG. However, neither of the sites is saturated with SDS, allowing a dynamic ligand exchange to occur between the sites at equilibrium state. Docking studies revealed that SDS forms H-bonds with Lys69 in the internal site and Lys138 and Lys141 in the external site in BLG via the sulfate group, and interacts with the hydrophobic residues valine, leucine, isoleucine and methionine within both of the sites via its hydrocarbon tail, stabilizing the BLG-SDS complex. Copyright © 2011 John Wiley & Sons, Ltd.
Muneeruddin, K; Bobst, C E; Frenkel, R; Houde, D; Turyan, I; Sosic, Z; Kaltashov, I A
2017-01-16
Detailed profiling of both enzymatic (e.g., glycosylation) and non-enzymatic (e.g., oxidation and deamidation) post-translational modifications (PTMs) is frequently required for the quality assessment of protein-based drugs. Challenging as it is, this task is further complicated for the so-called second-generation biopharmaceuticals, which also contain "designer PTMs" introduced to either enhance their pharmacokinetic profiles (e.g., PEGylated proteins) or endow them with therapeutic activity (e.g., protein-drug conjugates). Such modifications of protein covalent structure can dramatically increase structural heterogeneity, making the very notion of "molecular mass" meaningless, as ions representing different glycoforms of a PEGylated protein may have nearly identical distributions of ionic current as a function of m/z, making their contributions to the mass spectrum impossible to distinguish. In this work we demonstrate that a combination of ion exchange chromatography (IXC) with on-line detection by electrospray ionization mass spectrometry (ESI MS) and methods of ion manipulation in the gas phase (limited charge reduction and collision-induced dissociation) allows meaningful structural information to be obtained on a structurally heterogeneous sample of PEGylated interferon β-1a. IXC profiling of the protein sample gives rise to a convoluted chromatogram with several partially resolved peaks which can represent both deamidation and different glycosylation patterns within the protein, as well as varying extent of PEGylation. Thus, profiling the protein with on-line IXC/ESI/MS/MS allows it to be characterized by providing information on three different types of PTMs (designer, enzymatic and non-enzymatic) within a single protein therapeutic.
Needham, Shane R; Ye, Binying; Smith, J Richard; Korte, William D
2003-11-05
An HPLC/MS/MS method was validated for the low level analysis of pyridostigmine bromide (PB) from guinea pig plasma. An advantage of this strong-cation exchange HPLC/MS/MS method was the enhancement of the ESI-MS signal by providing good retention and good peak shape of PB with a mobile phase of 70% acetonitrile. In addition, the use of 70% acetonitrile in the mobile phase allowed the direct injection of the supernant from the protein precipitated extracted sample. The assay was linear from the range of 0.1 to 50 ng/ml using only 25 microl of sample. The precision and accuracy of the assay was better than 9.1 and 113%, respectively.
Hardware Assisted Stealthy Diversity (CHECKMATE)
2013-09-01
applicable across multiple architectures. Figure 29 shows an example an attack against an interpreted environment with a Java executable. CHECKMATE can...Architectures ARM PPCx86 Java VM Java VMJava VM Java Executable Attack APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED 33 a user executes “/usr/bin/wget...Server 1 - Administration Server 2 – Database ( mySQL ) Server 3 – Web server (Mongoose) Server 4 – File server (SSH) Server 5 – Email server
Möller, Kristina; Crescenzi, Carlo; Nilsson, Ulrika
2004-01-01
Diphenyl phosphate is a hydrolysis product and possible metabolite of the flame retardant and plasticiser additive triphenyl phosphate. A molecularly imprinted polymer solid-phase extraction (MISPE) method for extracting diphenyl phosphate from aqueous solutions has been developed and compared with SPE using a commercially available mixed-mode anion exchanger. The imprinted polymer was prepared using 2-vinylpyridine (2-Vpy) as the functional monomer, ethylene glycol dimethacrylate (EGDMA) as the cross-linker, and a structural analogue of the analyte as the template molecule. The imprinted polymer was evaluated for use as a SPE sorbent, in tests with both aqueous standards and spiked urine samples, by comparing recovery and breakthrough data obtained using the imprinted form of the polymer and a non-imprinted form (NIP). Extraction from aqueous solutions resulted in more than 80% recovery. Adsorption by the molecularly imprinted polymer (MIP) was non-selective, but selectivity was achieved by selective desorption in the wash steps. Diphenyl phosphate could also be selectively extracted from urine samples, although the urine matrix reduced the capacity of the MISPE cartridges. Recoveries from urine extraction were higher than 70%. It was important to control pH during sample loading. The MISPE method was found to yield a less complex LC-ESI-MS chromatogram of the urine extracts compared with the mixed-mode anion-exchanger method. An LC-ESI-MS method using a Hypercarb LC column with a graphitised carbon stationary phase was also evaluated for organophosphate diesters. LC-ESI-MS using negative-ion detection in selected ion monitoring (SIM) mode was shown to be linear for diphenyl phosphate in the range 0.08-20 ng microL(-1).
2013-01-01
Background Immunoassays that employ multiplexed bead arrays produce high information content per sample. Such assays are now frequently used to evaluate humoral responses in clinical trials. Integrated software is needed for the analysis, quality control, and secure sharing of the high volume of data produced by such multiplexed assays. Software that facilitates data exchange and provides flexibility to perform customized analyses (including multiple curve fits and visualizations of assay performance over time) could increase scientists’ capacity to use these immunoassays to evaluate human clinical trials. Results The HIV Vaccine Trials Network and the Statistical Center for HIV/AIDS Research and Prevention collaborated with LabKey Software to enhance the open source LabKey Server platform to facilitate workflows for multiplexed bead assays. This system now supports the management, analysis, quality control, and secure sharing of data from multiplexed immunoassays that leverage Luminex xMAP® technology. These assays may be custom or kit-based. Newly added features enable labs to: (i) import run data from spreadsheets output by Bio-Plex Manager™ software; (ii) customize data processing, curve fits, and algorithms through scripts written in common languages, such as R; (iii) select script-defined calculation options through a graphical user interface; (iv) collect custom metadata for each titration, analyte, run and batch of runs; (v) calculate dose–response curves for titrations; (vi) interpolate unknown concentrations from curves for titrated standards; (vii) flag run data for exclusion from analysis; (viii) track quality control metrics across runs using Levey-Jennings plots; and (ix) automatically flag outliers based on expected values. Existing system features allow researchers to analyze, integrate, visualize, export and securely share their data, as well as to construct custom user interfaces and workflows. Conclusions Unlike other tools tailored for Luminex immunoassays, LabKey Server allows labs to customize their Luminex analyses using scripting while still presenting users with a single, graphical interface for processing and analyzing data. The LabKey Server system also stands out among Luminex tools for enabling smooth, secure transfer of data, quality control information, and analyses between collaborators. LabKey Server and its Luminex features are freely available as open source software at http://www.labkey.com under the Apache 2.0 license. PMID:23631706
Eckels, Josh; Nathe, Cory; Nelson, Elizabeth K; Shoemaker, Sara G; Nostrand, Elizabeth Van; Yates, Nicole L; Ashley, Vicki C; Harris, Linda J; Bollenbeck, Mark; Fong, Youyi; Tomaras, Georgia D; Piehler, Britt
2013-04-30
Immunoassays that employ multiplexed bead arrays produce high information content per sample. Such assays are now frequently used to evaluate humoral responses in clinical trials. Integrated software is needed for the analysis, quality control, and secure sharing of the high volume of data produced by such multiplexed assays. Software that facilitates data exchange and provides flexibility to perform customized analyses (including multiple curve fits and visualizations of assay performance over time) could increase scientists' capacity to use these immunoassays to evaluate human clinical trials. The HIV Vaccine Trials Network and the Statistical Center for HIV/AIDS Research and Prevention collaborated with LabKey Software to enhance the open source LabKey Server platform to facilitate workflows for multiplexed bead assays. This system now supports the management, analysis, quality control, and secure sharing of data from multiplexed immunoassays that leverage Luminex xMAP® technology. These assays may be custom or kit-based. Newly added features enable labs to: (i) import run data from spreadsheets output by Bio-Plex Manager™ software; (ii) customize data processing, curve fits, and algorithms through scripts written in common languages, such as R; (iii) select script-defined calculation options through a graphical user interface; (iv) collect custom metadata for each titration, analyte, run and batch of runs; (v) calculate dose-response curves for titrations; (vi) interpolate unknown concentrations from curves for titrated standards; (vii) flag run data for exclusion from analysis; (viii) track quality control metrics across runs using Levey-Jennings plots; and (ix) automatically flag outliers based on expected values. Existing system features allow researchers to analyze, integrate, visualize, export and securely share their data, as well as to construct custom user interfaces and workflows. Unlike other tools tailored for Luminex immunoassays, LabKey Server allows labs to customize their Luminex analyses using scripting while still presenting users with a single, graphical interface for processing and analyzing data. The LabKey Server system also stands out among Luminex tools for enabling smooth, secure transfer of data, quality control information, and analyses between collaborators. LabKey Server and its Luminex features are freely available as open source software at http://www.labkey.com under the Apache 2.0 license.
SSL - THE SIMPLE SOCKETS LIBRARY
NASA Technical Reports Server (NTRS)
Campbell, C. E.
1994-01-01
The Simple Sockets Library (SSL) allows C programmers to develop systems of cooperating programs using Berkeley streaming Sockets running under the TCP/IP protocol over Ethernet. The SSL provides a simple way to move information between programs running on the same or different machines and does so with little overhead. The SSL can create three types of Sockets: namely a server, a client, and an accept Socket. The SSL's Sockets are designed to be used in a fashion reminiscent of the use of FILE pointers so that a C programmer who is familiar with reading and writing files will immediately feel comfortable with reading and writing with Sockets. The SSL consists of three parts: the library, PortMaster, and utilities. The user of the SSL accesses it by linking programs to the SSL library. The PortMaster initializes connections between clients and servers. The PortMaster also supports a "firewall" facility to keep out socket requests from unapproved machines. The "firewall" is a file which contains Internet addresses for all approved machines. There are three utilities provided with the SSL. SKTDBG can be used to debug programs that make use of the SSL. SPMTABLE lists the servers and port numbers on requested machine(s). SRMSRVR tells the PortMaster to forcibly remove a server name from its list. The package also includes two example programs: multiskt.c, which makes multiple accepts on one server, and sktpoll.c, which repeatedly attempts to connect a client to some server at one second intervals. SSL is a machine independent library written in the C-language for computers connected via Ethernet using the TCP/IP protocol. It has been successfully compiled and implemented on a variety of platforms, including Sun series computers running SunOS, DEC VAX series computers running VMS, SGI computers running IRIX, DECstations running ULTRIX, DEC alpha AXPs running OSF/1, IBM RS/6000 computers running AIX, IBM PC and compatibles running BSD/386 UNIX and HP Apollo 3000/4000/9000/400T computers running HP-UX. SSL requires 45K of RAM to run under SunOS and 80K of RAM to run under VMS. For use on IBM PC series computers and compatibles running DOS, SSL requires Microsoft C 6.0 and the Wollongong TCP/IP package. Source code for sample programs and debugging tools are provided. The documentation is available on the distribution medium in TeX and PostScript formats. The standard distribution medium for SSL is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format and a 5.25 inch 360K MS-DOS format diskette. The SSL was developed in 1992 and was updated in 1993.
NASA Astrophysics Data System (ADS)
Huang, Richard Y.-C.; O'Neil, Steven R.; Lipovšek, Daša; Chen, Guodong
2018-05-01
Higher-order structure (HOS) characterization of therapeutic protein-drug conjugates for comprehensive assessment of conjugation-induced protein conformational changes is an important consideration in the biopharmaceutical industry to ensure proper behavior of protein therapeutics. In this study, conformational dynamics of a small therapeutic protein, adnectin 1, together with its drug conjugate were characterized by hydrogen/deuterium exchange mass spectrometry (HDX-MS) with different spatial resolutions. Top-down HDX allows detailed assessment of the residue-level deuterium content in the payload conjugation region. HDX-MS dataset revealed the ability of peptide-based payload/linker to retain deuterium in HDX experiments. Combined results from intact, top-down, and bottom-up HDX indicated no significant conformational changes of adnectin 1 upon payload conjugation. [Figure not available: see fulltext.
Hydrogen exchange mass spectrometry of functional membrane-bound chemotaxis receptor complexes.
Koshy, Seena S; Eyles, Stephen J; Weis, Robert M; Thompson, Lynmarie K
2013-12-10
The transmembrane signaling mechanism of bacterial chemotaxis receptors is thought to involve changes in receptor conformation and dynamics. The receptors function in ternary complexes with two other proteins, CheA and CheW, that form extended membrane-bound arrays. Previous studies have shown that attractant binding induces a small (∼2 Å) piston displacement of one helix of the periplasmic and transmembrane domains toward the cytoplasm, but it is not clear how this signal propagates through the cytoplasmic domain to control the kinase activity of the CheA bound at the membrane-distal tip, nearly 200 Å away. The cytoplasmic domain has been shown to be highly dynamic, which raises the question of how a small piston motion could propagate through a dynamic domain to control CheA kinase activity. To address this, we have developed a method for measuring dynamics of the receptor cytoplasmic fragment (CF) in functional complexes with CheA and CheW. Hydrogen-deuterium exchange mass spectrometry (HDX-MS) measurements of global exchange of the CF demonstrate that the CF exhibits significantly slower exchange in functional complexes than in solution. Because the exchange rates in functional complexes are comparable to those of other proteins with similar structures, the CF appears to be a well-structured protein within these complexes, which is compatible with its role in propagating a signal that appears to be a tiny conformational change in the periplasmic and transmembrane domains of the receptor. We also demonstrate the feasibility of this protocol for local exchange measurements by incorporating a pepsin digest step to produce peptides with 87% sequence coverage and only 20% back exchange. This method extends HDX-MS to membrane-bound functional complexes without detergents that may perturb the stability or structure of the system.
Hydrogen Exchange Mass Spectrometry of Functional Membrane-bound Chemotaxis Receptor Complexes
Koshy, Seena S.; Eyles, Stephen J.; Weis, Robert M.; Thompson, Lynmarie K.
2014-01-01
The transmembrane signaling mechanism of bacterial chemotaxis receptors is thought to involve changes in receptor conformation and dynamics. The receptors function in ternary complexes with two other proteins, CheA and CheW, that form extended membrane-bound arrays. Previous studies have shown that attractant binding induces a small (~2 Å) piston displacement of one helix of the periplasmic and transmembrane domains towards the cytoplasm, but it is not clear how this signal propagates through the cytoplasmic domain to control the kinase activity of the CheA bound at the membrane-distal tip, nearly 200 Å away. The cytoplasmic domain has been shown to be highly dynamic, which raises the question of how a small piston motion could propagate through a dynamic domain to control CheA kinase activity. To address this, we have developed a method for measuring dynamics of the receptor cytoplasmic fragment (CF) in functional complexes with CheA and CheW. Hydrogen exchange mass spectrometry (HDX-MS) measurements of global exchange of CF demonstrate that CF exhibits significantly slower exchange in functional complexes than in solution. Since the exchange rates in functional complexes are comparable to that of other proteins of similar structure, the CF appears to be a well-structured protein within these complexes, which is compatible with its role in propagating a signal that appears to be a tiny conformational change in the periplasmic and transmembrane domains of the receptor. We also demonstrate the feasibility of this protocol for local exchange measurements, by incorporating a pepsin digest step to produce peptides with 87% sequence coverage and only 20% back exchange. This method extends HDX-MS to membrane-bound functional complexes without detergents that may perturb the stability or structure of the system. PMID:24274333
Modelling magnetic anisotropy of single-chain magnets in |d/J| ≥ 1 regime
NASA Astrophysics Data System (ADS)
Haldar, Sumit; Raghunathan, Rajamani; Sutter, Jean-Pascal; Ramasesha, S.
2017-11-01
Single-molecule magnets (SMMs) with single-ion anisotropies comparable to exchange interactions J between spins have recently been synthesised. Here, we provide theoretical insights into the magnetism of such systems. We study spin chains with site-spins, s = 1, 3/2 and 2 and strength of on-site anisotropy comparable to the exchange constants between the spins. We find that large on-site anisotropies lead to crossing of the states with different MS values in the same spin manifold to which they belong in the absence of anisotropy. When on-site anisotropy is increased further, we also find that the MS states of the higher energy spin states descend below the MS states of the ground spin manifold. Giant spin in this limit is no longer conserved and describing the axial and rhombic anisotropies of the molecule, DM and EM, respectively, is not possible. However, the giant spin of the low-lying large MS states is very nearly an integer and, using this spin value, it is possible to construct an effective spin-Hamiltonian and compute the molecular magnetic anisotropy constants DM and EM. We report effect of finite sizes, rotations of site anisotropies and chain dimerisation on the effective anisotropy of the spin chains.
Zhang, Zhenbin; Sun, Liangliang; Zhu, Guijie; Cox, Olivia F; Huber, Paul W; Dovichi, Norman J
2016-01-05
A sulfonate-silica hybrid strong cation exchange monolith microreactor was synthesized and coupled to a linear polyacrylamide coated capillary for online sample preparation and capillary zone electrophoresis-tandem mass spectrometry (CZE-MS/MS) bottom-up proteomic analysis. The protein sample was loaded onto the microreactor in an acidic buffer. After online reduction, alkylation, and digestion with trypsin, the digests were eluted with 200 mM ammonium bicarbonate at pH 8.2 for CZE-MS/MS analysis using 1 M acetic acid as the background electrolyte. This combination of basic elution and acidic background electrolytes results in both sample stacking and formation of a dynamic pH junction. 369 protein groups and 1274 peptides were identified from 50 ng of Xenopus laevis zygote homogenate, which is comparable with an offline sample preparation method, but the time required for sample preparation was decreased from over 24 h to less than 40 min. Dramatically improved performance was produced by coupling the reactor to a longer separation capillary (∼100 cm) and a Q Exactive HF mass spectrometer. 975 protein groups and 3749 peptides were identified from 50 ng of Xenopus protein using the online sample preparation method.
Moussa, Ehab M; Wilson, Nathan E; Zhou, Qi Tony; Singh, Satish K; Nema, Sandeep; Topp, Elizabeth M
2018-01-03
Lyophilization and spray drying are widely used to manufacture solid forms of therapeutic proteins. Lyophilization is used to stabilize proteins vulnerable to degradation in solution, whereas spray drying is mainly used to prepare inhalation powders or as an alternative to freezing for storing bulk drug substance. Both processes impose stresses that may adversely affect protein structure, stability and bioactivity. Here, we compared lyophilization with and without controlled ice nucleation, and spray drying for their effects on the solid-state conformation and matrix interactions of a model IgG1 monoclonal antibody (mAb). Solid-state conformation and matrix interactions of the mAb were probed using solid-state hydrogen-deuterium exchange with mass spectrometric analysis (ssHDX-MS), and solid-state Fourier transform infrared (ssFTIR) and solid-state fluorescence spectroscopies. mAb conformation and/or matrix interactions were most perturbed in mannitol-containing samples and the distribution of states was more heterogeneous in sucrose and trehalose samples that were spray dried. The findings demonstrate the sensitivity of ssHDX-MS to changes weakly indicated by spectroscopic methods, and support the broader use of ssHDX-MS to probe formulation and process effects on proteins in solid samples.
Demonstrating NaradaBrokering as a Middleware Fabric for Grid-based Remote Visualization Services
NASA Astrophysics Data System (ADS)
Pallickara, S.; Erlebacher, G.; Yuen, D.; Fox, G.; Pierce, M.
2003-12-01
Remote Visualization Services (RVS) have tended to rely on approaches based on the client server paradigm. Here we demonstrate our approach - based on a distributed brokering infrastructure, NaradaBrokering [1] - that relies on distributed, asynchronous and loosely coupled interactions to meet the requirements and constraints of RVS. In our approach to RVS, services advertise their capabilities to the broker network that manages these service advertisements. Among the services considered within our system are those that perform graphic transformations, mediate access to specialized datasets and finally those that manage the execution of specified tasks. There could be multiple instances of each of these services and the system ensures that load for a given service is distributed efficiently over these service instances. We will demonstrate implementation of concepts that we outlined in the oral presentation. This would involve two or more visualization servers interacting asynchronously with multiple clients through NaradaBrokering. The communicating entities may exchange SOAP [2] (Simple Object Access Protocol) messages. SOAP is a lightweight protocol for exchange of information in a decentralized, distributed environment. It is an XML based protocol that consists of three parts: an envelope that describes what is in a message and how to process it, rules for expressing instances of application-defined data types, and a convention for representing remote invocation related operations. Furthermore, we will also demonstrate how clients can retrieve their results after prolonged disconnects or after any failures that might have taken place. The entities, services and clients alike, are not limited by the geographical distances that separate them. We are planning to test this system in the context of trans-Atlantic links separating interacting entities. {[1]} The NaradaBrokering Project: http://www.naradabrokering.org {[2]} Newcomer, E., 2002, Understanding web services: XML, WSDL, SOAP, and UDDI, Addison Wesley Professional.
Heat extraction from salinity-gradient solar ponds using heat pipe heat exchangers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tundee, Sura; Terdtoon, Pradit; Sakulchangsatjatai, Phrut
This paper presents the results of experimental and theoretical analysis on the heat extraction process from solar pond by using the heat pipe heat exchanger. In order to conduct research work, a small scale experimental solar pond with an area of 7.0 m{sup 2} and a depth of 1.5 m was built at Khon Kaen in North-Eastern Thailand (16 27'N102 E). Heat was successfully extracted from the lower convective zone (LCZ) of the solar pond by using a heat pipe heat exchanger made from 60 copper tubes with 21 mm inside diameter and 22 mm outside diameter. The length ofmore » the evaporator and condenser section was 800 mm and 200 mm respectively. R134a was used as the heat transfer fluid in the experiment. The theoretical model was formulated for the solar pond heat extraction on the basis of the energy conservation equations and by using the solar radiation data for the above location. Numerical methods were used to solve the modeling equations. In the analysis, the performance of heat exchanger is investigated by varying the velocity of inlet air used to extract heat from the condenser end of the heat pipe heat exchanger (HPHE). Air velocity was found to have a significant influence on the effectiveness of heat pipe heat exchanger. In the present investigation, there was an increase in effectiveness by 43% as the air velocity was decreased from 5 m/s to 1 m/s. The results obtained from the theoretical model showed good agreement with the experimental data. (author)« less
Jin, Weihua; Liu, Bing; Li, Shuai; Chen, Jing; Tang, Hong; Jiang, Di; Zhang, Quanbin; Zhong, Weihong
2018-03-01
Polysaccharide (ST) was prepared from Sargassum thunbergii using hot water. Two fractions (ST-1 and ST-2) were prepared using anion exchange chromatography. One desulfated polysaccharide (ST-1-DS) was also prepared. Electrospray ionization mass spectrometry (ESI-MS) performed on ST-1-DS showed that the desulfated polysaccharides contained methyl glycosides of mono-sulfated and di-sulfated galacto-fucooligosaccharides. This result suggested that ST-1 might contain sulfated galactofucan, which consists of a backbone of alternating (Gal) n and (Fuc) n and sulfated randomly on Gal and mainly on C-2 in Fuc. In addition, ST-1 was degraded in 1M sulfuric acid. The solution was centrifuged, and the supernatant was concentrated and precipitated in ethanol to obtain the precipitate (ST-1-P). ST-1-P was then separated using gel chromatography and anion exchange chromatography to obtain the oligomers. ESI-MS spectra of oligomers indicated that ST-1 mostly contained sulfated glucuronomannan and fucoglucuronan. ESI-MS with collision-induced dissociation tandem mass spectrometry (ESI-CID-MS/MS) suggested that glucuronomannan contained alternating 2-linked Man and 4-linked GlcA, while fucoglucuronan contained 4-linked glucuronan with branched Fuc at C-3. Finally, the neuroprotective activities of ST, ST-1, ST-2 and MIX (a mixture of ST-1 and ST-2) were determined. ST showed the most neuroprotective activity, which indicated that ST might be a good candidate for curing neurodegenerative diseases. Copyright © 2017 Elsevier B.V. All rights reserved.
Acter, Thamina; Lee, Seulgidaun; Cho, Eunji; Jung, Maeng-Joon; Kim, Sunghwan
2018-01-01
In this study, continuous in-source hydrogen/deuterium exchange (HDX) atmospheric pressure photoionization (APPI) mass spectrometry (MS) with continuous feeding of D 2 O was developed and validated. D 2 O was continuously fed using a capillary line placed on the center of a metal plate positioned between the UV lamp and nebulizer. The proposed system overcomes the limitations of previously reported APPI HDX-MS approaches where deuterated solvents were premixed with sample solutions before ionization. This is particularly important for APPI because solvent composition can greatly influence ionization efficiency as well as the solubility of analytes. The experimental parameters for APPI HDX-MS with continuous feeding of D 2 O were optimized, and the optimized conditions were applied for the analysis of nitrogen-, oxygen-, and sulfur-containing compounds. The developed method was also applied for the analysis of the polar fraction of a petroleum sample. Thus, the data presented in this study clearly show that the proposed HDX approach can serve as an effective analytical tool for the structural analysis of complex mixtures. Graphical abstract ᅟ.
Protein Folding—How and Why: By Hydrogen Exchange, Fragment Separation, and Mass Spectrometry
Englander, S. Walter; Mayne, Leland; Kan, Zhong-Yuan; Hu, Wenbing
2017-01-01
Advanced hydrogen exchange (HX) methodology can now determine the structure of protein folding intermediates and their progression in folding pathways. Key developments over time include the HX pulse labeling method with nuclear magnetic resonance analysis, development of the fragment separation method, the addition to it of mass spectrometric (MS) analysis, and recent improvements in the HX MS technique and data analysis. Also, the discovery of protein foldons and their role supplies an essential interpretive link. Recent work using HX pulse labeling with HX MS analysis finds that a number of proteins fold by stepping through a reproducible sequence of native-like intermediates in an ordered pathway. The stepwise nature of the pathway is dictated by the cooperative foldon unit construction of the protein. The pathway order is determined by a sequential stabilization principle; prior native-like structure guides the formation of adjacent native-like structure. This view does not match the funneled energy landscape paradigm of a very large number of folding tracks, which was framed before foldons were known. PMID:27145881
Network characteristics for server selection in online games
NASA Astrophysics Data System (ADS)
Claypool, Mark
2008-01-01
Online gameplay is impacted by the network characteristics of players connected to the same server. Unfortunately, the network characteristics of online game servers are not well-understood, particularly for groups that wish to play together on the same server. As a step towards a remedy, this paper presents analysis of an extensive set of measurements of game servers on the Internet. Over the course of many months, actual Internet game servers were queried simultaneously by twenty-five emulated game clients, with both servers and clients spread out on the Internet. The data provides statistics on the uptime and populations of game servers over a month long period an an in-depth look at the suitability for game servers for multi-player server selection, concentrating on characteristics critical to playability--latency and fairness. Analysis finds most game servers have latencies suitable for third-person and omnipresent games, such as real-time strategy, sports and role-playing games, providing numerous server choices for game players. However, far fewer game servers have the low latencies required for first-person games, such as shooters or race games. In all cases, groups that wish to play together have a greatly reduced set of servers from which to choose because of inherent unfairness in server latencies and server selection is particularly limited as the group size increases. These results hold across different game types and even across different generations of games. The data should be useful for game developers and network researchers that seek to improve game server selection, whether for single or multiple players.
Monoterpene biosynthesis potential of plant subcellular compartments.
Dong, Lemeng; Jongedijk, Esmer; Bouwmeester, Harro; Van Der Krol, Alexander
2016-01-01
Subcellular monoterpene biosynthesis capacity based on local geranyl diphosphate (GDP) availability or locally boosted GDP production was determined for plastids, cytosol and mitochondria. A geraniol synthase (GES) was targeted to plastids, cytosol, or mitochondria. Transient expression in Nicotiana benthamiana indicated local GDP availability for each compartment but resulted in different product levels. A GDP synthase from Picea abies (PaGDPS1) was shown to boost GDP production. PaGDPS1 was also targeted to plastids, cytosol or mitochondria and PaGDPS1 and GES were coexpressed in all possible combinations. Geraniol and geraniol-derived products were analyzed by GC-MS and LC-MS, respectively. GES product levels were highest for plastid-targeted GES, followed by mitochondrial- and then cytosolic-targeted GES. For each compartment local boosting of GDP biosynthesis increased GES product levels. GDP exchange between compartments is not equal: while no GDP is exchanged from the cytosol to the plastids, 100% of GDP in mitochondria can be exchanged to plastids, while only 7% of GDP from plastids is available for mitochondria. This suggests a direct exchange mechanism for GDP between plastids and mitochondria. Cytosolic PaGDPS1 competes with plastidial GES activity, suggesting an effective drain of isopentenyl diphosphate from the plastids to the cytosol. © 2015 The Authors. New Phytologist © 2015 New Phytologist Trust.
Pan, Jingxi; Han, Jun; Borchers, Christoph H; Konermann, Lars
2009-09-09
Amide H/D exchange (HDX) mass spectrometry (MS) is widely used for protein structural studies. Traditionally, this technique involves protein labeling in D(2)O, followed by acid quenching, proteolytic digestion, and analysis of peptide deuteration levels by HPLC/MS. There is great interest in the development of alternative HDX approaches involving the top-down fragmentation of electrosprayed protein ions, instead of relying on enzymatic cleavage and solution-phase separations. A number of recent studies have demonstrated that electron capture dissociation (ECD) results in fragmentation of gaseous protein ions with little or no H/D scrambling. However, the successful application of this approach for in-depth protein conformational studies has not yet been demonstrated. The current work uses horse myoglobin as a model system for assessing the suitability of HDX-MS with top-down ECD for experiments of this kind. It is found that ECD can pinpoint the locations of protected amides with an average resolution of less than two residues for this 17 kDa protein. Native holo-myoglobin (hMb) shows considerable protection from exchange in all of its helices, whereas loops are extensively deuterated. Fraying is observable at some helix termini. Removal of the prosthetic heme group from hMb produces apo-myoglobin (aMb). Both hMb and aMb share virtually the same HDX protection pattern in helices A-E, whereas helix F is unfolded in aMb. In addition, destabilization is evident for some residues close to the beginning of helix G, the end of helix H, and the C-terminus of the protein. The structural changes reported herein are largely consistent with earlier NMR data for sperm whale myoglobin, although small differences between the two systems are evident. Our findings demonstrate that the level of structural information obtainable with top-down ECD for small to medium-sized proteins considerably surpasses that of traditional HDX-MS experiments, while at the same time greatly reducing undesired amide back exchange.
Evidence for the Role of B Cells and Immunoglobulins in the Pathogenesis of Multiple Sclerosis
Wootla, Bharath; Denic, Aleksandar; Keegan, B. Mark; Winters, Jeffrey L.; Astapenko, David; Warrington, Arthur E.; Bieber, Allan J.; Rodriguez, Moses
2011-01-01
The pathogenesis of multiple sclerosis (MS) remains elusive. Recent reports advocate greater involvement of B cells and immunoglobulins in the initiation and propagation of MS lesions at different stages of their ontogeny. The key role of B cells and immunoglobulins in pathogenesis was initially identified by studies in which patients whose fulminant attacks of demyelination did not respond to steroids experienced remarkable functional improvement following plasma exchange. The positive response to Rituximab in Phase II clinical trials of relapsing-remitting MS confirms the role of B cells. The critical question is how B cells contribute to MS. In this paper, we discuss both the deleterious and the beneficial roles of B cells and immunoglobulins in MS lesions. We provide alternative hypotheses to explain both damaging and protective antibody responses. PMID:21961063
A Study of the Efficiency of Spatial Indexing Methods Applied to Large Astronomical Databases
NASA Astrophysics Data System (ADS)
Donaldson, Tom; Berriman, G. Bruce; Good, John; Shiao, Bernie
2018-01-01
Spatial indexing of astronomical databases generally uses quadrature methods, which partition the sky into cells used to create an index (usually a B-tree) written as database column. We report the results of a study to compare the performance of two common indexing methods, HTM and HEALPix, on Solaris and Windows database servers installed with a PostgreSQL database, and a Windows Server installed with MS SQL Server. The indexing was applied to the 2MASS All-Sky Catalog and to the Hubble Source catalog. On each server, the study compared indexing performance by submitting 1 million queries at each index level with random sky positions and random cone search radius, which was computed on a logarithmic scale between 1 arcsec and 1 degree, and measuring the time to complete the query and write the output. These simulated queries, intended to model realistic use patterns, were run in a uniform way on many combinations of indexing method and indexing level. The query times in all simulations are strongly I/O-bound and are linear with number of records returned for large numbers of sources. There are, however, considerable differences between simulations, which reveal that hardware I/O throughput is a more important factor in managing the performance of a DBMS than the choice of indexing scheme. The choice of index itself is relatively unimportant: for comparable index levels, the performance is consistent within the scatter of the timings. At small index levels (large cells; e.g. level 4; cell size 3.7 deg), there is large scatter in the timings because of wide variations in the number of sources found in the cells. At larger index levels, performance improves and scatter decreases, but the improvement at level 8 (14 min) and higher is masked to some extent in the timing scatter caused by the range of query sizes. At very high levels (20; 0.0004 arsec), the granularity of the cells becomes so high that a large number of extraneous empty cells begin to degrade performance. Thus, for the use patterns studied here the database performance is not critically dependent on the exact choices of index or level.
Zhao, Bo; Hansen, Alexandar L; Zhang, Qi
2014-01-08
Quantitative characterization of dynamic exchange between various conformational states provides essential insights into the molecular basis of many regulatory RNA functions. Here, we present an application of nucleic-acid-optimized carbon chemical exchange saturation transfer (CEST) and low spin-lock field R(1ρ) relaxation dispersion (RD) NMR experiments in characterizing slow chemical exchange in nucleic acids that is otherwise difficult if not impossible to be quantified by the ZZ-exchange NMR experiment. We demonstrated the application on a 47-nucleotide fluoride riboswitch in the ligand-free state, for which CEST and R(1ρ) RD profiles of base and sugar carbons revealed slow exchange dynamics involving a sparsely populated (p ~ 10%) and shortly lived (τ ~ 10 ms) NMR "invisible" state. The utility of CEST and low spin-lock field R(1ρ) RD experiments in studying slow exchange was further validated in characterizing an exchange as slow as ~60 s(-1).
Zhao, Bo; Hansen, Alexandar L.; Zhang, Qi
2016-01-01
Quantitative characterization of dynamic exchange between various conformational states provides essential insights into the molecular basis of many regulatory RNA functions. Here, we present an application of nucleic-acid-optimized carbon chemical exchange saturation transfer (CEST) and low spin-lock field R1ρ relaxation dispersion (RD) NMR experiments in characterizing slow chemical exchange in nucleic acids that is otherwise difficult if not impossible to be quantified by the ZZ-exchange NMR experiment. We demonstrated the application on a 47-nucleotide fluoride riboswitch in the ligand-free state, for which CEST and R1ρ RD profiles of base and sugar carbons revealed slow exchange dynamics involving a sparsely populated (p ~ 10%) and shortly lived (τ ~ 10 ms) NMR “invisible” state. The utility of CEST and low spin-lock field R1ρ RD experiments in studying slow exchange was further validated in characterizing an exchange as slow as ~60 s−1. PMID:24299272
Application of GC/MS Soft Ionization for Isomeric Biological Compound Analysis.
Furuhashi, Takeshi; Okuda, Koji
2017-09-03
Isomers are compounds with the same molecular formula. Many different types of isomers are ubiquitous and play important roles in living organisms. Despite their early discovery, the actual analysis of isomers has been tricky and has confounded researchers. Using mass spectrometry (MS) to distinguish or identify isomers is an emergent topic and challenge for analytical chemists. We review some techniques for analyzing isomers with emphasis on MS, e.g., the roles of ion reaction, hydrogen-deuterium exchange, ion mobility mass spectrometry, ion spectroscopy, and energy change in producing isomer-specific fragments. In particular, soft ionization for gas chromatography-mass spectrometry (GC-MS) is a focus in this review. Awareness of the advantages and technical problems of these techniques would inspire innovation in future approaches.
NASA Astrophysics Data System (ADS)
Duc, Nguyen Minh; Du, Yang; Thorsen, Thor S.; Lee, Su Youn; Zhang, Cheng; Kato, Hideaki; Kobilka, Brian K.; Chung, Ka Young
2015-05-01
G protein-coupled receptors (GPCRs) have important roles in physiology and pathology, and 40% of drugs currently on the market target GPCRs for the treatment of various diseases. Because of their therapeutic importance, the structural mechanism of GPCR signaling is of great interest in the field of drug discovery. Hydrogen/deuterium exchange mass spectrometry (HDX-MS) is a useful tool for analyzing ligand binding sites, the protein-protein interaction interface, and conformational changes of proteins. However, its application to GPCRs has been limited for various reasons, including the hydrophobic nature of GPCRs and the use of detergents in their preparation. In the present study, we tested the application of bicelles as a means of solubilizing GPCRs for HDX-MS studies. GPCRs (e.g., β2-adrenergic receptor [β2AR], μ-opioid receptor, and protease-activated receptor 1) solubilized in bicelles produced better sequence coverage (greater than 90%) than GPCRs solubilized in n-dodecyl-β-D-maltopyranoside (DDM), suggesting that bicelles are a more effective method of solubilization for HDX-MS studies. The HDX-MS profile of β2AR in bicelles showed that transmembrane domains (TMs) undergo lower deuterium uptake than intracellular or extracellular regions, which is consistent with the fact that the TMs are highly ordered and embedded in bicelles. The overall HDX-MS profiles of β2AR solubilized in bicelles and in DDM were similar except for intracellular loop 3. Interestingly, we detected EX1 kinetics, an important phenomenon in protein dynamics, at the C-terminus of TM6 in β2AR. In conclusion, we suggest the application of bicelles as a useful method for solubilizing GPCRs for conformational analysis by HDX-MS.
Wang, Guanbo; Kaltashov, Igor A
2014-08-05
Top-down hydrogen/deuterium exchange (HDX) with mass spectrometric (MS) detection has recently matured to become a potent biophysical tool capable of providing valuable information on higher order structure and conformational dynamics of proteins at an unprecedented level of structural detail. However, the scope of the proteins amenable to the analysis by top-down HDX MS still remains limited, with the protein size and the presence of disulfide bonds being the two most important limiting factors. While the limitations imposed by the physical size of the proteins gradually become more relaxed as the sensitivity, resolution and dynamic range of modern MS instrumentation continue to improve at an ever accelerating pace, the presence of the disulfide linkages remains a much less forgiving limitation even for the proteins of relatively modest size. To circumvent this problem, we introduce an online chemical reduction step following completion and quenching of the HDX reactions and prior to the top-down MS measurements of deuterium occupancy of individual backbone amides. Application of the new methodology to the top-down HDX MS characterization of a small (99 residue long) disulfide-containing protein β2-microglobulin allowed the backbone amide protection to be probed with nearly a single-residue resolution across the entire sequence. The high-resolution backbone protection pattern deduced from the top-down HDX MS measurements carried out under native conditions is in excellent agreement with the crystal structure of the protein and high-resolution NMR data, suggesting that introduction of the chemical reduction step to the top-down routine does not trigger hydrogen scrambling either during the electrospray ionization process or in the gas phase prior to the protein ion dissociation.
Zhao, Miao; Wu, Xiao-Jie; Fan, Ya-Xin; Guo, Bei-Ning; Zhang, Jing
2016-05-30
A rapid ultra high-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) assay method was developed for determination of CMS and formed colistin in human plasma and urine. After extraction on a 96-well SPE Supra-Clean Weak Cation Exchange (WCX) plate, the eluents were mixed and injected into the UHPLC-MS/MS system directly. A Phonomenex Kinetex XB-C18 analytical column was employed with a mobile phase consisting of solution "A" (acetonitrile:methanol, 1:1, v/v) and solution "B" (0.1% formic acid in water, v/v). The flow rate was 0.4 mL/min with gradient elution over 3.5 min. Ions were detected in ESI positive ion mode and the precursor-product ion pairs were m/z 390.7/101.3 for colistin A, m/z 386.0/101.2 for colistin B, and m/z 402.3/101.2 for polymyxin B1 (IS), respectively. The lower limit of quantification (LLOQ) was 0.0130 and 0.0251 mg/L for colistin A and colistin B in both plasma and urine with accuracy (relative error, %) <± 12.6% and precision (relative standard deviation, %) <± 10.8%. Stability of CMS was demonstrated in biological samples before and during sample treatment, and in the extract. This new analytical method provides high-throughput treatment and optimized quantification of CMS and colistin, which offers a highly efficient tool for the analysis of a large number of clinical samples as well as routine therapeutic drug monitoring. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Cieplak-Rotowska, Maja K.; Tarnowski, Krzysztof; Rubin, Marcin; Fabian, Marc R.; Sonenberg, Nahum; Dadlez, Michal; Niedzwiecka, Anna
2018-01-01
The human GW182 protein plays an essential role in micro(mi)RNA-dependent gene silencing. miRNA silencing is mediated, in part, by a GW182 C-terminal region called the silencing domain, which interacts with the poly(A) binding protein and the CCR4-NOT deadenylase complex to repress protein synthesis. Structural studies of this GW182 fragment are challenging due to its predicted intrinsically disordered character, except for its RRM domain. However, detailed insights into the properties of proteins containing disordered regions can be provided by hydrogen-deuterium exchange mass spectrometry (HDX/MS). In this work, we applied HDX/MS to define the structural state of the GW182 silencing domain. HDX/MS analysis revealed that this domain is clearly divided into a natively unstructured part, including the CCR4-NOT interacting motif 1, and a distinct RRM domain. The GW182 RRM has a very dynamic structure, since water molecules can penetrate the whole domain in 2 h. The finding of this high structural dynamics sheds new light on the RRM structure. Though this domain is one of the most frequently occurring canonical protein domains in eukaryotes, these results are - to our knowledge - the first HDX/MS characteristics of an RRM. The HDX/MS studies show also that the α2 helix of the RRM can display EX1 behavior after a freezing-thawing cycle. This means that the RRM structure is sensitive to environmental conditions and can change its conformation, which suggests that the state of the RRM containing proteins should be checked by HDX/MS in regard of the conformational uniformity. [Figure not available: see fulltext.
NASA Astrophysics Data System (ADS)
Zhang, Jingjing; Kitova, Elena N.; Li, Jun; Eugenio, Luiz; Ng, Kenneth; Klassen, John S.
2016-01-01
The application of hydrogen/deuterium exchange mass spectrometry (HDX-MS) to localize ligand binding sites in carbohydrate-binding proteins is described. Proteins from three bacterial toxins, the B subunit homopentamers of Cholera toxin and Shiga toxin type 1 and a fragment of Clostridium difficile toxin A, and their interactions with native carbohydrate receptors, GM1 pentasaccharides (β-Gal-(1→3)-β-GalNAc-(1→4)[α-Neu5Ac-(2→3)]-β-Gal-(1→4)-Glc), Pk trisaccharide (α-Gal-(1→4)-β-Gal-(1→4)-Glc) and CD-grease (α-Gal-(1→3)-β-Gal-(1→4)-β-GlcNAcO(CH2)8CO2CH3), respectively, served as model systems for this study. Comparison of the differences in deuterium uptake for peptic peptides produced in the absence and presence of ligand revealed regions of the proteins that are protected against deuterium exchange upon ligand binding. Notably, protected regions generally coincide with the carbohydrate binding sites identified by X-ray crystallography. However, ligand binding can also result in increased deuterium exchange in other parts of the protein, presumably through allosteric effects. Overall, the results of this study suggest that HDX-MS can serve as a useful tool for localizing the ligand binding sites in carbohydrate-binding proteins. However, a detailed interpretation of the changes in deuterium exchange upon ligand binding can be challenging because of the presence of ligand-induced changes in protein structure and dynamics.
Measurement of Energy Performances for General-Structured Servers
NASA Astrophysics Data System (ADS)
Liu, Ren; Chen, Lili; Li, Pengcheng; Liu, Meng; Chen, Haihong
2017-11-01
Energy consumption of servers in data centers increases rapidly along with the wide application of Internet and connected devices. To improve the energy efficiency of servers, voluntary or mandatory energy efficiency programs for servers, including voluntary label program or mandatory energy performance standards have been adopted or being prepared in the US, EU and China. However, the energy performance of servers and testing methods of servers are not well defined. This paper presents matrices to measure the energy performances of general-structured servers. The impacts of various components of servers on their energy performances are also analyzed. Based on a set of normalized workload, the author proposes a standard method for testing energy efficiency of servers. Pilot tests are conducted to assess the energy performance testing methods of servers. The findings of the tests are discussed in the paper.
NASA Astrophysics Data System (ADS)
Adamczewski-Musch, Joern; Linev, Sergey
2015-12-01
The new THttpServer class in ROOT implements HTTP server for arbitrary ROOT applications. It is based on Civetweb embeddable HTTP server and provides direct access to all objects registered for the server. Objects data could be provided in different formats: binary, XML, GIF/PNG, and JSON. A generic user interface for THttpServer has been implemented with HTML/JavaScript based on JavaScript ROOT development. With any modern web browser one could list, display, and monitor objects available on the server. THttpServer is used in Go4 framework to provide HTTP interface to the online analysis.
NASA Astrophysics Data System (ADS)
Shah, Bhavana; Jiang, Xinzhao Grace; Chen, Louise; Zhang, Zhongqi
2014-06-01
Protein N-Glycan analysis is traditionally performed by high pH anion exchange chromatography (HPAEC), reversed phase liquid chromatography (RPLC), or hydrophilic interaction liquid chromatography (HILIC) on fluorescence-labeled glycans enzymatically released from the glycoprotein. These methods require time-consuming sample preparations and do not provide site-specific glycosylation information. Liquid chromatography-tandem mass spectrometry (LC-MS/MS) peptide mapping is frequently used for protein structural characterization and, as a bonus, can potentially provide glycan profile on each individual glycosylation site. In this work, a recently developed glycopeptide fragmentation model was used for automated identification, based on their MS/MS, of N-glycopeptides from proteolytic digestion of monoclonal antibodies (mAbs). Experimental conditions were optimized to achieve accurate profiling of glycoforms. Glycan profiles obtained from LC-MS/MS peptide mapping were compared with those obtained from HPAEC, RPLC, and HILIC analyses of released glycans for several mAb molecules. Accuracy, reproducibility, and linearity of the LC-MS/MS peptide mapping method for glycan profiling were evaluated. The LC-MS/MS peptide mapping method with fully automated data analysis requires less sample preparation, provides site-specific information, and may serve as an alternative method for routine profiling of N-glycans on immunoglobulins as well as other glycoproteins with simple N-glycans.
A Privacy-Preserving Platform for User-Centric Quantitative Benchmarking
NASA Astrophysics Data System (ADS)
Herrmann, Dominik; Scheuer, Florian; Feustel, Philipp; Nowey, Thomas; Federrath, Hannes
We propose a centralised platform for quantitative benchmarking of key performance indicators (KPI) among mutually distrustful organisations. Our platform offers users the opportunity to request an ad-hoc benchmarking for a specific KPI within a peer group of their choice. Architecture and protocol are designed to provide anonymity to its users and to hide the sensitive KPI values from other clients and the central server. To this end, we integrate user-centric peer group formation, exchangeable secure multi-party computation protocols, short-lived ephemeral key pairs as pseudonyms, and attribute certificates. We show by empirical evaluation of a prototype that the performance is acceptable for reasonably sized peer groups.
WebGLORE: a web service for Grid LOgistic REgression.
Jiang, Wenchao; Li, Pinghao; Wang, Shuang; Wu, Yuan; Xue, Meng; Ohno-Machado, Lucila; Jiang, Xiaoqian
2013-12-15
WebGLORE is a free web service that enables privacy-preserving construction of a global logistic regression model from distributed datasets that are sensitive. It only transfers aggregated local statistics (from participants) through Hypertext Transfer Protocol Secure to a trusted server, where the global model is synthesized. WebGLORE seamlessly integrates AJAX, JAVA Applet/Servlet and PHP technologies to provide an easy-to-use web service for biomedical researchers to break down policy barriers during information exchange. http://dbmi-engine.ucsd.edu/webglore3/. WebGLORE can be used under the terms of GNU general public license as published by the Free Software Foundation.
NASA Astrophysics Data System (ADS)
Hara, Yotamu Stephen Rainford
2014-01-01
Mineral sulphide (MS)-lime (CaO) ion exchange reactions (MS + CaO = MO + CaS) and the effect of CaO/C mole ratio during carbothermic reduction (MS + CaO + C = M + CaS + CO(g)) were investigated for complex froth flotation mineral sulphide concentrates. Phases in the partially and fully reacted samples were characterised by X-ray diffraction (XRD) and scanning electron microscopy (SEM). The primary phases during mineral sulphide-lime ion exchange reactions are Fe3O4, CaSO4 Cu2S, and CaS. A complex liquid phase of Ca2CuFeO3S forms during mineral sulphide-lime exchange reactions above 1173 K. The formation mechanisms of Ca2CuFeO3S liquid phase are determined by characterising the partially reacted samples. The reduction rate and extent of mineral sulphides in the presence of CaO and C increase with the increase in CaO/C ratio. The metallic phases are surrounded by the CaS rich phase at CaO/C > 1, but the metallic phases and CaS are found as separate phases at CaO/C < 1. Experimental results show that the stoichiometric ratio of carbon should be slightly higher than that of CaO. The reactions between CaO and gangue minerals (SiO2 and Al2O3) are only observed at CaO/C > 1 and the reacted samples are excessively sintered.
Debaene, François; Wagner-Rousset, Elsa; Colas, Olivier; Ayoub, Daniel; Corvaïa, Nathalie; Van Dorsselaer, Alain; Beck, Alain; Cianférani, Sarah
2013-10-15
Monoclonal antibodies (mAbs) and derivatives such as antibody-drug conjugates (ADC) and bispecific antibodies (bsAb), are the fastest growing class of human therapeutics. Most of the therapeutic antibodies currently on the market and in clinical trials are chimeric, humanized, and human immunoglobulin G1 (IgG1). An increasing number of IgG2s and IgG4s that have distinct structural and functional properties are also investigated to develop products that lack or have diminished antibody effector functions compared to IgG1. Importantly, wild type IgG4 has been shown to form half molecules (one heavy chain and one light chain) that lack interheavy chain disulfide bonds and form intrachain disulfide bonds. Moreover, IgG4 undergoes a process of Fab-arm exchange (FAE) in which the heavy chains of antibodies of different specificities can dissociate and recombine in bispecific antibodies both in vitro and in vivo. Here, native mass spectrometry (MS) and time-resolved traveling wave ion mobility MS (TWIM-MS) were used for the first time for online monitoring of FAE and bsAb formation using Hz6F4-2v3 and natalizumab, two humanized IgG4s which bind to human Junctional Adhesion Molecule-A (JAM-A) and alpha4 integrin, respectively. In addition, native MS analysis of bsAb/JAM-A immune complexes revealed that bsAb can bind up to two antigen molecules, confirming that the Hz6F4 family preferentially binds dimeric JAM-A. Our results illustrate how IM-MS can rapidly assess bsAb structural heterogeneity and be easily implemented into MS workflows for bsAb production follow up and bsAb/antigen complex characterization. Altogether, these results provide new MS-based methodologies for in-depth FAE and bsAb formation monitoring. Native MS and IM-MS will play an increasing role in next generation biopharmaceutical product characterization like bsAbs, antibody mixtures, and antibody-drug conjugates (ADC) as well as for biosimilar and biobetter antibodies.
Characteristics and Energy Use of Volume Servers in the United States
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fuchs, H.; Shehabi, A.; Ganeshalingam, M.
Servers’ field energy use remains poorly understood, given heterogeneous computing loads, configurable hardware and software, and operation over a wide range of management practices. This paper explores various characteristics of 1- and 2-socket volume servers that affect energy consumption, and quantifies the difference in power demand between higher-performing SPEC and ENERGY STAR servers and our best understanding of a typical server operating today. We first establish general characteristics of the U.S. installed base of volume servers from existing IDC data and the literature, before presenting information on server hardware configurations from data collection events at a major online retail website.more » We then compare cumulative distribution functions of server idle power across three separate datasets and explain the differences between them via examination of the hardware characteristics to which power draw is most sensitive. We find that idle server power demand is significantly higher than ENERGY STAR benchmarks and the industry-released energy use documented in SPEC, and that SPEC server configurations—and likely the associated power-scaling trends—are atypical of volume servers. Next, we examine recent trends in server power draw among high-performing servers across their full load range to consider how representative these trends are of all volume servers before inputting weighted average idle power load values into a recently published model of national server energy use. Finally, we present results from two surveys of IT managers (n=216) and IT vendors (n=178) that illustrate the prevalence of more-efficient equipment and operational practices in server rooms and closets; these findings highlight opportunities to improve the energy efficiency of the U.S. server stock.« less
Scaria, Joy; Sreedharan, Aswathy; Chang, Yung-Fu
2008-01-01
Background Microarrays are becoming a very popular tool for microbial detection and diagnostics. Although these diagnostic arrays are much simpler when compared to the traditional transcriptome arrays, due to the high throughput nature of the arrays, the data analysis requirements still form a bottle neck for the widespread use of these diagnostic arrays. Hence we developed a new online data sharing and analysis environment customised for diagnostic arrays. Methods Microbial Diagnostic Array Workstation (MDAW) is a database driven application designed in MS Access and front end designed in ASP.NET. Conclusion MDAW is a new resource that is customised for the data analysis requirements for microbial diagnostic arrays. PMID:18811969
Scaria, Joy; Sreedharan, Aswathy; Chang, Yung-Fu
2008-09-23
Microarrays are becoming a very popular tool for microbial detection and diagnostics. Although these diagnostic arrays are much simpler when compared to the traditional transcriptome arrays, due to the high throughput nature of the arrays, the data analysis requirements still form a bottle neck for the widespread use of these diagnostic arrays. Hence we developed a new online data sharing and analysis environment customised for diagnostic arrays. Microbial Diagnostic Array Workstation (MDAW) is a database driven application designed in MS Access and front end designed in ASP.NET. MDAW is a new resource that is customised for the data analysis requirements for microbial diagnostic arrays.
New method for assessing risks of email
NASA Astrophysics Data System (ADS)
Raja, Seyyed H.; Afrooz, Farzad
2013-03-01
E-mail technology, has become one of the requirements of human lives for correspondence between individuals. Given this, the important point is that the messages, server and client of e-mail and correspondences that exchanged between different people have acceptable security, to make people sure to use of this technology. In the information age, many of financial and non financial transactions are done electronically, data exchange takes place via the internet and theft and manipulation of data can make exorbitant cost in terms of integrity, financial, political, economic and culture. E-mail correspondence in there is same and it is very important. With review took place, a method that will focus on email system for risks assessment is not provided. We are examining ways of assessing for other systems and their strengths and weaknesses, then we use Mr Convery method for assessing email risks which it is for assessing network risks. At the end of paper we have offered special table for email risk assessment.
eHealth Networking Information Systems - The New Quality of Information Exchange.
Messer-Misak, Karin; Reiter, Christoph
2017-01-01
The development and introduction of platforms that enable interdisciplinary exchange on current developments and projects in the area of eHealth have been stimulated by different authorities. The aim of this project was to develop a repository of eHealth projects that will make the wealth of eHealth projects visible and enable mutual learning through the sharing of experiences and good practice. The content of the database and search criteria as well as their categories were determined in close co-ordination and cooperation with stakeholders from the specialist areas. Technically, we used Java Server Faces (JSF) for the implementation of the frontend of the web application. Access to structured information on projects can support stakeholders to combining skills and knowledge residing in different places to create new solutions and approaches within a network of evolving competencies and opportunities. A regional database is the beginning of a structured collection and presentation of projects, which can then be incorporated into a broader context. The next step will be to unify this information transparently.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kubachka, Kevin M.; Kohan, Michael C.; Herbin-Davis, Karen
Although metabolism of arsenicals to form methylated oxoarsenical species has been extensively studied, less is known about the formation of thiolated arsenical species that have recently been detected as urinary metabolites. Indeed, their presence suggests that the metabolism of ingested arsenic is more complex than previously thought. Recent reports have shown that thiolated arsenicals can be produced by the anaerobic microflora of the mouse cecum, suggesting that metabolism prior to systemic absorption may be a significant determinant of the pattern and extent of exposure to various arsenic-containing species. Here, we examined the metabolism of {sup 34}S labeled dimethylthioarsinic acid ({supmore » 34}S-DMTA{sup V}) by the anaerobic microflora of the mouse cecum using HPLC-ICP-MS and HPLC-ESI-MS/MS to monitor for the presence of various oxo- and thioarsenicals. The use of isotopically enriched {sup 34}S-DMTA{sup V} made it possible to differentiate among potential metabolic pathways for production of the trimethylarsine sulfide (TMAS{sup V}). Upon in vitro incubation in an assay containing anaerobic microflora of mouse cecum, {sup 34}S-DMTA{sup V} underwent several transformations. Labile {sup 34}S was exchanged with more abundant {sup 32}S to produce {sup 32}S-DMTA{sup V}, a thiol group was added to yield DMDTA{sup V}, and a methyl group was added to yield {sup 34}S-TMAS{sup V}. Because incubation of {sup 34}S-DMTA{sup V} resulted in the formation of {sup 34}S-TMAS{sup V}, the pathway for its formation must preserve the arsenic-sulfur bond. The alternative metabolic pathway postulated for formation of TMAS{sup V} from dimethylarsinic acid (DMA{sup V}) would proceed via a dimethylarsinous acid (DMA{sup III}) intermediate and would necessitate the loss of {sup 34}S label. Structural confirmation of the metabolic product was achieved using HPLC-ESI-MS/MS. The data presented support the direct methylation of DMTA{sup V} to TMAS{sup V}. Additionally, the detection of isotopically pure {sup 34}S-TMAS{sup V} raises questions about the sulfur exchange properties of TMAS{sup V} in the cecum material. Therefore, {sup 34}S-TMAS{sup V} was incubated and the exchange was monitored with respect to time. The data suggest that the As-S bond associated with TMAS{sup V} is less labile than the As-S bond associated with DMTA{sup V}.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chai, X; Liu, L; Xing, L
Purpose: Visualization and processing of medical images and radiation treatment plan evaluation have traditionally been constrained to local workstations with limited computation power and ability of data sharing and software update. We present a web-based image processing and planning evaluation platform (WIPPEP) for radiotherapy applications with high efficiency, ubiquitous web access, and real-time data sharing. Methods: This software platform consists of three parts: web server, image server and computation server. Each independent server communicates with each other through HTTP requests. The web server is the key component that provides visualizations and user interface through front-end web browsers and relay informationmore » to the backend to process user requests. The image server serves as a PACS system. The computation server performs the actual image processing and dose calculation. The web server backend is developed using Java Servlets and the frontend is developed using HTML5, Javascript, and jQuery. The image server is based on open source DCME4CHEE PACS system. The computation server can be written in any programming language as long as it can send/receive HTTP requests. Our computation server was implemented in Delphi, Python and PHP, which can process data directly or via a C++ program DLL. Results: This software platform is running on a 32-core CPU server virtually hosting the web server, image server, and computation servers separately. Users can visit our internal website with Chrome browser, select a specific patient, visualize image and RT structures belonging to this patient and perform image segmentation running Delphi computation server and Monte Carlo dose calculation on Python or PHP computation server. Conclusion: We have developed a webbased image processing and plan evaluation platform prototype for radiotherapy. This system has clearly demonstrated the feasibility of performing image processing and plan evaluation platform through a web browser and exhibited potential for future cloud based radiotherapy.« less
OPeNDAP Server4: Buidling a High-Performance Server for the DAP by Leveraging Existing Software
NASA Astrophysics Data System (ADS)
Potter, N.; West, P.; Gallagher, J.; Garcia, J.; Fox, P.
2006-12-01
OPeNDAP has been working in conjunction with NCAR/ESSL/HAO to develop a modular, high performance data server that will be the successor to the current OPeNDAP data server. The new server, called Server4, is really two servers: A 'Back-End' data server which reads information from various types of data sources and packages the results in DAP objects; and A 'Front-End' which receives client DAP request and then decides how use features of the Back-End data server to build the correct responses. This architecture can be configured in several interesting ways: The Front- and Back-End components can be run on either the same or different machines, depending on security and performance needs, new Front-End software can be written to support other network data access protocols and local applications can interact directly with the Back-End data server. This new server's Back-End component will use the server infrastructure developed by HAO for use with the Earth System Grid II project. Extensions needed to use it as part of the new OPeNDAP server were minimal. The HAO server was modified so that it loads 'data handlers' at run-time. Each data handler module only needs to satisfy a simple interface which both enabled the existing data handlers written for the old OPeNDAP server to be directly used and also simplifies writing new handlers from scratch. The Back-End server leverages high- performance features developed for the ESG II project, so applications that can interact with it directly can read large volumes of data efficiently. The Front-End module of Server4 uses the Java Servlet system in place of the Common Gateway Interface (CGI) used in the past. New front-end modules can be written to support different network data access protocols, so that same server will ultimately be able to support more than the DAP/2.0 protocol. As an example, we will discuss a SOAP interface that's currently in development. In addition to support for DAP/2.0 and prototypical support for a SOAP interface, the new server includes support for the THREDDS cataloging protocol. THREDDS is tightly integrated into the Front-End of Server4. The Server4 Front-End can make full use of the advanced THREDDS features such as attribute specification and inheritance, custom catalogs which segue into automatically generated catalogs as well as providing a default behavior which requires almost no catalog configuration.
Array Processing in the Cloud: the rasdaman Approach
NASA Astrophysics Data System (ADS)
Merticariu, Vlad; Dumitru, Alex
2015-04-01
The multi-dimensional array data model is gaining more and more attention when dealing with Big Data challenges in a variety of domains such as climate simulations, geographic information systems, medical imaging or astronomical observations. Solutions provided by classical Big Data tools such as Key-Value Stores and MapReduce, as well as traditional relational databases, proved to be limited in domains associated with multi-dimensional data. This problem has been addressed by the field of array databases, in which systems provide database services for raster data, without imposing limitations on the number of dimensions that a dataset can have. Examples of datasets commonly handled by array databases include 1-dimensional sensor data, 2-D satellite imagery, 3-D x/y/t image time series as well as x/y/z geophysical voxel data, and 4-D x/y/z/t weather data. And this can grow as large as simulations of the whole universe when it comes to astrophysics. rasdaman is a well established array database, which implements many optimizations for dealing with large data volumes and operation complexity. Among those, the latest one is intra-query parallelization support: a network of machines collaborate for answering a single array database query, by dividing it into independent sub-queries sent to different servers. This enables massive processing speed-ups, which promise solutions to research challenges on multi-Petabyte data cubes. There are several correlated factors which influence the speedup that intra-query parallelisation brings: the number of servers, the capabilities of each server, the quality of the network, the availability of the data to the server that needs it in order to compute the result and many more. In the effort of adapting the engine to cloud processing patterns, two main components have been identified: one that handles communication and gathers information about the arrays sitting on every server, and a processing unit responsible with dividing work among available nodes and executing operations on local data. The federation daemon collects and stores statistics from the other network nodes and provides real time updates about local changes. Information exchanged includes available datasets, CPU load and memory usage per host. The processing component is represented by the rasdaman server. Using information from the federation daemon it breaks queries into subqueries to be executed on peer nodes, ships them, and assembles the intermediate results. Thus, we define a rasdaman network node as a pair of a federation daemon and a rasdaman server. Any node can receive a query and will subsequently act as this query's dispatcher, so all peers are at the same level and there is no single point of failure. Should a node become inaccessible then the peers will recognize this and will not any longer consider this peer for distribution. Conversely, a peer at any time can join the network. To assess the feasibility of our approach, we deployed a rasdaman network in the Amazon Elastic Cloud environment on 1001 nodes, and observed that this feature can greatly increase the performance and scalability of the system, offering a large throughput of processed data.
Mistarz, Ulrik H; Brown, Jeffery M; Haselmann, Kim F; Rand, Kasper D
2014-12-02
Gas-phase hydrogen/deuterium exchange (HDX) is a fast and sensitive, yet unharnessed analytical approach for providing information on the structural properties of biomolecules, in a complementary manner to mass analysis. Here, we describe a simple setup for ND3-mediated millisecond gas-phase HDX inside a mass spectrometer immediately after ESI (gas-phase HDX-MS) and show utility for studying the primary and higher-order structure of peptides and proteins. HDX was achieved by passing N2-gas through a container filled with aqueous deuterated ammonia reagent (ND3/D2O) and admitting the saturated gas immediately upstream or downstream of the primary skimmer cone. The approach was implemented on three commercially available mass spectrometers and required no or minor fully reversible reconfiguration of gas-inlets of the ion source. Results from gas-phase HDX-MS of peptides using the aqueous ND3/D2O as HDX reagent indicate that labeling is facilitated exclusively through gaseous ND3, yielding similar results to the infusion of purified ND3-gas, while circumventing the complications associated with the use of hazardous purified gases. Comparison of the solution-phase- and gas-phase deuterium uptake of Leu-Enkephalin and Glu-Fibrinopeptide B, confirmed that this gas-phase HDX-MS approach allows for labeling of sites (heteroatom-bound non-amide hydrogens located on side-chains, N-terminus and C-terminus) not accessed by classical solution-phase HDX-MS. The simple setup is compatible with liquid chromatography and a chip-based automated nanoESI interface, allowing for online gas-phase HDX-MS analysis of peptides and proteins separated on a liquid chromatographic time scale at increased throughput. Furthermore, online gas-phase HDX-MS could be performed in tandem with ion mobility separation or electron transfer dissociation, thus enabling multiple orthogonal analyses of the structural properties of peptides and proteins in a single automated LC-MS workflow.
2008-08-01
Administration NDBA N-nitrosodi-n-butylamine NDEA N-nitrosodiethylamine NDMA N-nitrosodimethylamine NDPA N-nitrosodi-n-propylamine v ACRONYMS...spectrometry (IC-MS/MS). Nitrosamines were analyzed using EPA Method 521. N-nitrosodimethylamine ( NDMA ) was 2.6 parts per trillion (ppt) with a detection...and metals (Ca, Cu, Fe, Mg, Mn, K, Na , and Zn). Specific methods are listed in Table 5. ** N-nitrosodimethylamine ( NDMA ), N-nitrosodiethylamine
The Role of Memorable Messages in the Process of Organizational Socialization.
ERIC Educational Resources Information Center
Stohl, Cynthia
1986-01-01
Examines the structure, form, and nature of the content and context of memorable messages exchanged within an organization. Discusses how these features enhance the socializing and memorable nature of such messages. (MS)
West, Danielle M; Mu, Ruipu; Gamagedara, Sanjeewa; Ma, Yinfa; Adams, Craig; Eichholz, Todd; Burken, Joel G; Shi, Honglan
2015-06-01
Perchlorate and bromate occurrence in drinking water causes health concerns due to their effects on thyroid function and carcinogenicity, respectively. The purpose of this study was threefold: (1) to advance a sensitive method for simultaneous rapid detection of perchlorate and bromate in drinking water system, (2) to systematically study the occurrence of these two contaminants in Missouri drinking water treatment systems, and (3) to examine effective sorbents for minimizing perchlorate in drinking water. A rapid high-performance ion exchange chromatography-tandem mass spectrometry (HPIC-MS/MS) method was advanced for simultaneous detection of perchlorate and bromate in drinking water. The HPIC-MS/MS method was rapid, required no preconcentration of the water samples, and had detection limits for perchlorate and bromate of 0.04 and 0.01 μg/L, respectively. The method was applied to determine perchlorate and bromate concentrations in total of 23 selected Missouri drinking water treatment systems during differing seasons. The water systems selected include different source waters: groundwater, lake water, river water, and groundwater influenced by surface water. The concentrations of perchlorate and bromate were lower than or near to method detection limits in most of the drinking water samples monitored. The removal of perchlorate by various adsorbents was studied. A cationic organoclay (TC-99) exhibited effective removal of perchlorate from drinking water matrices.
An extensible and lightweight architecture for adaptive server applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gorton, Ian; Liu, Yan; Trivedi, Nihar
2008-07-10
Server applications augmented with behavioral adaptation logic can react to environmental changes, creating self-managing server applications with improved quality of service at runtime. However, developing adaptive server applications is challenging due to the complexity of the underlying server technologies and highly dynamic application environments. This paper presents an architecture framework, the Adaptive Server Framework (ASF), to facilitate the development of adaptive behavior for legacy server applications. ASF provides a clear separation between the implementation of adaptive behavior and the business logic of the server application. This means a server application can be extended with programmable adaptive features through the definitionmore » and implementation of control components defined in ASF. Furthermore, ASF is a lightweight architecture in that it incurs low CPU overhead and memory usage. We demonstrate the effectiveness of ASF through a case study, in which a server application dynamically determines the resolution and quality to scale an image based on the load of the server and network connection speed. The experimental evaluation demonstrates the erformance gains possible by adaptive behavior and the low overhead introduced by ASF.« less
Mass Spec Studio for Integrative Structural Biology
Rey, Martial; Sarpe, Vladimir; Burns, Kyle; Buse, Joshua; Baker, Charles A.H.; van Dijk, Marc; Wordeman, Linda; Bonvin, Alexandre M.J.J.; Schriemer, David C.
2015-01-01
SUMMARY The integration of biophysical data from multiple sources is critical for developing accurate structural models of large multiprotein systems and their regulators. Mass spectrometry (MS) can be used to measure the insertion location for a wide range of topographically sensitive chemical probes, and such insertion data provide a rich, but disparate set of modeling restraints. We have developed a software platform that integrates the analysis of label-based MS data with protein modeling activities (Mass Spec Studio). Analysis packages can mine any labeling data from any mass spectrometer in a proteomics-grade manner, and link labeling methods with data-directed protein interaction modeling using HADDOCK. Support is provided for hydrogen/ deuterium exchange (HX) and covalent labeling chemistries, including novel acquisition strategies such as targeted HX-tandem MS (MS2) and data-independent HX-MS2. The latter permits the modeling of highly complex systems, which we demonstrate by the analysis of microtubule interactions. PMID:25242457
Huang, Zongyun; Francis, Robert; Zha, Yan; Ruan, Joan
2015-01-01
A simple, sensitive and robust method using HILIC-ESI-MS has been developed for the determination of methane sulfonic acid (MSA) at low ppm level in order to verify the effectiveness of controlling the formation of genotoxic sulfonate esters in the downstream synthetic step, by which produces active pharmaceutical ingredient (API). Stationary phases with positively charged functional groups such as triazole and amino phases were evaluated for the retention of alkyl sulfonic acids. The MSA was quantitated at 1-10 ppm relative to the API using a Cosmosil column (triazole stationary phase) in HILIC mode and the control of MSA can be monitored effectively using the HILIC-ESI-MS methodology. In addition, to provide general guidance for the HILIC-ESI-MS method development, the retention behavior of propanesulfonic acid (PSA) in HILIC mode was investigated using a Unison UK-Amino column to have a better understanding of the HILIC separation mechanism. The results showed reasonable evidence that the combined effect of surface adsorption and ion-exchange played a dominant role for sulfonic acids when using a mobile phase within typical HILIC operation range (0.05-0.20 aqueous volume fraction) while the ion-exchange effect becomes increasingly important in a mobile phase with higher water content. The advantage of using ESI-MS detection in HILIC mode was also demonstrated by the observation that the sensitivity of PSA increased substantially with increasing acetonitrile fraction in mobile phase from 0.80 to 0.95. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Kerley, Dan; Smith, Malcolm; Dunn, Jennifer; Herriot, Glen; Véran, Jean-Pierre; Boyer, Corinne; Ellerbroek, Brent; Gilles, Luc; Wang, Lianqi
2016-08-01
The Narrow Field Infrared Adaptive Optics System (NFIRAOS) is the first light Adaptive Optics (AO) system for the Thirty Meter Telescope (TMT). A critical component of NFIRAOS is the Real-Time Controller (RTC) subsystem which provides real-time wavefront correction by processing wavefront information to compute Deformable Mirror (DM) and Tip/Tilt Stage (TTS) commands. The National Research Council of Canada - Herzberg (NRC-H), in conjunction with TMT, has developed a preliminary design for the NFIRAOS RTC. The preliminary architecture for the RTC is comprised of several Linux-based servers. These servers are assigned various roles including: the High-Order Processing (HOP) servers, the Wavefront Corrector Controller (WCC) server, the Telemetry Engineering Display (TED) server, the Persistent Telemetry Storage (PTS) server, and additional testing and spare servers. There are up to six HOP servers that accept high-order wavefront pixels, and perform parallelized pixel processing and wavefront reconstruction to produce wavefront corrector error vectors. The WCC server performs low-order mode processing, and synchronizes and aggregates the high-order wavefront corrector error vectors from the HOP servers to generate wavefront corrector commands. The Telemetry Engineering Display (TED) server is the RTC interface to TMT and other subsystems. The TED server receives all external commands and dispatches them to the rest of the RTC servers and is responsible for aggregating several offloading and telemetry values that are reported to other subsystems within NFIRAOS and TMT. The TED server also provides the engineering GUIs and real-time displays. The Persistent Telemetry Storage (PTS) server contains fault tolerant data storage that receives and stores telemetry data, including data for Point-Spread Function Reconstruction (PSFR).
Quantification of Soluble Sugars and Sugar Alcohols by LC-MS/MS.
Feil, Regina; Lunn, John Edward
2018-01-01
Sugars are simple carbohydrates composed primarily of carbon, hydrogen, and oxygen. They play a central role in metabolism as sources of energy and as building blocks for synthesis of structural and nonstructural polymers. Many different techniques have been used to measure sugars, including refractometry, colorimetric and enzymatic assays, gas chromatography, high-performance liquid chromatography, and nuclear magnetic resonance spectroscopy. In this chapter we describe a method that combines an initial separation of sugars by high-performance anion-exchange chromatography (HPAEC) with detection and quantification by tandem mass spectrometry (MS/MS). This combination of techniques provides exquisite specificity, allowing measurement of a diverse range of high- and low-abundance sugars in biological samples. This method can also be used for isotopomer analysis in stable-isotope labeling experiments to measure metabolic fluxes.
Energy Efficiency in Small Server Rooms: Field Surveys and Findings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheung, Iris; Greenberg, Steve; Mahdavi, Roozbeh
Fifty-seven percent of US servers are housed in server closets, server rooms, and localized data centers, in what are commonly referred to as small server rooms, which comprise 99percent of all server spaces in the US. While many mid-tier and enterprise-class data centers are owned by large corporations that consider energy efficiency a goal to minimize business operating costs, small server rooms typically are not similarly motivated. They are characterized by decentralized ownership and management and come in many configurations, which creates a unique set of efficiency challenges. To develop energy efficiency strategies for these spaces, we surveyed 30 smallmore » server rooms across eight institutions, and selected four of them for detailed assessments. The four rooms had Power Usage Effectiveness (PUE) values ranging from 1.5 to 2.1. Energy saving opportunities ranged from no- to low-cost measures such as raising cooling set points and better airflow management, to more involved but cost-effective measures including server consolidation and virtualization, and dedicated cooling with economizers. We found that inefficiencies mainly resulted from organizational rather than technical issues. Because of the inherent space and resource limitations, the most effective measure is to operate servers through energy-efficient cloud-based services or well-managed larger data centers, rather than server rooms. Backup power requirement, and IT and cooling efficiency should be evaluated to minimize energy waste in the server space. Utility programs are instrumental in raising awareness and spreading technical knowledge on server operation, and the implementation of energy efficiency measures in small server rooms.« less
Asymmetric Preorganization of Inverted Pair Residues in the Sodium-Calcium Exchanger
Giladi, Moshe; Almagor, Lior; van Dijk, Liat; Hiller, Reuben; Man, Petr; Forest, Eric; Khananshvili, Daniel
2016-01-01
In analogy with many other proteins, Na+/Ca2+ exchangers (NCX) adapt an inverted twofold symmetry of repeated structural elements, while exhibiting a functional asymmetry by stabilizing an outward-facing conformation. Here, structure-based mutant analyses of the Methanococcus jannaschii Na+/Ca2+ exchanger (NCX_Mj) were performed in conjunction with HDX-MS (hydrogen/deuterium exchange mass spectrometry) to identify the structure-dynamic determinants of functional asymmetry. HDX-MS identified hallmark differences in backbone dynamics at ion-coordinating residues of apo-NCX_Mj, whereas Na+or Ca2+ binding to the respective sites induced relatively small, but specific, changes in backbone dynamics. Mutant analysis identified ion-coordinating residues affecting the catalytic capacity (kcat/Km), but not the stability of the outward-facing conformation. In contrast, distinct “noncatalytic” residues (adjacent to the ion-coordinating residues) control the stability of the outward-facing conformation, but not the catalytic capacity. The helix-breaking signature sequences (GTSLPE) on the α1 and α2 repeats (at the ion-binding core) differ in their folding/unfolding dynamics, while providing asymmetric contributions to transport activities. The present data strongly support the idea that asymmetric preorganization of the ligand-free ion-pocket predefines catalytic reorganization of ion-bound residues, where secondary interactions with adjacent residues couple the alternating access. These findings provide a structure-dynamic basis for ion-coupled alternating access in NCX and similar proteins. PMID:26876271
SU-G-JeP3-08: Robotic System for Ultrasound Tracking in Radiation Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuhlemann, I; Graduate School for Computing in Medicine and Life Sciences, University of Luebeck; Jauer, P
Purpose: For safe and accurate real-time tracking of tumors for IGRT using 4D ultrasound, it is necessary to make use of novel, high-end force-sensitive lightweight robots designed for human-machine interaction. Such a robot will be integrated into an existing robotized ultrasound system for non-invasive 4D live tracking, using a newly developed real-time control and communication framework. Methods: The new KUKA LWR iiwa robot is used for robotized ultrasound real-time tumor tracking. Besides more precise probe contact pressure detection, this robot provides an additional 7th link, enhancing the dexterity of the kinematic and the mounted transducer. Several integrated, certified safety featuresmore » create a safe environment for the patients during treatment. However, to remotely control the robot for the ultrasound application, a real-time control and communication framework has to be developed. Based on a client/server concept, client-side control commands are received and processed by a central server unit and are implemented by a client module running directly on the robot’s controller. Several special functionalities for robotized ultrasound applications are integrated and the robot can now be used for real-time control of the image quality by adjusting the transducer position, and contact pressure. The framework was evaluated looking at overall real-time capability for communication and processing of three different standard commands. Results: Due to inherent, certified safety modules, the new robot ensures a safe environment for patients during tumor tracking. Furthermore, the developed framework shows overall real-time capability with a maximum average latency of 3.6 ms (Minimum 2.5 ms; 5000 trials). Conclusion: The novel KUKA LBR iiwa robot will advance the current robotized ultrasound tracking system with important features. With the developed framework, it is now possible to remotely control this robot and use it for robotized ultrasound tracking applications, including image quality control and target tracking.« less
NASA Astrophysics Data System (ADS)
Stepanov, Sergey
2013-03-01
X-Ray Server (x-server.gmca.aps.anl.gov) is a WWW-based computational server for modeling of X-ray diffraction, reflection and scattering data. The modeling software operates directly on the server and can be accessed remotely either from web browsers or from user software. In the later case the server can be deployed as a software library or a data fitting engine. As the server recently surpassed the milestones of 15 years online and 1.5 million calculations, it accumulated a number of technical solutions that are discussed in this paper. The developed approaches to detecting physical model limits and user calculations failures, solutions to spam and firewall problems, ways to involve the community in replenishing databases and methods to teach users automated access to the server programs may be helpful for X-ray researchers interested in using the server or sharing their own software online.
Effect of video server topology on contingency capacity requirements
NASA Astrophysics Data System (ADS)
Kienzle, Martin G.; Dan, Asit; Sitaram, Dinkar; Tetzlaff, William H.
1996-03-01
Video servers need to assign a fixed set of resources to each video stream in order to guarantee on-time delivery of the video data. If a server has insufficient resources to guarantee the delivery, it must reject the stream request rather than slowing down all existing streams. Large scale video servers are being built as clusters of smaller components, so as to be economical, scalable, and highly available. This paper uses a blocking model developed for telephone systems to evaluate video server cluster topologies. The goal is to achieve high utilization of the components and low per-stream cost combined with low blocking probability and high user satisfaction. The analysis shows substantial economies of scale achieved by larger server images. Simple distributed server architectures can result in partitioning of resources with low achievable resource utilization. By comparing achievable resource utilization of partitioned and monolithic servers, we quantify the cost of partitioning. Next, we present an architecture for a distributed server system that avoids resource partitioning and results in highly efficient server clusters. Finally, we show how, in these server clusters, further optimizations can be achieved through caching and batching of video streams.
Kumar, Sunil; Rai, Manoj K; Singh, Narender; Mangal, Manisha
2010-12-01
Shoot tips excised from in vitro proliferated shoots derived from nodal explants of jojoba [Simmondsia chinensis (Link) Schneider] were encapsulated in calcium alginate beads for germplasm exchange and distribution. A gelling matrix of 3 % sodium alginate and 100 mM calcium chloride was found most suitable for formation of ideal calcium alginate beads. Best response for shoot sprouting from encapsulated shoot tips was recorded on 0.8 % agar-solidified full-strength MS medium. Rooting was induced upon transfer of sprouted shoots to 0.8 % agar-solidified MS medium containing 1 mg l(-1) IBA. About 70 % of encapsulated shoot tips were rooted and converted into plantlets. Plants regenerated from encapsulated shoot tips were acclimatized successfully. The present encapsulation approach could also be applied as an alternative method of propagation of desirable elite genotype of jojoba.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Jia; Harrison, Rane A.; Li, Lianbo
KRAS G12C, the most common RAS mutation found in non-small-cell lung cancer, has been the subject of multiple recent covalent small-molecule inhibitor campaigns including efforts directed at the guanine nucleotide pocket and separate work focused on an inducible pocket adjacent to the switch motifs. Multiple conformations of switch II have been observed, suggesting that switch II pocket (SIIP) binders may be capable of engaging a range of KRAS conformations. Here we report the use of hydrogen/deuterium-exchange mass spectrometry (HDX MS) to discriminate between conformations of switch II induced by two chemical classes of SIIP binders. We investigated the structural basismore » for differences in HDX MS using X-ray crystallography and discovered a new SIIP configuration in response to binding of a quinazoline chemotype. These results have implications for structure-guided drug design targeting the RAS SIIP.« less
NASA Astrophysics Data System (ADS)
Acter, Thamina; Lee, Seulgidaun; Cho, Eunji; Jung, Maeng-Joon; Kim, Sunghwan
2018-01-01
In this study, continuous in-source hydrogen/deuterium exchange (HDX) atmospheric pressure photoionization (APPI) mass spectrometry (MS) with continuous feeding of D2O was developed and validated. D2O was continuously fed using a capillary line placed on the center of a metal plate positioned between the UV lamp and nebulizer. The proposed system overcomes the limitations of previously reported APPI HDX-MS approaches where deuterated solvents were premixed with sample solutions before ionization. This is particularly important for APPI because solvent composition can greatly influence ionization efficiency as well as the solubility of analytes. The experimental parameters for APPI HDX-MS with continuous feeding of D2O were optimized, and the optimized conditions were applied for the analysis of nitrogen-, oxygen-, and sulfur-containing compounds. The developed method was also applied for the analysis of the polar fraction of a petroleum sample. Thus, the data presented in this study clearly show that the proposed HDX approach can serve as an effective analytical tool for the structural analysis of complex mixtures. [Figure not available: see fulltext.
A new route of oxygen isotope exchange in the solid phase: demonstration in CuSO4.5H2O.
Danon, Albert; Saig, Avraham; Finkelstein, Yacov; Koresh, Jacob E
2005-11-10
Temperature-programmed desorption mass spectrometry (TPD-MS) measurements on [(18)O]water-enriched copper sulfate pentahydrate (CuSO(4).5H(2)(18)O) reveal an unambiguous occurrence of efficient oxygen isotope exchange between the water of crystallization and the sulfate in its CuSO(4) solid phase. To the best of our knowledge, the occurrence of such an exchange was never observed in a solid phase. The exchange process was observed during the stepwise dehydration (50-300 degrees C) of the compound. Specifically, the exchange promptly occurs somewhere between 160 and 250 degrees C; however, the exact temperature could not be resolved conclusively. It is shown that only the fifth, sulfate-associated, anionic H(2)O molecule participates in the exchange process and that the exchange seems to occur in a preferable fashion with, at the most, one oxygen atom in SO(4). Such an exchange, occurring below 250 degrees C, questions the common conviction of unfeasible oxygen exchange under geothermic conditions. This new oxygen exchange phenomenon is not exclusive to copper sulfate but is unambiguously observed also in other sulfate- and nitrate-containing minerals.
Lin, Ping-Chang
2015-06-01
A number of NMR methods possess the capability of probing chemical exchange dynamics in solution. However, certain drawbacks limit the applications of these NMR approaches, particularly, to a complex system. Here, we propose a procedure that integrates the regularized nonnegative least squares (NNLS) analysis of multiexponential T2 relaxation into Carr-Purcell-Meiboom-Gill (CPMG) relaxation dispersion experiments to probe chemical exchange in a multicompartmental system. The proposed procedure was validated through analysis of (19)F T2 relaxation data of 6-fluoro-DL-tryptophan in a two-compartment solution with and without bovine serum albumin. Given the regularized NNLS analysis of a T2 relaxation curve acquired, for example, at the CPMG frequency υ CPMG = 125, the nature of two distinct peaks in the associated T2 distribution spectrum indicated 6-fluoro-DL-tryptophan either retaining the free state, with geometric mean */multiplicative standard deviation (MSD) = 1851.2 ms */1.51, or undergoing free/albumin-bound interconversion, with geometric mean */MSD = 236.8 ms */1.54, in the two-compartment system. Quantities of the individual tryptophan species were accurately reflected by the associated T2 peak areas, with an interconversion state-to-free state ratio of 0.45 ± 0.11. Furthermore, the CPMG relaxation dispersion analysis estimated the exchange rate between the free and albumin-bound states in this fluorinated tryptophan analog and the corresponding dissociation constant of the fluorinated tryptophan-albumin complex in the chemical-exchanging, two-compartment system.
Kaese, Sven; Bögeholz, Nils; Pauls, Paul; Dechering, Dirk; Olligs, Jan; Kölker, Katharina; Badawi, Sascha; Frommeyer, Gerrit; Pott, Christian; Eckardt, Lars
2017-08-01
The cardiac sodium/calcium (Na + /Ca 2+ ) exchanger (NCX) contributes to diastolic depolarization in cardiac pacemaker cells. Increased NCX activity has been found in heart failure and atrial fibrillation. The influence of increased NCX activity on resting heart rate, beta-adrenergic-mediated increase in heart rate, and cardiac conduction properties is unknown. The purpose of this study was to investigate the influence of NCX overexpression in a homozygous transgenic whole-heart mouse model (NCX-OE) on sinus and AV nodal function. Langendorff-perfused, beating whole hearts of NCX-OE and the corresponding wild-type (WT) were studied ± isoproterenol (ISO; 0.2 μM). Epicardial ECG, AV nodal Wenckebach cycle length (AVN-WCL), and retrograde AVN-WCL were obtained. At baseline, basal heart rate was unaltered between NCX-OE and WT (WT: cycle length [CL] 177.6 ± 40.0 ms, no. of hearts [n] = 20; NCX-OE: CL 185.9 ± 30.5 ms, n = 18; P = .21). In the presence of ISO, NCX-OE exhibited a significantly higher heart rate compared to WT (WT: CL 133.4 ± 13.4 ms, n = 20; NCX-OE: CL 117.7 ± 14.2 ms, n = 18; P <.001). ISO led to a significant shortening of the anterograde and retrograde AVN-WCL without differences between NCX-OE and WT. This study is the first to demonstrate that increased NCX activity enhances beta-adrenergic increase of heart rate. Mechanistically, increased NCX inward mode activity may promote acceleration of diastolic depolarization in sinus nodal pacemaker cells, thus enhancing chronotropy in NCX-OE. These findings suggest a novel potential therapeutic target for heart rate control in the presence of increased NCX activity, such as heart failure. Copyright © 2017 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.
Awan, Muaaz Gul; Saeed, Fahad
2016-05-15
Modern proteomics studies utilize high-throughput mass spectrometers which can produce data at an astonishing rate. These big mass spectrometry (MS) datasets can easily reach peta-scale level creating storage and analytic problems for large-scale systems biology studies. Each spectrum consists of thousands of peaks which have to be processed to deduce the peptide. However, only a small percentage of peaks in a spectrum are useful for peptide deduction as most of the peaks are either noise or not useful for a given spectrum. This redundant processing of non-useful peaks is a bottleneck for streaming high-throughput processing of big MS data. One way to reduce the amount of computation required in a high-throughput environment is to eliminate non-useful peaks. Existing noise removing algorithms are limited in their data-reduction capability and are compute intensive making them unsuitable for big data and high-throughput environments. In this paper we introduce a novel low-complexity technique based on classification, quantization and sampling of MS peaks. We present a novel data-reductive strategy for analysis of Big MS data. Our algorithm, called MS-REDUCE, is capable of eliminating noisy peaks as well as peaks that do not contribute to peptide deduction before any peptide deduction is attempted. Our experiments have shown up to 100× speed up over existing state of the art noise elimination algorithms while maintaining comparable high quality matches. Using our approach we were able to process a million spectra in just under an hour on a moderate server. The developed tool and strategy has been made available to wider proteomics and parallel computing community and the code can be found at https://github.com/pcdslab/MSREDUCE CONTACT: : fahad.saeed@wmich.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Demonstration of Regenerable, Large-scale Ion Exchange System Using WBA Resin in Rialto, CA
2008-03-01
Saturation Index MCL – Maximum Contaminant Level NaOH – Sodium hydroxide NDBA – N-nitrosodi-n-butylamine NDEA – N-nitrosodiethylamine NDMA ...analyzed using EPA Method 521. NDMA was 2.6 ppt with a detection limit of 2 ppt. All other nitrosamines analyzed (including NDEA, NDBA, NDPA, NMEA...using IC/MS/MS. Nitrosamines were analyzed using EPA Method 521. NDMA was 2.6 ppt with a detection limit of 2 ppt. All other nitrosamines
Demonstration of Regenerable, Large-Scale Ion Exchange System Using WBA Resin in Rialto, CA
2008-03-05
NDMA – N-nitrosodimethylamine NDPA – N-nitrosodi-n-propylamine NMEA – N-nitrosomethylethylamine NMOR – N-nitrosomorpholine NPDES – National Pollutant...were analyzed using EPA Method 521. NDMA was 2.6 ppt with a detection limit of 2 ppt. All other nitrosamines analyzed (including NDEA, NDBA, NDPA...ppb) using IC/MS/MS. Nitrosamines were analyzed using EPA Method 521. NDMA was 2.6 ppt with a detection limit of 2 ppt. All other nitrosamines
Xue, Lu; Lin, Lin; Zhou, Wenbin; Chen, Wendong; Tang, Jun; Sun, Xiujie; Huang, Peiwu; Tian, Ruijun
2018-06-09
Plasma proteome profiling by LC-MS based proteomics has drawn great attention recently for biomarker discovery from blood liquid biopsy. Due to standard multi-step sample preparation could potentially cause plasma protein degradation and analysis variation, integrated proteomics sample preparation technologies became promising solution towards this end. Here, we developed a fully integrated proteomics sample preparation technology for both fast and deep plasma proteome profiling under its native pH. All the sample preparation steps, including protein digestion and two-dimensional fractionation by both mixed-mode ion exchange and high-pH reversed phase mechanism were integrated into one spintip device for the first time. The mixed-mode ion exchange beads design achieved the sample loading at neutral pH and protein digestion within 30 min. Potential sample loss and protein degradation by pH changing could be voided. 1 μL of plasma sample with depletion of high abundant proteins was processed by the developed technology with 12 equally distributed fractions and analyzed with 12 h of LC-MS gradient time, resulting in the identification of 862 proteins. The combination of the Mixed-mode-SISPROT and data-independent MS method achieved fast plasma proteome profiling in 2 h with high identification overlap and quantification precision for a proof-of-concept study of plasma samples from 5 healthy donors. We expect that the Mixed-mode-SISPROT become a generally applicable sample preparation technology for clinical oriented plasma proteome profiling. Copyright © 2018 Elsevier B.V. All rights reserved.
Design and implementation of streaming media server cluster based on FFMpeg.
Zhao, Hong; Zhou, Chun-long; Jin, Bao-zhao
2015-01-01
Poor performance and network congestion are commonly observed in the streaming media single server system. This paper proposes a scheme to construct a streaming media server cluster system based on FFMpeg. In this scheme, different users are distributed to different servers according to their locations and the balance among servers is maintained by the dynamic load-balancing algorithm based on active feedback. Furthermore, a service redirection algorithm is proposed to improve the transmission efficiency of streaming media data. The experiment results show that the server cluster system has significantly alleviated the network congestion and improved the performance in comparison with the single server system.
Design and Implementation of Streaming Media Server Cluster Based on FFMpeg
Zhao, Hong; Zhou, Chun-long; Jin, Bao-zhao
2015-01-01
Poor performance and network congestion are commonly observed in the streaming media single server system. This paper proposes a scheme to construct a streaming media server cluster system based on FFMpeg. In this scheme, different users are distributed to different servers according to their locations and the balance among servers is maintained by the dynamic load-balancing algorithm based on active feedback. Furthermore, a service redirection algorithm is proposed to improve the transmission efficiency of streaming media data. The experiment results show that the server cluster system has significantly alleviated the network congestion and improved the performance in comparison with the single server system. PMID:25734187
Rellán-Alvarez, Rubén; Abadía, Javier; Alvarez-Fernández, Ana
2008-05-01
Nicotianamine (NA) is considered as a key element in plant metal homeostasis. This non-proteinogenic amino acid has an optimal structure for chelation of metal ions, with six functional groups that allow octahedral coordination. The ability to chelate metals by NA is largely dependent on the pK of the resulting complex and the pH of the solution, with most metals being chelated at neutral or basic pH values. In silico calculations using pKa and pK values have predicted the occurrence of metal-NA complexes in plant fluids, but the use of soft ionization techniques (e.g. electrospray), together with high-resolution mass spectrometers (e.g. time-of-flight mass detector), can offer direct and metal-specific information on the speciation of NA in solution. We have used direct infusion electrospray ionization mass spectrometry (time-of-flight) ESI-MS(TOF) to study the complexation of Mn, Fe(II), Fe(III), Ni, Cu by NA. The pH dependence of the metal-NA complexes in ESI-MS was compared to that predicted in silico. Possible exchange reactions that may occur between Fe-NA and other metal micronutrients as Zn and Cu, as well as between Fe-NA and citrate, another possible Fe ligand candidate in plants, were studied at pH 5.5 and 7.5, values typical of the plant xylem and phloem saps. Metal-NA complexes were generally observed in the ESI-MS experiments at a pH value approximately 1-2 units lower than that predicted in silico, and this difference could be only partially explained by the estimated error, approximately 0.3 pH units, associated with measuring pH in organic solvent-containing solutions. Iron-NA complexes are less likely to participate in ligand- and metal-exchange reactions at pH 7.5 than at pH 5.5. Results support that NA may be the ligand chelating Fe at pH values usually found in phloem sap, whereas in the xylem sap NA is not likely to be involved in Fe transport, conversely to what occurs with other metals such as Cu and Ni. Some considerations that need to be addressed when studying metal complexes in plant compartments by ESI-MS are also discussed.
The MaxQuant computational platform for mass spectrometry-based shotgun proteomics.
Tyanova, Stefka; Temu, Tikira; Cox, Juergen
2016-12-01
MaxQuant is one of the most frequently used platforms for mass-spectrometry (MS)-based proteomics data analysis. Since its first release in 2008, it has grown substantially in functionality and can be used in conjunction with more MS platforms. Here we present an updated protocol covering the most important basic computational workflows, including those designed for quantitative label-free proteomics, MS1-level labeling and isobaric labeling techniques. This protocol presents a complete description of the parameters used in MaxQuant, as well as of the configuration options of its integrated search engine, Andromeda. This protocol update describes an adaptation of an existing protocol that substantially modifies the technique. Important concepts of shotgun proteomics and their implementation in MaxQuant are briefly reviewed, including different quantification strategies and the control of false-discovery rates (FDRs), as well as the analysis of post-translational modifications (PTMs). The MaxQuant output tables, which contain information about quantification of proteins and PTMs, are explained in detail. Furthermore, we provide a short version of the workflow that is applicable to data sets with simple and standard experimental designs. The MaxQuant algorithms are efficiently parallelized on multiple processors and scale well from desktop computers to servers with many cores. The software is written in C# and is freely available at http://www.maxquant.org.
Analysis of data throughput in communication between PLCs and HMI/SCADA systems
NASA Astrophysics Data System (ADS)
Mikolajek, Martin; Koziorek, Jiri
2016-09-01
This paper is focused on Analysis of data throughout in communication between PLCs and HMI/SCADA systems. The first part of paper discusses basic problematic communication between PLC and HMI systems. Next part is about specific types of communications PLC - HMI requests. For those cases paper is talking about response and data throughput1-3 . Subsequent section of this article contains practical parts with various data exchanges between PLC Siemens and HMI. The possibilities of communication that are described in this article are focused on using OPC server for visualization software, custom HMI system and own application created by using .NET with Technology. The last part of this article contains some communication solutions.
WebGLORE: a Web service for Grid LOgistic REgression
Jiang, Wenchao; Li, Pinghao; Wang, Shuang; Wu, Yuan; Xue, Meng; Ohno-Machado, Lucila; Jiang, Xiaoqian
2013-01-01
WebGLORE is a free web service that enables privacy-preserving construction of a global logistic regression model from distributed datasets that are sensitive. It only transfers aggregated local statistics (from participants) through Hypertext Transfer Protocol Secure to a trusted server, where the global model is synthesized. WebGLORE seamlessly integrates AJAX, JAVA Applet/Servlet and PHP technologies to provide an easy-to-use web service for biomedical researchers to break down policy barriers during information exchange. Availability and implementation: http://dbmi-engine.ucsd.edu/webglore3/. WebGLORE can be used under the terms of GNU general public license as published by the Free Software Foundation. Contact: x1jiang@ucsd.edu PMID:24072732
Hanneman, Andrew J S; Strand, James; Huang, Chi-Ting
2014-02-01
Glycosylation is a critical parameter used to evaluate protein quality and consistency. N-linked glycan profiling is fundamental to the support of biotherapeutic protein manufacturing from early stage process development through drug product commercialization. Sialylated glycans impact the serum half-life of receptor-Fc fusion proteins (RFPs), making their quality and consistency a concern during the production of fusion proteins. Here, we describe an analytical approach providing both quantitative profiling and in-depth mass spectrometry (MS)-based structural characterization of sialylated RFP N-glycans. Aiming to efficiently link routine comparability studies with detailed structural characterization, an integrated workflow was implemented employing fluorescence detection, online positive and negative ion tandem mass spectrometry (MS/MS), and offline static nanospray ionization-sequential mass spectrometry (NSI-MS(n)). For routine use, high-performance liquid chromatography profiling employs established fluorescence detection of 2-aminobenzoic acid derivatives (2AA) and hydrophilic interaction anion-exchange chromatography (HIAX) charge class separation. Further characterization of HIAX peak fractions is achieved by online (-) ion orbitrap MS/MS, offering the advantages of high mass accuracy and data-dependent MS/MS. As required, additional characterization uses porous graphitized carbon in the second chromatographic dimension to provide orthogonal (+) ion MS/MS spectra and buffer-free liquid chromatography peak eluants that are optimum for offline (+)/(-) NSI-MS(n) investigations to characterize low-abundance species and specific moieties including O-acetylation and sulfation. © 2013 Wiley Periodicals, Inc. and the American Pharmacists Association.
ERIC Educational Resources Information Center
de Miranda, John
The field of alcohol server awareness and training has grown dramatically in the past several years and the idea of training servers to reduce alcohol problems has become a central fixture in the current alcohol policy debate. The San Mateo County, California Server Information Program (SIP) is a community-based prevention strategy designed to…
The use of geospatial web services for exchanging utilities data
NASA Astrophysics Data System (ADS)
Kuczyńska, Joanna
2013-04-01
Geographic information technologies and related geo-information systems currently play an important role in the management of public administration in Poland. One of these tasks is to maintain and update Geodetic Evidence of Public Utilities (GESUT), part of the National Geodetic and Cartographic Resource, which contains an important for many institutions information of technical infrastructure. It requires an active exchange of data between the Geodesy and Cartography Documentation Centers and institutions, which administrate transmission lines. The administrator of public utilities, is legally obliged to provide information about utilities to GESUT. The aim of the research work was to develop a universal data exchange methodology, which can be implemented on a variety of hardware and software platforms. This methodology use Unified Modeling Language (UML), eXtensible Markup Language (XML), and Geography Markup Language (GML). The proposed methodology is based on the two different strategies: Model Driven Architecture (MDA) and Service Oriented Architecture (SOA). Used solutions are consistent with the INSPIRE Directive and ISO 19100 series standards for geographic information. On the basis of analysis of the input data structures, conceptual models were built for both databases. Models were written in the universal modeling language: UML. Combined model that defines a common data structure was also built. This model was transformed into developed for the exchange of geographic information GML standard. The structure of the document describing the data that may be exchanged is defined in the .xsd file. Network services were selected and implemented in the system designed for data exchange based on open source tools. Methodology was implemented and tested. Data in the agreed data structure and metadata were set up on the server. Data access was provided by geospatial network services: data searching possibilities by Catalog Service for the Web (CSW), data collection by Web Feature Service (WFS). WFS provides also operation for modification data, for example to update them by utility administrator. The proposed solution significantly increases the efficiency of data exchange and facilitates maintenance the National Geodetic and Cartographic Resource.
Analysis of Translocation-Competent Secretory Proteins by HDX-MS.
Tsirigotaki, A; Papanastasiou, M; Trelle, M B; Jørgensen, T J D; Economou, A
2017-01-01
Protein folding is an intricate and precise process in living cells. Most exported proteins evade cytoplasmic folding, become targeted to the membrane, and then trafficked into/across membranes. Their targeting and translocation-competent states are nonnatively folded. However, once they reach the appropriate cellular compartment, they can fold to their native states. The nonnative states of preproteins remain structurally poorly characterized since increased disorder, protein sizes, aggregation propensity, and the observation timescale are often limiting factors for typical structural approaches such as X-ray crystallography and NMR. Here, we present an alternative approach for the in vitro analysis of nonfolded translocation-competent protein states and their comparison with their native states. We make use of hydrogen/deuterium exchange coupled with mass spectrometry (HDX-MS), a method based on differentiated isotope exchange rates in structured vs unstructured protein states/regions, and highly dynamic vs more rigid regions. We present a complete structural characterization pipeline, starting from the preparation of the polypeptides to data analysis and interpretation. Proteolysis and mass spectrometric conditions for the analysis of the labeled proteins are discussed, followed by the analysis and interpretation of HDX-MS data. We highlight the suitability of HDX-MS for identifying short structured regions within otherwise highly flexible protein states, as illustrated by an exported protein example, experimentally tested in our lab. Finally, we discuss statistical analysis in comparative HDX-MS. The protocol is applicable to any protein and protein size, exhibiting slow or fast loss of translocation competence. It could be easily adapted to more complex assemblies, such as the interaction of chaperones with nonnative protein states. © 2017 Elsevier Inc. All rights reserved.
Su, Chong; Sun, Hui; Yang, Hong; Yin, Lei; Zhang, Jiwen; Fawcett, John Paul; Gu, Jingkai
2017-11-01
Porcine relaxin is a 6 kDa peptide hormone of pregnancy with important physiological and pharmacological effects. It contains a number of analogs of which porcine relaxin B29 is one of the most important. To support the development of porcine relaxin B29 as a new drug, we established an UPLC-MS/MS method for its quantitation in dog plasma. Sample preparation by protein precipitation and ion exchange solid phase extraction was followed by UPLC on an XBridge™ BEH300 C18 column at 40 °C in a run time of only 5.5 min. Detection was performed on a Qtrap 6500 mass spectrometer using ESI in the positive ion mode with MRM of the transitions at m/z 831.7 [M+7H] 7+ → 505.4 and m/z 1162.4 [M+5H] 5+ → 226 for pRLX B29 and internal standard (recombinant human insulin), respectively. The method was linear over the concentration range 30-2000 ng/mL with no matrix effects. Intra- and inter-day precisions were < 15% with accuracies in the range 98.8-100.6%. The method was successfully applied to a pharmacokinetic study in beagle dogs after administration of a 0.15 mg/kg intravenous dose. Graphical abstract Sample preparation and detection procedure.
Brumfield, Brian E.; Taubman, Matthew S.; Phillips, Mark C.
2016-05-23
A rapidly-swept external-cavity quantum cascade laser with an open-path Herriott cell is used to quantify gas-phase chemical mixtures of D 2O and HDO at a rate of 40 Hz (25-ms measurement time). The chemical mixtures were generated by evaporating D 2O liquid near the open-path Herriott cell, allowing the H/D exchange reaction with ambient H 2O to produce HDO. Fluctuations in the ratio of D 2O and HDO on timescales of <1 s due to the combined effects of plume transport and the H/D exchange chemical reaction are observed. Noise-equivalent concentrations (1σ) (NEC) of 147.0 ppbv and 151.6 ppbv inmore » a 25-ms measurement time are determined for D 2O and HDO, respectively, with a 127-m optical path. These NECs are improved to 23.0 and 24.0 ppbv with a 1-s averaging time for D 2O and HDO, respectively. NECs <200 ppbv are also estimated for N2O, 1,1,1,2–tetrafluoroethane (F134A), CH 4, acetone and SO 2 for a 25-ms measurement time. Finally, the isotopic precision for measurement of the [D 2O]/[HDO] concentration ratio of 33‰ and 5‰ is calculated for the current experimental conditions for measurement times of 25 ms and 1 s, respectively.« less
Gershon, P D
2010-12-15
Two tools are described for integrating LC elution position with mass-based data in hydrogen-deuterium exchange (HDX) experiments by nano-liquid chromatography/matrix-assisted laser desorption/ionization mass spectrometry (nanoLC/MALDI-MS, a novel approach to HDX-MS). The first of these, 'TOF2H-Z Comparator', highlights peptides in HDX experiments that are potentially misidentified on the basis of mass alone. The program first calculates normalized values for the organic solvent concentration responsible for the elution of ions in nanoLC/MALDI HDX experiments. It then allows the solvent gradients for the multiple experiments contributing to an MS/MS-confirmed peptic peptide library to be brought into mutual alignment by iteratively re-modeling variables among LC parameters such as gradient shape, solvent species, fraction duration and LC dead time. Finally, using the program, high-probability chromatographic outliers can be flagged within HDX experimental data. The role of the second tool, 'TOF2H-XIC Comparator', is to normalize the LC chromatograms corresponding to all deuteration timepoints of all HDX experiments of a project, to a common reference. Accurate normalization facilitates the verification of chromatographic consistency between all ions whose spectral segments contribute to particular deuterium uptake plots. Gradient normalization in this manner revealed chromatographic inconsistencies between ions whose masses were either indistinguishable or separated by precise isotopic increments. Copyright © 2010 John Wiley & Sons, Ltd.
Garcia-Sartal, Cristina; Taebunpakul, Sutthinun; Stokes, Emma; Barciela-Alonso, María del Carmen; Bermejo-Barrera, Pilar; Goenaga-Infante, Heidi
2012-04-01
Edible seaweed consumption is a route of exposure to arsenic. However, little attention has been paid to estimate the bioaccessibility and/or bioavailability of arsenosugars in edible seaweed and their possible degradation products during gastrointestinal digestion. This work presents first use of combined inductively coupled plasma mass spectroscopy (ICP-MS) with electrospray ionization tandem mass spectrometry (ESI-MS/MS) with two-dimensional HPLC (size exclusion followed by anion exchange) to compare the qualitative and quantitative arsenosugars speciation of different edible seaweed with that of their bioavailable fraction as obtained using an in vitro gastrointestinal digestion procedure. Optimal extraction conditions for As species from four seaweed namely kombu, wakame, nori and sea lettuce were selected as a compromise between As extraction efficiency and preservation of compound identity. For most investigated samples, the use of ammonium acetate buffer as extractant and 1 h sonication in a water bath followed by HPLC-ICP-MS resulted in 40-61% of the total As to be found in the buffered aqueous extract, of which 86-110% was present as arsenosugars (glycerol sugar, phosphate sugar and sulfonate sugar for wakame and kombu and glycerol sugar and phosphate sugar for nori). The exception was sea lettuce, for which the arsenosugar fraction (glycerol sugar, phosphate sugar) only comprised 44% of the total extracted As. Interestingly, the ratio of arsenobetaine and dimethylarsinic acid to arsenosugars in sea lettuce extracts seemed higher than that for the rest of investigated samples. After in vitro gastrointestinal digestion, approximately 11-16% of the total As in the solid sample was found in the dialyzates with arsenosugars comprising 93-120% and 41% of the dialyzable As fraction for kombu, wakame, nori and sea lettuce, respectively. Moreover, the relative As species distribution in seaweed-buffered extracts and dialyzates was found to be very similar. Collection of specific fractions from the size-exclusion column to be analysed using anion-exchange HPLC-ESI-MS/MS enabled improved chromatographic selectivity, particularly for the less retained arsenosugar (glycerol sugar), facilitating confirmation of the presence of arsenosugars in seaweed extracts and dialyzates. Using this approach, the presence of arsenobetaine in sea lettuce samples was also confirmed.
Analysis of practical backoff protocols for contention resolution with multiple servers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldberg, L.A.; MacKenzie, P.D.
Backoff protocols are probably the most widely used protocols for contention resolution in multiple access channels. In this paper, we analyze the stochastic behavior of backoff protocols for contention resolution among a set of clients and servers, each server being a multiple access channel that deals with contention like an Ethernet channel. We use the standard model in which each client generates requests for a given server according to a Bemoulli distribution with a specified mean. The client-server request rate of a system is the maximum over all client-server pairs (i, j) of the sum of all request rates associatedmore » with either client i or server j. Our main result is that any superlinear polynomial backoff protocol is stable for any multiple-server system with a sub-unit client-server request rate. We confirm the practical relevance of our result by demonstrating experimentally that the average waiting time of requests is very small when such a system is run with reasonably few clients and reasonably small request rates such as those that occur in actual ethernets. Our result is the first proof of stability for any backoff protocol for contention resolution with multiple servers. Our result is also the first proof that any weakly acknowledgment based protocol is stable for contention resolution with multiple servers and such high request rates. Two special cases of our result are of interest. Hastad, Leighton and Rogoff have shown that for a single-server system with a sub-unit client-server request rate any modified superlinear polynomial backoff protocol is stable. These modified backoff protocols are similar to standard backoff protocols but require more random bits to implement. The special case of our result in which there is only one server extends the result of Hastad, Leighton and Rogoff to standard (practical) backoff protocols. Finally, our result applies to dynamic routing in optical networks.« less
Battle Management Language Transformations
2006-10-01
Simulation (M&S) systems. Battlefield Management Language (BML) is being developed as a common representation of military mission suitable for automated ... processing . Within NATO the task group MSG-048 Coalition BML is defining a BML using the Joint Command, Control and Consultation Information Exchange
Naver: a PC-cluster-based VR system
NASA Astrophysics Data System (ADS)
Park, ChangHoon; Ko, HeeDong; Kim, TaiYun
2003-04-01
In this paper, we present a new framework NAVER for virtual reality application. The NAVER is based on a cluster of low-cost personal computers. The goal of NAVER is to provide flexible, extensible, scalable and re-configurable framework for the virtual environments defined as the integration of 3D virtual space and external modules. External modules are various input or output devices and applications on the remote hosts. From the view of system, personal computers are divided into three servers according to its specific functions: Render Server, Device Server and Control Server. While Device Server contains external modules requiring event-based communication for the integration, Control Server contains external modules requiring synchronous communication every frame. And, the Render Server consists of 5 managers: Scenario Manager, Event Manager, Command Manager, Interaction Manager and Sync Manager. These managers support the declaration and operation of virtual environment and the integration with external modules on remote servers.
NASA Astrophysics Data System (ADS)
Antony, Joby; Mathuria, D. S.; Chaudhary, Anup; Datta, T. S.; Maity, T.
2017-02-01
Cryogenic network for linear accelerator operations demand a large number of Cryogenic sensors, associated instruments and other control-instrumentation to measure, monitor and control different cryogenic parameters remotely. Here we describe an alternate approach of six types of newly designed integrated intelligent cryogenic instruments called device-servers which has the complete circuitry for various sensor-front-end analog instrumentation and the common digital back-end http-server built together, to make crateless PLC-free model of controls and data acquisition. These identified instruments each sensor-specific viz. LHe server, LN2 Server, Control output server, Pressure server, Vacuum server and Temperature server are completely deployed over LAN for the cryogenic operations of IUAC linac (Inter University Accelerator Centre linear Accelerator), New Delhi. This indigenous design gives certain salient features like global connectivity, low cost due to crateless model, easy signal processing due to integrated design, less cabling and device-interconnectivity etc.
Twin-tailed fail-over for fileservers maintaining full performance in the presence of a failure
Coteus, Paul W.; Gara, Alan G.; Giampapa, Mark E.; Heidelberger, Philip; Steinmacher-Burow, Burkhard D.
2008-02-12
A method for maintaining full performance of a file system in the presence of a failure is provided. The file system having N storage devices, where N is an integer greater than zero and N primary file servers where each file server is operatively connected to a corresponding storage device for accessing files therein. The file system further having a secondary file server operatively connected to at least one of the N storage devices. The method including: switching the connection of one of the N storage devices to the secondary file server upon a failure of one of the N primary file servers; and switching the connections of one or more of the remaining storage devices to a primary file server other than the failed file server as necessary so as to prevent a loss in performance and to provide each storage device with an operating file server.
Experimental parametric study of servers cooling management in data centers buildings
NASA Astrophysics Data System (ADS)
Nada, S. A.; Elfeky, K. E.; Attia, Ali M. A.; Alshaer, W. G.
2017-06-01
A parametric study of air flow and cooling management of data centers servers is experimentally conducted for different design conditions. A physical scale model of data center accommodating one rack of four servers was designed and constructed for testing purposes. Front and rear rack and server's temperatures distributions and supply/return heat indices (SHI/RHI) are used to evaluate data center thermal performance. Experiments were conducted to parametrically study the effects of perforated tiles opening ratio, servers power load variation and rack power density. The results showed that (1) perforated tile of 25% opening ratio provides the best results among the other opening ratios, (2) optimum benefit of cold air in servers cooling is obtained at uniformly power loading of servers (3) increasing power density decrease air re-circulation but increase air bypass and servers temperature. The present results are compared with previous experimental and CFD results and fair agreement was found.
Experience with Adaptive Security Policies.
1998-03-01
3.1 Introduction r: 3.2 Logical Groupings of audited permission checks 29 3.3 Auditing of system servers via microkernel snooping 31 3.4...performed by servers other than the microkernel . Since altering each server to audit events would complicate the integration of new servers, a...modification to the microkernel was implemented to allow the microkernel to audit the requests made of other servers. Both methods for enhancing audit
Xian, Yanping; Wu, Yuluan; Dong, Hao; Guo, Xindong; Wang, Bin; Wang, Li
2017-09-29
The present work presents a novel and rapid analytical method for the simultaneous analysis of bisphenol A (BPA), bisphenol B (BPB), bisphenol F (BPF) and bisphenol S (BPS) in edible oil based on dispersive micro solid phase extraction (DMSPE) for the first time followed by isotope dilution-ultra high performance liquid chromatography tandem mass spectrometry (UPLC-MS/MS). The edible oil sample was dispersed by n-hexane and extracted with ammoniated methanol-water solution. Then the target analytes were dispersedly absorbed using the polymer anion exchange (PAX) as the sorbent and eluted by acidic methanol. After that, four bisphenols were separated on a C18 column by gradient elution with methanol and 0.05% ammonium hydroxide in water as mobile phase, detected by MS/MS under multiple reactions monitoring (MRM) mode and quantified by internal standard method. The PAX amounts, adsorption time, concentrations of formic acid in the elution solvent and volume of elution solvent for the DMSPE technique were optimized. The limit of detection and quantitation (LOD and LOQ), matrix effect, recovery and precision of the developed method were investigated. Results indicated that BPS and the rest three bisphenols displayed excellent linearity in the concentration ranges of 0.1-50μg/L and 0.5-250μg/L, respectively, with correlation coefficients (R 2 ) all larger than 0.998. Achieved MLODs (S/N=3) varied between 0.1-0.4μg/kg for all bisphenols. The mean recoveries at three spiked levels in edible oil were in the range of 87.3-108%. Intra-day precision (n=6) and inter-day precision (n=5) were <9% and <11%, respectively. This method is of rapid-and-simple pretreatment, accurate and sensitive, and suitable for the simultaneous determination of bisphenols in edible oil. Copyright © 2017. Published by Elsevier B.V.
Opportunities for the Mashup of Heterogenous Data Server via Semantic Web Technology
NASA Astrophysics Data System (ADS)
Ritschel, Bernd; Seelus, Christoph; Neher, Günther; Iyemori, Toshihiko; Koyama, Yukinobu; Yatagai, Akiyo; Murayama, Yasuhiro; King, Todd; Hughes, John; Fung, Shing; Galkin, Ivan; Hapgood, Michael; Belehaki, Anna
2015-04-01
Opportunities for the Mashup of Heterogenous Data Server via Semantic Web Technology European Union ESPAS, Japanese IUGONET and GFZ ISDC data server are developed for the ingestion, archiving and distributing of geo and space science domain data. Main parts of the data -managed by the mentioned data server- are related to near earth-space and geomagnetic field data. A smart mashup of the data server would allow a seamless browse and access to data and related context information. However the achievement of a high level of interoperability is a challenge because the data server are based on different data models and software frameworks. This paper is focused on the latest experiments and results for the mashup of the data server using the semantic Web approach. Besides the mashup of domain and terminological ontologies, especially the options to connect data managed by relational databases using D2R server and SPARQL technology will be addressed. A successful realization of the data server mashup will not only have a positive impact to the data users of the specific scientific domain but also to related projects, such as e.g. the development of a new interoperable version of NASA's Planetary Data System (PDS) or ICUS's World Data System alliance. ESPAS data server: https://www.espas-fp7.eu/portal/ IUGONET data server: http://search.iugonet.org/iugonet/ GFZ ISDC data server (semantic Web based prototype): http://rz-vm30.gfz-potsdam.de/drupal-7.9/ NASA PDS: http://pds.nasa.gov ICSU-WDS: https://www.icsu-wds.org
Triple-server blind quantum computation using entanglement swapping
NASA Astrophysics Data System (ADS)
Li, Qin; Chan, Wai Hong; Wu, Chunhui; Wen, Zhonghua
2014-04-01
Blind quantum computation allows a client who does not have enough quantum resources or technologies to achieve quantum computation on a remote quantum server such that the client's input, output, and algorithm remain unknown to the server. Up to now, single- and double-server blind quantum computation have been considered. In this work, we propose a triple-server blind computation protocol where the client can delegate quantum computation to three quantum servers by the use of entanglement swapping. Furthermore, the three quantum servers can communicate with each other and the client is almost classical since one does not require any quantum computational power, quantum memory, and the ability to prepare any quantum states and only needs to be capable of getting access to quantum channels.
Code of Federal Regulations, 2011 CFR
2011-10-01
... ENVIRONMENTAL, CONSERVATION, OCCUPATIONAL SAFETY, AND DRUG-FREE WORKPLACE Energy-Efficient Computer Equipment 1523.7002 Waivers. (a) There are several types of computer equipment which technically fall under the... types of equipment: (1) LAN servers, including file servers; application servers; communication servers...
Code of Federal Regulations, 2012 CFR
2012-10-01
... ENVIRONMENTAL, CONSERVATION, OCCUPATIONAL SAFETY, AND DRUG-FREE WORKPLACE Energy-Efficient Computer Equipment 1523.7002 Waivers. (a) There are several types of computer equipment which technically fall under the... types of equipment: (1) LAN servers, including file servers; application servers; communication servers...
Code of Federal Regulations, 2010 CFR
2010-10-01
... ENVIRONMENTAL, CONSERVATION, OCCUPATIONAL SAFETY, AND DRUG-FREE WORKPLACE Energy-Efficient Computer Equipment 1523.7002 Waivers. (a) There are several types of computer equipment which technically fall under the... types of equipment: (1) LAN servers, including file servers; application servers; communication servers...
Urinary Amino Acid Analysis: A Comparison of iTRAQ®-LC-MS/MS, GC-MS, and Amino Acid Analyzer
Kaspar, Hannelore; Dettmer, Katja; Chan, Queenie; Daniels, Scott; Nimkar, Subodh; Daviglus, Martha L.; Stamler, Jeremiah; Elliott, Paul; Oefner, Peter J.
2009-01-01
Urinary amino acid analysis is typically done by cation-exchange chromatography followed by post-column derivatization with ninhydrin and UV detection. This method lacks throughput and specificity. Two recently introduced stable isotope ratio mass spectrometric methods promise to overcome those shortcomings. Using two blinded sets of urine replicates and a certified amino acid standard, we compared the precision and accuracy of gas chromatography/mass spectrometry (GC-MS) and liquid chromatography-tandem mass spectrometry (LC-MS/MS) of propyl chloroformate and iTRAQ® derivatized amino acids, respectively, to conventional amino acid analysis. The GC-MS method builds on the direct derivatization of amino acids in diluted urine with propyl chloroformate, GC separation and mass spectrometric quantification of derivatives using stable isotope labeled standards. The LC-MS/MS method requires prior urinary protein precipitation followed by labeling of urinary and standard amino acids with iTRAQ® tags containing different cleavable reporter ions distinguishable by MS/MS fragmentation. Means and standard deviations of percent technical error (%TE) computed for 20 amino acids determined by amino acid analyzer, GC-MS, and iTRAQ®-LC-MS/MS analyses of 33 duplicate and triplicate urine specimens were 7.27±5.22, 21.18±10.94, and 18.34±14.67, respectively. Corresponding values for 13 amino acids determined in a second batch of 144 urine specimens measured in duplicate or triplicate were 8.39±5.35, 6.23±3.84, and 35.37±29.42. Both GC-MS and iTRAQ®-LC-MS/MS are suited for high-throughput amino acid analysis, with the former offering at present higher reproducibility and completely automated sample pretreatment, while the latter covers more amino acids and related amines. PMID:19481989
Urinary amino acid analysis: a comparison of iTRAQ-LC-MS/MS, GC-MS, and amino acid analyzer.
Kaspar, Hannelore; Dettmer, Katja; Chan, Queenie; Daniels, Scott; Nimkar, Subodh; Daviglus, Martha L; Stamler, Jeremiah; Elliott, Paul; Oefner, Peter J
2009-07-01
Urinary amino acid analysis is typically done by cation-exchange chromatography followed by post-column derivatization with ninhydrin and UV detection. This method lacks throughput and specificity. Two recently introduced stable isotope ratio mass spectrometric methods promise to overcome those shortcomings. Using two blinded sets of urine replicates and a certified amino acid standard, we compared the precision and accuracy of gas chromatography/mass spectrometry (GC-MS) and liquid chromatography-tandem mass spectrometry (LC-MS/MS) of propyl chloroformate and iTRAQ derivatized amino acids, respectively, to conventional amino acid analysis. The GC-MS method builds on the direct derivatization of amino acids in diluted urine with propyl chloroformate, GC separation and mass spectrometric quantification of derivatives using stable isotope labeled standards. The LC-MS/MS method requires prior urinary protein precipitation followed by labeling of urinary and standard amino acids with iTRAQ tags containing different cleavable reporter ions distinguishable by MS/MS fragmentation. Means and standard deviations of percent technical error (%TE) computed for 20 amino acids determined by amino acid analyzer, GC-MS, and iTRAQ-LC-MS/MS analyses of 33 duplicate and triplicate urine specimens were 7.27+/-5.22, 21.18+/-10.94, and 18.34+/-14.67, respectively. Corresponding values for 13 amino acids determined in a second batch of 144 urine specimens measured in duplicate or triplicate were 8.39+/-5.35, 6.23+/-3.84, and 35.37+/-29.42. Both GC-MS and iTRAQ-LC-MS/MS are suited for high-throughput amino acid analysis, with the former offering at present higher reproducibility and completely automated sample pretreatment, while the latter covers more amino acids and related amines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manes, Nathan P.; Estep, Ryan D.; Mottaz, Heather M.
2008-03-07
Orthopoxviruses are the largest and most complex of the animal viruses. In response to the recent emergence of monkeypox in Africa and the threat of smallpox bioterrorism, virulent (monkeypox virus) and benign (vaccinia virus) orthopoxviruses were proteomically compared with the goal of identifying proteins required for pathogenesis. Orthopoxviruses were grown in HeLa cells to two different viral forms (intracellular mature virus and extracellular enveloped virus), purified by sucrose gradient ultracentrifugation, denatured using RapiGest™ surfactant, and digested with trypsin. Unfractionated samples and strong cation exchange HPLC fractions were analyzed by reversed-phase LC-MS/MS, and analyses of the MS/MS spectra using SEQUEST® andmore » X! Tandem resulted in the identification of hundreds of monkeypox, vaccinia, and copurified host proteins. The unfractionated samples were additionally analyzed by LC-MS on an LTQ-Orbitrap™, and the accurate mass and elution time tag approach was used to perform quantitative comparisons. Possible pathophysiological roles of differentially expressed orthopoxvirus genes are discussed.« less
Li, Hua; Jiang, Xiaoyu; Xie, Jingping; Gore, John C; Xu, Junzhong
2017-06-01
To investigate the influence of transcytolemmal water exchange on estimates of tissue microstructural parameters derived from diffusion MRI using conventional PGSE and IMPULSED methods. Computer simulations were performed to incorporate a broad range of intracellular water life times τ in (50-∞ ms), cell diameters d (5-15 μm), and intrinsic diffusion coefficient D in (0.6-2 μm 2 /ms) for different values of signal-to-noise ratio (SNR) (10 to 50). For experiments, murine erythroleukemia (MEL) cancer cells were cultured and treated with saponin to selectively change cell membrane permeability. All fitted microstructural parameters from simulations and experiments in vitro were compared with ground-truth values. Simulations showed that, for both PGSE and IMPULSED methods, cell diameter d can be reliably fit with sufficient SNR (≥ 50), whereas intracellular volume fraction f in is intrinsically underestimated due to transcytolemmal water exchange. D in can be reliably fit only with sufficient SNR and using the IMPULSED method with short diffusion times. These results were confirmed with those obtained in the cell culture experiments in vitro. For the sequences and models considered in this study, transcytolemmal water exchange has minor effects on the fittings of d and D in with physiologically relevant membrane permeabilities if the SNR is sufficient (> 50), but f in is intrinsically underestimated. Magn Reson Med 77:2239-2249, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.
Khakinejad, Mahdiar; Kondalaji, Samaneh Ghassabi; Tafreshian, Amirmahdi; Valentine, Stephen J
2015-07-01
The per-residue, gas-phase hydrogen deuterium exchange (HDX) kinetics for individual amino acid residues on selected ion conformer types of the model peptide KKDDDDDIIKIIK have been examined using ion mobility spectrometry (IMS) and HDX-tandem mass spectrometry (MS/MS) techniques. The [M + 4H](4+) ions exhibit two major conformer types with collision cross sections of 418 Å(2) and 446 Å(2); the [M + 3H](3+) ions also yield two different conformer types having collision cross sections of 340 Å(2) and 367 Å(2). Kinetics plots of HDX for individual amino acid residues reveal fast- and slow-exchanging hydrogens. The contributions of each amino acid residue to the overall conformer type rate constant have been estimated. For this peptide, N- and C-terminal K residues exhibit the greatest contributions for all ion conformer types. Interior D and I residues show decreased contributions. Several charge state trends are observed. On average, the D residues of the [M + 3H](3+) ions show faster HDX rate contributions compared with [M + 4H](4+) ions. In contrast the interior I8 and I9 residues show increased accessibility to exchange for the more elongated [M + 4H](4+) ion conformer type. The contribution of each residue to the overall uptake rate showed a good correlation with a residue hydrogen accessibility score model calculated using a distance from charge site and initial incorporation site for nominal structures obtained from molecular dynamic simulations (MDS).
Maitra, Tanmoy; Giri, Debasis
2014-12-01
The medical organizations have introduced Telecare Medical Information System (TMIS) to provide a reliable facility by which a patient who is unable to go to a doctor in critical or urgent period, can communicate to a doctor through a medical server via internet from home. An authentication mechanism is needed in TMIS to hide the secret information of both parties, namely a server and a patient. Recent research includes patient's biometric information as well as password to design a remote user authentication scheme that enhances the security level. In a single server environment, one server is responsible for providing services to all the authorized remote patients. However, the problem arises if a patient wishes to access several branch servers, he/she needs to register to the branch servers individually. In 2014, Chuang and Chen proposed an remote user authentication scheme for multi-server environment. In this paper, we have shown that in their scheme, an non-register adversary can successfully logged-in into the system as a valid patient. To resist the weaknesses, we have proposed an authentication scheme for TMIS in multi-server environment where the patients can register to a root telecare server called registration center (RC) in one time to get services from all the telecare branch servers through their registered smart card. Security analysis and comparison shows that our proposed scheme provides better security with low computational and communication cost.
Secure Dynamic access control scheme of PHR in cloud computing.
Chen, Tzer-Shyong; Liu, Chia-Hui; Chen, Tzer-Long; Chen, Chin-Sheng; Bau, Jian-Guo; Lin, Tzu-Ching
2012-12-01
With the development of information technology and medical technology, medical information has been developed from traditional paper records into electronic medical records, which have now been widely applied. The new-style medical information exchange system "personal health records (PHR)" is gradually developed. PHR is a kind of health records maintained and recorded by individuals. An ideal personal health record could integrate personal medical information from different sources and provide complete and correct personal health and medical summary through the Internet or portable media under the requirements of security and privacy. A lot of personal health records are being utilized. The patient-centered PHR information exchange system allows the public autonomously maintain and manage personal health records. Such management is convenient for storing, accessing, and sharing personal medical records. With the emergence of Cloud computing, PHR service has been transferred to storing data into Cloud servers that the resources could be flexibly utilized and the operation cost can be reduced. Nevertheless, patients would face privacy problem when storing PHR data into Cloud. Besides, it requires a secure protection scheme to encrypt the medical records of each patient for storing PHR into Cloud server. In the encryption process, it would be a challenge to achieve accurately accessing to medical records and corresponding to flexibility and efficiency. A new PHR access control scheme under Cloud computing environments is proposed in this study. With Lagrange interpolation polynomial to establish a secure and effective PHR information access scheme, it allows to accurately access to PHR with security and is suitable for enormous multi-users. Moreover, this scheme also dynamically supports multi-users in Cloud computing environments with personal privacy and offers legal authorities to access to PHR. From security and effectiveness analyses, the proposed PHR access scheme in Cloud computing environments is proven flexible and secure and could effectively correspond to real-time appending and deleting user access authorization and appending and revising PHR records.
RSA-Based Password-Authenticated Key Exchange, Revisited
NASA Astrophysics Data System (ADS)
Shin, Seonghan; Kobara, Kazukuni; Imai, Hideki
The RSA-based Password-Authenticated Key Exchange (PAKE) protocols have been proposed to realize both mutual authentication and generation of secure session keys where a client is sharing his/her password only with a server and the latter should generate its RSA public/private key pair (e, n), (d, n) every time due to the lack of PKI (Public-Key Infrastructures). One of the ways to avoid a special kind of off-line (so called e-residue) attacks in the RSA-based PAKE protocols is to deploy a challenge/response method by which a client verifies the relative primality of e and φ(n) interactively with a server. However, this kind of RSA-based PAKE protocols did not give any proof of the underlying challenge/response method and therefore could not specify the exact complexity of their protocols since there exists another security parameter, needed in the challenge/response method. In this paper, we first present an RSA-based PAKE (RSA-PAKE) protocol that can deploy two different challenge/response methods (denoted by Challenge/Response Method1 and Challenge/Response Method2). The main contributions of this work include: (1) Based on the number theory, we prove that the Challenge/Response Method1 and the Challenge/Response Method2 are secure against e-residue attacks for any odd prime e (2) With the security parameter for the on-line attacks, we show that the RSA-PAKE protocol is provably secure in the random oracle model where all of the off-line attacks are not more efficient than on-line dictionary attacks; and (3) By considering the Hamming weight of e and its complexity in the. RSA-PAKE protocol, we search for primes to be recommended for a practical use. We also compare the RSA-PAKE protocol with the previous ones mainly in terms of computation and communication complexities.
Operations Technology Exchange Initiating Partnerships University Partners Government Partners Industry Partnerships University Partners Government Partners Industry Partners Part of the Automotive Research Center's that are relevant to both the Army and industry. We create research opportunities for Ph.D. and M.S
Liyanage, Rohana; Devarapalli, Nagarjuna; Pyland, Derek B; Puckett, Latisha M; Phan, N H; Starch, Joel A; Okimoto, Mark R; Gidden, Jennifer; Stites, Wesley E; Lay, Jackson O
2012-12-15
Protein equilibrium snapshot by hydrogen/deuterium exchange electrospray ionization mass spectrometry (PEPS-HDX-ESI-MS or PEPS) is a method recently introduced for estimating protein folding energies and rates. Herein we describe the basis for this method using both theory and new experiments. Benchmark experiments were conducted using ubiquitin because of the availability of reference data for folding and unfolding rates from NMR studies. A second set of experiments was also conducted to illustrate the surprising resilience of the PEPS to changes in HDX time, using staphylococcal nuclease and time frames ranging from a few seconds to several minutes. Theory suggests that PEPS experiments should be conducted at relatively high denaturant concentrations, where the protein folding/unfolding rates are slow with respect to HDX and the life times of both the closed and open states are long enough to be sampled experimentally. Upon deliberate denaturation, changes in folding/unfolding are correlated with associated changes in the ESI-MS signal upon fast HDX. When experiments are done quickly, typically within a few seconds, ESI-MS signals, corresponding to the equilibrium population of the native (closed) and denatured (open) states can both be detected. The interior of folded proteins remains largely un-exchanged. Amongst MS methods, the simultaneous detection of both states in the spectrum is unique to PEPS and provides a "snapshot" of these populations. The associated ion intensities are used to estimate the protein folding equilibrium constant (or the free energy change, ΔG). Linear extrapolation method (LEM) plots of derived ΔG values for each denaturant concentration can then be used to calculate ΔG in the absence of denaturant, ΔG(H(2)O). In accordance with the requirement for detection of signals for both the folded and unfolded states, this theoretical framework predicts that PEPS experiments work best at the middle of the denaturation curve where natured and denatured protein molecules are equilibrated at easily detectable ratios, namely 1:1. It also requires that closed and open states have lifetimes measurable in the time frame of the HDX experiment. Because both conditions are met by PEPS, these measurements can provide an accurate assessment of closed/open state populations and thus protein folding energies/rates.
Exploring the Circulation Dynamics of Mississippi Sound and Bight Using the CONCORDE Synthesis Model
NASA Astrophysics Data System (ADS)
Pan, C.; Dinniman, M. S.; Fitzpatrick, P. J.; Lau, Y.; Cambazoglu, M. K.; Parra, S. M.; Hofmann, E. E.; Dzwonkowski, B.; Warner, S. J.; O'Brien, S. J.; Dykstra, S. L.; Wiggert, J. D.
2017-12-01
As part of the modeling effort of the GOMRI (Gulf of Mexico Research Initiative)-funded CONCORDE consortium, a high resolution ( 400 m) regional ocean model is implemented for the Mississippi (MS) Sound and Bight. The model is based on the Coupled Ocean Atmosphere Wave Sediment Transport Modeling System (COAWST), with initial and lateral boundary conditions drawn from data assimilative 3-day forecasts of the 1km-resolution Gulf of Mexico Navy Coastal Ocean Model (GOM-NCOM). The model initiates on 01/01/2014 and runs for 3 years. The model results are validated with available remote sensing data and with CONCORDE's moored and ship-based in-situ observations. Results from a three-year simulation (2014-2016) show that ocean circulation and water properties of the MS Sound and Bight are sensitive to meteorological forcing. A low resolution surface forcing, drawn from the North America Regional Reanalysis (NARR), and a high resolution forcing, called CONCORDE Meteorological Analysis (CMA) ) that resolves the diurnal sea breeze, are used to drive the model to examine the sensitivity of the circulation to surface forcing. The model responses to the low resolution NARR forcing and to the high resolution CMA are compared in detail for the CONCORDE Fall and Spring field campaigns when contemporaneous in situ data are available, with a focus on how simulated exchanges between MS Sound and MS Bight are impacted. In most cases, the model shows higher simulation skill when it is driven by CMA. Freshwater plumes of the MS River, MS Sound and Mobile Bay influence the shelf waters of the MS Bight in terms of material budget and dynamics. Drifters and dye experiments near Mobile Bay demonstrate that material exchanges between Mobile Bay and the Sound, and between the Sound and Bight, are sensitive to the wind strength and direction. A model - data comparison targeting the Mobile Bay plume suggests that under both northerly and southerly wind conditions the model is capable of simulating the variation of the plume in terms of velocity, plume extent, heat and salt budgets.
Effect of substrate temperature on magnetic properties of MnFe2O4 thin films
NASA Astrophysics Data System (ADS)
Rajagiri, Prabhu; Sahu, B. N.; Venkataramani, N.; Prasad, Shiva; Krishnan, R.
2018-05-01
MnFe2O4 thin films were pulsed laser deposited on to quartz substrate from room temperature (RT) to 650 °C in a pure argon environment. Temperature dependence of spontaneous magnetization (4πMS) was measured on these films from 10 K to 350 K using a vibrating sample magnetometer. Ferromagnetic resonance (FMR) study was also carried out at 300 K. The exchange stiffness constant (D) values were obtained by fitting the 4πMS data to the Bloch's equation. The D values of the films thus found decreases while the 4πMS value increases, though non-monotonically, with the increase in TS and tend to reach bulk values at TS = 650 °C. The variation in D and 4πMS values of the films are explained based on the degree of inversion and oxidation state of cations in thin films.
NASA Astrophysics Data System (ADS)
Ouadah, Amina; Xu, Hulin; Luo, Tianwei; Gao, Shuitao; Wang, Xing; Fang, Zhou; Jing, Chaojun; Zhu, Changjin
2017-12-01
A new series of ionic liquid functionalized copolymers for anion exchange membranes (AEM) is prepared. Poly(butylvinylimidazolium)(b-VIB) is copolymerized with para-methyl styrene (p-MS) by the radical polymerization formed block copolymers b-VIB/p-MS, which is crosslinked with poly(diphenylether bibenzimidazole) (DPEBI) providing the desired materials b-VIB/p-MS/DPEBI. Structures are characterized via H1NMR, FTIR spectra and elemental analysis. The b-VIB blocks offer the anion conduction function while DPEBI moieties contribute to enhancing other properties. The prepared membranes display chloride conductivity as high as 19.5 mS/cm at 25 °C and 69.2 mS/cm at 100 °C-higher than that of the commercial membrane tokuyuama A201-. Their hydroxide conductivity reaches 35.7 Scm-1 at 25 °C and 73.1 Scm-1 at 100 °C. The membranes showed a linear Arrhenius behavior in the anion conduction, low activation energies and distinguished nanophase separation of hydrophilic/hydrophobic regions by the transmission electron microscopy (TEM) studies. Thermal investigations using TGA and DSC confirm that the membranes are stable up to 250 °C. Particularly, drastically alkaline stability due to no decrease in the hydroxide conductivity after 168 h of treatment with 2M KOH.
Aureli, Federica; Ouerdane, Laurent; Bierla, Katarzyna; Szpunar, Joanna; Prakash, Nagaraja Tejo; Cubadda, Francesco
2012-08-01
Several novel selenium containing compounds were characterized in staple crops (wheat, rice and maize) grown on soils naturally rich in selenium. A dedicated method based on the coupling of liquid chromatography with multiplexed detection (ICP-MS, ESI-Orbitrap MS(/MS)) was developed for the speciation of low-molecular weight (<5 kDa) selenium metabolites. Nine species present in different proportions as a function of the crop type were identified by cation-exchange HPLC-ESI-Orbitrap MS on the basis of the accurate molecular mass and MS/MS spectra. The natural origin of these species was then validated by varying extraction conditions and by using hydrophilic interaction LC (HILIC)-ESI-Orbitrap MS(/MS). Among the identified compounds, Se-containing monosaccharides (hexose moiety, m/z 317 and m/z 358) or Se-containing disaccharides (hexose-pentose moiety, m/z 407 and m/z 408) were the first selenosugars reported in edible plants. It is also the first report of the presence of 2,3-dihydroxypropionyl-selenolanthionine (m/z 345) in rice. Because these crops can be an important source of selenium in animal and human nutrition, the understanding of the origin and the fate of these species during metabolic processes will be of great interest.
Synthesis of new oligothiophene derivatives and their intercalation compounds: Orientation effects
Ibrahim, M.A.; Lee, B.-G.; Park, N.-G.; Pugh, J.R.; Eberl, D.D.; Frank, A.J.
1999-01-01
The orientation dependence of intercalated oligothiophene derivatives in vermiculite and metal disulfides MS2 (M = Mo, Ti and Zr) on the pendant group on the thiophene ring and the host material was studied by X-ray diffraction (XRD) and solid state nuclear magnetic resonance spectroscopy. Amino and nitro derivatives of bi-, ter- and quarter-thiophenes were synthesized for the first time. The amino-oligothiophenes were intercalated into vermiculite by an exchange reaction with previously intercalated octadecylammonium vermiculite and into MS2 by the intercalation-exfoliation technique. Analysis of the XRD data indicates that a monolayer of amino-oligothiophene orients perpendicularly to the silicate surface in vermiculite and lies flat in the van der Waals gap of MS2.
NASA Astrophysics Data System (ADS)
Suckow, A. O.
2013-12-01
Measurements need post-processing to obtain results that are comparable between laboratories. Raw data may need to be corrected for blank, memory, drift (change of reference values with time), linearity (dependence of reference on signal height) and normalized to international reference materials. Post-processing parameters need to be stored for traceability of results. State of the art stable isotope correction schemes are available based on MS Excel (Geldern and Barth, 2012; Gröning, 2011) or MS Access (Coplen, 1998). These are specialized to stable isotope measurements only, often only to the post-processing of a special run. Embedding of algorithms into a multipurpose database system was missing. This is necessary to combine results of different tracers (3H, 3He, 2H, 18O, CFCs, SF6...) or geochronological tools (Sediment dating e.g. with 210Pb, 137Cs), to relate to attribute data (submitter, batch, project, geographical origin, depth in core, well information etc.) and for further interpretation tools (e.g. lumped parameter modelling). Database sub-systems to the LabData laboratory management system (Suckow and Dumke, 2001) are presented for stable isotopes and for gas chromatographic CFC and SF6 measurements. The sub-system for stable isotopes allows the following post-processing: 1. automated import from measurement software (Isodat, Picarro, LGR), 2. correction for sample-to sample memory, linearity, drift, and renormalization of the raw data. The sub-system for gas chromatography covers: 1. storage of all raw data 2. storage of peak integration parameters 3. correction for blank, efficiency and linearity The user interface allows interactive and graphical control of the post-processing and all corrections by export to and plot in MS Excel and is a valuable tool for quality control. The sub-databases are integrated into LabData, a multi-user client server architecture using MS SQL server as back-end and an MS Access front-end and installed in four laboratories to date. Attribute data storage (unique ID for each subsample, origin, project context etc.) and laboratory management features are included. Export routines to Excel (depth profiles, time series, all possible tracer-versus tracer plots...) and modelling capabilities are add-ons. The source code is public domain and available under the GNU general public licence agreement (GNU-GPL). References Coplen, T.B., 1998. A manual for a laboratory information management system (LIMS) for light stable isotopes. Version 7.0. USGS open file report 98-284. Geldern, R.v., Barth, J.A.C., 2012. Optimization of instrument setup and post-run corrections for oxygen and hydrogen stable isotope measurements of water by isotope ratio infrared spectroscopy (IRIS). Limnology and Oceanography: Methods 10, 1024-1036. Gröning, M., 2011. Improved water δ2H and δ18O calibration and calculation of measurement uncertainty using a simple software tool. Rapid Communications in Mass Spectrometry 25, 2711-2720. Suckow, A., Dumke, I., 2001. A database system for geochemical, isotope hydrological and geochronological laboratories. Radiocarbon 43, 325-337.
NASA Astrophysics Data System (ADS)
Sasikala, S.; Indhira, K.; Chandrasekaran, V. M.
2017-11-01
In this paper, we have considered an MX / (a,b) / 1 queueing system with server breakdown without interruption, multiple vacations, setup times and N-policy. After a batch of service, if the size of the queue is ξ (< a), then the server immediately takes a vacation. Upon returns from a vacation, if the queue is less than N, then the server takes another vacation. This process continues until the server finds atleast N customers in the queue. After a vacation, if the server finds at least N customers waiting for service, then the server needs a setup time to start the service. After a batch of service, if the amount of waiting customers in the queue is ξ (≥ a) then the server serves a batch of min(ξ,b) customers, where b ≥ a. We derived the probability generating function of queue length at arbitrary time epoch. Further, we obtained some important performance measures.
Secure entanglement distillation for double-server blind quantum computation.
Morimae, Tomoyuki; Fujii, Keisuke
2013-07-12
Blind quantum computation is a new secure quantum computing protocol where a client, who does not have enough quantum technologies at her disposal, can delegate her quantum computation to a server, who has a fully fledged quantum computer, in such a way that the server cannot learn anything about the client's input, output, and program. If the client interacts with only a single server, the client has to have some minimum quantum power, such as the ability of emitting randomly rotated single-qubit states or the ability of measuring states. If the client interacts with two servers who share Bell pairs but cannot communicate with each other, the client can be completely classical. For such a double-server scheme, two servers have to share clean Bell pairs, and therefore the entanglement distillation is necessary in a realistic noisy environment. In this Letter, we show that it is possible to perform entanglement distillation in the double-server scheme without degrading the security of blind quantum computing.
Rahmati, Mitra; Davarynejad, Gholam Hossein; Génard, Michel; Bannayan, Mohammad; Azizi, Majid; Vercambre, Gilles
2015-01-01
In this study the sensitivity of peach tree (Prunus persica L.) to three water stress levels from mid-pit hardening until harvest was assessed. Seasonal patterns of shoot and fruit growth, gas exchange (leaf photosynthesis, stomatal conductance and transpiration) as well as carbon (C) storage/mobilization were evaluated in relation to plant water status. A simple C balance model was also developed to investigate sink-source relationship in relation to plant water status at the tree level. The C source was estimated through the leaf area dynamics and leaf photosynthesis rate along the season. The C sink was estimated for maintenance respiration and growth of shoots and fruits. Water stress significantly reduced gas exchange, and fruit, and shoot growth, but increased fruit dry matter concentration. Growth was more affected by water deficit than photosynthesis, and shoot growth was more sensitive to water deficit than fruit growth. Reduction of shoot growth was associated with a decrease of shoot elongation, emergence, and high shoot mortality. Water scarcity affected tree C assimilation due to two interacting factors: (i) reduction in leaf photosynthesis (-23% and -50% under moderate (MS) and severe (SS) water stress compared to low (LS) stress during growth season) and (ii) reduction in total leaf area (-57% and -79% under MS and SS compared to LS at harvest). Our field data analysis suggested a Ψstem threshold of -1.5 MPa below which daily net C gain became negative, i.e. C assimilation became lower than C needed for respiration and growth. Negative C balance under MS and SS associated with decline of trunk carbohydrate reserves--may have led to drought-induced vegetative mortality.
Rahmati, Mitra; Davarynejad, Gholam Hossein; Génard, Michel; Bannayan, Mohammad; Azizi, Majid; Vercambre, Gilles
2015-01-01
In this study the sensitivity of peach tree (Prunus persica L.) to three water stress levels from mid-pit hardening until harvest was assessed. Seasonal patterns of shoot and fruit growth, gas exchange (leaf photosynthesis, stomatal conductance and transpiration) as well as carbon (C) storage/mobilization were evaluated in relation to plant water status. A simple C balance model was also developed to investigate sink-source relationship in relation to plant water status at the tree level. The C source was estimated through the leaf area dynamics and leaf photosynthesis rate along the season. The C sink was estimated for maintenance respiration and growth of shoots and fruits. Water stress significantly reduced gas exchange, and fruit, and shoot growth, but increased fruit dry matter concentration. Growth was more affected by water deficit than photosynthesis, and shoot growth was more sensitive to water deficit than fruit growth. Reduction of shoot growth was associated with a decrease of shoot elongation, emergence, and high shoot mortality. Water scarcity affected tree C assimilation due to two interacting factors: (i) reduction in leaf photosynthesis (-23% and -50% under moderate (MS) and severe (SS) water stress compared to low (LS) stress during growth season) and (ii) reduction in total leaf area (-57% and -79% under MS and SS compared to LS at harvest). Our field data analysis suggested a Ψstem threshold of -1.5 MPa below which daily net C gain became negative, i.e. C assimilation became lower than C needed for respiration and growth. Negative C balance under MS and SS associated with decline of trunk carbohydrate reserves – may have led to drought-induced vegetative mortality. PMID:25830350
Krystkowiak, Izabella; Manguy, Jean; Davey, Norman E
2018-06-05
There is a pressing need for in silico tools that can aid in the identification of the complete repertoire of protein binding (SLiMs, MoRFs, miniMotifs) and modification (moiety attachment/removal, isomerization, cleavage) motifs. We have created PSSMSearch, an interactive web-based tool for rapid statistical modeling, visualization, discovery and annotation of protein motif specificity determinants to discover novel motifs in a proteome-wide manner. PSSMSearch analyses proteomes for regions with significant similarity to a motif specificity determinant model built from a set of aligned motif-containing peptides. Multiple scoring methods are available to build a position-specific scoring matrix (PSSM) describing the motif specificity determinant model. This model can then be modified by a user to add prior knowledge of specificity determinants through an interactive PSSM heatmap. PSSMSearch includes a statistical framework to calculate the significance of specificity determinant model matches against a proteome of interest. PSSMSearch also includes the SLiMSearch framework's annotation, motif functional analysis and filtering tools to highlight relevant discriminatory information. Additional tools to annotate statistically significant shared keywords and GO terms, or experimental evidence of interaction with a motif-recognizing protein have been added. Finally, PSSM-based conservation metrics have been created for taxonomic range analyses. The PSSMSearch web server is available at http://slim.ucd.ie/pssmsearch/.
SciServer Compute brings Analysis to Big Data in the Cloud
NASA Astrophysics Data System (ADS)
Raddick, Jordan; Medvedev, Dmitry; Lemson, Gerard; Souter, Barbara
2016-06-01
SciServer Compute uses Jupyter Notebooks running within server-side Docker containers attached to big data collections to bring advanced analysis to big data "in the cloud." SciServer Compute is a component in the SciServer Big-Data ecosystem under development at JHU, which will provide a stable, reproducible, sharable virtual research environment.SciServer builds on the popular CasJobs and SkyServer systems that made the Sloan Digital Sky Survey (SDSS) archive one of the most-used astronomical instruments. SciServer extends those systems with server-side computational capabilities and very large scratch storage space, and further extends their functions to a range of other scientific disciplines.Although big datasets like SDSS have revolutionized astronomy research, for further analysis, users are still restricted to downloading the selected data sets locally - but increasing data sizes make this local approach impractical. Instead, researchers need online tools that are co-located with data in a virtual research environment, enabling them to bring their analysis to the data.SciServer supports this using the popular Jupyter notebooks, which allow users to write their own Python and R scripts and execute them on the server with the data (extensions to Matlab and other languages are planned). We have written special-purpose libraries that enable querying the databases and other persistent datasets. Intermediate results can be stored in large scratch space (hundreds of TBs) and analyzed directly from within Python or R with state-of-the-art visualization and machine learning libraries. Users can store science-ready results in their permanent allocation on SciDrive, a Dropbox-like system for sharing and publishing files. Communication between the various components of the SciServer system is managed through SciServer‘s new Single Sign-on Portal.We have created a number of demos to illustrate the capabilities of SciServer Compute, including Python and R scripts accessing a range of datasets and showing the data flow between storage and compute components.Demos, documentation, and more information can be found at www.sciserver.org.SciServer is funded by the National Science Foundation Award ACI-1261715.
Providing Internet Access to High-Resolution Lunar Images
NASA Technical Reports Server (NTRS)
Plesea, Lucian
2008-01-01
The OnMoon server is a computer program that provides Internet access to high-resolution Lunar images, maps, and elevation data, all suitable for use in geographical information system (GIS) software for generating images, maps, and computational models of the Moon. The OnMoon server implements the Open Geospatial Consortium (OGC) Web Map Service (WMS) server protocol and supports Moon-specific extensions. Unlike other Internet map servers that provide Lunar data using an Earth coordinate system, the OnMoon server supports encoding of data in Moon-specific coordinate systems. The OnMoon server offers access to most of the available high-resolution Lunar image and elevation data. This server can generate image and map files in the tagged image file format (TIFF) or the Joint Photographic Experts Group (JPEG), 8- or 16-bit Portable Network Graphics (PNG), or Keyhole Markup Language (KML) format. Image control is provided by use of the OGC Style Layer Descriptor (SLD) protocol. Full-precision spectral arithmetic processing is also available, by use of a custom SLD extension. This server can dynamically add shaded relief based on the Lunar elevation to any image layer. This server also implements tiled WMS protocol and super-overlay KML for high-performance client application programs.
Wide Area Information Servers: An Executive Information System for Unstructured Files.
ERIC Educational Resources Information Center
Kahle, Brewster; And Others
1992-01-01
Describes the Wide Area Information Servers (WAIS) system, an integrated information retrieval system for corporate end users. Discussion covers general characteristics of the system, search techniques, protocol development, user interfaces, servers, selective dissemination of information, nontextual data, access to other servers, and description…
Parallel Computing Using Web Servers and "Servlets".
ERIC Educational Resources Information Center
Lo, Alfred; Bloor, Chris; Choi, Y. K.
2000-01-01
Describes parallel computing and presents inexpensive ways to implement a virtual parallel computer with multiple Web servers. Highlights include performance measurement of parallel systems; models for using Java and intranet technology including single server, multiple clients and multiple servers, single client; and a comparison of CGI (common…
Use of proteomics for validation of the isolation process of clotting factor IX from human plasma.
Clifton, James; Huang, Feilei; Gaso-Sokac, Dajana; Brilliant, Kate; Hixson, Douglas; Josic, Djuro
2010-01-03
The use of proteomic techniques in the monitoring of different production steps of plasma-derived clotting factor IX (pd F IX) was demonstrated. The first step, solid-phase extraction with a weak anion-exchange resin, fractionates the bulk of human serum albumin (HSA), immunoglobulin G, and other non-binding proteins from F IX. The proteins that strongly bind to the anion-exchange resin are eluted by higher salt concentrations. In the second step, anion-exchange chromatography, residual HSA, some proteases and other contaminating proteins are separated. In the last chromatographic step, affinity chromatography with immobilized heparin, the majority of the residual impurities are removed. However, some contaminating proteins still remain in the eluate from the affinity column. The next step in the production process, virus filtration, is also an efficient step for the removal of residual impurities, mainly high molecular weight proteins, such as vitronectin and inter-alpha inhibitor proteins. In each production step, the active component, pd F IX and contaminating proteins are monitored by biochemical and immunochemical methods and by LC-MS/MS and their removal documented. Our methodology is very helpful for further process optimization, rapid identification of target proteins with relatively low abundance, and for the design of subsequent steps for their removal or purification.
Quantitative and Qualitative Analysis of Biomarkers in Fusarium verticillioides
USDA-ARS?s Scientific Manuscript database
In this study, a combination HPLC-DART-TOF-MS system was utilized to identify and quantitatively analyze carbohydrates in wild type and mutant strains of Fusarium verticillioides. Carbohydrate fractions were isolated from F. verticillioides cellular extracts by HPLC using a cation-exchange size-excl...
Heat transfer and pressure drop measurements in an air/molten salt direct-contact heat exchanger
NASA Astrophysics Data System (ADS)
Bohn, Mark S.
1988-11-01
This paper presents a comparison of experimental data with a recently published model of heat exchange in irrigated packed beds. Heat transfer and pressure drop were measured in a 150 mm (ID) column with a 610 mm bed of metal Pall rings. Molten nitrate salt and preheated air were the working fluids with a salt inlet temperature of approximately 440 C and air inlet temperatures of approximately 230 C. A comparison between the experimental data and the heat transfer model is made on the basis of heat transfer from the salt. For the range of air and salt flow rates tested, 0.3 to 1.2 kg/sq m/s air flow and 6 to 18 kg/sq m/s salt flow, the data agree with the model within 22 percent standard deviation. In addition, a model for the column pressure drop was validated, agreeing with the experimental data within 18 percent standard deviation over the range of column pressure drop from 40 to 1250 Pa/m.
Ruggenthaler, M; Grass, J; Schuh, W; Huber, C G; Reischl, R J
2017-09-05
For the first time, a comprehensive investigation of the impurity profile of the synthetic thyroid API (active pharmaceutical ingredient) liothyronine sodium (LT 3 Na) was performed by using reversed phase HPLC and advanced structural elucidation techniques including high resolution tandem mass spectrometry (HRMS/MS) and on-line hydrogen-deuterium (H/D) exchange. Overall, 39 compounds were characterized and 25 of these related substances were previously unknown to literature. The impurity classification system recently developed for the closely related API levothyroxine sodium (LT 4 Na) could be applied to the newly characterized liothyronine sodium impurities resulting in a wholistic thyroid API impurity classification system. Furthermore, the mass-spectrometric CID-fragmentation of specific related substances was discussed and rationalized by detailed fragmentation pathways. Moreover, the UV/Vis absorption characteristics of the API and selected impurities were investigated to corroborate chemical structure assignments derived from MS data. Copyright © 2017 Elsevier B.V. All rights reserved.
Wei, Rongfei; Guo, Qingjun; Wen, Hanjie; Peters, Marc; Yang, Junxing; Tian, Liyan; Han, Xiaokun
2017-01-01
In this study, key factors affecting the chromatographic separation of Cd from plants, such as the resin column, digestion and purification procedures, were experimentally investigated. A technique for separating Cd from plant samples based on single ion-exchange chromatography has been developed, which is suitable for the high-precision analysis of Cd isotopes by multiple-collector inductively coupled plasma mass spectrometry (MC-ICP-MS). The robustness of the technique was assessed by replicate analyses of Cd standard solutions and plant samples. The Cd yields of the whole separation process were higher than 95%, and the 114/110 Cd values of three Cd second standard solutions (Münster Cd, Spex Cd, Spex-1 Cd solutions) relative to the NIST SRM 3108 were measured accurately, which enabled the comparisons of Cd isotope results obtained in other laboratories. Hence, stable Cd isotope analyses represent a powerful tool for fingerprinting specific Cd sources and/or examining biogeochemical reactions in ecological and environmental systems.
Azzam, Sausan; Broadwater, Laurie; Li, Shuo; Freeman, Ernest J; McDonough, Jennifer; Gregory, Roger B
2013-05-01
Experimental autoimmune encephalomyelitis (EAE) is an autoimmune, inflammatory disease of the central nervous system that is widely used as a model of multiple sclerosis (MS). Mitochondrial dysfunction appears to play a role in the development of neuropathology in MS and may also play a role in disease pathology in EAE. Here, surface enhanced laser desorption ionization mass spectrometry (SELDI-MS) has been employed to obtain protein expression profiles from mitochondrially enriched fractions derived from EAE and control mouse brain. To gain insight into experimental variation, the reproducibility of sub-cellular fractionation, anion exchange fractionation as well as spot-to-spot and chip-to-chip variation using pooled samples from brain tissue was examined. Variability of SELDI mass spectral peak intensities indicates a coefficient of variation (CV) of 15.6% and 17.6% between spots on a given chip and between different chips, respectively. Thinly slicing tissue prior to homogenization with a rotor homogenizer showed better reproducibility (CV = 17.0%) than homogenization of blocks of brain tissue with a Teflon® pestle (CV = 27.0%). Fractionation of proteins with anion exchange beads prior to SELDI-MS analysis gave overall CV values from 16.1% to 18.6%. SELDI mass spectra of mitochondrial fractions obtained from brain tissue from EAE mice and controls displayed 39 differentially expressed proteins (p≤ 0.05) out of a total of 241 protein peaks observed in anion exchange fractions. Hierarchical clustering analysis showed that protein fractions from EAE animals with severe disability clearly segregated from controls. Several components of electron transport chain complexes (cytochrome c oxidase subunit 6b1, subunit 6C, and subunit 4; NADH dehydrogenase flavoprotein 3, alpha subcomplex subunit 2, Fe-S protein 4, and Fe-S protein 6; and ATP synthase subunit e) were identified as possible differentially expressed proteins. Myelin Basic Protein isoform 8 (MBP8) (14.2 kDa) levels were lower in EAE samples with advanced disease relative to controls, while an MBP fragment (12. 4kDa), likely due to calpain digestion, was increased in EAE relative to controls. The appearance of MBP in mitochondrially enriched fractions is due to tissue freezing and storage, as MBP was not found associated with mitochondria obtained from fresh tissue. SELDI mass spectrometry can be employed to explore the proteome of a complex tissue (brain) and obtain protein profiles of differentially expressed proteins from protein fractions. Appropriate homogenization protocols and protein fractionation using anion exchange beads can be employed to reduce sample complexity without introducing significant additional variation into the SELDI mass spectra beyond that inherent in the SELDI- MS method itself. SELDI-MS coupled with principal component analysis and hierarchical cluster analysis provides protein patterns that can clearly distinguish the disease state from controls. However, identification of individual differentially expressed proteins requires a separate purification of the proteins of interest by polyacrylamide electrophoresis prior to trypsin digestion and peptide mass fingerprint analysis, and unambiguous identification of differentially expressed proteins can be difficult if protein bands consist of several proteins with similar molecular weights.
2013-01-01
Background Experimental autoimmune encephalomyelitis (EAE) is an autoimmune, inflammatory disease of the central nervous system that is widely used as a model of multiple sclerosis (MS). Mitochondrial dysfunction appears to play a role in the development of neuropathology in MS and may also play a role in disease pathology in EAE. Here, surface enhanced laser desorption ionization mass spectrometry (SELDI-MS) has been employed to obtain protein expression profiles from mitochondrially enriched fractions derived from EAE and control mouse brain. To gain insight into experimental variation, the reproducibility of sub-cellular fractionation, anion exchange fractionation as well as spot-to-spot and chip-to-chip variation using pooled samples from brain tissue was examined. Results Variability of SELDI mass spectral peak intensities indicates a coefficient of variation (CV) of 15.6% and 17.6% between spots on a given chip and between different chips, respectively. Thinly slicing tissue prior to homogenization with a rotor homogenizer showed better reproducibility (CV = 17.0%) than homogenization of blocks of brain tissue with a Teflon® pestle (CV = 27.0%). Fractionation of proteins with anion exchange beads prior to SELDI-MS analysis gave overall CV values from 16.1% to 18.6%. SELDI mass spectra of mitochondrial fractions obtained from brain tissue from EAE mice and controls displayed 39 differentially expressed proteins (p≤ 0.05) out of a total of 241 protein peaks observed in anion exchange fractions. Hierarchical clustering analysis showed that protein fractions from EAE animals with severe disability clearly segregated from controls. Several components of electron transport chain complexes (cytochrome c oxidase subunit 6b1, subunit 6C, and subunit 4; NADH dehydrogenase flavoprotein 3, alpha subcomplex subunit 2, Fe-S protein 4, and Fe-S protein 6; and ATP synthase subunit e) were identified as possible differentially expressed proteins. Myelin Basic Protein isoform 8 (MBP8) (14.2 kDa) levels were lower in EAE samples with advanced disease relative to controls, while an MBP fragment (12. 4kDa), likely due to calpain digestion, was increased in EAE relative to controls. The appearance of MBP in mitochondrially enriched fractions is due to tissue freezing and storage, as MBP was not found associated with mitochondria obtained from fresh tissue. Conclusions SELDI mass spectrometry can be employed to explore the proteome of a complex tissue (brain) and obtain protein profiles of differentially expressed proteins from protein fractions. Appropriate homogenization protocols and protein fractionation using anion exchange beads can be employed to reduce sample complexity without introducing significant additional variation into the SELDI mass spectra beyond that inherent in the SELDI- MS method itself. SELDI-MS coupled with principal component analysis and hierarchical cluster analysis provides protein patterns that can clearly distinguish the disease state from controls. However, identification of individual differentially expressed proteins requires a separate purification of the proteins of interest by polyacrylamide electrophoresis prior to trypsin digestion and peptide mass fingerprint analysis, and unambiguous identification of differentially expressed proteins can be difficult if protein bands consist of several proteins with similar molecular weights. PMID:23635033
Neck Strength Imbalance Correlates With Increased Head Acceleration in Soccer Heading
Dezman, Zachary D.W.; Ledet, Eric H.; Kerr, Hamish A.
2013-01-01
Background: Soccer heading is using the head to directly contact the ball, often to advance the ball down the field or score. It is a skill fundamental to the game, yet it has come under scrutiny. Repeated subclinical effects of heading may compound over time, resulting in neurologic deficits. Greater head accelerations are linked to brain injury. Developing an understanding of how the neck muscles help stabilize and reduce head acceleration during impact may help prevent brain injury. Hypothesis: Neck strength imbalance correlates to increasing head acceleration during impact while heading a soccer ball. Study Design: Observational laboratory investigation. Methods: Sixteen Division I and II collegiate soccer players headed a ball in a controlled indoor laboratory setting while player motions were recorded by a 14-camera Vicon MX motion capture system. Neck flexor and extensor strength of each player was measured using a spring-type clinical dynamometer. Results: Players were served soccer balls by hand at a mean velocity of 4.29 m/s (±0.74 m/s). Players returned the ball to the server using a heading maneuver at a mean velocity of 5.48 m/s (±1.18 m/s). Mean neck strength difference was positively correlated with angular head acceleration (rho = 0.497; P = 0.05), with a trend toward significance for linear head acceleration (rho = 0.485; P = 0.057). Conclusion: This study suggests that symmetrical strength in neck flexors and extensors reduces head acceleration experienced during low-velocity heading in experienced collegiate players. Clinical Relevance: Balanced neck strength may reduce head acceleration cumulative subclinical injury. Since neck strength is a measureable and amenable strength training intervention, this may represent a modifiable intrinsic risk factor for injury. PMID:24459547
GRAMM-X public web server for protein–protein docking
Tovchigrechko, Andrey; Vakser, Ilya A.
2006-01-01
Protein docking software GRAMM-X and its web interface () extend the original GRAMM Fast Fourier Transformation methodology by employing smoothed potentials, refinement stage, and knowledge-based scoring. The web server frees users from complex installation of database-dependent parallel software and maintaining large hardware resources needed for protein docking simulations. Docking problems submitted to GRAMM-X server are processed by a 320 processor Linux cluster. The server was extensively tested by benchmarking, several months of public use, and participation in the CAPRI server track. PMID:16845016
2016-06-08
server environment. While the college’s two Cisco blade -servers are located in separate buildings, these 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND...college’s two Cisco blade -servers are located in separate buildings, these units now work as one unit. Critical databases and software packages are...server environment. While the college’s two Cisco blade -servers are located in separate buildings, these units now work as one unit. Critical
Scaling NS-3 DCE Experiments on Multi-Core Servers
2016-06-15
that work well together. 3.2 Simulation Server Details We ran the simulations on a Dell® PowerEdge M520 blade server[8] running Ubuntu Linux 14.04...To minimize the amount of time needed to complete all of the simulations, we planned to run multiple simulations at the same time on a blade server...MacBook was running the simulation inside a virtual machine (Ubuntu 14.04), while the blade server was running the same operating system directly on
Reliability Information Analysis Center 1st Quarter 2007, Technical Area Task (TAT) Report
2007-02-05
34* Created new SQL server database for "PC Configuration" web application. Added roles for security closed 4235 and posted application to production. "e Wrote...and ran SQL Server scripts to migrate production databases to new server . "e Created backup jobs for new SQL Server databases. "* Continued...second phase of the TENA demo. Extensive tasking was established and assigned. A TENA interface to EW Server was reaffirmed after some uncertainty about
Lawrence, Daphne
2009-03-01
Blade servers and virtualization can reduce infrastructure, maintenance, heating, electric, cooling and equipment costs. Blade server technology is evolving and some elements may become obsolete. There is very little interoperability between blades. Hospitals can virtualize 40 to 60 percent of their servers, and old servers can be reused for testing. Not all applications lend themselves to virtualization--especially those with high memory requirements. CIOs should engage their vendors in virtualization discussions.
A cloud based brokering framework to support hydrology at global scale
NASA Astrophysics Data System (ADS)
Boldrini, E.; Pecora, S.; Bordini, F.; Nativi, S.
2016-12-01
This work presents the hydrology broker designed and deployed in the context of a collaboration between the Regional Agency for Environmental Protection in the Italian region Emilia-Romagna (ARPA-ER) and CNR-IIA (National Research Council of Italy). The hydrology brokering platform eases the task of discovering and accessing hydrological observation data, usually acquired and made available by national agencies by means of a set of heterogeneous services (e.g. CUAHSI HIS servers, OGC services, FTP servers) and formats (e.g. WaterML, O&M, ...). The hydrology broker makes all the already published data available according to one or more of the desired and well known discovery protocols, access protocols, and formats . As a result, the user is able to search and access the available hydrological data through his preferred client (e.g. CUAHSI HydroDesktop, 52North SWE client). It is also easy to build a hydrological web portal on top of the broker, using the user friendly js API. The hydrology broker has been deployed on the Amazon cloud to ensure scalability and tested in the context of the work of the Commission for Hydrology of WMO on three different scenarios: the La Plata river basin, the Sava river basin and the Arctic-HYCOS project. In each scenario the hydrology broker discovered and accessed heterogeneous data formats (e.g. Waterml 1.0/2.0, proprietary CSV documents) from the heterogeneous services (e.g. CUAHSI HIS servers, FTP service and agency proprietary services) managed by several national agencies and international commissions. The hydrology broker made possible to present all the available data uniformly through the user desired service type and format (e.g. an HIS server publishing Waterml 2.0), producing a great improvement in both system interoperability and data exchange. Interoperability tests were also successfully conducted with WMO Information System (WIS) nodes, making possible for a specific Global Information Center System (GISC) to gather the available hydrological records as ISO 19115:2007 metadata documents through the OAI-PMH interface exposed by the broker. The framework flexibility makes it also easy to add other sources, as well as additional published interfaces, in order to cope with the future standard requirements needed by the hydrological community.
A Scalability Model for ECS's Data Server
NASA Technical Reports Server (NTRS)
Menasce, Daniel A.; Singhal, Mukesh
1998-01-01
This report presents in four chapters a model for the scalability analysis of the Data Server subsystem of the Earth Observing System Data and Information System (EOSDIS) Core System (ECS). The model analyzes if the planned architecture of the Data Server will support an increase in the workload with the possible upgrade and/or addition of processors, storage subsystems, and networks. The approaches in the report include a summary of the architecture of ECS's Data server as well as a high level description of the Ingest and Retrieval operations as they relate to ECS's Data Server. This description forms the basis for the development of the scalability model of the data server and the methodology used to solve it.
NASA Astrophysics Data System (ADS)
Anderson, J.; Bauer, K.; Borga, A.; Boterenbrood, H.; Chen, H.; Chen, K.; Drake, G.; Dönszelmann, M.; Francis, D.; Guest, D.; Gorini, B.; Joos, M.; Lanni, F.; Lehmann Miotto, G.; Levinson, L.; Narevicius, J.; Panduro Vazquez, W.; Roich, A.; Ryu, S.; Schreuder, F.; Schumacher, J.; Vandelli, W.; Vermeulen, J.; Whiteson, D.; Wu, W.; Zhang, J.
2016-12-01
The ATLAS Phase-I upgrade (2019) requires a Trigger and Data Acquisition (TDAQ) system able to trigger and record data from up to three times the nominal LHC instantaneous luminosity. The Front-End LInk eXchange (FELIX) system provides an infrastructure to achieve this in a scalable, detector agnostic and easily upgradeable way. It is a PC-based gateway, interfacing custom radiation tolerant optical links from front-end electronics, via PCIe Gen3 cards, to a commodity switched Ethernet or InfiniBand network. FELIX enables reducing custom electronics in favour of software running on commercial servers. The FELIX system, the design of the PCIe prototype card and the integration test results are presented in this paper.
Load Balancing in Distributed Web Caching: A Novel Clustering Approach
NASA Astrophysics Data System (ADS)
Tiwari, R.; Kumar, K.; Khan, G.
2010-11-01
The World Wide Web suffers from scaling and reliability problems due to overloaded and congested proxy servers. Caching at local proxy servers helps, but cannot satisfy more than a third to half of requests; more requests are still sent to original remote origin servers. In this paper we have developed an algorithm for Distributed Web Cache, which incorporates cooperation among proxy servers of one cluster. This algorithm uses Distributed Web Cache concepts along with static hierarchies with geographical based clusters of level one proxy server with dynamic mechanism of proxy server during the congestion of one cluster. Congestion and scalability problems are being dealt by clustering concept used in our approach. This results in higher hit ratio of caches, with lesser latency delay for requested pages. This algorithm also guarantees data consistency between the original server objects and the proxy cache objects.
On the optimal use of a slow server in two-stage queueing systems
NASA Astrophysics Data System (ADS)
Papachristos, Ioannis; Pandelis, Dimitrios G.
2017-07-01
We consider two-stage tandem queueing systems with a dedicated server in each queue and a slower flexible server that can attend both queues. We assume Poisson arrivals and exponential service times, and linear holding costs for jobs present in the system. We study the optimal dynamic assignment of servers to jobs assuming that two servers cannot collaborate to work on the same job and preemptions are not allowed. We formulate the problem as a Markov decision process and derive properties of the optimal allocation for the dedicated (fast) servers. Specifically, we show that the one downstream should not idle, and the same is true for the one upstream when holding costs are larger there. The optimal allocation of the slow server is investigated through extensive numerical experiments that lead to conjectures on the structure of the optimal policy.
Process evaluation distributed system
NASA Technical Reports Server (NTRS)
Moffatt, Christopher L. (Inventor)
2006-01-01
The distributed system includes a database server, an administration module, a process evaluation module, and a data display module. The administration module is in communication with the database server for providing observation criteria information to the database server. The process evaluation module is in communication with the database server for obtaining the observation criteria information from the database server and collecting process data based on the observation criteria information. The process evaluation module utilizes a personal digital assistant (PDA). A data display module in communication with the database server, including a website for viewing collected process data in a desired metrics form, the data display module also for providing desired editing and modification of the collected process data. The connectivity established by the database server to the administration module, the process evaluation module, and the data display module, minimizes the requirement for manual input of the collected process data.
Nakrani, Sunil; Tovey, Craig
2007-12-01
An Internet hosting center hosts services on its server ensemble. The center must allocate servers dynamically amongst services to maximize revenue earned from hosting fees. The finite server ensemble, unpredictable request arrival behavior and server reallocation cost make server allocation optimization difficult. Server allocation closely resembles honeybee forager allocation amongst flower patches to optimize nectar influx. The resemblance inspires a honeybee biomimetic algorithm. This paper describes details of the honeybee self-organizing model in terms of information flow and feedback, analyzes the homology between the two problems and derives the resulting biomimetic algorithm for hosting centers. The algorithm is assessed for effectiveness and adaptiveness by comparative testing against benchmark and conventional algorithms. Computational results indicate that the new algorithm is highly adaptive to widely varying external environments and quite competitive against benchmark assessment algorithms. Other swarm intelligence applications are briefly surveyed, and some general speculations are offered regarding their various degrees of success.
Smith, Nicholas; Witham, Shawn; Sarkar, Subhra; Zhang, Jie; Li, Lin; Li, Chuan; Alexov, Emil
2012-06-15
A new edition of the DelPhi web server, DelPhi web server v2, is released to include atomic presentation of geometrical figures. These geometrical objects can be used to model nano-size objects together with real biological macromolecules. The position and size of the object can be manipulated by the user in real time until desired results are achieved. The server fixes structural defects, adds hydrogen atoms and calculates electrostatic energies and the corresponding electrostatic potential and ionic distributions. The web server follows a client-server architecture built on PHP and HTML and utilizes DelPhi software. The computation is carried out on supercomputer cluster and results are given back to the user via http protocol, including the ability to visualize the structure and corresponding electrostatic potential via Jmol implementation. The DelPhi web server is available from http://compbio.clemson.edu/delphi_webserver.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-12
... Commercial and Industrial Equipment: Proposed Determination of Computer Servers as a Covered Consumer Product... comments on the proposed determination that computer servers (servers) qualify as a covered product. DATES: The comment period for the proposed determination relating to servers published on July 12, 2013 (78...
ASPEN--A Web-Based Application for Managing Student Server Accounts
ERIC Educational Resources Information Center
Sandvig, J. Christopher
2004-01-01
The growth of the Internet has greatly increased the demand for server-side programming courses at colleges and universities. Students enrolled in such courses must be provided with server-based accounts that support the technologies that they are learning. The process of creating, managing and removing large numbers of student server accounts is…
A broadcast-based key agreement scheme using set reconciliation for wireless body area networks.
Ali, Aftab; Khan, Farrukh Aslam
2014-05-01
Information and communication technologies have thrived over the last few years. Healthcare systems have also benefited from this progression. A wireless body area network (WBAN) consists of small, low-power sensors used to monitor human physiological values remotely, which enables physicians to remotely monitor the health of patients. Communication security in WBANs is essential because it involves human physiological data. Key agreement and authentication are the primary issues in the security of WBANs. To agree upon a common key, the nodes exchange information with each other using wireless communication. This information exchange process must be secure enough or the information exchange should be minimized to a certain level so that if information leak occurs, it does not affect the overall system. Most of the existing solutions for this problem exchange too much information for the sake of key agreement; getting this information is sufficient for an attacker to reproduce the key. Set reconciliation is a technique used to reconcile two similar sets held by two different hosts with minimal communication complexity. This paper presents a broadcast-based key agreement scheme using set reconciliation for secure communication in WBANs. The proposed scheme allows the neighboring nodes to agree upon a common key with the personal server (PS), generated from the electrocardiogram (EKG) feature set of the host body. Minimal information is exchanged in a broadcast manner, and even if every node is missing a different subset, by reconciling these feature sets, the whole network will still agree upon a single common key. Because of the limited information exchange, if an attacker gets the information in any way, he/she will not be able to reproduce the key. The proposed scheme mitigates replay, selective forwarding, and denial of service attacks using a challenge-response authentication mechanism. The simulation results show that the proposed scheme has a great deal of adoptability in terms of security, communication overhead, and running time complexity, as compared to the existing EKG-based key agreement scheme.
How to securely replicate services
NASA Technical Reports Server (NTRS)
Reiter, Michael; Birman, Kenneth
1992-01-01
A method is presented for constructing replicated services that retain their availability and integrity despite several servers and clients corrupted by an intruder, in addition to others failing benignly. More precisely, a service is replicated by n servers in such a way that a correct client will accept a correct server's response if, for some prespecified parameter k, at least k servers are correct and fewer than k servers are corrupt. The issue of maintaining causality among client requests is also addressed. A security breach resulting from an intruder's ability to effect a violation of causality in the sequence of requests processed by the service is illustrated. An approach to counter this problem is proposed that requires fewer than k servers to be corrupt and that is live if at least k+b servers are correct, where b is the assumed maximum total number of corrupt servers in any system run. An important and novel feature of these schemes is that the client need not be able to identify or authenticate even a single server. Instead, the client is required only to possess at most two public keys for the service. The practicality of these schemes is illustrated through a discussion of several issues pertinent to their implementation and use, and their intended role in a secure version of the Isis system is also described.
Optimal Self-Tuning PID Controller Based on Low Power Consumption for a Server Fan Cooling System.
Lee, Chengming; Chen, Rongshun
2015-05-20
Recently, saving the cooling power in servers by controlling the fan speed has attracted considerable attention because of the increasing demand for high-density servers. This paper presents an optimal self-tuning proportional-integral-derivative (PID) controller, combining a PID neural network (PIDNN) with fan-power-based optimization in the transient-state temperature response in the time domain, for a server fan cooling system. Because the thermal model of the cooling system is nonlinear and complex, a server mockup system simulating a 1U rack server was constructed and a fan power model was created using a third-order nonlinear curve fit to determine the cooling power consumption by the fan speed control. PIDNN with a time domain criterion is used to tune all online and optimized PID gains. The proposed controller was validated through experiments of step response when the server operated from the low to high power state. The results show that up to 14% of a server's fan cooling power can be saved if the fan control permits a slight temperature response overshoot in the electronic components, which may provide a time-saving strategy for tuning the PID controller to control the server fan speed during low fan power consumption.
Informatics in radiology (infoRAD): A complete continuous-availability PACS archive server.
Liu, Brent J; Huang, H K; Cao, Fei; Zhou, Michael Z; Zhang, Jianguo; Mogel, Greg
2004-01-01
The operational reliability of the picture archiving and communication system (PACS) server in a filmless hospital environment is always a major concern because server failure could cripple the entire PACS operation. A simple, low-cost, continuous-availability (CA) PACS archive server was designed and developed. The server makes use of a triple modular redundancy (TMR) system with a simple majority voting logic that automatically identifies a faulty module and removes it from service. The remaining two modules continue normal operation with no adverse effects on data flow or system performance. In addition, the server is integrated with two external mass storage devices for short- and long-term storage. Evaluation and testing of the server were conducted with laboratory experiments in which hardware failures were simulated to observe recovery time and the resumption of normal data flow. The server provides maximum uptime (99.999%) for end users while ensuring the transactional integrity of all clinical PACS data. Hardware failure has only minimal impact on performance, with no interruption of clinical data flow or loss of data. As hospital PACS become more widespread, the need for CA PACS solutions will increase. A TMR CA PACS archive server can reliably help achieve CA in this setting. Copyright RSNA, 2004
Performance of a distributed superscalar storage server
NASA Technical Reports Server (NTRS)
Finestead, Arlan; Yeager, Nancy
1993-01-01
The RS/6000 performed well in our test environment. The potential exists for the RS/6000 to act as a departmental server for a small number of users, rather than as a high speed archival server. Multiple UniTree Disk Server's utilizing one UniTree Disk Server's utilizing one UniTree Name Server could be developed that would allow for a cost effective archival system. Our performance tests were clearly limited by the network bandwidth. The performance gathered by the LibUnix testing shows that UniTree is capable of exceeding ethernet speeds on an RS/6000 Model 550. The performance of FTP might be significantly faster if asked to perform across a higher bandwidth network. The UniTree Name Server also showed signs of being a potential bottleneck. UniTree sites that would require a high ratio of file creations and deletions to reads and writes would run into this bottleneck. It is possible to improve the UniTree Name Server performance by bypassing the UniTree LibUnix Library altogether and communicating directly with the UniTree Name Server and optimizing creations. Although testing was performed in a less than ideal environment, hopefully the performance statistics stated in this paper will give end-users a realistic idea as to what performance they can expect in this type of setup.
NASA Astrophysics Data System (ADS)
Roach, Colin; Carlsson, Johan; Cary, John R.; Alexander, David A.
2002-11-01
The National Transport Code Collaboration (NTCC) has developed an array of software, including a data client/server. The data server, which is written in C++, serves local data (in the ITER Profile Database format) as well as remote data (by accessing one or several MDS+ servers). The client, a web-invocable Java applet, provides a uniform, intuitive, user-friendly, graphical interface to the data server. The uniformity of the interface relieves the user from the trouble of mastering the differences between different data formats and lets him/her focus on the essentials: plotting and viewing the data. The user runs the client by visiting a web page using any Java capable Web browser. The client is automatically downloaded and run by the browser. A reference to the data server is then retrieved via the standard Web protocol (HTTP). The communication between the client and the server is then handled by the mature, industry-standard CORBA middleware. CORBA has bindings for all common languages and many high-quality implementations are available (both Open Source and commercial). The NTCC data server has been installed at the ITPA International Multi-tokamak Confinement Profile Database, which is hosted by the UKAEA at Culham Science Centre. The installation of the data server is protected by an Internet firewall. To make it accessible to clients outside the firewall some modifications of the server were required. The working version of the ITPA confinement profile database is not open to the public. Authentification of legitimate users is done utilizing built-in Java security features to demand a password to download the client. We present an overview of the NTCC data client/server and some details of how the CORBA firewall-traversal issues were resolved and how the user authentification is implemented.
LiveBench-1: continuous benchmarking of protein structure prediction servers.
Bujnicki, J M; Elofsson, A; Fischer, D; Rychlewski, L
2001-02-01
We present a novel, continuous approach aimed at the large-scale assessment of the performance of available fold-recognition servers. Six popular servers were investigated: PDB-Blast, FFAS, T98-lib, GenTHREADER, 3D-PSSM, and INBGU. The assessment was conducted using as prediction targets a large number of selected protein structures released from October 1999 to April 2000. A target was selected if its sequence showed no significant similarity to any of the proteins previously available in the structural database. Overall, the servers were able to produce structurally similar models for one-half of the targets, but significantly accurate sequence-structure alignments were produced for only one-third of the targets. We further classified the targets into two sets: easy and hard. We found that all servers were able to find the correct answer for the vast majority of the easy targets if a structurally similar fold was present in the server's fold libraries. However, among the hard targets--where standard methods such as PSI-BLAST fail--the most sensitive fold-recognition servers were able to produce similar models for only 40% of the cases, half of which had a significantly accurate sequence-structure alignment. Among the hard targets, the presence of updated libraries appeared to be less critical for the ranking. An "ideally combined consensus" prediction, where the results of all servers are considered, would increase the percentage of correct assignments by 50%. Each server had a number of cases with a correct assignment, where the assignments of all the other servers were wrong. This emphasizes the benefits of considering more than one server in difficult prediction tasks. The LiveBench program (http://BioInfo.PL/LiveBench) is being continued, and all interested developers are cordially invited to join.
The HydroServer Platform for Sharing Hydrologic Data
NASA Astrophysics Data System (ADS)
Tarboton, D. G.; Horsburgh, J. S.; Schreuders, K.; Maidment, D. R.; Zaslavsky, I.; Valentine, D. W.
2010-12-01
The CUAHSI Hydrologic Information System (HIS) is an internet based system that supports sharing of hydrologic data. HIS consists of databases connected using the Internet through Web services, as well as software for data discovery, access, and publication. The HIS system architecture is comprised of servers for publishing and sharing data, a centralized catalog to support cross server data discovery and a desktop client to access and analyze data. This paper focuses on HydroServer, the component developed for sharing and publishing space-time hydrologic datasets. A HydroServer is a computer server that contains a collection of databases, web services, tools, and software applications that allow data producers to store, publish, and manage the data from an experimental watershed or project site. HydroServer is designed to permit publication of data as part of a distributed national/international system, while still locally managing access to the data. We describe the HydroServer architecture and software stack, including tools for managing and publishing time series data for fixed point monitoring sites as well as spatially distributed, GIS datasets that describe a particular study area, watershed, or region. HydroServer adopts a standards based approach to data publication, relying on accepted and emerging standards for data storage and transfer. CUAHSI developed HydroServer code is free with community code development managed through the codeplex open source code repository and development system. There is some reliance on widely used commercial software for general purpose and standard data publication capability. The sharing of data in a common format is one way to stimulate interdisciplinary research and collaboration. It is anticipated that the growing, distributed network of HydroServers will facilitate cross-site comparisons and large scale studies that synthesize information from diverse settings, making the network as a whole greater than the sum of its parts in advancing hydrologic research. Details of the CUAHSI HIS can be found at http://his.cuahsi.org, and HydroServer codeplex site http://hydroserver.codeplex.com.
NASA Astrophysics Data System (ADS)
Kim, M.; Yang, M. X.; Blomquist, B.; Huebert, B. J.; Bertram, T. H.
2014-12-01
Biogenic Volatile Organic Compounds (BVOCs) are reactive trace gases that impact both chemistry and climate by regulating oxidant loadings, determining secondary organic aerosol production rates as well as altering particle hygroscopicity. While continental BVOC exchange rates are well studied, global marine flux estimates are poorly constrained. In Fall 2013, a chemical-ionization time-of-flight mass spectrometer (CI-ToF-MS) utilizing benzene cations was deployed as part of the High Wind Gas Exchange Study (HiWinGs) to quantify monoterpenes, isoprene and dimethylsulfide fluxes in the remote North Atlantic. Dimethylsulfide measurements are in strong agreement with those determined by the University of Hawaii's atmospheric pressure ionization mass-spectrometer. In the remote marine boundary layer, positive monoterpene fluxes (i.e. emissions) were observed while isoprene levels rarely exceeded the detection limit.
Spatial and temporal variability in forest-atmosphere CO2 exchange
D.Y. Hollinger; J. Aber; B. Dail; E.A. Davidson; S.M. Goltz; et al.
2004-01-01
Seven years of carbon dioxide flux measurements indicate that a ∼ 90-year-old spruce dominated forest in Maine, USA, has been sequestering 174±46 gCm-2 yr-1 (mean±1 standard deviation, nocturnal friction velocity (u*) threshold >0.25ms-1...
Nickel-catalyzed proton-deuterium exchange (HDX) for linkage analysis of complex carbohydrates
USDA-ARS?s Scientific Manuscript database
The structural assignment of complex carbohydrates typically requires the analysis of at least three parameters: 1. composition; 2. linkage; and 3. substituents. These are often assigned on a small scale by gas chromatography/mass spectrometry (GC/MS). Linkage positions are determined by permethylat...
Operations Technology Exchange Initiating Partnerships University Partners Government Partners Industry , boehman@umich.edu Government Leader Dr. David Gorsich Chief Scientist US Army RDECOM-TARDEC RDTA-S , Building 200A, MS 204 Warren, MI 48397-5000 586-282-7413, david.j.gorsich.civ@mail.mil Government Leader Dr
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crowder, M.; Pierce, R.
2012-08-22
H-Canyon and HB-Line are tasked with the production of PuO{sub 2} from a feed of plutonium metal. The PuO{sub 2} will provide feed material for the MOX Fuel Fabrication Facility. After dissolution of the Pu metal in H-Canyon, the solution will be transferred to HB-Line for purification by anion exchange. Subsequent unit operations include Pu(IV) oxalate precipitation, filtration and calcination to form PuO{sub 2}. This report details the results from SRNL anion exchange, precipitation, filtration, calcination, and characterization tests, as requested by HB-Line1 and described in the task plan. This study involved an 80-g batch of Pu and employed testmore » conditions prototypical of HB-Line conditions, wherever feasible. In addition, this study integrated lessons learned from earlier anion exchange and precipitation and calcination studies. H-Area Engineering selected direct strike Pu(IV) oxalate precipitation to produce a more dense PuO{sub 2} product than expected from Pu(III) oxalate precipitation. One benefit of the Pu(IV) approach is that it eliminates the need for reduction by ascorbic acid. The proposed HB-Line precipitation process involves a digestion time of 5 minutes after the time (44 min) required for oxalic acid addition. These were the conditions during HB-line production of neptunium oxide (NpO{sub 2}). In addition, a series of small Pu(IV) oxalate precipitation tests with different digestion times were conducted to better understand the effect of digestion time on particle size, filtration efficiency and other factors. To test the recommended process conditions, researchers performed two nearly-identical larger-scale precipitation and calcination tests. The calcined batches of PuO{sub 2} were characterized for density, specific surface area (SSA), particle size, moisture content, and impurities. Because the 3013 Standard requires that the calcination (or stabilization) process eliminate organics, characterization of PuO{sub 2} batches monitored the presence of oxalate by thermogravimetric analysis-mass spectrometry (TGA-MS). To use the TGA-MS for carbon or oxalate content, some method development will be required. However, the TGA-MS is already used for moisture measurements. Therefore, SRNL initiated method development for the TGA-MS to allow quantification of oxalate or total carbon. That work continues at this time and is not yet ready for use in this study. However, the collected test data can be reviewed later as those analysis tools are available.« less
Group-oriented coordination models for distributed client-server computing
NASA Technical Reports Server (NTRS)
Adler, Richard M.; Hughes, Craig S.
1994-01-01
This paper describes group-oriented control models for distributed client-server interactions. These models transparently coordinate requests for services that involve multiple servers, such as queries across distributed databases. Specific capabilities include: decomposing and replicating client requests; dispatching request subtasks or copies to independent, networked servers; and combining server results into a single response for the client. The control models were implemented by combining request broker and process group technologies with an object-oriented communication middleware tool. The models are illustrated in the context of a distributed operations support application for space-based systems.
National Medical Terminology Server in Korea
NASA Astrophysics Data System (ADS)
Lee, Sungin; Song, Seung-Jae; Koh, Soonjeong; Lee, Soo Kyoung; Kim, Hong-Gee
Interoperable EHR (Electronic Health Record) necessitates at least the use of standardized medical terminologies. This paper describes a medical terminology server, LexCare Suite, which houses terminology management applications, such as a terminology editor, and a terminology repository populated with international standard terminology systems such as Systematized Nomenclature of Medicine (SNOMED). The server is to satisfy the needs of quality terminology systems to local primary to tertiary hospitals. Our partner general hospitals have used the server to test its applicability. This paper describes the server and the results of the applicability test.
Ellingson, David J; Shippar, Jeffrey J; Gilmore, Justin M
2016-01-01
Analytical methods for the analysis of both L-carnitine and choline are needed for reliable and accurate determination in infant formula and adult/pediatric nutritional formula. These compounds are different in how they are utilized by the human body, but are structurally similar. L-carnitine and choline are quaternary ammonium compounds, enabling both to be retained under acidic conditions with strong cation exchange (SCX) chromatography. This method analyzes both compounds simultaneously as either the free forms or as a total amount that includes bound sources such as phosphatidylcholine or acetylcarnitine. The free analysis consists of water extraction and analysis by LC/MS/MS, while the total analysis consists of extraction by acid assisted microwave hydrolysis and analysis by LC/MS/MS. Calibration standards used for calculations are extracted with all samples in the batch. A single laboratory validation (SLV) was performed following the guidelines of the AOAC Stakeholder Panel on Infant Formula and Adult Nutritionals (SPIFAN) utilizing the kit of materials provided. The results achieved meet the requirements of SMPR 2012.010 and 2012.013 for L-carnitine and total choline, respectively.
CIVET: Continuous Integration, Verification, Enhancement, and Testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alger, Brian; Gaston, Derek R.; Permann, Cody J
A Git server (GitHub, GitLab, BitBucket) sends event notifications to the Civet server. These are either a " Pull Request" or a "Push" notification. Civet then checks the database to determine what tests need to be run and marks them as ready to run. Civet clients, running on dedicated machines, query the server for available jobs that are ready to run. When a client gets a job it executes the scripts attached to the job and report back to the server the output and exit status. When the client updates the server, the server will also update the Git servermore » with the result of the job, as well as updating the main web page.« less
Determination of benzoylurea insecticides in food by pressurized liquid extraction and LC-MS.
Brutti, Monia; Blasco, Cristina; Picó, Yolanda
2010-01-01
A method based on pressurized liquid extraction and LC-MS/MS has been developed for determining nine benzoylureas (BUs) in fruit, vegetable, cereals, and animal products. Samples (5 g) were homogenized with diatomaceous earth and extracted in a 22 mL cell with 22 mL of ethyl acetate at 80 degrees C and 1500 psi. After solvent concentration and exchange to methanol, BUs were analyzed by LC-MS/MS using an IT mass analyzer, which achieved several transitions of precursor ions that increase selectivity providing identification. LOQs were between 0.002 and 0.01 mg/kg, which are equal or lower than maximum residue limits established by the Codex Alimentarius. Excellent linearity was achieved over a range of concentrations from 0.01 to 1 mg/kg with correlation coefficients 0.995-0.999 (n=7). Validation of the total method was performed by analyzing in quintuplicate seven different commodities (milk, eggs, meat, rice, lettuce, avocado, and lemon) at three concentration levels (0.01, 0.1, and 1 mg/kg). The recoveries ranged from 58 to 97% and the RSDs from 5 to 19% depending on the compound and the commodity. The combination of pressurized liquid extraction with LC-MS/MS provides a sensitive and selective method for the determination of BUs in food.
Study of rat hypothalamic proteome by HPLC/ESI ion trap and HPLC/ESI-Q-TOF MS.
Iqbal, Javed; Li, Wang; Ullah, Kaleem; Hasan, Murtaza; Linna, Guo; Awan, Umer; Zhang, Yongqian; Batool, Sajida; Qing, Hong; Deng, Yulin
2013-08-01
The proteomic profile of hypothalamus, a key organ of CNS, is explored here by employing two widely used MS techniques, i.e. HPLC/ESI-ion trap and HPLC/ESI-quadrupole-TOF MS. Strong cation exchange is used for the fractionation of peptides and protein search engine MASCOT is employed for data query. One hundred and thirty six proteins with 10 973 peptides were identified by HPLC/ESI-ion trap MS, while 140 proteins with 32 183 peptides were characterized by HPLC/ESI-quadrupole-TOF MS. Among the total 198 proteins identified in both experiments, 78 proteins were common in both sets of conditions. The rest of the 120 proteins were identified distinctly in both MS strategies, i.e. 58 unique proteins were found using the quadrupole-TOF while 62 were found with the HPLC/ESI-ion trap. Moreover, these proteins were classified into groups based on their functions performed in the body. Results presented here identified some important signal and cellular defense proteins inevitable for survival in stressed conditions. Additionally, it is also shown that any single MS strategy is not reliable for good results due to loss of data depending on sensitivity of the instrument used. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
He, Huimin; Liu, Fengman; Li, Baoxia; Xue, Haiyun; Wang, Haidong; Qiu, Delong; Zhou, Yunyan; Cao, Liqiang
2016-11-01
With the development of the multicore processor, the bandwidth and capacity of the memory, rather than the memory area, are the key factors in server performance. At present, however, the new architectures, such as fully buffered DIMM (FBDIMM), hybrid memory cube (HMC), and high bandwidth memory (HBM), cannot be commercially applied in the server. Therefore, a new architecture for the server is proposed. CPU and memory are separated onto different boards, and optical interconnection is used for the communication between them. Each optical module corresponds to each dual inline memory module (DIMM) with 64 channels. Compared to the previous technology, not only can the architecture realize high-capacity and wide-bandwidth memory, it also can reduce power consumption and cost, and be compatible with the existing dynamic random access memory (DRAM). In this article, the proposed module with system-in-package (SiP) integration is demonstrated. In the optical module, the silicon photonic chip is included, which is a promising technology to be applied in the next-generation data exchanging centers. And due to the bandwidth-distance performance of the optical interconnection, SerDes chips are introduced to convert the 64-bit data at 800 Mbps from/to 4-channel data at 12.8 Gbps after/before they are transmitted though optical fiber. All the devices are packaged on cheap organic substrates. To ensure the performance of the whole system, several optimization efforts have been performed on the two modules. High-speed interconnection traces have been designed and simulated with electromagnetic simulation software. Steady-state thermal characteristics of the transceiver module have been evaluated by ANSYS APLD based on finite-element methodology (FEM). Heat sinks are placed at the hotspot area to ensure the reliability of all working chips. Finally, this transceiver system based on silicon photonics is measured, and the eye diagrams of data and clock signals are verified.
NASA Technical Reports Server (NTRS)
Anyiwo, Joshua C.
2000-01-01
Vixen is a collection of enabling technologies for uninhibited distributed object computing. In the Spring of 1995 when Vixen was proposed, it was an innovative idea very much ahead of its time. But today the technologies proposed in Vixen have become standard technologies for Enterprise Computing. Sun Microsystems J2EE/EJB specifications, among others, are independently proposed technologies of the Vixen type. I have brought Vixen completely under the J2EE standard in order to maximize interoperability and compatibility with other computing industry efforts. Vixen and the Enterprise JavaBean (EJB) Server technologies are now practically identical; OIL, another Vixen technology, and the Java Messaging System (JMS) are practically identical; and so on. There is no longer anything novel or patentable in the Vixen work performed under this grant. The above discussion, notwithstanding, my independent development of Vixen has significantly helped me, my university, my students and the local community. The undergraduate students who worked with me in developing Vixen have enhanced their expertise in what has become the cutting edge technology of their industry and are therefore well positioned for lucrative employment opportunities in the industry. My academic department has gained a new course: "Multi-media System Development", which provides a highly desirable expertise to our students for employment in any enterprise today. The many Outreach Programs that I conducted during this grant period have exposed local Middle School students to the contributions that NASA is making in our society as well as awakened desires in many such students for careers in Science and Technology. I have applied Vixen to the development of two software packages: (a) JAS: Joshua Application Server - which allows a user to configure an EJB Server to serve a J2EE compliant application over the world wide web; (b) PCM: Professor Course Manager: a J2EE compliant application for configuring a course for distance learning. These types of applications are, however, generally available in the industry today.
Wasslen, Karl V; Tan, Le Hoa; Manthorpe, Jeffrey M; Smith, Jeffrey C
2014-04-01
Defining cellular processes relies heavily on elucidating the temporal dynamics of proteins. To this end, mass spectrometry (MS) is an extremely valuable tool; different MS-based quantitative proteomics strategies have emerged to map protein dynamics over the course of stimuli. Herein, we disclose our novel MS-based quantitative proteomics strategy with unique analytical characteristics. By passing ethereal diazomethane over peptides on strong cation exchange resin within a microfluidic device, peptides react to contain fixed, permanent positive charges. Modified peptides display improved ionization characteristics and dissociate via tandem mass spectrometry (MS(2)) to form strong a2 fragment ion peaks. Process optimization and determination of reactive functional groups enabled a priori prediction of MS(2) fragmentation patterns for modified peptides. The strategy was tested on digested bovine serum albumin (BSA) and successfully quantified a peptide that was not observable prior to modification. Our method ionizes peptides regardless of proton affinity, thus decreasing ion suppression and permitting predictable multiple reaction monitoring (MRM)-based quantitation with improved sensitivity.
Iyer, Lavanya K.; Sacha, Gregory A.; Moorthy, Balakrishnan S.; Nail, Steven L.; Topp, Elizabeth M.
2016-01-01
Myoglobin (Mb) was lyophilized in the absence (Mb-A) and presence (Mb-B) of sucrose in a pilot-scale lyophilizer with or without controlled ice nucleation. Cake morphology was characterized using scanning electron microscopy (SEM) and changes in protein structure were monitored using solid-state Fourier-transform infrared spectroscopy (ssFTIR), solid-state hydrogen-deuterium exchange-mass spectrometry (ssHDX-MS) and solid-state photolytic labeling-mass spectrometry (ssPL-MS). The results showed greater variability in nucleation temperature and irregular cake structure for formulations lyophilized without controlled nucleation. Controlled nucleation resulted in nucleation at ~ −5 °C and uniform cake structure. Formulations containing sucrose showed better retention of protein structure by all measures than formulations without sucrose. Samples lyophilized with and without controlled nucleation were similar by most measures of protein structure. However, ssPL-MS showed the greatest pLeu incorporation and more labeled regions for Mb-B lyophilized with controlled nucleation. The data support the use of ssHDX-MS and ssPL-MS to study formulation and process-induced conformational changes in lyophilized proteins. PMID:27044943
Bandu, Raju; Lee, Hyun Jeong; Lee, Hyeong Min; Ha, Tae Hyon; Lee, Heon-Jeong; Kim, Se Joo; Ha, Kyooseob; Kim, Kwang Pyo
2018-05-01
Liquid chromatography-mass spectrometry (LC-MS) method revealed the plasma metabolite profiles in major depressive disorder patients treated with escitalopram (ECTP) (n = 7). Depression severity was assessed according to the 17-item Hamilton Depression Rating Scale. Metabolic profiles were derived from major depressive disorder subject blood samples collected after ECTP treatment. Blood plasma was separated and processed in order to effectively extract metabolites, which were then analyzed using LC-MS. We identified 19 metabolites and elucidated their structures using LC-tandem MS (LC-MS/MS) combined with elemental compositions derived from accurate mass measurements. We further used online H/D exchange experiments to verify the structural elucidations of each metabolite. Identifying molecular metabolites may provide critical insights into the pharmacological and clinical effects of ECTP treatment and may also provide useful information informing the development of new antidepressant treatments. These detailed plasma metabolite analyses may also be used to identify optimal dose concentrations in psychopharmacotherapeutic treatment through drug monitoring, as well as forming the basis for response predictions in depressed subjects. Copyright © 2018 John Wiley & Sons, Ltd.
Analysis of coffee for the presence of acrylamide by LC-MS/MS.
Andrzejewski, Denis; Roach, John A G; Gay, Martha L; Musser, Steven M
2004-04-07
A variety of popular instant, ground, and brewed coffees were analyzed using a modified liquid chromatography-tandem mass spectrometry (LC-MS/MS) method specifically developed for the determination of acrylamide in foods. Coffee test portions were spiked with 13C3-labeled acrylamide as an internal standard prior to their extraction and cleanup. Ground coffees (1 g) and instant coffees (0.5 g) were extracted by shaking with 9 mL of water for 20 min. Brewed coffee test portions (9 mL) were taken through the cleanup procedure without further dilution with extraction solvent. Coffee test portions were cleaned up by passing 1.5 mL first through an Oasis HLB (hydrophilic/lipophilic copolymer sorbent) solid phase extraction (SPE) cartridge and then a Bond Elut-Accucat (cation and anion exchange sorbent) SPE cartridge. The cleaned up extracts were analyzed by positive ion electrospray LC-MS/MS. The MS/MS data was used to detect, confirm, and quantitate acrylamide. The limit of quantitation of the method was 10 ng/g for ground and instant coffees and 1.0 ng/mL for brewed coffee. The levels of acrylamide ranged from 45 to 374 ng/g in unbrewed coffee grounds, from 172 to 539 ng/g in instant coffee crystals, and from 6 to 16 ng/mL in brewed coffee.
ICM: a web server for integrated clustering of multi-dimensional biomedical data.
He, Song; He, Haochen; Xu, Wenjian; Huang, Xin; Jiang, Shuai; Li, Fei; He, Fuchu; Bo, Xiaochen
2016-07-08
Large-scale efforts for parallel acquisition of multi-omics profiling continue to generate extensive amounts of multi-dimensional biomedical data. Thus, integrated clustering of multiple types of omics data is essential for developing individual-based treatments and precision medicine. However, while rapid progress has been made, methods for integrated clustering are lacking an intuitive web interface that facilitates the biomedical researchers without sufficient programming skills. Here, we present a web tool, named Integrated Clustering of Multi-dimensional biomedical data (ICM), that provides an interface from which to fuse, cluster and visualize multi-dimensional biomedical data and knowledge. With ICM, users can explore the heterogeneity of a disease or a biological process by identifying subgroups of patients. The results obtained can then be interactively modified by using an intuitive user interface. Researchers can also exchange the results from ICM with collaborators via a web link containing a Project ID number that will directly pull up the analysis results being shared. ICM also support incremental clustering that allows users to add new sample data into the data of a previous study to obtain a clustering result. Currently, the ICM web server is available with no login requirement and at no cost at http://biotech.bmi.ac.cn/icm/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Virtual Project Rooms for Education in Engineering
ERIC Educational Resources Information Center
van Vliet, Rudolf G.; Roeling, Monika M.; de Graaff, Rick; Pilot, Albert
2004-01-01
Virtual project rooms (VPRs) may support collaborative project-based learning groups by facilitating project management, documentation and communication. In this study a set of experiments was carried out at Eindhoven University of Technology using the MS Outlook/Exchange software as a groupware platform for design-oriented group projects. The…
SPECIATION AND PRESERVATION OF INORGANIC ARSENIC IN DRINKING WATER SUPPLIES WITH IC-ICP-MS
The speciation of inorganic arsenic in drinking water supplies is an essential part of devising an appropriate treatment process. Arsenate, because of its anion characteristics at drinking water pHs, is effectively removed by anion exchange treatment while arsenite remains in the...
NASA Technical Reports Server (NTRS)
Plesea, Lucian; Wood, James F.
2012-01-01
This software is a simple, yet flexible server of raster map products, compliant with the Open Geospatial Consortium (OGC) Web Map Service (WMS) 1.1.1 protocol. The server is a full implementation of the OGC WMS 1.1.1 as a fastCGI client and using Geospatial Data Abstraction Library (GDAL) for data access. The server can operate in a proxy mode, where all or part of the WMS requests are done on a back server. The server has explicit support for a colocated tiled WMS, including rapid response of black (no-data) requests. It generates JPEG and PNG images, including 16-bit PNG. The GDAL back-end support allows great flexibility on the data access. The server is a port to a Linux/GDAL platform from the original IRIX/IL platform. It is simpler to configure and use, and depending on the storage format used, it has better performance than other available implementations. The WMS server 2.0 is a high-performance WMS implementation due to the fastCGI architecture. The use of GDAL data back end allows for great flexibility. The configuration is relatively simple, based on a single XML file. It provides scaling and cropping, as well as blending of multiple layers based on layer transparency.
Virtual network computing: cross-platform remote display and collaboration software.
Konerding, D E
1999-04-01
VNC (Virtual Network Computing) is a computer program written to address the problem of cross-platform remote desktop/application display. VNC uses a client/server model in which an image of the desktop of the server is transmitted to the client and displayed. The client collects mouse and keyboard input from the user and transmits them back to the server. The VNC client and server can run on Windows 95/98/NT, MacOS, and Unix (including Linux) operating systems. VNC is multi-user on Unix machines (any number of servers can be run are unrelated to the primary display of the computer), while it is effectively single-user on Macintosh and Windows machines (only one server can be run, displaying the contents of the primary display of the server). The VNC servers can be configured to allow more than one client to connect at one time, effectively allowing collaboration through the shared desktop. I describe the function of VNC, provide details of installation, describe how it achieves its goal, and evaluate the use of VNC for molecular modelling. VNC is an extremely useful tool for collaboration, instruction, software development, and debugging of graphical programs with remote users.
How to securely replicate services (preliminary version)
NASA Technical Reports Server (NTRS)
Reiter, Michael; Birman, Kenneth
1992-01-01
A method is presented for constructing replicated services that retain their availability and integrity despite several servers and clients being corrupted by an intruder, in addition to others failing benignly. More precisely, a service is replicated by 'n' servers in such a way that a correct client will accept a correct server's response if, for some prespecified parameter, k, at least k servers are correct and fewer than k servers are correct. The issue of maintaining causality among client requests is also addressed. A security breach resulting from an intruder's ability to effect a violation of causality in the sequence of requests processed by the service is illustrated. An approach to counter this problem is proposed that requires that fewer than k servers are corrupt and, to ensure liveness, that k is less than or = n - 2t, where t is the assumed maximum total number of both corruptions and benign failures suffered by servers in any system run. An important and novel feature of these schemes is that the client need not be able to identify or authenticate even a single server. Instead, the client is required only to possess at most two public keys for the service.
NASA Astrophysics Data System (ADS)
Faden, J.; Vandegriff, J. D.; Weigel, R. S.
2016-12-01
Autoplot was introduced in 2008 as an easy-to-use plotting tool for the space physics community. It reads data from a variety of file resources, such as CDF and HDF files, and a number of specialized data servers, such as the PDS/PPI's DIT-DOS, CDAWeb, and from the University of Iowa's RPWG Das2Server. Each of these servers have optimized methods for transmitting data to display in Autoplot, but require coordination and specialized software to work, limiting Autoplot's ability to access new servers and datasets. Likewise, groups who would like software to access their APIs must either write thier own clients, or publish a specification document in hopes that people will write clients. The HAPI specification was written so that a simple, standard API could be used by both Autoplot and server implementations, to remove these barriers to free flow of time series data. Autoplot's software for communicating with HAPI servers is presented, showing the user interface scientists will use, and how data servers might implement the HAPI specification to provide access to their data. This will also include instructions on how Autoplot is used and installed desktop computers, and used to view data from the RBSP, Juno, and other missions.
Providing Internet Access to High-Resolution Mars Images
NASA Technical Reports Server (NTRS)
Plesea, Lucian
2008-01-01
The OnMars server is a computer program that provides Internet access to high-resolution Mars images, maps, and elevation data, all suitable for use in geographical information system (GIS) software for generating images, maps, and computational models of Mars. The OnMars server is an implementation of the Open Geospatial Consortium (OGC) Web Map Service (WMS) server. Unlike other Mars Internet map servers that provide Martian data using an Earth coordinate system, the OnMars WMS server supports encoding of data in Mars-specific coordinate systems. The OnMars server offers access to most of the available high-resolution Martian image and elevation data, including an 8-meter-per-pixel uncontrolled mosaic of most of the Mars Global Surveyor (MGS) Mars Observer Camera Narrow Angle (MOCNA) image collection, which is not available elsewhere. This server can generate image and map files in the tagged image file format (TIFF), Joint Photographic Experts Group (JPEG), 8- or 16-bit Portable Network Graphics (PNG), or Keyhole Markup Language (KML) format. Image control is provided by use of the OGC Style Layer Descriptor (SLD) protocol. The OnMars server also implements tiled WMS protocol and super-overlay KML for high-performance client application programs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bent, John M.; Faibish, Sorin; Pedone, Jr., James M.
A cluster file system is provided having a plurality of distributed metadata servers with shared access to one or more shared low latency persistent key-value metadata stores. A metadata server comprises an abstract storage interface comprising a software interface module that communicates with at least one shared persistent key-value metadata store providing a key-value interface for persistent storage of key-value metadata. The software interface module provides the key-value metadata to the at least one shared persistent key-value metadata store in a key-value format. The shared persistent key-value metadata store is accessed by a plurality of metadata servers. A metadata requestmore » can be processed by a given metadata server independently of other metadata servers in the cluster file system. A distributed metadata storage environment is also disclosed that comprises a plurality of metadata servers having an abstract storage interface to at least one shared persistent key-value metadata store.« less
An assessment of burn prevention knowledge in a high burn-risk environment: restaurants.
Piazza-Waggoner, Carrie; Adams, C D; Goldfarb, I W; Slater, H
2002-01-01
Our facility has seen an increase in the number of cases of children burned in restaurants. Fieldwork has revealed many unsafe serving practices in restaurants in our tristate area. The current research targets what appears to be an underexamined burn-risk environment, restaurants, to examine server knowledge about burn prevention and burn care with customers. Participants included 71 local restaurant servers and 53 servers from various restaurants who were recruited from undergraduate courses. All participants completed a brief demographic form as well as a Burn Knowledge Questionnaire. It was found that server knowledge was low (ie, less than 50% accuracy). Yet, most servers reported that they felt customer burn safety was important enough to change the way that they serve. Additionally, it was found that length of time employed as a server was a significant predictor of servers' burn knowledge (ie, more years serving associated with higher knowledge). Finally, individual items were examined to identify potential targets for developing prevention programs.
Horton, John J.
2006-04-11
A system and method of maintaining communication between a computer and a server, the server being in communication with the computer via xDSL service or dial-up modem service, with xDSL service being the default mode of communication, the method including sending a request to the server via xDSL service to which the server should respond and determining if a response has been received. If no response has been received, displaying on the computer a message (i) indicating that xDSL service has failed and (ii) offering to establish communication between the computer and the server via the dial-up modem, and thereafter changing the default mode of communication between the computer and the server to dial-up modem service. In a preferred embodiment, an xDSL service provider monitors dial-up modem communications and determines if the computer dialing in normally establishes communication with the server via xDSL service. The xDSL service provider can thus quickly and easily detect xDSL failures.
Rclick: a web server for comparison of RNA 3D structures.
Nguyen, Minh N; Verma, Chandra
2015-03-15
RNA molecules play important roles in key biological processes in the cell and are becoming attractive for developing therapeutic applications. Since the function of RNA depends on its structure and dynamics, comparing and classifying the RNA 3D structures is of crucial importance to molecular biology. In this study, we have developed Rclick, a web server that is capable of superimposing RNA 3D structures by using clique matching and 3D least-squares fitting. Our server Rclick has been benchmarked and compared with other popular servers and methods for RNA structural alignments. In most cases, Rclick alignments were better in terms of structure overlap. Our server also recognizes conformational changes between structures. For this purpose, the server produces complementary alignments to maximize the extent of detectable similarity. Various examples showcase the utility of our web server for comparison of RNA, RNA-protein complexes and RNA-ligand structures. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
A General Purpose Connections type CTI Server Based on SIP Protocol and Its Implementation
NASA Astrophysics Data System (ADS)
Watanabe, Toru; Koizumi, Hisao
In this paper, we propose a general purpose connections type CTI (Computer Telephony Integration) server that provides various CTI services such as voice logging where the CTI server communicates with IP-PBX using the SIP (Session Initiation Protocol), and accumulates voice packets of external line telephone call flowing between an IP telephone for extension and a VoIP gateway connected to outside line networks. The CTI server realizes CTI services such as voice logging, telephone conference, or IVR (interactive voice response) with accumulating and processing voice packets sampled. Furthermore, the CTI server incorporates a web server function which can provide various CTI services such as a Web telephone directory via a Web browser to PCs, cellular telephones or smart-phones in mobile environments.
The Ophidia Stack: Toward Large Scale, Big Data Analytics Experiments for Climate Change
NASA Astrophysics Data System (ADS)
Fiore, S.; Williams, D. N.; D'Anca, A.; Nassisi, P.; Aloisio, G.
2015-12-01
The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in multiple domains (e.g. climate change). It provides a "datacube-oriented" framework responsible for atomically processing and manipulating scientific datasets, by providing a common way to run distributive tasks on large set of data fragments (chunks). Ophidia provides declarative, server-side, and parallel data analysis, jointly with an internal storage model able to efficiently deal with multidimensional data and a hierarchical data organization to manage large data volumes. The project relies on a strong background on high performance database management and On-Line Analytical Processing (OLAP) systems to manage large scientific datasets. The Ophidia analytics platform provides several data operators to manipulate datacubes (about 50), and array-based primitives (more than 100) to perform data analysis on large scientific data arrays. To address interoperability, Ophidia provides multiple server interfaces (e.g. OGC-WPS). From a client standpoint, a Python interface enables the exploitation of the framework into Python-based eco-systems/applications (e.g. IPython) and the straightforward adoption of a strong set of related libraries (e.g. SciPy, NumPy). The talk will highlight a key feature of the Ophidia framework stack: the "Analytics Workflow Management System" (AWfMS). The Ophidia AWfMS coordinates, orchestrates, optimises and monitors the execution of multiple scientific data analytics and visualization tasks, thus supporting "complex analytics experiments". Some real use cases related to the CMIP5 experiment will be discussed. In particular, with regard to the "Climate models intercomparison data analysis" case study proposed in the EU H2020 INDIGO-DataCloud project, workflows related to (i) anomalies, (ii) trend, and (iii) climate change signal analysis will be presented. Such workflows will be distributed across multiple sites - according to the datasets distribution - and will include intercomparison, ensemble, and outlier analysis. The two-level workflow solution envisioned in INDIGO (coarse grain for distributed tasks orchestration, and fine grain, at the level of a single data analytics cluster instance) will be presented and discussed.
Implementing TCP/IP and a socket interface as a server in a message-passing operating system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hipp, E.; Wiltzius, D.
1990-03-01
The UNICOS 4.3BSD network code and socket transport interface are the basis of an explicit network server for NLTSS, a message passing operating system on the Cray YMP. A BSD socket user library provides access to the network server using an RPC mechanism. The advantages of this server methodology are its modularity and extensibility to migrate to future protocol suites (e.g. OSI) and transport interfaces. In addition, the network server is implemented in an explicit multi-tasking environment to take advantage of the Cray YMP multi-processor platform. 19 refs., 5 figs.
Single-server blind quantum computation with quantum circuit model
NASA Astrophysics Data System (ADS)
Zhang, Xiaoqian; Weng, Jian; Li, Xiaochun; Luo, Weiqi; Tan, Xiaoqing; Song, Tingting
2018-06-01
Blind quantum computation (BQC) enables the client, who has few quantum technologies, to delegate her quantum computation to a server, who has strong quantum computabilities and learns nothing about the client's quantum inputs, outputs and algorithms. In this article, we propose a single-server BQC protocol with quantum circuit model by replacing any quantum gate with the combination of rotation operators. The trap quantum circuits are introduced, together with the combination of rotation operators, such that the server is unknown about quantum algorithms. The client only needs to perform operations X and Z, while the server honestly performs rotation operators.
An Evaluation of Alternative Designs for a Grid Information Service
NASA Technical Reports Server (NTRS)
Smith, Warren; Waheed, Abdul; Meyers, David; Yan, Jerry; Kwak, Dochan (Technical Monitor)
2001-01-01
The Globus information service wasn't working well. There were many updates of data from Globus daemons which saturated the single server and users couldn't retrieve information. We created a second server for NASA and Alliance. Things were great on that server, but a bit slow on the other server. We needed to know exactly how the information service was being used. What were the best servers and configurations? This viewgraph presentation gives an overview of the evaluation of alternative designs for a Grid Information Service. Details are given on the workload characterization, methodology used, and the performance evaluation.
Measurement of in-vehicle volatile organic compounds under static conditions.
You, Ke-wei; Ge, Yun-shan; Hu, Bin; Ning, Zhan-wu; Zhao, Shou-tang; Zhang, Yan-ni; Xie, Peng
2007-01-01
The types and quantities of volatile organic compounds (VOCs) inside vehicles have been determined in one new vehicle and two old vehicles under static conditions using the Thermodesorber-Gas Chromatograph/Mass Spectrometer (TD-GC/MS). Air sampling and analysis was conducted under the requirement of USEPA Method TO-17. A room-size, environment test chamber was utilized to provide stable and accurate control of the required environmental conditions (temperature, humidity, horizontal and vertical airflow velocity, and background VOCs concentration). Static vehicle testing demonstrated that although the amount of total volatile organic compounds (TVOC) detected within each vehicle was relatively distinct (4940 microg/m3 in the new vehicle A, 1240 microg/m3 in used vehicle B, and 132 microg/m3 in used vehicle C), toluene, xylene, some aromatic compounds, and various C7-C12 alkanes were among the predominant VOC species in all three vehicles tested. In addition, tetramethyl succinonitrile, possibly derived from foam cushions was detected in vehicle B. The types and quantities of VOCs varied considerably according to various kinds of factors, such as, vehicle age, vehicle model, temperature, air exchange rate, and environment airflow velocity. For example, if the airflow velocity increases from 0.1 m/s to 0.7 m/s, the vehicle's air exchange rate increases from 0.15 h(-1) to 0.67 h(-1), and in-vehicle TVOC concentration decreases from 1780 to 1201 microg/m3.
Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server
2016-09-01
ARL-TR-7798 ● SEP 2016 US Army Research Laboratory Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server...for the Applied Anomaly Detection Tool (AADT) Web Server by Christian D Schlesiger Computational and Information Sciences Directorate, ARL...SUBTITLE Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT
PREDICT: Privacy and Security Enhancing Dynamic Information Monitoring
2015-08-03
consisting of global server-side probabilistic assignment by an untrusted server using cloaked locations, followed by feedback-loop guided local...12], consisting of global server-side probabilistic assignment by an untrusted server using cloaked locations, followed by feedback-loop guided...these methods achieve high sensing coverage with low cost using cloaked locations [3]. In follow-on work, the issue of mobility is addressed. Task
Performance Modeling of the ADA Rendezvous
1991-10-01
queueing network of figure 2, SERVERTASK can complete only one rendezvous at a time. Thus, the rate that the rendezvous requests are processed at the... Network 1, SERVERTASK competes with the traffic tasks of Server Processor. Each time SERVERTASK gains access to the processor, SERVERTASK completes...Client Processor Server Processor Software Server Nek Netork2 Figure 10. A conceptualization of the algorithm. The SERVERTASK software server of Network 2
Remote Adaptive Communication System
2001-10-25
manage several different devices using the software tool A. Client /Server Architecture The architecture we are proposing is based on the Client ...Server model (see figure 3). We want both client and server to be accessible from anywhere via internet. The computer, acting as a server, is in...the other hand, each of the client applications will act as sender or receiver, depending on the associated interface: user interface or device
Database architectures for Space Telescope Science Institute
NASA Astrophysics Data System (ADS)
Lubow, Stephen
1993-08-01
At STScI nearly all large applications require database support. A general purpose architecture has been developed and is in use that relies upon an extended client-server paradigm. Processing is in general distributed across three processes, each of which generally resides on its own processor. Database queries are evaluated on one such process, called the DBMS server. The DBMS server software is provided by a database vendor. The application issues database queries and is called the application client. This client uses a set of generic DBMS application programming calls through our STDB/NET programming interface. Intermediate between the application client and the DBMS server is the STDB/NET server. This server accepts generic query requests from the application and converts them into the specific requirements of the DBMS server. In addition, it accepts query results from the DBMS server and passes them back to the application. Typically the STDB/NET server is local to the DBMS server, while the application client may be remote. The STDB/NET server provides additional capabilities such as database deadlock restart and performance monitoring. This architecture is currently in use for some major STScI applications, including the ground support system. We are currently investigating means of providing ad hoc query support to users through the above architecture. Such support is critical for providing flexible user interface capabilities. The Universal Relation advocated by Ullman, Kernighan, and others appears to be promising. In this approach, the user sees the entire database as a single table, thereby freeing the user from needing to understand the detailed schema. A software layer provides the translation between the user and detailed schema views of the database. However, many subtle issues arise in making this transformation. We are currently exploring this scheme for use in the Hubble Space Telescope user interface to the data archive system (DADS).
Karakawa, Sachise; Shimbo, Kazutaka; Yamada, Naoyuki; Mizukoshi, Toshimi; Miyano, Hiroshi; Mita, Masashi; Lindner, Wolfgang; Hamase, Kenji
2015-11-10
A highly sensitive and selective chiral LC-MS/MS method for D-alanine, D-aspartic acid and D-serine has been developed using the precolumn derivatization reagents, 6-aminoquinolyl-N-hydroxysuccinimidyl carbamate (AccQ-Tag) or p-N,N,N-trimethylammonioanilyl N'-hydroxysuccinimidyl carbamate iodide (TAHS). The thus N-tagged enantiomers of the derivatized amino acids were nicely separated within 20min using the cinchona alkaloid-based zwittterionic ion-exchange type enantioselective column, Chiralpak ZWIX(+). The selected reaction monitoring was applied for detecting the target d-amino acids in biological matrices. By using the present chiral LC-MS/MS method, the three d-amino acids and their l-forms could be simultaneously determined in the range of 0.1-500nmol/mL. Finally, the technique was successfully applied to rat plasma and tissue samples. Copyright © 2015 Elsevier B.V. All rights reserved.
Rajabi, Khadijeh
2015-01-01
A pulsed hydrogen/deuterium exchange (HDX) method has been developed for rapid monitoring of the exchange kinetics of protein ions with D2O a few milliseconds after electrospray ionization (ESI). The stepwise gradual evolution of HDX of multiply charged protein ions was monitored using the pulsed HDX mass spectrometry technique. Upon introducing a very short pulse of D2O (in the μs to ms time scale) into the linear ion trap (LIT) of a time-of-flight (TOF) mass spectrometer, bimodal distributions were detected for the ions of cytochrome c and ubiquitin. Mechanistic details of HDX reactions for ubiquitin and cytochrome c in the gas phase were uncovered and the structural transitions were followed by analyzing the kinetics of HDX.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallimore, David L.
2012-06-13
The measurement uncertainty estimatino associated with trace element analysis of impurities in U and Pu was evaluated using the Guide to the Expression of Uncertainty Measurement (GUM). I this evalution the uncertainty sources were identified and standard uncertainties for the components were categorized as either Type A or B. The combined standard uncertainty was calculated and a coverage factor k = 2 was applied to obtain the expanded uncertainty, U. The ICP-AES and ICP-MS methods used were deveoped for the multi-element analysis of U and Pu samples. A typical analytical run consists of standards, process blanks, samples, matrix spiked samples,more » post digestion spiked samples and independent calibration verification standards. The uncertainty estimation was performed on U and Pu samples that have been analyzed previously as part of the U and Pu Sample Exchange Programs. Control chart results and data from the U and Pu metal exchange programs were combined with the GUM into a concentration dependent estimate of the expanded uncertainty. Comparison of trace element uncertainties obtained using this model was compared to those obtained for trace element results as part of the Exchange programs. This process was completed for all trace elements that were determined to be above the detection limit for the U and Pu samples.« less
Pérez-Méndez, A; Chandler, J C; Bisha, B; Goodridge, L D
2014-08-01
Enteric viral contaminants in water represent a public health concern, thus methods for detecting these viruses or their indicator microorganisms are needed. Because enteric viruses and their viral indicators are often found at low concentrations in water, their detection requires upfront concentration methods. In this study, a strong basic anion exchange resin was evaluated as an adsorbent material for the concentration of F-RNA coliphages (MS2, Qβ, GA, and HB-P22). These coliphages are recognized as enteric virus surrogates and fecal indicator organisms. Following adsorption of the coliphages from 50ml water samples, direct RNA isolation and real time RT-PCR detection were performed. In water samples containing 10(5)pfu/ml of the F-RNA coliphages, the anion exchange resin (IRA-900) adsorbed over 96.7% of the coliphages present, improving real time RT-PCR detection by 5-7 cycles compared to direct testing. F-RNA coliphage RNA recovery using the integrated method ranged from 12.6% to 77.1%. Resin-based concentration of samples with low levels of the F-RNA coliphages allowed for 10(0)pfu/ml (MS2 and Qβ) and 10(-1)pfu/ml (GA and HB-P22) to be detected. The resin-based method offers considerable advantages in cost, speed, simplicity and field adaptability. Copyright © 2014 Elsevier B.V. All rights reserved.
Enhanced networked server management with random remote backups
NASA Astrophysics Data System (ADS)
Kim, Song-Kyoo
2003-08-01
In this paper, the model is focused on available server management in network environments. The (remote) backup servers are hooked up by VPN (Virtual Private Network) and replace broken main severs immediately. A virtual private network (VPN) is a way to use a public network infrastructure and hooks up long-distance servers within a single network infrastructure. The servers can be represent as "machines" and then the system deals with main unreliable and random auxiliary spare (remote backup) machines. When the system performs a mandatory routine maintenance, auxiliary machines are being used for backups during idle periods. Unlike other existing models, the availability of auxiliary machines is changed for each activation in this enhanced model. Analytically tractable results are obtained by using several mathematical techniques and the results are demonstrated in the framework of optimized networked server allocation problems.
Stachowicz, Aneta; Siudut, Jakub; Suski, Maciej; Olszanecki, Rafał; Korbut, Ryszard; Undas, Anetta; Wiśniewski, Jacek R
2017-01-01
It is well known that fibrin network binds a large variety of proteins, including inhibitors and activators of fibrinolysis, which may affect clot properties, such as stability and susceptibility to fibrinolysis. Specific plasma clot composition differs between individuals and may change in disease states. However, the plasma clot proteome has not yet been in-depth analyzed, mainly due to technical difficulty related to the presence of a highly abundant protein-fibrinogen and fibrin that forms a plasma clot. The aim of our study was to optimize quantitative proteomic analysis of fibrin clots prepared ex vivo from citrated plasma of the peripheral blood drawn from patients with prior venous thromboembolism (VTE). We used a multiple enzyme digestion filter aided sample preparation, a multienzyme digestion (MED) FASP method combined with LC-MS/MS analysis performed on a Proxeon Easy-nLC System coupled to the Q Exactive HF mass spectrometer. We also evaluated the impact of peptide fractionation with pipet-tip strong anion exchange (SAX) method on the obtained results. Our proteomic approach revealed 476 proteins repeatedly identified in the plasma fibrin clots from patients with VTE including extracellular vesicle-derived proteins, lipoproteins, fibrinolysis inhibitors, and proteins involved in immune responses. The MED FASP method using three different enzymes: LysC, trypsin and chymotrypsin increased the number of identified peptides and proteins and their sequence coverage as compared to a single step digestion. Peptide fractionation with a pipet-tip strong anion exchange (SAX) protocol increased the depth of proteomic analyses, but also extended the time needed for sample analysis with LC-MS/MS. The MED FASP method combined with a label-free quantification is an excellent proteomic approach for the analysis of fibrin clots prepared ex vivo from citrated plasma of patients with prior VTE.
Ganranoo, Lucksagoon; Mishra, Santosh K; Azad, Abul K; Shigihara, Ado; Dasgupta, Purnendu K; Breitbach, Zachary S; Armstrong, Daniel W; Grudpan, Kate; Rappenglueck, Bernhard
2010-07-01
We report a novel system to analyze atmospheric nitrophenols (NPs). Rain or air sample extracts (1 mL) are preconcentrated on a narrow bore (2 mm) aliphatic anion exchanger. In the absence of strong retention of NPs exhibited by aromatic ion exchangers, retained NPs are eluted as a plug by injection of 100 microL of 0.1 M Na(2)SO(4) on to a short (2 x 50 mm) reverse phase C-18 column packed with 2.2 mum particles. The salt plug passes through the C-18 column unretained while the NPs are separated by an ammonium acetate buffered methanol-water eluent, compatible with mass spectrometry (MS). The eluted NPs are measured with a long path Teflon AF-based liquid core waveguide (0.15 x 1420 mm) illuminated by a 403 nm light emitting diode and detected by a monolithic photodiode-operational amplifier. The waveguide is rendered chemically active by suspending it over concentrated ammonia that permeates into the lumen. The NPs ionize to the yellow anion form (lambda(max) approximately 400 nm). The separation of 4-nitrophenol, 2,4-dinitrophenol, 2-methyl-4-nitrophenol, 3-methyl-4-nitrophenol, and 2-nitrophenol (these are the dominant NPs, typically in that order, in both rain and air of Houston and Arlington, TX, confirmed by tandem MS) takes just over 5 min with respective S/N = 3 limits of detection (LODs) of 60, 12, 30, 67, and 23 pg/mL compared to MS/MS LODs of 20, 49, 11, 20, and 210 pg/mL. Illustrative air and rain data are presented.
Britton, David; Zen, Yoh; Quaglia, Alberto; Selzer, Stefan; Mitra, Vikram; Löβner, Christopher; Jung, Stephan; Böhm, Gitte; Schmid, Peter; Prefot, Petra; Hoehle, Claudia; Koncarevic, Sasa; Gee, Julia; Nicholson, Robert; Ward, Malcolm; Castellano, Leandro; Stebbing, Justin; Zucht, Hans Dieter; Sarker, Debashis; Heaton, Nigel; Pike, Ian
2014-01-01
LC-MS/MS phospho-proteomics is an essential technology to help unravel the complex molecular events that lead to and propagate cancer. We have developed a global phospho-proteomic workflow to determine activity of signaling pathways and drug targets in pancreatic cancer tissue for clinical application. Peptides resulting from tryptic digestion of proteins extracted from frozen tissue of pancreatic ductal adenocarcinoma and background pancreas (n = 12), were labelled with tandem mass tags (TMT 8-plex), separated by strong cation exchange chromatography, then were analysed by LC-MS/MS directly or first enriched for phosphopeptides using IMAC and TiO2, prior to analysis. In-house, commercial and freeware bioinformatic platforms were used to identify relevant biological events from the complex dataset. Of 2,101 proteins identified, 152 demonstrated significant difference in abundance between tumor and non-tumor tissue. They included proteins that are known to be up-regulated in pancreatic cancer (e.g. Mucin-1), but the majority were new candidate markers such as HIPK1 & MLCK. Of the 6,543 unique phosphopeptides identified (6,284 unique phosphorylation sites), 635 showed significant regulation, particularly those from proteins involved in cell migration (Rho guanine nucleotide exchange factors & MRCKα) and formation of focal adhesions. Activator phosphorylation sites on FYN, AKT1, ERK2, HDAC1 and other drug targets were found to be highly modulated (≥2 fold) in different cases highlighting their predictive power. Here we provided critical information enabling us to identify the common and unique molecular events likely contributing to cancer in each case. Such information may be used to help predict more bespoke therapy suitable for an individual case.
Ahmad, Mahtab; Lee, Sang Soo; Lim, Jung Eun; Lee, Sung-Eun; Cho, Ju Sik; Moon, Deok Hyun; Hashimoto, Yohey; Ok, Yong Sik
2014-01-01
Mussel shell (MS), cow bone (CB) and biochar (BC) were selected to immobilize metals in an army firing range soil. Amendments were applied at 5% (wt) and their efficacies were determined after 175 d. For metal phytoavailability test, maize (Zea mays L.) plants were cultivated for 3weeks. Results showed that all amendments decreased the exchangeable Pb by up to 99% in planted/unplanted soils. Contrarily, exchangeable Sb were increased in the MS- and CB-amended soils. The rise in soil pH (~1 unit) by the amendments affected Pb and Sb mobility in soils. Bioavailability of Pb to maize was reduced by up to 71% in the amended soils. The Sb uptake to maize was decreased by up to 53.44% in the BC-amended soil. Sequential chemical extractions showed the transformation of easily available Pb to stable residual form with the amendment treatments. Scanning electron microscopic elemental dot mapping revealed the Pb association with Al and Si in the MS-amended soil and that with P in the CB- and BC-amended soils. Additionally, the extended X-ray absorption fine structure spectroscopic analysis indicated the transformation of organic bound Pb in unamended control soil to relatively more stable Pb-hydroxide (Ksp=10(-17.1)), chloropyromorphite (Ksp=10(-84.4)) and Pb-phosphate (Ksp=10(-23.8)) in soils amended with MS, CB and BC, respectively. Application of BC was the best in decreasing the phytoavailability of Pb and Sb in the studied army firing range soil. Copyright © 2013 Elsevier Ltd. All rights reserved.
Fang, Zhi; He, Chen; Li, Yongyong; Chung, Keng H; Xu, Chunming; Shi, Quan
2017-01-01
Although the progress of high resolution mass spectrometry in the past decade has enabled the molecular characterization of dissolved organic matter (DOM) in water as a whole, fractionation of DOM is necessary for a comprehensive characterization due to its super-complex nature. Here we proposed a method for the fractionation of DOM in a wastewater based on solubility and acidic-basic properties. Solid phase extraction (SPE) cartridges with reversed phase retention and ion-exchange adsorption capacities, namely MAX and MCX, were used in succession to fractionate a petroleum refinery wastewater into four fractions: hydrophobic acid (HOA), hydrophobic neutral (HON), hydrophobic base (HOB), and hydrophilic substance (HIS) fractions. According to the total organic carbon (TOC) analysis, 72.6% (in term of TOC) of DOM was extracted in hydrophobic fractions, in which HON was the most abundant. Hydrophobic extracts were characterized by negative and positive ion electrospray (ESI) Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS), respectively. Compounds with multiple oxygen atoms were predominant in the HOA, which were responded strongly in the negative ESI MS. Nitrogen containing compounds were the major detected species by positive ion ESI in all hydrophobic fractions. The molecular composition of the DOM were discussed based on the FT-ICR MS results. The fractionation provided salt free samples which enables the direct analysis of the fractions by ESI and a deep insight into the molecular composition of DOM in the wastewater. The method is potential for routine evaluation of DOM in industry wastewaters, as well as environmental water samples. Copyright © 2016. Published by Elsevier B.V.
Analysis of carbohydrates by anion exchange chromatography and mass spectrometry.
Bruggink, Cees; Maurer, Rolf; Herrmann, Heiko; Cavalli, Silvano; Hoefler, Frank
2005-08-26
A versatile liquid chromatographic platform has been developed for analysing underivatized carbohydrates using high performance anion exchange chromatography (HPAEC) followed by an inert PEEK splitter that splits the effluent to the integrated pulsed amperometric detector (IPAD) and to an on-line single quadrupole mass spectrometer (MS). Common eluents for HPAEC such as sodium hydroxide and sodium acetate are beneficial for the amperometric detection but not compatible with electrospray ionisation (ESI). Therefore a membrane-desalting device was installed after the splitter and prior to the ESI interface converting sodium hydroxide into water and sodium acetate into acetic acid. To enhance the sensitivity for the MS detection, 0.5 mmol/l lithium chloride was added after the membrane desalter to form lithium adducts of the carbohydrates. To compare sensitivity of IPAD and MS detection glucose, fructose, and sucrose were used as analytes. A calibration with external standards from 2.5 to 1000 pmole was performed showing a linear range over three orders of magnitude. Minimum detection limits (MDL) with IPAD were determined at 5 pmole levels for glucose to be 0.12 pmole, fructose 0.22 pmole and sucrose 0.11 pmole. With MS detection in the selected ion mode (SIM) the lithium adducts of the carbohydrates were detected obtaining MDL's for glucose of 1.49 pmole, fructose 1.19 pmole, and sucrose 0.36 pmole showing that under these conditions IPAD is 3-10 times more sensitive for those carbohydrates. The applicability of the method was demonstrated analysing carbohydrates in real world samples such as chicory inulin where polyfructans up to a molecular mass of 7000 g/mol were detected as quadrupoly charged lithium adducts. Furthermore mono-, di-, tri-, and oligosaccharides were detected in chicory coffee, honey and beer samples.
Assessing Server Fault Tolerance and Disaster Recovery Implementation in Thin Client Architectures
2007-09-01
server • Windows 2003 server Processor AMD Geode GX Memory 512MB Flash/256MB DDR RAM I/O/Peripheral Support • VGA-type video output (DB-15...2000 Advanced Server Processor AMD Geode NX 1500 Memory • 256MB or 512MB or 1GB DDR SDRAM • 1GB or 512MB Flash I/O/Peripheral Support • SiS741 GX
Accountable Information Flow for Java-Based Web Applications
2010-01-01
runtime library Swift server runtime Java servlet framework HTTP Web server Web browser Figure 2: The Swift architecture introduced an open-ended...On the server, the Java application code links against Swift’s server-side run-time library, which in turn sits on top of the standard Java servlet ...AFRL-RI-RS-TR-2010-9 Final Technical Report January 2010 ACCOUNTABLE INFORMATION FLOW FOR JAVA -BASED WEB APPLICATIONS
NASA Astrophysics Data System (ADS)
Niranjan, S. P.; Chandrasekaran, V. M.; Indhira, K.
2018-04-01
This paper examines bulk arrival and batch service queueing system with functioning server failure and multiple vacations. Customers are arriving into the system in bulk according to Poisson process with rate λ. Arriving customers are served in batches with minimum of ‘a’ and maximum of ‘b’ number of customers according to general bulk service rule. In the service completion epoch if the queue length is less than ‘a’ then the server leaves for vacation (secondary job) of random length. After a vacation completion, if the queue length is still less than ‘a’ then the server leaves for another vacation. The server keeps on going vacation until the queue length reaches the value ‘a’. The server is not stable at all the times. Sometimes it may fails during functioning of customers. Though the server fails service process will not be interrupted.It will be continued for the current batch of customers with lower service rate than the regular service rate. The server will be repaired after the service completion with lower service rate. The probability generating function of the queue size at an arbitrary time epoch will be obtained for the modelled queueing system by using supplementary variable technique. Moreover various performance characteristics will also be derived with suitable numerical illustrations.
RNAiFold: a web server for RNA inverse folding and molecular design.
Garcia-Martin, Juan Antonio; Clote, Peter; Dotu, Ivan
2013-07-01
Synthetic biology and nanotechnology are poised to make revolutionary contributions to the 21st century. In this article, we describe a new web server to support in silico RNA molecular design. Given an input target RNA secondary structure, together with optional constraints, such as requiring GC-content to lie within a certain range, requiring the number of strong (GC), weak (AU) and wobble (GU) base pairs to lie in a certain range, the RNAiFold web server determines one or more RNA sequences, whose minimum free-energy secondary structure is the target structure. RNAiFold provides access to two servers: RNA-CPdesign, which applies constraint programming, and RNA-LNSdesign, which applies the large neighborhood search heuristic; hence, it is suitable for larger input structures. Both servers can also solve the RNA inverse hybridization problem, i.e. given a representation of the desired hybridization structure, RNAiFold returns two sequences, whose minimum free-energy hybridization is the input target structure. The web server is publicly accessible at http://bioinformatics.bc.edu/clotelab/RNAiFold, which provides access to two specialized servers: RNA-CPdesign and RNA-LNSdesign. Source code for the underlying algorithms, implemented in COMET and supported on linux, can be downloaded at the server website.
STELAR: An experiment in the electronic distribution of astronomical literature
NASA Technical Reports Server (NTRS)
Warnock, A.; Vansteenburg, M. E.; Brotzman, L. E.; Gass, J.; Kovalsky, D.
1992-01-01
STELAR (Study of Electronic Literature for Astronomical Research) is a Goddard-based project designed to test methods of delivering technical literature in machine readable form. To that end, we have scanned a five year span of the ApJ, ApJ Supp, AJ and PASP, and have obtained abstracts for eight leading academic journals from NASA/STI CASI, which also makes these abstracts available through the NASA RECON system. We have also obtained machine readable versions of some journal volumes from the publishers, although in many instances, the final typeset versions are no longer available. The fundamental data object for the STELAR database is the article, a collection of items associated with a scientific paper - abstract, scanned pages (in a variety of formats), figures, OCR extractions, forward and backward references, errata and versions of the paper in various formats (e.g., TEX, SGML, PostScript, DVI). Articles are uniquely referenced in the database by journal name, volume number and page number. The selection and delivery of articles is accomplished through the WAIS (Wide Area Information Server) client/server models requiring only an Internet connection. Modest modifications to the server code have made it capable of delivering the multiple data types required by STELAR. WAIS is a platform independent and fully open multi-disciplinary delivery system, originally developed by Thinking Machines Corp. and made available free of charge. It is based on the ISO Z39.50 standard communications protocol. WAIS servers run under both UNIX and VMS. WAIS clients run on a wide variety of machines, from UNIX-based Xwindows systems to MS-DOS and macintosh microcomputers. The WAIS system includes full-test indexing and searching of documents, network interface and easy access to a variety of document viewers. ASCII versions of the CASI abstracts have been formatted for display and the full test of the abstracts has been indexed. The entire WAIS database of abstracts is now available for use by the astronomical community. Enhancements of the search and retrieval system are under investigation to include specialized searches (by reference, author or keyword, as opposed to full test searches), improved handling of word stems, improvements in relevancy criteria and other retrieval techniques, such as factor spaces. The STELAR project has been assisted by the full cooperation of the AAS, the ASP, the publishers of the academic journals, librarians from GSFC, NRAO and STScI, the Library of Congress, and the University of North Carolina at Chapel Hill.
Telerobotic control of a mobile coordinated robotic server. M.S. Thesis Annual Technical Report
NASA Technical Reports Server (NTRS)
Lee, Gordon
1993-01-01
The annual report on telerobotic control of a mobile coordinated robotic server is presented. The goal of this effort is to develop advanced control methods for flexible space manipulator systems. As such, an adaptive fuzzy logic controller was developed in which model structure as well as parameter constraints are not required for compensation. The work builds upon previous work on fuzzy logic controllers. Fuzzy logic controllers have been growing in importance in the field of automatic feedback control. Hardware controllers using fuzzy logic have become available as an alternative to the traditional PID controllers. Software has also been introduced to aid in the development of fuzzy logic rule-bases. The advantages of using fuzzy logic controllers include the ability to merge the experience and intuition of expert operators into the rule-base and that a model of the system is not required to construct the controller. A drawback of the classical fuzzy logic controller, however, is the many parameters needed to be turned off-line prior to application in the closed-loop. In this report, an adaptive fuzzy logic controller is developed requiring no system model or model structure. The rule-base is defined to approximate a state-feedback controller while a second fuzzy logic algorithm varies, on-line, parameters of the defining controller. Results indicate the approach is viable for on-line adaptive control of systems when the model is too complex or uncertain for application of other more classical control techniques.
Setting an Upper Limit on Gas Exchange Through Sea-Spray
NASA Astrophysics Data System (ADS)
Vlahos, P.; Monahan, E. C.; Andreas, E. L.
2016-02-01
Air-sea gas exchange parameterization is critical to understanding both climate forcing and feedbacks and is key in biogeochemistry cycles. Models based on wind speed have provided empirical estimates of gas exchange that are useful though it is likely that at high wind speeds of over 10 m/s there are important gas exchange parameters including bubbles and sea spray that have not been well constrained. Here we address the sea-spray component of gas exchange at these high wind speeds to set sn upper boundary condition for the gas exchange of the six model gases including; nobel gases helium, neon and argon, diatomic gases nitrogen and oxygen and finally, the more complex gas carbon dioxide. Estimates are based on the spray generation function of Andreas and Monahan and the gases are tested under three scenarios including 100 percent saturation and complete droplet evaporation, 100 percent saturation and a more realistic scenario in which a fraction of droplets evaporate completely, a fraction evaporate to some degree and a fraction returns to the water side without significant evaporation. Finally the latter scenario is applied to representative under saturated concentrations of the gases.
Konc, Janez; Janezic, Dusanka
2012-07-01
The ProBiS web server is a web server for detection of structurally similar binding sites in the PDB and for local pairwise alignment of protein structures. In this article, we present a new version of the ProBiS web server that is 10 times faster than earlier versions, due to the efficient parallelization of the ProBiS algorithm, which now allows significantly faster comparison of a protein query against the PDB and reduces the calculation time for scanning the entire PDB from hours to minutes. It also features new web services, and an improved user interface. In addition, the new web server is united with the ProBiS-Database and thus provides instant access to pre-calculated protein similarity profiles for over 29 000 non-redundant protein structures. The ProBiS web server is particularly adept at detection of secondary binding sites in proteins. It is freely available at http://probis.cmm.ki.si/old-version, and the new ProBiS web server is at http://probis.cmm.ki.si.
An Application Server for Scientific Collaboration
NASA Astrophysics Data System (ADS)
Cary, John R.; Luetkemeyer, Kelly G.
1998-11-01
Tech-X Corporation has developed SciChat, an application server for scientific collaboration. Connections are made to the server through a Java client, that can either be an application or an applet served in a web page. Once connected, the client may choose to start or join a session. A session includes not only other clients, but also an application. Any client can send a command to the application. This command is executed on the server and echoed to all clients. The results of the command, whether numerical or graphical, are then distributed to all of the clients; thus, multiple clients can interact collaboratively with a single application. The client is developed in Java, the server in C++, and the middleware is the Common Object Request Broker Architecture. In this system, the Graphical User Interface processing is on the client machine, so one does not have the disadvantages of insufficient bandwidth as occurs when running X over the internet. Because the server, client, and middleware are object oriented, new types of servers and clients specialized to particular scientific applications are more easily developed.
Web Service Distributed Management Framework for Autonomic Server Virtualization
NASA Astrophysics Data System (ADS)
Solomon, Bogdan; Ionescu, Dan; Litoiu, Marin; Mihaescu, Mircea
Virtualization for the x86 platform has imposed itself recently as a new technology that can improve the usage of machines in data centers and decrease the cost and energy of running a high number of servers. Similar to virtualization, autonomic computing and more specifically self-optimization, aims to improve server farm usage through provisioning and deprovisioning of instances as needed by the system. Autonomic systems are able to determine the optimal number of server machines - real or virtual - to use at a given time, and add or remove servers from a cluster in order to achieve optimal usage. While provisioning and deprovisioning of servers is very important, the way the autonomic system is built is also very important, as a robust and open framework is needed. One such management framework is the Web Service Distributed Management (WSDM) system, which is an open standard of the Organization for the Advancement of Structured Information Standards (OASIS). This paper presents an open framework built on top of the WSDM specification, which aims to provide self-optimization for applications servers residing on virtual machines.
Hybrid Rendering with Scheduling under Uncertainty
Tamm, Georg; Krüger, Jens
2014-01-01
As scientific data of increasing size is generated by today’s simulations and measurements, utilizing dedicated server resources to process the visualization pipeline becomes necessary. In a purely server-based approach, requirements on the client-side are minimal as the client only displays results received from the server. However, the client may have a considerable amount of hardware available, which is left idle. Further, the visualization is put at the whim of possibly unreliable server and network conditions. Server load, bandwidth and latency may substantially affect the response time on the client. In this paper, we describe a hybrid method, where visualization workload is assigned to server and client. A capable client can produce images independently. The goal is to determine a workload schedule that enables a synergy between the two sides to provide rendering results to the user as fast as possible. The schedule is determined based on processing and transfer timings obtained at runtime. Our probabilistic scheduler adapts to changing conditions by shifting workload between server and client, and accounts for the performance variability in the dynamic system. PMID:25309115
Boulos, Maged N Kamel; Honda, Kiyoshi
2006-01-01
Open Source Web GIS software systems have reached a stage of maturity, sophistication, robustness and stability, and usability and user friendliness rivalling that of commercial, proprietary GIS and Web GIS server products. The Open Source Web GIS community is also actively embracing OGC (Open Geospatial Consortium) standards, including WMS (Web Map Service). WMS enables the creation of Web maps that have layers coming from multiple different remote servers/sources. In this article we present one easy to implement Web GIS server solution that is based on the Open Source University of Minnesota (UMN) MapServer. By following the accompanying step-by-step tutorial instructions, interested readers running mainstream Microsoft® Windows machines and with no prior technical experience in Web GIS or Internet map servers will be able to publish their own health maps on the Web and add to those maps additional layers retrieved from remote WMS servers. The 'digital Asia' and 2004 Indian Ocean tsunami experiences in using free Open Source Web GIS software are also briefly described. PMID:16420699
Konc, Janez; Janežič, Dušanka
2012-01-01
The ProBiS web server is a web server for detection of structurally similar binding sites in the PDB and for local pairwise alignment of protein structures. In this article, we present a new version of the ProBiS web server that is 10 times faster than earlier versions, due to the efficient parallelization of the ProBiS algorithm, which now allows significantly faster comparison of a protein query against the PDB and reduces the calculation time for scanning the entire PDB from hours to minutes. It also features new web services, and an improved user interface. In addition, the new web server is united with the ProBiS-Database and thus provides instant access to pre-calculated protein similarity profiles for over 29 000 non-redundant protein structures. The ProBiS web server is particularly adept at detection of secondary binding sites in proteins. It is freely available at http://probis.cmm.ki.si/old-version, and the new ProBiS web server is at http://probis.cmm.ki.si. PMID:22600737
R3D Align web server for global nucleotide to nucleotide alignments of RNA 3D structures.
Rahrig, Ryan R; Petrov, Anton I; Leontis, Neocles B; Zirbel, Craig L
2013-07-01
The R3D Align web server provides online access to 'RNA 3D Align' (R3D Align), a method for producing accurate nucleotide-level structural alignments of RNA 3D structures. The web server provides a streamlined and intuitive interface, input data validation and output that is more extensive and easier to read and interpret than related servers. The R3D Align web server offers a unique Gallery of Featured Alignments, providing immediate access to pre-computed alignments of large RNA 3D structures, including all ribosomal RNAs, as well as guidance on effective use of the server and interpretation of the output. By accessing the non-redundant lists of RNA 3D structures provided by the Bowling Green State University RNA group, R3D Align connects users to structure files in the same equivalence class and the best-modeled representative structure from each group. The R3D Align web server is freely accessible at http://rna.bgsu.edu/r3dalign/.
Innovative methods in soil phosphorus research: A review
Kruse, Jens; Abraham, Marion; Amelung, Wulf; Baum, Christel; Bol, Roland; Kühn, Oliver; Lewandowski, Hans; Niederberger, Jörg; Oelmann, Yvonne; Rüger, Christopher; Santner, Jakob; Siebers, Meike; Siebers, Nina; Spohn, Marie; Vestergren, Johan; Vogts, Angela; Leinweber, Peter
2015-01-01
Phosphorus (P) is an indispensable element for all life on Earth and, during the past decade, concerns about the future of its global supply have stimulated much research on soil P and method development. This review provides an overview of advanced state-of-the-art methods currently used in soil P research. These involve bulk and spatially resolved spectroscopic and spectrometric P speciation methods (1 and 2D NMR, IR, Raman, Q-TOF MS/MS, high resolution-MS, NanoSIMS, XRF, XPS, (µ)XAS) as well as methods for assessing soil P reactions (sorption isotherms, quantum-chemical modeling, microbial biomass P, enzymes activity, DGT, 33P isotopic exchange, 18O isotope ratios). Required experimental set-ups and the potentials and limitations of individual methods present a guide for the selection of most suitable methods or combinations. PMID:26167132
Susskind, Alex M; Kacmar, K Michele; Borchgrevink, Carl P
2003-02-01
The authors proposed and tested a model describing the relationship between customer service providers' perceptions and attitudes toward their service-related duties and their customers' perceptions of satisfaction with their service experiences. Results indicated that the perception of having standards for service delivery in an organization is strongly related to line-level employees' perceptions of support from coworkers and supervisors. Perceived support from coworkers was significantly related to service providers' customer orientation, whereas perceived support from supervisors showed a weaker relationship to a customer orientation. Ultimately, service providers' customer orientation was strongly related to customers' satisfaction with service. Finally, a set of post hoc analyses indicated that coworker and supervisory support explained a greater proportion of incremental variance in the model than did perceived organizational support alone.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, J.; Bauer, K.; Borga, A.
The ATLAS Phase-I upgrade (2019) requires a Trigger and Data Acquisition (TDAQ) system able to trigger and record data from up to three times the nominal LHC instantaneous luminosity. Furthermore, the Front-End LInk eXchange (FELIX) system provides an infrastructure to achieve this in a scalable, detector agnostic and easily upgradeable way. It is a PC-based gateway, interfacing custom radiation tolerant optical links from front-end electronics, via PCIe Gen3 cards, to a commodity switched Ethernet or InfiniBand network. FELIX enables reducing custom electronics in favour of software running on commercial servers. Here, the FELIX system, the design of the PCIe prototypemore » card and the integration test results are presented.« less
Anderson, J.; Bauer, K.; Borga, A.; ...
2016-12-13
The ATLAS Phase-I upgrade (2019) requires a Trigger and Data Acquisition (TDAQ) system able to trigger and record data from up to three times the nominal LHC instantaneous luminosity. Furthermore, the Front-End LInk eXchange (FELIX) system provides an infrastructure to achieve this in a scalable, detector agnostic and easily upgradeable way. It is a PC-based gateway, interfacing custom radiation tolerant optical links from front-end electronics, via PCIe Gen3 cards, to a commodity switched Ethernet or InfiniBand network. FELIX enables reducing custom electronics in favour of software running on commercial servers. Here, the FELIX system, the design of the PCIe prototypemore » card and the integration test results are presented.« less
Wang, Shuang; Jiang, Xiaoqian; Wu, Yuan; Cui, Lijuan; Cheng, Samuel; Ohno-Machado, Lucila
2013-01-01
We developed an EXpectation Propagation LOgistic REgRession (EXPLORER) model for distributed privacy-preserving online learning. The proposed framework provides a high level guarantee for protecting sensitive information, since the information exchanged between the server and the client is the encrypted posterior distribution of coefficients. Through experimental results, EXPLORER shows the same performance (e.g., discrimination, calibration, feature selection etc.) as the traditional frequentist Logistic Regression model, but provides more flexibility in model updating. That is, EXPLORER can be updated one point at a time rather than having to retrain the entire data set when new observations are recorded. The proposed EXPLORER supports asynchronized communication, which relieves the participants from coordinating with one another, and prevents service breakdown from the absence of participants or interrupted communications. PMID:23562651
SAIDE: A Semi-Automated Interface for Hydrogen/Deuterium Exchange Mass Spectrometry.
Villar, Maria T; Miller, Danny E; Fenton, Aron W; Artigues, Antonio
2010-01-01
Deuterium/hydrogen exchange in combination with mass spectrometry (DH MS) is a sensitive technique for detection of changes in protein conformation and dynamics. Since temperature, pH and timing control are the key elements for reliable and efficient measurement of hydrogen/deuterium content in proteins and peptides, we have developed a small, semiautomatic interface for deuterium exchange that interfaces the HPLC pumps with a mass spectrometer. This interface is relatively inexpensive to build, and provides efficient temperature and timing control in all stages of enzyme digestion, HPLC separation and mass analysis of the resulting peptides. We have tested this system with a series of standard tryptic peptides reconstituted in a solvent containing increasing concentration of deuterium. Our results demonstrate the use of this interface results in minimal loss of deuterium due to back exchange during HPLC desalting and separation. For peptides reconstituted in a buffer containing 100% deuterium, and assuming that all amide linkages have exchanged hydrogen with deuterium, the maximum loss of deuterium content is only 17% of the label, indicating the loss of only one deuterium molecule per peptide.
SAIDE: A Semi-Automated Interface for Hydrogen/Deuterium Exchange Mass Spectrometry
Villar, Maria T.; Miller, Danny E.; Fenton, Aron W.; Artigues, Antonio
2011-01-01
Deuterium/hydrogen exchange in combination with mass spectrometry (DH MS) is a sensitive technique for detection of changes in protein conformation and dynamics. Since temperature, pH and timing control are the key elements for reliable and efficient measurement of hydrogen/deuterium content in proteins and peptides, we have developed a small, semiautomatic interface for deuterium exchange that interfaces the HPLC pumps with a mass spectrometer. This interface is relatively inexpensive to build, and provides efficient temperature and timing control in all stages of enzyme digestion, HPLC separation and mass analysis of the resulting peptides. We have tested this system with a series of standard tryptic peptides reconstituted in a solvent containing increasing concentration of deuterium. Our results demonstrate the use of this interface results in minimal loss of deuterium due to back exchange during HPLC desalting and separation. For peptides reconstituted in a buffer containing 100% deuterium, and assuming that all amide linkages have exchanged hydrogen with deuterium, the maximum loss of deuterium content is only 17% of the label, indicating the loss of only one deuterium molecule per peptide. PMID:25309638
NASA Astrophysics Data System (ADS)
Chursin, Alexei A.; Jacquinet-Husson, N.; Lefevre, G.; Scott, Noelle A.; Chedin, Alain
2000-01-01
This paper presents the recently developed information content diffusion facilities, e.g. the WWW-server of GEISA, MS DOS, WINDOWS-95/NT, and UNIX software packages, associated with the 1997 version of the GEISA-(Gestion et Etude des Informations Spectroscopiques Atmospheriques; word translation: Management and Study of Atmospheric Spectroscopic Information) infrared spectroscopic databank developed at LMD (Laboratoire de Meteorologie Dynamique, France). GEISA-97 individual lines file involves 42 molecules (96 isotopic species) and contains 1,346,266 entries, between 0 and 22,656 cm-1. GEISA-97 also has a catalog of cross-sections at different temperatures and pressures for species (such as chlorofluorocarbons) with complex spectra. The current version of the GEISA-97 cross- section databank contains 4,716,743 entries related to 23 molecules between 555 and 1700 cm-1.
3DScapeCS: application of three dimensional, parallel, dynamic network visualization in Cytoscape
2013-01-01
Background The exponential growth of gigantic biological data from various sources, such as protein-protein interaction (PPI), genome sequences scaffolding, Mass spectrometry (MS) molecular networking and metabolic flux, demands an efficient way for better visualization and interpretation beyond the conventional, two-dimensional visualization tools. Results We developed a 3D Cytoscape Client/Server (3DScapeCS) plugin, which adopted Cytoscape in interpreting different types of data, and UbiGraph for three-dimensional visualization. The extra dimension is useful in accommodating, visualizing, and distinguishing large-scale networks with multiple crossed connections in five case studies. Conclusions Evaluation on several experimental data using 3DScapeCS and its special features, including multilevel graph layout, time-course data animation, and parallel visualization has proven its usefulness in visualizing complex data and help to make insightful conclusions. PMID:24225050
Emerging New Strategies for Successful Metabolite Identification in Metabolomics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bingol, Ahmet K.; Bruschweiler-Li, Lei; Li, Dawei
2016-02-26
NMR is a very powerful tool for the identification of known and unknown (or unnamed) metabolites in complex mixtures as encountered in metabolomics. Known compounds can be reliably identified using 2D NMR methods, such as 13C-1H HSQC, for which powerful web servers with databases are available for semi-automated analysis. For the identification of unknown compounds, new combinations of NMR with MS have been developed recently that make synergistic use of the mutual strengths of the two techniques. The use of chemical additives to the NMR tube, such as reactive agents, paramagnetic ions, or charged silica nanoparticles, permit the identification ofmore » metabolites with specific physical chemical properties. In the following sections, we give an overview of some of the recent advances in metabolite identification and discuss remaining challenges.« less
David, Frank; Tienpont, Bart; Devos, Christophe; Lerch, Oliver; Sandra, Pat
2013-10-25
Laboratories focusing on residue analysis in food are continuously seeking to increase sample throughput by minimizing sample preparation. Generic sample extraction methods such as QuEChERS lack selectivity and consequently extracts are not free from non-volatile material that contaminates the analytical system. Co-extracted matrix constituents interfere with target analytes, even if highly sensitive and selective GC-MS/MS is used. A number of GC approaches are described that can be used to increase laboratory productivity. These techniques include automated inlet liner exchange and column backflushing for preservation of the performance of the analytical system and heart-cutting two-dimensional GC for increasing sensitivity and selectivity. The application of these tools is illustrated by the analysis of pesticides in vegetables and fruits, PCBs in milk powder and coplanar PCBs in fish. It is demonstrated that considerable increase in productivity can be achieved by decreasing instrument down-time, while analytical performance is equal or better compared to conventional trace contaminant analysis. Copyright © 2013 Elsevier B.V. All rights reserved.
da Silva, Letícia Flores; Guerra, Celito Crivellaro; Klein, Diandra; Bergold, Ana Maria
2017-07-15
Bioactive phenols (BPs) are often targets in red wine analysis. However, other compounds interfere in the liquid chromatography methods used for this analysis. Here, purification procedures were tested to eliminate anthocyanin interference during the determination of 19 red-wine BPs. Liquid chromatography, coupled to a diode array detector (HPLC-DAD) and a mass spectrometer (UPLC-MS), was used to compare the direct injection of the samples with solid-phase extractions: reversed-phase (C18) and strong cation-exchange (SCX). The HPLC-DAD method revealed that, out of 13BPs, only six are selectively analyzed with or without C18 treatment, whereas SCX enabled the detection of all BPs. The recovery with SCX was above 86.6% for eight BPs. Moreover, UPLC-MS demonstrated the potential of SCX sample preparation for the determination of 19BPs. The developed procedure may be extended to the analysis of other red wine molecules or to other analytical methods where anthocyanins may interfere. Copyright © 2017 Elsevier Ltd. All rights reserved.
Probing Conformational Dynamics of Tau Protein by Hydrogen/Deuterium Exchange Mass Spectrometry
NASA Astrophysics Data System (ADS)
Huang, Richard Y.-C.; Iacob, Roxana E.; Sankaranarayanan, Sethu; Yang, Ling; Ahlijanian, Michael; Tao, Li; Tymiak, Adrienne A.; Chen, Guodong
2018-01-01
Fibrillization of the microtubule-associated protein tau has been recognized as one of the signature pathologies of the nervous system in Alzheimer's disease, progressive supranuclear palsy, and other tauopathies. The conformational transition of tau in the fibrillization process, tau monomer to soluble aggregates to fibrils in particular, remains unclear. Here we report on the use of hydrogen/deuterium exchange mass spectrometry (HDX-MS) in combination with other biochemical approaches, including Thioflavin S fluorescence measurements, enzyme-linked immunosorbent assay (ELISA), and Western blotting to understand the heparin-induced tau's fibrillization. HDX-MS studies including anti-tau antibody epitope mapping experiments provided molecular level details of the full-length tau's conformational dynamics and its regional solvent accessibility upon soluble aggregates formation. The results demonstrate that R3 region in the full-length tau's microtubule binding repeat region (MTBR) is stabilized in the aggregation process, leaving both N and C terminal regions to be solvent exposed in the soluble aggregates and fibrils. The findings also illustrate the practical utility of orthogonal analytical methodologies for the characterization of protein higher order structure. [Figure not available: see fulltext.
Mitigating Security Issues: The University of Memphis Case.
ERIC Educational Resources Information Center
Jackson, Robert; Frolick, Mark N.
2003-01-01
Studied a server security breach at the University of Memphis, Tennessee, to highlight personnel roles, detection of the compromised server, policy enforcement, forensics, and the proactive search for other servers threatened in the same way. (SLD)
NASA Technical Reports Server (NTRS)
Lyle, Stacey D.
2009-01-01
A software package that has been designed to allow authentication for determining if the rover(s) is/are within a set of boundaries or a specific area to access critical geospatial information by using GPS signal structures as a means to authenticate mobile devices into a network wirelessly and in real-time has been developed. The advantage lies in that the system only allows those with designated geospatial boundaries or areas into the server. The Geospatial Authentication software has two parts Server and Client. The server software is a virtual private network (VPN) developed in Linux operating system using Perl programming language. The server can be a stand-alone VPN server or can be combined with other applications and services. The client software is a GUI Windows CE software, or Mobile Graphical Software, that allows users to authenticate into a network. The purpose of the client software is to pass the needed satellite information to the server for authentication.
Mfold web server for nucleic acid folding and hybridization prediction.
Zuker, Michael
2003-07-01
The abbreviated name, 'mfold web server', describes a number of closely related software applications available on the World Wide Web (WWW) for the prediction of the secondary structure of single stranded nucleic acids. The objective of this web server is to provide easy access to RNA and DNA folding and hybridization software to the scientific community at large. By making use of universally available web GUIs (Graphical User Interfaces), the server circumvents the problem of portability of this software. Detailed output, in the form of structure plots with or without reliability information, single strand frequency plots and 'energy dot plots', are available for the folding of single sequences. A variety of 'bulk' servers give less information, but in a shorter time and for up to hundreds of sequences at once. The portal for the mfold web server is http://www.bioinfo.rpi.edu/applications/mfold. This URL will be referred to as 'MFOLDROOT'.
The Development of a Remote Patient Monitoring System using Java-enabled Mobile Phones.
Kogure, Y; Matsuoka, H; Kinouchi, Y; Akutagawa, M
2005-01-01
A remote patient monitoring system is described. This system is to monitor information of multiple patients in ICU/CCU via 3G mobile phones. Conventionally, various patient information, such as vital signs, is collected and stored on patient information systems. In proposed system, the patient information is recollected by remote information server, and transported to mobile phones. The server is worked as a gateway between hospital intranet and public networks. Provided information from the server consists of graphs and text data. Doctors can browse patient's information on their mobile phones via the server. A custom Java application software is used to browse these data. In this study, the information server and Java application are developed, and communication between the server and mobile phone in model environment is confirmed. To apply this system to practical products of patient information systems is future work.
Data Processing Center of Radioastron Project: 3 years of operation.
NASA Astrophysics Data System (ADS)
Shatskaya, Marina
ASC DATA PROCESSING CENTER (DPC) of Radioastron Project is a fail-safe complex centralized system of interconnected software/ hardware components along with organizational procedures. Tasks facing of the scientific data processing center are organization of service information exchange, collection of scientific data, storage of all of scientific data, data science oriented processing. DPC takes part in the informational exchange with two tracking stations in Pushchino (Russia) and Green Bank (USA), about 30 ground telescopes, ballistic center, tracking headquarters and session scheduling center. Enormous flows of information go to Astro Space Center. For the inquiring of enormous data volumes we develop specialized network infrastructure, Internet channels and storage. The computer complex has been designed at the Astro Space Center (ASC) of Lebedev Physical Institute and includes: - 800 TB on-line storage, - 2000 TB hard drive archive, - backup system on magnetic tapes (2000 TB); - 24 TB redundant storage at Pushchino Radio Astronomy Observatory; - Web and FTP servers, - DPC management and data transmission networks. The structure and functions of ASC Data Processing Center are fully adequate to the data processing requirements of the Radioastron Mission and has been successfully confirmed during Fringe Search, Early Science Program and first year of Key Science Program.
Fréchette-Viens, Laurie; Hadioui, Madjid; Wilkinson, Kevin J
2017-01-15
The applicability of single particle ICP-MS (SP-ICP-MS) for the analysis of nanoparticle size distributions and the determination of particle numbers was evaluated using the rare earth oxide, La 2 O 3 , as a model particle. The composition of the storage containers, as well as the ICP-MS sample introduction system were found to significantly impact SP-ICP-MS analysis. While La 2 O 3 nanoparticles (La 2 O 3 NP) did not appear to interact strongly with sample containers, adsorptive losses of La 3+ (over 24h) were substantial (>72%) for fluorinated ethylene propylene bottles as opposed to polypropylene (<10%). Furthermore, each part of the sample introduction system (nebulizers made of perfluoroalkoxy alkane (PFA) or glass, PFA capillary tubing, and polyvinyl chloride (PVC) peristaltic pump tubing) contributed to La 3+ adsorptive losses. On the other hand, the presence of natural organic matter in the nanoparticle suspensions led to a decreased adsorptive loss in both the sample containers and the introduction system, suggesting that SP-ICP-MS may nonetheless be appropriate for NP analysis in environmental matrices. Coupling of an ion-exchange resin to the SP-ICP-MS led to more accurate determinations of the La 2 O 3 NP size distributions. Copyright © 2016 Elsevier B.V. All rights reserved.
Jimmerson, Leah C.; Ray, Michelle L.; Bushman, Lane R.; Anderson, Peter L.; Klein, Brandon; Rower, Joseph E.; Zheng, Jia-Hua; Kiser, Jennifer J.
2014-01-01
Ribavirin (RBV) is a nucleoside analog used to treat a variety of DNA and RNA viruses. RBV undergoes intracellular phosphorylation to a mono- (MP), di- (DP), and triphosphate (TP). The phosphorylated forms have been associated with the mechanisms of antiviral effect observed in vitro, but the intracellular pharmacology of the drug has not been well characterized in vivo. A highly sensitive LC-MS/MS method was developed and validated for the determination of intracellular RBV MP, DP, and TP in multiple cell matrix types. For this method, the individual MP, DP, and TP fractions were isolated from lysed intracellular matrix using strong anion exchange solid phase extraction, dephosphorylated to parent RBV, desalted and concentrated and quantified using LC-MS/MS. The method utilized a stable labeled internal standard (RBV-13C5) which facilitated accuracy (% deviation within ±15%) and precision (coefficient of variation of ≤15%). The quantifiable linear range for the assay was 0.50 to 200 pmol/sample. The method was applied to the measurement of RBV MP, DP, and TP in human peripheral blood mononuclear cells (PBMC), red blood cells (RBC), and dried blood spot (DBS) samples obtained from patients taking RBV for the treatment of chronic Hepatitis C virus infection. PMID:25555148
2016 ASMS Workshop Review: Next Generation LC/MS: Critical Insights and Future Perspectives
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao, Hongying; Makarov, Alexander; Smith, Richard D.
The pilot workshop on BNext Generation LC/MS: Critical Insights and Future Perspectives was held on the evening of June 6, 2016 at the 64th ASMS Conference on Mass Spectrometry and Allied Topics held in San Antonio, TX. The workshop, chaired by Hongying Gao (Pfizer), consisted of stimulating talks from distinguished speakers and open discussion among the audience and invited presenters.The objectives of this workshop were to better understand the advances and limitations of current technologies; to exchange perspectives on the next generation LC/MS; and to discuss/debate the features of next generation LC/MS focusing on the following three questions: (1) Whatmore » would the next generation LC/MS look like? (2) How would it change the way we do analysis? and (3) What fundamental issues need to be resolved? A real-world case in the biopharmaceutical industry was presented by Hongying Gao on the needs by industry for LC/MS innovation and technology advancements. The primary invited speakers were Alexander Makarov (Thermo Fisher Scientific) and Richard (Dick) Smith (Pacific Northwest National Laboratory). The open discussions started with Q&A and comments for Alexander Makarov and Dick Smith, followed by insights and perspectives from members of the audience and other invited presenters who shared their thoughts addressing the above questions.« less
Comparison of approaches for mobile document image analysis using server supported smartphones
NASA Astrophysics Data System (ADS)
Ozarslan, Suleyman; Eren, P. Erhan
2014-03-01
With the recent advances in mobile technologies, new capabilities are emerging, such as mobile document image analysis. However, mobile phones are still less powerful than servers, and they have some resource limitations. One approach to overcome these limitations is performing resource-intensive processes of the application on remote servers. In mobile document image analysis, the most resource consuming process is the Optical Character Recognition (OCR) process, which is used to extract text in mobile phone captured images. In this study, our goal is to compare the in-phone and the remote server processing approaches for mobile document image analysis in order to explore their trade-offs. For the inphone approach, all processes required for mobile document image analysis run on the mobile phone. On the other hand, in the remote-server approach, core OCR process runs on the remote server and other processes run on the mobile phone. Results of the experiments show that the remote server approach is considerably faster than the in-phone approach in terms of OCR time, but adds extra delays such as network delay. Since compression and downscaling of images significantly reduce file sizes and extra delays, the remote server approach overall outperforms the in-phone approach in terms of selected speed and correct recognition metrics, if the gain in OCR time compensates for the extra delays. According to the results of the experiments, using the most preferable settings, the remote server approach performs better than the in-phone approach in terms of speed and acceptable correct recognition metrics.
Research on Heat Exchange Process in Aircraft Air Conditioning System
NASA Astrophysics Data System (ADS)
Chichindaev, A. V.
2017-11-01
Using of heat-exchanger-condenser in the air conditioning system of the airplane Tu-204 (Boeing, Airbus, Superjet 100, MS-21, etc.) for cooling the compressed air by the cold air with negative temperature exiting the turbine results in a number of operational problems. Mainly it’s frosting of the heat exchange surface, which is the cause of live-section channels frosting, resistance increasing and airflow in the system decreasing. The purpose of this work is to analyse the known freeze-up-fighting methods for heat-exchanger-condenser, description of the features of anti-icing protection and offering solutions to this problem. For the problem of optimizing the design of heat exchangers in this work used generalized criterion that describes the ratio of thermal resistances of cold and hot sections, which include: the ratio of the initial values of heat transfer agents flow state; heat exchange surface finning coefficients; factors which describes the ratio of operating parameters and finning area. By controlling the ratio of the thermal resistances can be obtained the desired temperature of the heat exchange surface, which would prevent freezing. The work presents the results of a numerical study of the effect of different combinations of regime and geometrical factors changes on reduction of the heat-exchanger-condenser freezing surface area, including using of variable ratio of thermal resistances.
MODEL FOR INSTANTANEOUS RESIDENTIAL WATER DEMANDS
Residential wateer use is visualized as a customer-server interaction often encountered in queueing theory. Individual customers are assumed to arrive according to a nonhomogeneous Poisson process, then engage water servers for random lengths of time. Busy servers are assumed t...
Report #11-P-0597, September 9, 2011. Vulnerability testing of EPA’s directory service system authentication and authorization servers conducted in March 2011 identified authentication and authorization servers with numerous vulnerabilities.
Comparing Server Energy Use and Efficiency Using Small Sample Sizes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coles, Henry C.; Qin, Yong; Price, Phillip N.
This report documents a demonstration that compared the energy consumption and efficiency of a limited sample size of server-type IT equipment from different manufacturers by measuring power at the server power supply power cords. The results are specific to the equipment and methods used. However, it is hoped that those responsible for IT equipment selection can used the methods described to choose models that optimize energy use efficiency. The demonstration was conducted in a data center at Lawrence Berkeley National Laboratory in Berkeley, California. It was performed with five servers of similar mechanical and electronic specifications; three from Intel andmore » one each from Dell and Supermicro. Server IT equipment is constructed using commodity components, server manufacturer-designed assemblies, and control systems. Server compute efficiency is constrained by the commodity component specifications and integration requirements. The design freedom, outside of the commodity component constraints, provides room for the manufacturer to offer a product with competitive efficiency that meets market needs at a compelling price. A goal of the demonstration was to compare and quantify the server efficiency for three different brands. The efficiency is defined as the average compute rate (computations per unit of time) divided by the average energy consumption rate. The research team used an industry standard benchmark software package to provide a repeatable software load to obtain the compute rate and provide a variety of power consumption levels. Energy use when the servers were in an idle state (not providing computing work) were also measured. At high server compute loads, all brands, using the same key components (processors and memory), had similar results; therefore, from these results, it could not be concluded that one brand is more efficient than the other brands. The test results show that the power consumption variability caused by the key components as a group is similar to all other components as a group. However, some differences were observed. The Supermicro server used 27 percent more power at idle compared to the other brands. The Intel server had a power supply control feature called cold redundancy, and the data suggest that cold redundancy can provide energy savings at low power levels. Test and evaluation methods that might be used by others having limited resources for IT equipment evaluation are explained in the report.« less
2013-01-01
Background Subunit vaccines based on recombinant proteins have been effective in preventing infectious diseases and are expected to meet the demands of future vaccine development. Computational approach, especially reverse vaccinology (RV) method has enormous potential for identification of protein vaccine candidates (PVCs) from a proteome. The existing protective antigen prediction software and web servers have low prediction accuracy leading to limited applications for vaccine development. Besides machine learning techniques, those software and web servers have considered only protein’s adhesin-likeliness as criterion for identification of PVCs. Several non-adhesin functional classes of proteins involved in host-pathogen interactions and pathogenesis are known to provide protection against bacterial infections. Therefore, knowledge of bacterial pathogenesis has potential to identify PVCs. Results A web server, Jenner-Predict, has been developed for prediction of PVCs from proteomes of bacterial pathogens. The web server targets host-pathogen interactions and pathogenesis by considering known functional domains from protein classes such as adhesin, virulence, invasin, porin, flagellin, colonization, toxin, choline-binding, penicillin-binding, transferring-binding, fibronectin-binding and solute-binding. It predicts non-cytosolic proteins containing above domains as PVCs. It also provides vaccine potential of PVCs in terms of their possible immunogenicity by comparing with experimentally known IEDB epitopes, absence of autoimmunity and conservation in different strains. Predicted PVCs are prioritized so that only few prospective PVCs could be validated experimentally. The performance of web server was evaluated against known protective antigens from diverse classes of bacteria reported in Protegen database and datasets used for VaxiJen server development. The web server efficiently predicted known vaccine candidates reported from Streptococcus pneumoniae and Escherichia coli proteomes. The Jenner-Predict server outperformed NERVE, Vaxign and VaxiJen methods. It has sensitivity of 0.774 and 0.711 for Protegen and VaxiJen dataset, respectively while specificity of 0.940 has been obtained for the latter dataset. Conclusions Better prediction accuracy of Jenner-Predict web server signifies that domains involved in host-pathogen interactions and pathogenesis are better criteria for prediction of PVCs. The web server has successfully predicted maximum known PVCs belonging to different functional classes. Jenner-Predict server is freely accessible at http://117.211.115.67/vaccine/home.html PMID:23815072
Client - server programs analysis in the EPOCA environment
NASA Astrophysics Data System (ADS)
Donatelli, Susanna; Mazzocca, Nicola; Russo, Stefano
1996-09-01
Client - server processing is a popular paradigm for distributed computing. In the development of client - server programs, the designer has first to ensure that the implementation behaves correctly, in particular that it is deadlock free. Second, he has to guarantee that the program meets predefined performance requirements. This paper addresses the issues in the analysis of client - server programs in EPOCA. EPOCA is a computer-aided software engeneering (CASE) support system that allows the automated construction and analysis of generalized stochastic Petri net (GSPN) models of concurrent applications. The paper describes, on the basis of a realistic case study, how client - server systems are modelled in EPOCA, and the kind of qualitative and quantitative analysis supported by its tools.
Stockburger, D W
1999-05-01
Active server pages permit a software developer to customize the Web experience for users by inserting server-side script and database access into Web pages. This paper describes applications of these techniques and provides a primer on the use of these methods. Applications include a system that generates and grades individualized homework assignments and tests for statistics students. The student accesses the system as a Web page, prints out the assignment, does the assignment, and enters the answers on the Web page. The server, running on NT Server 4.0, grades the assignment, updates the grade book (on a database), and returns the answer key to the student.
2015-10-01
with fMRI , and CEST acquisitions. Analysis hurdles were noted in the qMT, which we discuss here. Recruitment continues in the MS cohort (all healthy...Saturation Transfer (CEST) • Magnetization Transfer (MT) • Brain • Cortical Gray Matter (cGM) • Multiple Sclerosis (MS) • Functional MRI ( fMRI ) • Pool Size...MPRAGE Anatomical – 2:12 • fMRI Resting State – 8:34 • fMRI N-Back task – 8:30 • fMRI Trailmaking task – 4:14 The current scan time for all scans is
The NAIMS cooperative pilot project: Design, implementation and future directions.
Oh, Jiwon; Bakshi, Rohit; Calabresi, Peter A; Crainiceanu, Ciprian; Henry, Roland G; Nair, Govind; Papinutto, Nico; Constable, R Todd; Reich, Daniel S; Pelletier, Daniel; Rooney, William; Schwartz, Daniel; Tagge, Ian; Shinohara, Russell T; Simon, Jack H; Sicotte, Nancy L
2017-10-01
The North American Imaging in Multiple Sclerosis (NAIMS) Cooperative represents a network of 27 academic centers focused on accelerating the pace of magnetic resonance imaging (MRI) research in multiple sclerosis (MS) through idea exchange and collaboration. Recently, NAIMS completed its first project evaluating the feasibility of implementation and reproducibility of quantitative MRI measures derived from scanning a single MS patient using a high-resolution 3T protocol at seven sites. The results showed the feasibility of utilizing advanced quantitative MRI measures in multicenter studies and demonstrated the importance of careful standardization of scanning protocols, central image processing, and strategies to account for inter-site variability.
2002-06-01
Student memo for personnel MCLLS . . . . . . . . . . . . . . 75 i. Migrate data to SQL Server...The Web Server is on the same server as the SWORD database in the current version. 4: results set 5: dynamic HTML page 6: dynamic HTML page 3: SQL ...still be supported by Access. SQL Server would be a more viable tool for a fully developed application based on the number of potential users and
Understanding Customer Dissatisfaction with Underutilized Distributed File Servers
NASA Technical Reports Server (NTRS)
Riedel, Erik; Gibson, Garth
1996-01-01
An important trend in the design of storage subsystems is a move toward direct network attachment. Network-attached storage offers the opportunity to off-load distributed file system functionality from dedicated file server machines and execute many requests directly at the storage devices. For this strategy to lead to better performance, as perceived by users, the response time of distributed operations must improve. In this paper we analyze measurements of an Andrew file system (AFS) server that we recently upgraded in an effort to improve client performance in our laboratory. While the original server's overall utilization was only about 3%, we show how burst loads were sufficiently intense to lead to period of poor response time significant enough to trigger customer dissatisfaction. In particular, we show how, after adjusting for network load and traffic to non-project servers, 50% of the variation in client response time was explained by variation in server central processing unit (CPU) use. That is, clients saw long response times in large part because the server was often over-utilized when it was used at all. Using these measures, we see that off-loading file server work in a network-attached storage architecture has to potential to benefit user response time. Computational power in such a system scales directly with storage capacity, so the slowdown during burst period should be reduced.