DOE Office of Scientific and Technical Information (OSTI.GOV)
Bacon, Charles; Bell, Greg; Canon, Shane
The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 25 years. In October 2012, ESnet and the Office of Advanced Scientific Computing Research (ASCR) of the DOE SCmore » organized a review to characterize the networking requirements of the programs funded by the ASCR program office. The requirements identified at the review are summarized in the Findings section, and are described in more detail in the body of the report.« less
Fusion Energy Sciences Network Requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dart, Eli; Tierney, Brian
2012-09-26
The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy Office of Science, the single largest supporter of basic research in the physical sciences in the United States. In support of the Office of Science programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 25 years. In December 2011, ESnet and the Office of Fusion Energy Sciences (FES), of the DOE Officemore » of Science (SC), organized a workshop to characterize the networking requirements of the programs funded by FES. The requirements identified at the workshop are summarized in the Findings section, and are described in more detail in the body of the report.« less
Biological and Environmental Research Network Requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balaji, V.; Boden, Tom; Cowley, Dave
2013-09-01
The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet be a highly successful enabler of scientific discovery for over 25 years. In November 2012, ESnet and the Office of Biological and Environmental Research (BER) of the DOE SC organizedmore » a review to characterize the networking requirements of the programs funded by the BER program office. Several key findings resulted from the review. Among them: 1) The scale of data sets available to science collaborations continues to increase exponentially. This has broad impact, both on the network and on the computational and storage systems connected to the network. 2) Many science collaborations require assistance to cope with the systems and network engineering challenges inherent in managing the rapid growth in data scale. 3) Several science domains operate distributed facilities that rely on high-performance networking for success. Key examples illustrated in this report include the Earth System Grid Federation (ESGF) and the Systems Biology Knowledgebase (KBase). This report expands on these points, and addresses others as well. The report contains a findings section as well as the text of the case studies discussed at the review.« less
77 FR 62231 - DOE/Advanced Scientific Computing Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-12
.... Facilities update. ESnet-5. Early Career technical talks. Co-design. Innovative and Novel Computational Impact on Theory and Experiment (INCITE). Public Comment (10-minute rule). Public Participation: The...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tierney, Brian; Dart, Eli; Tierney, Brian
The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy Office of Science, the single largest supporter of basic research in the physical sciences in the United States of America. In support of the Office of Science programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 20 years. In March 2008, ESnet and the Fusion Energy Sciences (FES) Program Office of themore » DOE Office of Science organized a workshop to characterize the networking requirements of the science programs funded by the FES Program Office. Most sites that conduct data-intensive activities (the Tokamaks at GA and MIT, the supercomputer centers at NERSC and ORNL) show a need for on the order of 10 Gbps of network bandwidth for FES-related work within 5 years. PPPL reported a need for 8 times that (80 Gbps) in that time frame. Estimates for the 5-10 year time period are up to 160 Mbps for large simulations. Bandwidth requirements for ITER range from 10 to 80 Gbps. In terms of science process and collaboration structure, it is clear that the proposed Fusion Simulation Project (FSP) has the potential to significantly impact the data movement patterns and therefore the network requirements for U.S. fusion science. As the FSP is defined over the next two years, these changes will become clearer. Also, there is a clear and present unmet need for better network connectivity between U.S. FES sites and two Asian fusion experiments--the EAST Tokamak in China and the KSTAR Tokamak in South Korea. In addition to achieving its goal of collecting and characterizing the network requirements of the science endeavors funded by the FES Program Office, the workshop emphasized that there is a need for research into better ways of conducting remote collaboration with the control room of a Tokamak running an experiment. This is especially important since the current plans for ITER assume that this problem will be solved.« less
High Energy Physics and Nuclear Physics Network Requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dart, Eli; Bauerdick, Lothar; Bell, Greg
The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements needed by instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 25 years. In August 2013, ESnet and the DOE SC Offices of High Energy Physics (HEP) and Nuclear Physicsmore » (NP) organized a review to characterize the networking requirements of the programs funded by the HEP and NP program offices. Several key findings resulted from the review. Among them: 1. The Large Hadron Collider?s ATLAS (A Toroidal LHC Apparatus) and CMS (Compact Muon Solenoid) experiments are adopting remote input/output (I/O) as a core component of their data analysis infrastructure. This will significantly increase their demands on the network from both a reliability perspective and a performance perspective. 2. The Large Hadron Collider (LHC) experiments (particularly ATLAS and CMS) are working to integrate network awareness into the workflow systems that manage the large number of daily analysis jobs (1 million analysis jobs per day for ATLAS), which are an integral part of the experiments. Collaboration with networking organizations such as ESnet, and the consumption of performance data (e.g., from perfSONAR [PERformance Service Oriented Network monitoring Architecture]) are critical to the success of these efforts. 3. The international aspects of HEP and NP collaborations continue to expand. This includes the LHC experiments, the Relativistic Heavy Ion Collider (RHIC) experiments, the Belle II Collaboration, the Large Synoptic Survey Telescope (LSST), and others. The international nature of these collaborations makes them heavily reliant on transoceanic connectivity, which is subject to longer term service disruptions than terrestrial connectivity. The network engineering aspects of undersea connectivity will continue to be a significant part of the planning, deployment, and operation of the data analysis infrastructure for HEP and NP experiments for the foreseeable future. Given their critical dependency on networking services, the experiments have expressed the need for tight integration (both technically and operationally) of the domestic and the transoceanic parts of the network infrastructure that supports the experiments. 4. The datasets associated with simulations continue to increase in size, and the need to move these datasets between analysis centers is placing ever-increasing demands on networks and on data management systems at the supercomputing centers. In addition, there is a need to harmonize cybersecurity practice with the data transfer performance requirements of the science. This report expands on these points, and addresses others as well. The report contains a findings section in addition to the text of the case studies discussed during the review.« less
Nuclear Physics Science Network Requirements Workshop, May 2008 - Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tierney, Ed., Brian L; Dart, Ed., Eli; Carlson, Rich
2008-11-10
The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the US Department of Energy Office of Science, the single largest supporter of basic research in the physical sciences in the United States of America. In support of the Office of Science programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 20 years. In May 2008, ESnet and the Nuclear Physics (NP) Program Office of the DOEmore » Office of Science organized a workshop to characterize the networking requirements of the science programs funded by the NP Program Office. Most of the key DOE sites for NP related work will require significant increases in network bandwidth in the 5 year time frame. This includes roughly 40 Gbps for BNL, and 20 Gbps for NERSC. Total transatlantic requirements are on the order of 40 Gbps, and transpacific requirements are on the order of 30 Gbps. Other key sites are Vanderbilt University and MIT, which will need on the order of 20 Gbps bandwidth to support data transfers for the CMS Heavy Ion program. In addition to bandwidth requirements, the workshop emphasized several points in regard to science process and collaboration. One key point is the heavy reliance on Grid tools and infrastructure (both PKI and tools such as GridFTP) by the NP community. The reliance on Grid software is expected to increase in the future. Therefore, continued development and support of Grid software is very important to the NP science community. Another key finding is that scientific productivity is greatly enhanced by easy researcher-local access to instrument data. This is driving the creation of distributed repositories for instrument data at collaborating institutions, along with a corresponding increase in demand for network-based data transfers and the tools to manage those transfers effectively. Network reliability is also becoming more important as there is often a narrow window between data collection and data archiving when transfer and analysis can be done. The instruments do not stop producing data, so extended network outages can result in data loss due to analysis pipeline stalls. Finally, as the scope of collaboration continues to increase, collaboration tools such as audio and video conferencing are becoming ever more critical to the productivity of scientific collaborations.« less
ESnet: Large-Scale Science and Data Management ( (LBNL Summer Lecture Series)
Johnston, Bill
2017-12-09
Summer Lecture Series 2004: Bill Johnston of Berkeley Lab's Computing Sciences is a distinguished networking and computing researcher. He managed the Energy Sciences Network (ESnet), a leading-edge, high-bandwidth network funded by DOE's Office of Science. Used for everything from videoconferencing to climate modeling, and flexible enough to accommodate a wide variety of data-intensive applications and services, ESNet's traffic volume is doubling every year and currently surpasses 200 terabytes per month.
Final report and recommendations of the ESnet Authentication Pilot Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, G.R.; Moore, J.P.; Athey, C.L.
1995-01-01
To conduct their work, U.S. Department of Energy (DOE) researchers require access to a wide range of computing systems and information resources outside of their respective laboratories. Electronically communicating with peers using the global Internet has become a necessity to effective collaboration with university, industrial, and other government partners. DOE`s Energy Sciences Network (ESnet) needs to be engineered to facilitate this {open_quotes}collaboratory{close_quotes} while ensuring the protection of government computing resources from unauthorized use. Sensitive information and intellectual properties must be protected from unauthorized disclosure, modification, or destruction. In August 1993, DOE funded four ESnet sites (Argonne National Laboratory, Lawrence Livermoremore » National Laboratory, the National Energy Research Supercomputer Center, and Pacific Northwest Laboratory) to begin implementing and evaluating authenticated ESnet services using the advanced Kerberos Version 5. The purpose of this project was to identify, understand, and resolve the technical, procedural, cultural, and policy issues surrounding peer-to-peer authentication in an inter-organization internet. The investigators have concluded that, with certain conditions, Kerberos Version 5 is a suitable technology to enable ESnet users to freely share resources and information without compromising the integrity of their systems and data. The pilot project has demonstrated that Kerberos Version 5 is capable of supporting trusted third-party authentication across an inter-organization internet and that Kerberos Version 5 would be practical to implement across the ESnet community within the U.S. The investigators made several modifications to the Kerberos Version 5 system that are necessary for operation in the current Internet environment and have documented other technical shortcomings that must be addressed before large-scale deployment is attempted.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Veeraraghavan, Malathi
This report describes our accomplishments and activities for the project titled Terabit-Scale Hybrid Networking. The key accomplishment is that we developed, tested and deployed an Alpha Flow Characterization System (AFCS) in ESnet. It is being run in production mode since Sept. 2015. Also, a new QoS class was added to ESnet5 to support alpha flows.
ESnet authentication services and trust federations
NASA Astrophysics Data System (ADS)
Muruganantham, Dhivakaran; Helm, Mike; Genovese, Tony
2005-01-01
ESnet provides authentication services and trust federation support for SciDAC projects, collaboratories, and other distributed computing applications. The ESnet ATF team operates the DOEGrids Certificate Authority, available to all DOE Office of Science programs, plus several custom CAs, including one for the National Fusion Collaboratory and one for NERSC. The secure hardware and software environment developed to support CAs is suitable for supporting additional custom authentication and authorization applications that your program might require. Seamless, secure interoperation across organizational and international boundaries is vital to collaborative science. We are fostering the development of international PKI federations by founding the TAGPMA, the American regional PMA, and the worldwide IGTF Policy Management Authority (PMA), as well as participating in European and Asian regional PMAs. We are investigating and prototyping distributed authentication technology that will allow us to support the "roaming scientist" (distributed wireless via eduroam), as well as more secure authentication methods (one-time password tokens).
Big Data over a 100G network at Fermilab
Garzoglio, Gabriele; Mhashilkar, Parag; Kim, Hyunwoo; ...
2014-06-11
As the need for Big Data in science becomes ever more relevant, networks around the world are upgrading their infrastructure to support high-speed interconnections. To support its mission, the high-energy physics community as a pioneer in Big Data has always been relying on the Fermi National Accelerator Laboratory to be at the forefront of storage and data movement. This need was reiterated in recent years with the data-taking rate of the major LHC experiments reaching tens of petabytes per year. At Fermilab, this resulted regularly in peaks of data movement on the Wide area network (WAN) in and out ofmore » the laboratory of about 30 Gbit/s and on the Local are network (LAN) between storage and computational farms of 160 Gbit/s. To address these ever increasing needs, as of this year Fermilab is connected to the Energy Sciences Network (ESnet) through a 100 Gb/s link. To understand the optimal system-and application-level configuration to interface computational systems with the new highspeed interconnect, Fermilab has deployed a Network Research & Development facility connected to the ESnet 100G Testbed. For the past two years, the High Throughput Data Program (HTDP) has been using the Testbed to identify gaps in data movement middleware [5] when transferring data at these high-speeds. The program has published evaluations of technologies typically used in High Energy Physics, such as GridFTP [4], XrootD [9], and Squid [8]. Furthermore, this work presents the new R&D facility and the continuation of the evaluation program.« less
Big Data over a 100G network at Fermilab
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garzoglio, Gabriele; Mhashilkar, Parag; Kim, Hyunwoo
As the need for Big Data in science becomes ever more relevant, networks around the world are upgrading their infrastructure to support high-speed interconnections. To support its mission, the high-energy physics community as a pioneer in Big Data has always been relying on the Fermi National Accelerator Laboratory to be at the forefront of storage and data movement. This need was reiterated in recent years with the data-taking rate of the major LHC experiments reaching tens of petabytes per year. At Fermilab, this resulted regularly in peaks of data movement on the Wide area network (WAN) in and out ofmore » the laboratory of about 30 Gbit/s and on the Local are network (LAN) between storage and computational farms of 160 Gbit/s. To address these ever increasing needs, as of this year Fermilab is connected to the Energy Sciences Network (ESnet) through a 100 Gb/s link. To understand the optimal system-and application-level configuration to interface computational systems with the new highspeed interconnect, Fermilab has deployed a Network Research & Development facility connected to the ESnet 100G Testbed. For the past two years, the High Throughput Data Program (HTDP) has been using the Testbed to identify gaps in data movement middleware [5] when transferring data at these high-speeds. The program has published evaluations of technologies typically used in High Energy Physics, such as GridFTP [4], XrootD [9], and Squid [8]. Furthermore, this work presents the new R&D facility and the continuation of the evaluation program.« less
NASA/SPAN and DOE/ESnet-DECnet transition strategy for DECnet OSI/phase 5
NASA Technical Reports Server (NTRS)
Porter, Linda; Demar, Phil
1991-01-01
The technical issues are examined involved with the transition of very large DECnet networks from DECnet phase IV protocols to DECnet OSI/Phase V protocols. The networks involved are the NASA's Science Internet (NSI-DECnet) and the DOE's Energy Science network (ESnet-DECnet). These networks, along with the many universities and research institutions connected to them, combine to form a single DECnet network containing more than 20,000 transitions and crossing numerous organizational boundaries. Discussion of transition planning, including decisions about Phase V naming, addressing, and routing are presented. Also discussed are transition issues related to the use of non-DEC routers in the network.
An authentication infrastructure for today and tomorrow
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engert, D.E.
1996-06-01
The Open Software Foundation`s Distributed Computing Environment (OSF/DCE) was originally designed to provide a secure environment for distributed applications. By combining it with Kerberos Version 5 from MIT, it can be extended to provide network security as well. This combination can be used to build both an inter and intra organizational infrastructure while providing single sign-on for the user with overall improved security. The ESnet community of the Department of Energy is building just such an infrastructure. ESnet has modified these systems to improve their interoperability, while encouraging the developers to incorporate these changes and work more closely together tomore » continue to improve the interoperability. The success of this infrastructure depends on its flexibility to meet the needs of many applications and network security requirements. The open nature of Kerberos, combined with the vendor support of OSF/DCE, provides the infrastructure for today and tomorrow.« less
mdtmFTP and its evaluation on ESNET SDN testbed
Zhang, Liang; Wu, Wenji; DeMar, Phil; ...
2017-04-21
In this paper, to address the high-performance challenges of data transfer in the big data era, we are developing and implementing mdtmFTP: a high-performance data transfer tool for big data. mdtmFTP has four salient features. First, it adopts an I/O centric architecture to execute data transfer tasks. Second, it more efficiently utilizes the underlying multicore platform through optimized thread scheduling. Third, it implements a large virtual file mechanism to address the lots-of-small-files (LOSF) problem. In conclusion, mdtmFTP integrates multiple optimization mechanisms, including–zero copy, asynchronous I/O, pipelining, batch processing, and pre-allocated buffer pools–to enhance performance. mdtmFTP has been extensively tested andmore » evaluated within the ESNET 100G testbed. Evaluations show that mdtmFTP can achieve higher performance than existing data transfer tools, such as GridFTP, FDT, and BBCP.« less
Help Transfer, Protect Medical Research Data Science DMZ Design Could Help Transfer, Protect Medical Research Data As medicine becomes more data-intensive, a Medical Science DMZ design proposed by ESnet's Sean Peisert and Eli Dart could provide a secure solution for medical science data transfers. Read More
Kerberos authentication: The security answer for unsecured networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engert, D.E.
1995-06-01
Traditional authentication schemes do not properly address the problems encountered with today`s unsecured networks. Kerbmm developed by MIT, on the other hand is designed to operate in an open unsecured network, yet provide good authentication and security including encrypted session traffic. Basic Kerberos principles as well as experiences of the ESnet Authentication Pilot Project with Cross Realm. Authentication between four National Laboratories will also be described.
Cooperative Institute for Research in the Atmosphere (CIRA) Requirements Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zurawski, Jason, W; Mace, Kathryn, P
2016-08-11
In August 2016 The Energy Sciences Network (ESnet) and Colorado State University (CSU) organized a review to characterize the networking requirements of the Cooperative Institute for Research in the Atmosphere (CIRA) located on the campus of Colorado State University. Several key findings highlighting the results from the review were discovered, with benefits to improve the overall scientific process for CIRA and CSU.
A Class of Manifold Regularized Multiplicative Update Algorithms for Image Clustering.
Yang, Shangming; Yi, Zhang; He, Xiaofei; Li, Xuelong
2015-12-01
Multiplicative update algorithms are important tools for information retrieval, image processing, and pattern recognition. However, when the graph regularization is added to the cost function, different classes of sample data may be mapped to the same subspace, which leads to the increase of data clustering error rate. In this paper, an improved nonnegative matrix factorization (NMF) cost function is introduced. Based on the cost function, a class of novel graph regularized NMF algorithms is developed, which results in a class of extended multiplicative update algorithms with manifold structure regularization. Analysis shows that in the learning, the proposed algorithms can efficiently minimize the rank of the data representation matrix. Theoretical results presented in this paper are confirmed by simulations. For different initializations and data sets, variation curves of cost functions and decomposition data are presented to show the convergence features of the proposed update rules. Basis images, reconstructed images, and clustering results are utilized to present the efficiency of the new algorithms. Last, the clustering accuracies of different algorithms are also investigated, which shows that the proposed algorithms can achieve state-of-the-art performance in applications of image clustering.
Updated atomic weights: Time to review our table
Coplen, Tyler B.; Meyers, Fabienne; Holden, Norman E.
2016-01-01
Despite common belief, atomic weights are not necessarily constants of nature. Scientists’ ability to measure these values is regularly improving, so one would expect that the accuracy of these values should be improving with time. It is the task of the IUPAC (International Union of Pure and Applied Chemistry) Commission on Isotopic Abundances and Atomic Weights (CIAAW) to regularly review atomic-weight determinations and release updated values.According to an evaluation published in Pure and Applied Chemistry [1], even the most simplified table abridged to four significant digits needs to be updated for the elements selenium and molybdenum. According to the most recent 2015 release of "Atomic Weights of the Elements" [2], another update is needed for ytterbium.
IMMUNISATION TRAINING NEEDS IN MALAWI.
Tsega, A Y; Hausi, H T; Steinglass, R; Chirwa, G Z
2014-09-01
The Malawi Ministry of Health (MOH) and its immunisation partners conducted a training needs assessment in May 2013 to assess the current status of immunisation training programmemes in health training institutions, to identify unmet training needs, and to recommend possible solutions for training of health workers on a regular basis. A cross-sectional, descriptive study. Health training institutions in Malawi, a developing country that does not regularly update its curricula to include new vaccines and management tools, nor train healthcare workers on a regular basis. Researchers interviewed Malawi's central immunisation manager, three zonal immunisation officers, six district officers, 12 health facility immunisation coordinators, and eight principals of training institutions. All health training institutions in Malawi include immunisation in their preservice training curricula. However, the curriculum is not regularly updated; thus, the graduates are not well equipped to provide quality services. In addition, the duration of the training curriculum is inadequate, and in-service training sessions for managers and service providers are conducted only on an ad hoc basis. All levels of Malawi's health system have not met sufficient training needs for providing immunisations, and the health training institutions teach their students with outdated materials. It is recommended that the training institutions update their training curricula regularly and the service providers are trained on a regular basis.
ERIC Educational Resources Information Center
Mock, Karen R.
1998-01-01
Updates cases and issues previously discussed in this regular column on human rights in Canada, including racism and anti-Semitism, laws on hate crimes, hate sites on the World Wide Web, the use of the "free speech" defense by hate groups, and legal challenges to antiracist groups by individuals criticized by them. (DSK)
RAEGE Project Update: Yebes Observatory Broadband Receiver Ready for VGOS
NASA Astrophysics Data System (ADS)
IGN Yebes Observatory staff
2016-12-01
An update of the deployment and activities at the Spanish/Portuguese RAEGE project (``Atlantic Network of Geodynamical and Space Stations'') is presented. While regular observations with the Yebes radio telescope are on-going, technological developments about receivers for VGOS are progressing at the Yebes laboratories.
Robust Object Tracking with a Hierarchical Ensemble Framework
2016-10-09
layer; 4 -update the top layer; 5-re-extract the sub-patches and update their weights in the middle layer; 6-update the parameters of weak classifiers...approaches [ 4 ], [5], which represent the target with a limited number of non-overlapping or regular local regions. So they may not cope well with the large...significant- ly reduce the feature dimensions so that our approach can handle colorful images without suffering from exponential memory explosion; 4
Jungé, Justin A; Scholl, Brian J; Chun, Marvin M
2007-01-01
Over repeated exposure to particular visual search displays, subjects are able to implicitly extract regularities that then make search more efficient-a phenomenon known as contextual cueing. Here we explore how the learning involved in contextual cueing is formed, maintained, and updated over experience. During an initial training phase, a group of signal first subjects searched through a series of predictive displays (where distractor locations were perfectly correlated with the target location), followed with no overt break by a series of unpredictive displays (where repeated contexts were uncorrelated with target locations). A second noise first group of subjects encountered the unpredictive displays followed by the predictive displays. Despite the fact that both groups had the same overall exposure to signal and noise, only the signal first group demonstrated subsequent contextual cueing. This primacy effect indicates that initial experience can result in hypotheses about regularities in displays-or the lack thereof-which then become resistant to updating. The absence of regularities in early stages of training even blocked observers from learning predictive regularities later on.
Jungé, Justin A.; Scholl, Brian J.; Chun, Marvin M.
2008-01-01
Over repeated exposure to particular visual search displays, subjects are able to implicitly extract regularities that then make search more efficient—a phenomenon known as contextual cueing. Here we explore how the learning involved in contextual cueing is formed, maintained, and updated over experience. During an initial training phase, a group of signal first subjects searched through a series of predictive displays (where distractor locations were perfectly correlated with the target location), followed with no overt break by a series of unpredictive displays (where repeated contexts were uncorrelated with target locations). A second noise first group of subjects encountered the unpredictive displays followed by the predictive displays. Despite the fact that both groups had the same overall exposure to signal and noise, only the signal first group demonstrated subsequent contextual cueing. This primacy effect indicates that initial experience can result in hypotheses about regularities in displays—or the lack thereof—which then become resistant to updating. The absence of regularities in early stages of training even blocked observers from learning predictive regularities later on. PMID:18725966
Next Generation Integrated Environment for Collaborative Work Across Internets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harvey B. Newman
2009-02-24
We are now well-advanced in our development, prototyping and deployment of a high performance next generation Integrated Environment for Collaborative Work. The system, aimed at using the capability of ESnet and Internet2 for rapid data exchange, is based on the Virtual Room Videoconferencing System (VRVS) developed by Caltech. The VRVS system has been chosen by the Internet2 Digital Video (I2-DV) Initiative as a preferred foundation for the development of advanced video, audio and multimedia collaborative applications by the Internet2 community. Today, the system supports high-end, broadcast-quality interactivity, while enabling a wide variety of clients (Mbone, H.323) to participate in themore » same conference by running different standard protocols in different contexts with different bandwidth connection limitations, has a fully Web-integrated user interface, developers and administrative APIs, a widely scalable video network topology based on both multicast domains and unicast tunnels, and demonstrated multiplatform support. This has led to its rapidly expanding production use for national and international scientific collaborations in more than 60 countries. We are also in the process of creating a 'testbed video network' and developing the necessary middleware to support a set of new and essential requirements for rapid data exchange, and a high level of interactivity in large-scale scientific collaborations. These include a set of tunable, scalable differentiated network services adapted to each of the data streams associated with a large number of collaborative sessions, policy-based and network state-based resource scheduling, authentication, and optional encryption to maintain confidentiality of inter-personal communications. High performance testbed video networks will be established in ESnet and Internet2 to test and tune the implementation, using a few target application-sets.« less
The Practical Obstacles of Data Transfer: Why researchers still love scp
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nam, Hai Ah; Hill, Jason J; Parete-Koon, Suzanne T
The importance of computing facilities is heralded every six months with the announcement of the new Top500 list, showcasing the world s fastest supercomputers. Unfortu- nately, with great computing capability does not come great long-term data storage capacity, which often means users must move their data to their local site archive, to remote sites where they may be doing future computation or anal- ysis, or back to their home institution, else face the dreaded data purge that most HPC centers employ to keep utiliza- tion of large parallel filesystems low to manage performance and capacity. At HPC centers, data transfermore » is crucial to the scientific workflow and will increase in importance as computing systems grow in size. The Energy Sciences Net- work (ESnet) recently launched its fifth generation network, a 100 Gbps high-performance, unclassified national network connecting more than 40 DOE research sites to support scientific research and collaboration. Despite the tenfold increase in bandwidth to DOE research sites amenable to multiple data transfer streams and high throughput, in prac- tice, researchers often under-utilize the network and resort to painfully-slow single stream transfer methods such as scp to avoid the complexity of using multiple stream tools such as GridFTP and bbcp, and contend with frustration from the lack of consistency of available tools between sites. In this study we survey and assess the data transfer methods pro- vided at several DOE supported computing facilities, includ- ing both leadership-computing facilities, connected through ESnet. We present observed transfer rates, suggested opti- mizations, and discuss the obstacles the tools must overcome to receive wide-spread adoption over scp.« less
Designing Adult Learning Strategies: The Case of South Eastern Europe
ERIC Educational Resources Information Center
Gunny, Madeleine; Viertel, Evelyn
2006-01-01
The importance of lifelong learning is generally well understood and few people today would query the need for adults to regularly update their skills in line with labour market needs, and for governments and social partners to provide an environment that supports skills acquisition and updating. However, it is clear when we look at data from the…
High throughput profile-profile based fold recognition for the entire human proteome.
McGuffin, Liam J; Smith, Richard T; Bryson, Kevin; Sørensen, Søren-Aksel; Jones, David T
2006-06-07
In order to maintain the most comprehensive structural annotation databases we must carry out regular updates for each proteome using the latest profile-profile fold recognition methods. The ability to carry out these updates on demand is necessary to keep pace with the regular updates of sequence and structure databases. Providing the highest quality structural models requires the most intensive profile-profile fold recognition methods running with the very latest available sequence databases and fold libraries. However, running these methods on such a regular basis for every sequenced proteome requires large amounts of processing power. In this paper we describe and benchmark the JYDE (Job Yield Distribution Environment) system, which is a meta-scheduler designed to work above cluster schedulers, such as Sun Grid Engine (SGE) or Condor. We demonstrate the ability of JYDE to distribute the load of genomic-scale fold recognition across multiple independent Grid domains. We use the most recent profile-profile version of our mGenTHREADER software in order to annotate the latest version of the Human proteome against the latest sequence and structure databases in as short a time as possible. We show that our JYDE system is able to scale to large numbers of intensive fold recognition jobs running across several independent computer clusters. Using our JYDE system we have been able to annotate 99.9% of the protein sequences within the Human proteome in less than 24 hours, by harnessing over 500 CPUs from 3 independent Grid domains. This study clearly demonstrates the feasibility of carrying out on demand high quality structural annotations for the proteomes of major eukaryotic organisms. Specifically, we have shown that it is now possible to provide complete regular updates of profile-profile based fold recognition models for entire eukaryotic proteomes, through the use of Grid middleware such as JYDE.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-01
... will be updated on a regular basis and announced to the public via NOAA Weather Radio, Fishery Bulletin... area as needed (on a daily or weekly basis) and announce the revised closed area via NOAA Weather Radio... obtain the updated boundary coordinates for the fishery closed area by listening to NOAA Weather Radio...
Biswas, Samir Kumar; Kanhirodan, Rajan; Vasu, Ram Mohan; Roy, Debasish
2011-08-01
We explore a pseudodynamic form of the quadratic parameter update equation for diffuse optical tomographic reconstruction from noisy data. A few explicit and implicit strategies for obtaining the parameter updates via a semianalytical integration of the pseudodynamic equations are proposed. Despite the ill-posedness of the inverse problem associated with diffuse optical tomography, adoption of the quadratic update scheme combined with the pseudotime integration appears not only to yield higher convergence, but also a muted sensitivity to the regularization parameters, which include the pseudotime step size for integration. These observations are validated through reconstructions with both numerically generated and experimentally acquired data.
Seghouane, Abd-Krim; Iqbal, Asif
2017-09-01
Sequential dictionary learning algorithms have been successfully applied to functional magnetic resonance imaging (fMRI) data analysis. fMRI data sets are, however, structured data matrices with the notions of temporal smoothness in the column direction. This prior information, which can be converted into a constraint of smoothness on the learned dictionary atoms, has seldomly been included in classical dictionary learning algorithms when applied to fMRI data analysis. In this paper, we tackle this problem by proposing two new sequential dictionary learning algorithms dedicated to fMRI data analysis by accounting for this prior information. These algorithms differ from the existing ones in their dictionary update stage. The steps of this stage are derived as a variant of the power method for computing the SVD. The proposed algorithms generate regularized dictionary atoms via the solution of a left regularized rank-one matrix approximation problem where temporal smoothness is enforced via regularization through basis expansion and sparse basis expansion in the dictionary update stage. Applications on synthetic data experiments and real fMRI data sets illustrating the performance of the proposed algorithms are provided.
RBOOST: RIEMANNIAN DISTANCE BASED REGULARIZED BOOSTING
Liu, Meizhu; Vemuri, Baba C.
2011-01-01
Boosting is a versatile machine learning technique that has numerous applications including but not limited to image processing, computer vision, data mining etc. It is based on the premise that the classification performance of a set of weak learners can be boosted by some weighted combination of them. There have been a number of boosting methods proposed in the literature, such as the AdaBoost, LPBoost, SoftBoost and their variations. However, the learning update strategies used in these methods usually lead to overfitting and instabilities in the classification accuracy. Improved boosting methods via regularization can overcome such difficulties. In this paper, we propose a Riemannian distance regularized LPBoost, dubbed RBoost. RBoost uses Riemannian distance between two square-root densities (in closed form) – used to represent the distribution over the training data and the classification error respectively – to regularize the error distribution in an iterative update formula. Since this distance is in closed form, RBoost requires much less computational cost compared to other regularized Boosting algorithms. We present several experimental results depicting the performance of our algorithm in comparison to recently published methods, LP-Boost and CAVIAR, on a variety of datasets including the publicly available OASIS database, a home grown Epilepsy database and the well known UCI repository. Results depict that the RBoost algorithm performs better than the competing methods in terms of accuracy and efficiency. PMID:21927643
Evolutionary graph theory: breaking the symmetry between interaction and replacement
Ohtsuki, Hisashi; Pacheco, Jorge M.; Nowak, Martin A.
2008-01-01
We study evolutionary dynamics in a population whose structure is given by two graphs: the interaction graph determines who plays with whom in an evolutionary game; the replacement graph specifies the geometry of evolutionary competition and updating. First, we calculate the fixation probabilities of frequency dependent selection between two strategies or phenotypes. We consider three different update mechanisms: birth-death, death-birth and imitation. Then, as a particular example, we explore the evolution of cooperation. Suppose the interaction graph is a regular graph of degree h, the replacement graph is a regular graph of degree g and the overlap between the two graphs is a regular graph of degree l. We show that cooperation is favored by natural selection if b/c > hg/l. Here, b and c denote the benefit and cost of the altruistic act. This result holds for death-birth updating, weak selection and large population size. Note that the optimum population structure for cooperators is given by maximum overlap between the interaction and the replacement graph (g = h = l), which means that the two graphs are identical. We also prove that a modified replicator equation can describe how the expected values of the frequencies of an arbitrary number of strategies change on replacement and interaction graphs: the two graphs induce a transformation of the payoff matrix. PMID:17350049
Rücker, Viktoria; Keil, Ulrich; Fitzgerald, Anthony P; Malzahn, Uwe; Prugger, Christof; Ertl, Georg; Heuschmann, Peter U; Neuhauser, Hannelore
2016-01-01
Estimation of absolute risk of cardiovascular disease (CVD), preferably with population-specific risk charts, has become a cornerstone of CVD primary prevention. Regular recalibration of risk charts may be necessary due to decreasing CVD rates and CVD risk factor levels. The SCORE risk charts for fatal CVD risk assessment were first calibrated for Germany with 1998 risk factor level data and 1999 mortality statistics. We present an update of these risk charts based on the SCORE methodology including estimates of relative risks from SCORE, risk factor levels from the German Health Interview and Examination Survey for Adults 2008–11 (DEGS1) and official mortality statistics from 2012. Competing risks methods were applied and estimates were independently validated. Updated risk charts were calculated based on cholesterol, smoking, systolic blood pressure risk factor levels, sex and 5-year age-groups. The absolute 10-year risk estimates of fatal CVD were lower according to the updated risk charts compared to the first calibration for Germany. In a nationwide sample of 3062 adults aged 40–65 years free of major CVD from DEGS1, the mean 10-year risk of fatal CVD estimated by the updated charts was lower by 29% and the estimated proportion of high risk people (10-year risk > = 5%) by 50% compared to the older risk charts. This recalibration shows a need for regular updates of risk charts according to changes in mortality and risk factor levels in order to sustain the identification of people with a high CVD risk. PMID:27612145
India: Chronology of Recent Events
2007-02-13
Order Code RS21589 Updated February 13, 2007 India : Chronology of Recent Events K. Alan Kronstadt Specialist in Asian Affairs Foreign Affairs...Defense, and Trade Division Summary This report provides a reverse chronology of recent events involving India and India -U.S. relations. Sources include... India -U.S. Relations. This report will be updated regularly. 02/13/07 — Commerce Secretary Gutierrez began a two-day visit to New Delhi, where he
(Update) Wellness Challenge: How Are You Doing with Your New Year’s Resolutions? | Poster
Editor’s note: This article has been updated since its original post on May 29 to include information on the quick link from the Poster home page. Remember those fitness resolutions you made at the beginning of the year? Were you going to lose weight, quit smoking, reduce alcohol intake, or establish a regular workout routine? If you have neglected some of these resolutions
Benefits of regular aerobic exercise for executive functioning in healthy populations.
Guiney, Hayley; Machado, Liana
2013-02-01
Research suggests that regular aerobic exercise has the potential to improve executive functioning, even in healthy populations. The purpose of this review is to elucidate which components of executive functioning benefit from such exercise in healthy populations. In light of the developmental time course of executive functions, we consider separately children, young adults, and older adults. Data to date from studies of aging provide strong evidence of exercise-linked benefits related to task switching, selective attention, inhibition of prepotent responses, and working memory capacity; furthermore, cross-sectional fitness data suggest that working memory updating could potentially benefit as well. In young adults, working memory updating is the main executive function shown to benefit from regular exercise, but cross-sectional data further suggest that task-switching and post error performance may also benefit. In children, working memory capacity has been shown to benefit, and cross-sectional data suggest potential benefits for selective attention and inhibitory control. Although more research investigating exercise-related benefits for specific components of executive functioning is clearly needed in young adults and children, when considered across the age groups, ample evidence indicates that regular engagement in aerobic exercise can provide a simple means for healthy people to optimize a range of executive functions.
NASA Astrophysics Data System (ADS)
Schneider, Barry I.; Segura, Javier; Gil, Amparo; Guan, Xiaoxu; Bartschat, Klaus
2018-04-01
This is a revised and updated version of a modern Fortran 90 code to compute the regular Plm (x) and irregular Qlm (x) associated Legendre functions for all x ∈(- 1 , + 1) (on the cut) and | x | > 1 and integer degree (l) and order (m). The necessity to revise the code comes as a consequence of some comments of Prof. James Bremer of the UC//Davis Mathematics Department, who discovered that there were errors in the code for large integer degree and order for the normalized regular Legendre functions on the cut.
Architectures and Design for Next-Generation Hybrid Circuit/Packet Networks
NASA Astrophysics Data System (ADS)
Vadrevu, Sree Krishna Chaitanya
Internet traffic is increasing rapidly at an annual growth rate of 35% with aggregate traffic exceeding several Exabyte's per month. The traffic is also becoming heterogeneous in bandwidth and quality-of-service (QoS) requirements with growing popularity of cloud computing, video-on-demand (VoD), e-science, etc. Hybrid circuit/packet networks which can jointly support circuit and packet services along with the adoption of high-bit-rate transmission systems form an attractive solution to address the traffic growth. 10 Gbps and 40 Gbps transmission systems are widely deployed in telecom backbone networks such as Comcast, AT&T, etc., and network operators are considering migration to 100 Gbps and beyond. This dissertation proposes robust architectures, capacity migration strategies, and novel service frameworks for next-generation hybrid circuit/packet architectures. In this dissertation, we study two types of hybrid circuit/packet networks: a) IP-over-WDM networks, in which the packet (IP) network is overlaid on top of the circuit (optical WDM) network and b) Hybrid networks in which the circuit and packet networks are deployed side by side such as US DoE's ESnet. We investigate techniques to dynamically migrate capacity between the circuit and packet sections by exploiting traffic variations over a day, and our methods show that significant bandwidth savings can be obtained with improved reliability of services. Specifically, we investigate how idle backup circuit capacity can be used to support packet services in IP-over-WDM networks, and similarly, excess capacity in packet network to support circuit services in ESnet. Control schemes that enable our mechanisms are also discussed. In IP-over-WDM networks, with upcoming 100 Gbps and beyond, dedicated protection will induce significant under-utilization of backup resources. We investigate design strategies to loan idle circuit backup capacity to support IP/packet services. However, failure of backup circuits will preempt IP services routed over them, and thus it is important to ensure IP topology survivability to successfully re-route preempted IP services. Integer-linear-program (ILP) and heuristic solutions have been developed and network cost reduction up to 60% has been observed. In ESnet, we study loaning packet links to support circuit services. Mixed-line-rate (MLR) networks supporting 10/40/100 Gbps on the same fiber are becoming increasingly popular. Services that accept degradation in bandwidth, latency, jitter, etc. under failure scenarios for lower cost are known as degraded services. We study degradation in bandwidth for lower cost under failure scenarios, a concept called partial protection, in the context of MLR networks. We notice partial protection enables significant cost savings compared to full protection. To cope with traffic growth, network operators need to deploy equipment at periodic time intervals, and this is known as the multi-period planning and upgrade problem. We study three important multi-period planning approaches, namely incremental planning, all-period planning, and two-period planning with mixed line rates. Our approaches predict the network equipment that needs to be deployed optimally at which nodes and at which time periods in the network to meet QoS requirements.
Selection of regularization parameter in total variation image restoration.
Liao, Haiyong; Li, Fang; Ng, Michael K
2009-11-01
We consider and study total variation (TV) image restoration. In the literature there are several regularization parameter selection methods for Tikhonov regularization problems (e.g., the discrepancy principle and the generalized cross-validation method). However, to our knowledge, these selection methods have not been applied to TV regularization problems. The main aim of this paper is to develop a fast TV image restoration method with an automatic selection of the regularization parameter scheme to restore blurred and noisy images. The method exploits the generalized cross-validation (GCV) technique to determine inexpensively how much regularization to use in each restoration step. By updating the regularization parameter in each iteration, the restored image can be obtained. Our experimental results for testing different kinds of noise show that the visual quality and SNRs of images restored by the proposed method is promising. We also demonstrate that the method is efficient, as it can restore images of size 256 x 256 in approximately 20 s in the MATLAB computing environment.
Physical Activity Improves Quality of Life
... It Works Healthy Workplace Food and Beverage Toolkit Physical activity improves quality of life Updated:Mar 2,2015 ... proven to improve both mental and physical health. Physical activity boosts mental wellness. Regular physical activity can relieve ...
Global Social Media Directory: A Resource Guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noonan, Christine F.; Piatt, Andrew W.
The Global Social Media Directory is a resource guide providing information on social networking services around the globe. This information changes rapidly, therefore, this document will be updated on a regular basis and as funding permits.
76 FR 12356 - Farm Credit Administration Board; Sunshine Act; Regular Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-07
... concludes its business. FOR FURTHER INFORMATION CONTACT: Dale L. Aultman, Secretary to the Farm Credit... on Borrowers Rights--Part II. Update on Dodd-Frank Rulemaking Projects. Dated: March 2, 2011. Dale L...
LHCNet: Wide Area Networking and Collaborative Systems for HEP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Newman, H.B,
2007-08-20
This proposal presents the status and progress in 2006-7, and the technical and financial plans for 2008-2010 for the US LHCNet transatlantic network supporting U.S. participation in the LHC physics program. US LHCNet provides transatlantic connections of the Tier1 computing facilities at Fermilab and Brookhaven with the Tier0 and Tier1 facilities at CERN as well as Tier1s elsewhere in Europe and Asia. Together with ESnet, Internet2, the GEANT pan-European network, and NSF’s UltraLight project, US LHCNet also supports connections between the Tier2 centers (where most of the analysis of the data will take place, starting this year) and the Tier1smore » as needed.See report« less
Implementation of aerial LiDAR technology to update highway feature inventory.
DOT National Transportation Integrated Search
2016-12-01
Highway assets, including traffic signs, traffic signals, light poles, and guardrails, are important components of : transportation networks. They guide, warn and protect drivers, and regulate traffic. To manage and maintain the : regular operation o...
Updating ARI Educational Benefits Usage Data Bases for Army Regular, Reserve, and Guard: 2005 - 2006
2007-09-01
22 0 0 0 0 22 291059 38868 46373 57138 142379 433438 5 MGIB Regular Army Data As of September 2005 ***** 3 Table 3: Percent users MGIB 2YR 3YR 4YR...Post HS 5.68% 10.37% 8.82% Total 100.0% 100.0% 100.0% Frequency Missing = 7158 6 MGIB Regular Army Data As of September 2005 ***** 5 Table 5 : Time...167 112 8787 1994 233 2616 1117 629 340 297 182 132 65 46 5657 1995 223 2711 1278 698 409 252 163 98 50 22 5904 1996 197 2894 1357 668 377 237 131 58 12
Visual tracking based on the sparse representation of the PCA subspace
NASA Astrophysics Data System (ADS)
Chen, Dian-bing; Zhu, Ming; Wang, Hui-li
2017-09-01
We construct a collaborative model of the sparse representation and the subspace representation. First, we represent the tracking target in the principle component analysis (PCA) subspace, and then we employ an L 1 regularization to restrict the sparsity of the residual term, an L 2 regularization term to restrict the sparsity of the representation coefficients, and an L 2 norm to restrict the distance between the reconstruction and the target. Then we implement the algorithm in the particle filter framework. Furthermore, an iterative method is presented to get the global minimum of the residual and the coefficients. Finally, an alternative template update scheme is adopted to avoid the tracking drift which is caused by the inaccurate update. In the experiment, we test the algorithm on 9 sequences, and compare the results with 5 state-of-art methods. According to the results, we can conclude that our algorithm is more robust than the other methods.
The Replicator Equation on Graphs
Ohtsuki, Hisashi; Nowak, Martin A.
2008-01-01
We study evolutionary games on graphs. Each player is represented by a vertex of the graph. The edges denote who meets whom. A player can use any one of n strategies. Players obtain a payoff from interaction with all their immediate neighbors. We consider three different update rules, called ‘birth-death’, ‘death-birth’ and ‘imitation’. A fourth update rule, ‘pairwise comparison’, is shown to be equivalent to birth-death updating in our model. We use pair-approximation to describe the evolutionary game dynamics on regular graphs of degree k. In the limit of weak selection, we can derive a differential equation which describes how the average frequency of each strategy on the graph changes over time. Remarkably, this equation is a replicator equation with a transformed payoff matrix. Therefore, moving a game from a well-mixed population (the complete graph) onto a regular graph simply results in a transformation of the payoff matrix. The new payoff matrix is the sum of the original payoff matrix plus another matrix, which describes the local competition of strategies. We discuss the application of our theory to four particular examples, the Prisoner’s Dilemma, the Snow-Drift game, a coordination game and the Rock-Scissors-Paper game. PMID:16860343
United States housing, first quarter 2013
Delton Alderman
2014-01-01
Provides current and historical information on housing market in the United States. Information includes trends for housing permits and starts, housing under construction, and housing completions for single and multifamily units, and sales and construction. This report will be updated regularly.
NASA Astrophysics Data System (ADS)
Xia, Cheng-Yi; Wang, Lei; Wang, Juan; Wang, Jin-Song
2012-09-01
We combine the Fermi and Moran update rules in the spatial prisoner's dilemma and snowdrift games to investigate the behavior of collective cooperation among agents on the regular lattice. Large-scale simulations indicate that, compared to the model with only one update rule, the cooperation behavior exhibits the richer phenomena, and the role of update dynamics should be paid more attention in the evolutionary game theory. Meanwhile, we also observe that the introduction of Moran rule, which needs to consider all neighbor's information, can markedly promote the aggregate cooperation level, that is, randomly selecting the neighbor proportional to its payoff to imitate will facilitate the cooperation among agents. Current results will contribute to further understand the cooperation dynamics and evolutionary behaviors within many biological, economic and social systems.
Bibliography. College and University Business Administration.
ERIC Educational Resources Information Center
National Association of College and University Business Officers, Washington, DC.
This regularly-updated bibliography is organized by chapter and is generally limited to publications that have specific application to colleges and universities. The chapters include: business administration in higher education; institutional planning; management information systems and data processing; risk management and insurance;…
77 FR 37446 - Advisory Committee on the Medical Uses of Isotopes: Meeting Notice
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-21
...; and (6) update on domestic production of molybdenum-99. The regular meeting agenda is subject to..., Advisory Committee Management Officer. [FR Doc. 2012-15173 Filed 6-20-12; 8:45 am] BILLING CODE 7590-01-P ...
The measures needed for the protection of the Earth's ozone layer are decided regularly by the Parties to the Montreal Protocol. This progress report is the 2004 update by the Environmental Effects Assessment Panel.
76 FR 71861 - America Recycles Day, 2011
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-18
... families have advanced the common good of our Nation by recycling regularly and promoting conservation... then, we have bolstered recycling programs through individual action, community engagement, and... today, we must update and expand existing recycling programs and dedicate ourselves to devising new...
2004-12-01
'Blogging' is too new a term to be included in the latest Oxford English Dictionary but it is rapidly becoming an influential way of having your say. Blogs are weblogs, or regularly updated webpages in diary form, often with commentaries on, and links to, other websites.
Fixed-head star tracker attitude updates on the Hubble Space Telescope
NASA Technical Reports Server (NTRS)
Nadelman, Matthew S.; Karl, Jeffrey B.; Hallock, Lou
1994-01-01
The Hubble Space Telescope (HST) was launched in April 1990 to begin observing celestial space to the edge of the universe. National Aeronautics and Space Administration (NASA) standard fixed-head star trackers (FHST's) are used operationally onboard the HST to regularly adjust ('update') the spacecraft attitude before the acquisition of guide stars for science observations. During the first 3 months of the mission, the FHST's updated the spacecraft attitude successfully only 85 percent of the time. During the other periods, the trackers were unable to find the selected stars -- either they failed to find any star, or worse, they selected incorrect stars and produced erroneous attitude updates. In July 1990, the HST project office at Goddard Space Flight Center (GSFC) requested that Computer Sciences Corporation (CSC) form an investigative 'tiger' team to examine these FHST update failures. This paper discusses the work of the FHST tiger team, describes the investigations that led the team to identify the sources of the errors, and defines the solutions that were subsequently developed, which ultimately increased the success rate of FHST updates to approximately 98 percent.
Aspiration dynamics in structured population acts as if in a well-mixed one.
Du, Jinming; Wu, Bin; Wang, Long
2015-01-26
Understanding the evolution of human interactive behaviors is important. Recent experimental results suggest that human cooperation in spatial structured population is not enhanced as predicted in previous works, when payoff-dependent imitation updating rules are used. This constraint opens up an avenue to shed light on how humans update their strategies in real life. Studies via simulations show that, instead of comparison rules, self-evaluation driven updating rules may explain why spatial structure does not alter the evolutionary outcome. Though inspiring, there is a lack of theoretical result to show the existence of such evolutionary updating rule. Here we study the aspiration dynamics, and show that it does not alter the evolutionary outcome in various population structures. Under weak selection, by analytical approximation, we find that the favored strategy in regular graphs is invariant. Further, we show that this is because the criterion under which a strategy is favored is the same as that of a well-mixed population. By simulation, we show that this holds for random networks. Although how humans update their strategies is an open question to be studied, our results provide a theoretical foundation of the updating rules that may capture the real human updating rules.
(Update) Wellness Challenge: How Are You Doing with Your New Year’s Resolutions? | Poster
Editor’s note: This article has been updated since its original post on May 29 to include information on the quick link from the Poster home page. Remember those fitness resolutions you made at the beginning of the year? Were you going to lose weight, quit smoking, reduce alcohol intake, or establish a regular workout routine? If you have neglected some of these resolutions over the last few months, think about why—was it lack of time, lack of motivation, lack of direction, or some combination of these?
Gaitanis, Anastasios; Kastis, George A; Vlastou, Elena; Bouziotis, Penelope; Verginis, Panayotis; Anagnostopoulos, Constantinos D
2017-08-01
The Tera-Tomo 3D image reconstruction algorithm (a version of OSEM), provided with the Mediso nanoScan® PC (PET8/2) small-animal positron emission tomograph (PET)/x-ray computed tomography (CT) scanner, has various parameter options such as total level of regularization, subsets, and iterations. Also, the acquisition time in PET plays an important role. This study aims to assess the performance of this new small-animal PET/CT scanner for different acquisition times and reconstruction parameters, for 2-deoxy-2-[ 18 F]fluoro-D-glucose ([ 18 F]FDG) and Ga-68, under the NEMA NU 4-2008 standards. Various image quality metrics were calculated for different realizations of [ 18 F]FDG and Ga-68 filled image quality (IQ) phantoms. [ 18 F]FDG imaging produced improved images over Ga-68. The best compromise for the optimization of all image quality factors is achieved for at least 30 min acquisition and image reconstruction with 52 iteration updates combined with a high regularization level. A high regularization level at 52 iteration updates and 30 min acquisition time were found to optimize most of the figures of merit investigated.
Blended Learning: An Evolving Praxis
ERIC Educational Resources Information Center
Fogal, Gary G.; Graham, Floyd H., III.; Lavigne, Anthony G.
2014-01-01
TED (Technology Entertainment Design), a collection of regularly updated talks, offers a web-based platform that is easily accessible. This platform affords language learners across multiple proficiency levels an opportunity to develop autonomy and critical thinking skills alongside their second language (L2) development. With an international…
ERIC Educational Resources Information Center
CEC Today, 1994
1994-01-01
This document consists of the first 40 issues of a newsletter published "exclusively for members of the Council for Exceptional Children" (CEC). Issues typically include items such as: a message from the executive director, a legislative update, meeting announcements, suggestions to regular and student chapters of the organization,…
[NRC/GT: Six Year One Research Studies.
ERIC Educational Resources Information Center
Gubbins, E. Jean, Ed.
1992-01-01
This newsletter focuses on six Year 1 research projects associated with the National Research Center on the Gifted and Talented (NRC/GT). The updates address: "Regular Classroom Practices With Gifted Students: Findings from the Classroom Practices Survey" (Francis X. Archambault, Jr. and others); "The Classroom Practices Study:…
24 CFR 242.44 - Construction standards.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 24 Housing and Urban Development 2 2012-04-01 2012-04-01 false Construction standards. 242.44... MORTGAGE INSURANCE FOR HOSPITALS Construction § 242.44 Construction standards. Work designed and performed... “Guidelines for Construction and Equipment of Hospital and Medical Facilities,” which is regularly updated and...
24 CFR 242.44 - Construction standards.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 24 Housing and Urban Development 2 2011-04-01 2011-04-01 false Construction standards. 242.44... MORTGAGE INSURANCE FOR HOSPITALS Construction § 242.44 Construction standards. Work designed and performed... “Guidelines for Construction and Equipment of Hospital and Medical Facilities,” which is regularly updated and...
24 CFR 242.44 - Construction standards.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 24 Housing and Urban Development 2 2014-04-01 2014-04-01 false Construction standards. 242.44... MORTGAGE INSURANCE FOR HOSPITALS Construction § 242.44 Construction standards. Work designed and performed... “Guidelines for Construction and Equipment of Hospital and Medical Facilities,” which is regularly updated and...
Writing a curriculum vitae, resume or data sheet.
Saltman, D
1995-02-01
This paper outlines a method for the preparation of a curriculum vitae, resume or data sheet, which is an essential document for professional people seeking employment or promotion. However, it needs to be accurate and relevant to the circumstances of the position, and requires regular updating.
Argonne HEP Lunch Seminar Schedule ANL home | HEP Division | Theory group | HEP Division seminars | HEP Theory seminars | Chicago seminars The ANL HEP Lunchtime Seminar is held regularly on Tuesdays at Phenomena in Astrophysics and Cosmology November 15, 2005 Harry Lipkin Update on Pentaquark theory and
How Colleges Are Coping, 1995.
ERIC Educational Resources Information Center
Huggett, Kim
1995-01-01
Based on news accounts, correspondence, conference presentations, and interviews, this collection of quarterly reports provides regular updates on actions taken by California's colleges to cope with difficult economic times. These four reports were produced in January, March, May, and September of 1995 and review the effects of and responses to…
Cutanda, Diana; Correa, Ángel; Sanabria, Daniel
2015-06-01
The present study investigated whether participants can develop temporal preparation driven by auditory isochronous rhythms when concurrently performing an auditory working memory (WM) task. In Experiment 1, participants had to respond to an auditory target presented after a regular or an irregular sequence of auditory stimuli while concurrently performing a Sternberg-type WM task. Results showed that participants responded faster after regular compared with irregular rhythms and that this effect was not affected by WM load; however, the lack of a significant main effect of WM load made it difficult to draw any conclusion regarding the influence of the dual-task manipulation in Experiment 1. In order to enhance dual-task interference, Experiment 2 combined the auditory rhythm procedure with an auditory N-Back task, which required WM updating (monitoring and coding of the information) and was presumably more demanding than the mere rehearsal of the WM task used in Experiment 1. Results now clearly showed dual-task interference effects (slower reaction times [RTs] in the high- vs. the low-load condition). However, such interference did not affect temporal preparation induced by rhythms, with faster RTs after regular than after irregular sequences in the high-load and low-load conditions. These results revealed that secondary tasks demanding memory updating, relative to tasks just demanding rehearsal, produced larger interference effects on overall RTs in the auditory rhythm task. Nevertheless, rhythm regularity exerted a strong temporal preparation effect that survived the interference of the WM task even when both tasks competed for processing resources within the auditory modality. (c) 2015 APA, all rights reserved).
Code of Federal Regulations, 2014 CFR
2014-10-01
... function in a vehicle that records the vehicle's dynamic time-series data during the time period just prior... updated at regular time intervals. Delta-V, lateral means the cumulative change in velocity, as recorded by the EDR of the vehicle, along the lateral axis, starting from crash time zero and ending at 0.25...
Code of Federal Regulations, 2012 CFR
2012-10-01
... function in a vehicle that records the vehicle's dynamic time-series data during the time period just prior... updated at regular time intervals. Delta-V, lateral means the cumulative change in velocity, as recorded by the EDR of the vehicle, along the lateral axis, starting from crash time zero and ending at 0.25...
ERIC Educational Resources Information Center
Getz, Richard E., Comp.
Compiled to provide a central reference point for all legislative information pertaining to libraries in the State of Texas, this publication includes all pertinent legislation as amended through the 71st Legislature, 1989, Regular Session. This update of the 1980 edition has been expanded to include statutes pertaining to the school and academic…
Trial by Fire (and Tornado) Taught Us to Plan for Crises.
ERIC Educational Resources Information Center
Caylor, Mary Jane
1991-01-01
Based on Huntsville (Alabama) schools' experience with a devastating fire, the superintendent later ensured adequate fire insurance coverage, promoted regular fire drills, and developed an emergency response plan that delineated staff responsibilities, communication modes, and training and updating procedures. The plan served the district well…
Supporting the Health and Wellness of Individuals with Psychiatric Disabilities
ERIC Educational Resources Information Center
Swarbrick, Margaret; Nemec, Patricia B.
2016-01-01
Purpose: Psychiatric rehabilitation is recognized as a field with specialized knowledge and skills required for practice. The certified psychiatric rehabilitation practitioner (CPRP) credential, an exam-based certification process, is based on a regularly updated job task analysis that, in its most recent iteration, identified the new core…
Long-range Perspectives in Environmental Education: Producing Practical Problem-solvers.
ERIC Educational Resources Information Center
Barratt, Rod
1997-01-01
Addresses postgraduate environmental education by supported distance learning as offered by the Open University in Great Britain. Refers to techniques for regularly updating material in rapidly developing areas as well as integrating teaching and research. Also refers to the modular course Integrated Safety, Health and Environmental Management.…
ERIC Educational Resources Information Center
Curran, Vernon; Fleet, Lisa; Greene, Melanie
2012-01-01
Introduction: Resuscitation and life support skills training comprises a significant proportion of continuing education programming for health professionals. The purpose of this study was to explore the perceptions and attitudes of certified resuscitation providers toward the retention of resuscitation skills, regular skills updating, and methods…
Code of Federal Regulations, 2013 CFR
2013-10-01
... function in a vehicle that records the vehicle's dynamic time-series data during the time period just prior... updated at regular time intervals. Delta-V, lateral means the cumulative change in velocity, as recorded by the EDR of the vehicle, along the lateral axis, starting from crash time zero and ending at 0.25...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-25
... boundary determinations, coastal engineering, storm warnings and hazard mitigation, emergency management... shoreline depiction may need to be updated on the next regularly scheduled chart edition. Although... Web site ( http://www.tidesandcurrents.noaa.gov ) or contact the Center for Operational Oceanographic...
Vernooij, Robin W. M.; Alonso-Coello, Pablo; Brouwers, Melissa
2017-01-01
Background Scientific knowledge is in constant development. Consequently, regular review to assure the trustworthiness of clinical guidelines is required. However, there is still a lack of preferred reporting items of the updating process in updated clinical guidelines. The present article describes the development process of the Checklist for the Reporting of Updated Guidelines (CheckUp). Methods and Findings We developed an initial list of items based on an overview of research evidence on clinical guideline updating, the Appraisal of Guidelines for Research and Evaluation (AGREE) II Instrument, and the advice of the CheckUp panel (n = 33 professionals). A multistep process was used to refine this list, including an assessment of ten existing updated clinical guidelines, interviews with key informants (response rate: 54.2%; 13/24), a three-round Delphi consensus survey with the CheckUp panel (33 participants), and an external review with clinical guideline methodologists (response rate: 90%; 53/59) and users (response rate: 55.6%; 10/18). CheckUp includes 16 items that address (1) the presentation of an updated guideline, (2) editorial independence, and (3) the methodology of the updating process. In this article, we present the methodology to develop CheckUp and include as a supplementary file an explanation and elaboration document. Conclusions CheckUp can be used to evaluate the completeness of reporting in updated guidelines and as a tool to inform guideline developers about reporting requirements. Editors may request its completion from guideline authors when submitting updated guidelines for publication. Adherence to CheckUp will likely enhance the comprehensiveness and transparency of clinical guideline updating for the benefit of patients and the public, health care professionals, and other relevant stakeholders. PMID:28072838
Code of Federal Regulations, 2011 CFR
2011-10-01
... dynamic time-series data during the time period just prior to a crash event (e.g., vehicle speed vs. time... EDR data in a temporary, volatile storage medium where it is continuously updated at regular time..., along the lateral axis, starting from crash time zero and ending at 0.25 seconds, recorded every 0.01...
Code of Federal Regulations, 2010 CFR
2010-10-01
... vehicle's dynamic time-series data during the time period just prior to a crash event (e.g., vehicle speed... updated at regular time intervals. Delta-V, lateral means the cumulative change in velocity, as recorded by the EDR of the vehicle, along the lateral axis, starting from crash time zero and ending at 0.25...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-03
... the Commission is to consult with the Secretary of the Interior, or his designee, with respect to... provisions of sections 4 and 5 of the Act establishing the Seashore. The regular business meeting is being.../Cell Towers; Shorebird Management; Highlands Center Update; Alternate Transportation funding; Other...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-12
... the Commission is to consult with the Secretary of the Interior, or his designee, with respect to... provisions of sections 4 and 5 of the Act establishing the Seashore. The regular business meeting is being... Flexible Shorebird Management Highlands Center Update Alternate Transportation funding Other construction...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-18
.... The purpose of the Commission is to consult with the Secretary of the Interior, or his designee, with... out the provisions of sections 4 and 5 of the Act establishing the Seashore. The regular business... Wind Turbines/Cell Towers Flexible Shorebird Management Highlands Center Update Alternate...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-19
.... The purpose of the Commission is to consult with the Secretary of the Interior, or his designee, with... regular business meeting is being held to discuss the following: 1. Adoption of Agenda. 2. Approval of... Wind Turbines/Cell Towers Shorebird Management Highlands Center Update Alternate Transportation funding...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-29
... Law 105-280. The purpose of the Commission is to consult with the Secretary of the Interior, or his... regular business meeting is being held to discuss the following: 1. Adoption of Agenda 2. Approval of... Wind Turbines/Cell Towers Shorebird Management Planning Highlands Center Update Alternate...
Sen. Gillibrand, Kirsten E. [D-NY
2010-03-16
Senate - 03/16/2010 Read twice and referred to the Committee on Agriculture, Nutrition, and Forestry. (All Actions) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
75 FR 44226 - Mid-Atlantic Fishery Management Council (MAFMC); Public Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-28
... the Interagency Ocean Policy Task Force. 10:45 a.m. to 1:30 p.m. - The Council will convene to conduct... the Ricks E Savage Award criteria and nomination process, excessive share project update, Scientific... recommendations of the Interagency Ocean Policy Task Force. The Council will hold its regular Business Session to...
Subscribe to the Energy Systems Integration Newsletter | Energy Systems
Integration Facility | NREL Subscribe to the Energy Systems Integration Newsletter Subscribe to the Energy Systems Integration Newsletter Subscribe to receive regular updates on what's happening at the Energy Systems Integration Facility and in energy systems integration research at NREL and around
Report from the European Prison Education Association, June 2006
ERIC Educational Resources Information Center
Behan, Cormac
2006-01-01
It has just been announced that the 11th European Prison Education Association (EPEA) International Conference will take place in Dublin, Ireland from the 13th to 17th June 2007. Further details and an application form will be available in September 2006. Regular updates will be available at www.epea.org.
Compendium of National Data Sources on Higher Education.
ERIC Educational Resources Information Center
Rodriguez, Esther M., Ed.; Lenth, Charles S., Ed.
This compendium provides a guide to data collections in higher education focusing on sources that are national in scope, and updated and made available on a regular or periodic basis including surveys, data bases, reports, and statistical digests. These sources are divided into nine broad categories, each category contains separate entries for…
2017-01-01
An assessment of the various factors that may influence oil prices - physical market factors as well as those related to trading and financial markets. The analysis describes seven key factors that could influence oil markets and explores possible linkages between each factor and oil prices. Regularly updated graphs are included to illustrate aspects of those relationships.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao, H
Purpose: This work is to develop a general framework, namely filtered iterative reconstruction (FIR) method, to incorporate analytical reconstruction (AR) method into iterative reconstruction (IR) method, for enhanced CT image quality. Methods: FIR is formulated as a combination of filtered data fidelity and sparsity regularization, and then solved by proximal forward-backward splitting (PFBS) algorithm. As a result, the image reconstruction decouples data fidelity and image regularization with a two-step iterative scheme, during which an AR-projection step updates the filtered data fidelity term, while a denoising solver updates the sparsity regularization term. During the AR-projection step, the image is projected tomore » the data domain to form the data residual, and then reconstructed by certain AR to a residual image which is in turn weighted together with previous image iterate to form next image iterate. Since the eigenvalues of AR-projection operator are close to the unity, PFBS based FIR has a fast convergence. Results: The proposed FIR method is validated in the setting of circular cone-beam CT with AR being FDK and total-variation sparsity regularization, and has improved image quality from both AR and IR. For example, AIR has improved visual assessment and quantitative measurement in terms of both contrast and resolution, and reduced axial and half-fan artifacts. Conclusion: FIR is proposed to incorporate AR into IR, with an efficient image reconstruction algorithm based on PFBS. The CBCT results suggest that FIR synergizes AR and IR with improved image quality and reduced axial and half-fan artifacts. The authors was partially supported by the NSFC (#11405105), the 973 Program (#2015CB856000), and the Shanghai Pujiang Talent Program (#14PJ1404500).« less
Physician use of updated anti-virus software in a tertiary Nigerian hospital.
Laabes, E P; Nyango, D D; Ayedima, M M; Ladep, N G
2010-01-01
While physicians are becoming increasingly dependent on computers and the internet, highly lethal malware continue to be loaded into cyberspace. We sought to assess the proportion of physicians with updated anti-virus software in Jos University Teaching Hospital Nigeria and to determine perceived barriers to getting updates. We used a pre-tested semi-structured self-administered questionnaire to conduct a cross-sectional survey among 118 physicians. The mean age (+/- SD) of subjects was 34 (+/- 4) years, with 94 male and 24 female physicians. Forty-two (36.5%) of 115 physicians with anti-virus software used an updated program (95% Cl: 27, 45). The top-three antivirus software were: McAfee 40 (33.9%), AVG 37 (31.4%) and Norton 17 (14.4%). Common infections were: Trojan horse 22 (29.7%), Brontok worm 8 (10.8%), and Ravmonlog.exe 5 (6.8%). Internet browsing with a firewall was an independent determinant for use of updated anti-virus software [OR 4.3, 95% CI, 1.86, 10.02; P < 0.001]. Busy schedule, 40 (33.9%) and lack of credit card 39 (33.1%) were perceived barriers to updating antivirus software. The use of regularly updated anti-virus software is sub-optimal among physicians implying vulnerability to computer viruses. Physicians should be careful with flash drives and should avoid being victims of the raging arms race between malware producers and anti-virus software developers.
NASA Astrophysics Data System (ADS)
Kordy, M.; Wannamaker, P.; Maris, V.; Cherkaev, E.; Hill, G.
2016-01-01
Following the creation described in Part I of a deformable edge finite-element simulator for 3-D magnetotelluric (MT) responses using direct solvers, in Part II we develop an algorithm named HexMT for 3-D regularized inversion of MT data including topography. Direct solvers parallelized on large-RAM, symmetric multiprocessor (SMP) workstations are used also for the Gauss-Newton model update. By exploiting the data-space approach, the computational cost of the model update becomes much less in both time and computer memory than the cost of the forward simulation. In order to regularize using the second norm of the gradient, we factor the matrix related to the regularization term and apply its inverse to the Jacobian, which is done using the MKL PARDISO library. For dense matrix multiplication and factorization related to the model update, we use the PLASMA library which shows very good scalability across processor cores. A synthetic test inversion using a simple hill model shows that including topography can be important; in this case depression of the electric field by the hill can cause false conductors at depth or mask the presence of resistive structure. With a simple model of two buried bricks, a uniform spatial weighting for the norm of model smoothing recovered more accurate locations for the tomographic images compared to weightings which were a function of parameter Jacobians. We implement joint inversion for static distortion matrices tested using the Dublin secret model 2, for which we are able to reduce nRMS to ˜1.1 while avoiding oscillatory convergence. Finally we test the code on field data by inverting full impedance and tipper MT responses collected around Mount St Helens in the Cascade volcanic chain. Among several prominent structures, the north-south trending, eruption-controlling shear zone is clearly imaged in the inversion.
Martin-Collado, D; Byrne, T J; Visser, B; Amer, P R
2016-12-01
This study used simulation to evaluate the performance of alternative selection index configurations in the context of a breeding programme where a trait with a non-linear economic value is approaching an economic optimum. The simulation used a simple population structure that approximately mimics selection in dual purpose sheep flocks in New Zealand (NZ). In the NZ dual purpose sheep population, number of lambs born is a genetic trait that is approaching an economic optimum, while genetically correlated growth traits have linear economic values and are not approaching any optimum. The predominant view among theoretical livestock geneticists is that the optimal approach to select for nonlinear profit traits is to use a linear selection index and to update it regularly. However, there are some nonlinear index approaches that have not been evaluated. This study assessed the efficiency of the following four alternative selection index approaches in terms of genetic progress relative to each other: (i) a linear index, (ii) a linear index updated regularly, (iii) a nonlinear (quadratic) index, and (iv) a NLF index (nonlinear index below the optimum and then flat). The NLF approach does not reward or penalize animals for additional genetic merit beyond the trait optimum. It was found to be at least comparable in efficiency to the approach of regularly updating the linear index with short (15 year) and long (30 year) time frames. The relative efficiency of this approach was slightly reduced when the current average value of the nonlinear trait was close to the optimum. Finally, practical issues of industry application of indexes are considered and some potential practical benefits of efficient deployment of a NLF index in highly heterogeneous industries (breeds, flocks and production environments) such as in the NZ dual purpose sheep population are discussed. © 2016 Blackwell Verlag GmbH.
A combined reconstruction-classification method for diffuse optical tomography.
Hiltunen, P; Prince, S J D; Arridge, S
2009-11-07
We present a combined classification and reconstruction algorithm for diffuse optical tomography (DOT). DOT is a nonlinear ill-posed inverse problem. Therefore, some regularization is needed. We present a mixture of Gaussians prior, which regularizes the DOT reconstruction step. During each iteration, the parameters of a mixture model are estimated. These associate each reconstructed pixel with one of several classes based on the current estimate of the optical parameters. This classification is exploited to form a new prior distribution to regularize the reconstruction step and update the optical parameters. The algorithm can be described as an iteration between an optimization scheme with zeroth-order variable mean and variance Tikhonov regularization and an expectation-maximization scheme for estimation of the model parameters. We describe the algorithm in a general Bayesian framework. Results from simulated test cases and phantom measurements show that the algorithm enhances the contrast of the reconstructed images with good spatial accuracy. The probabilistic classifications of each image contain only a few misclassified pixels.
STAR Data Reconstruction at NERSC/Cori, an adaptable Docker container approach for HPC
NASA Astrophysics Data System (ADS)
Mustafa, Mustafa; Balewski, Jan; Lauret, Jérôme; Porter, Jefferson; Canon, Shane; Gerhardt, Lisa; Hajdu, Levente; Lukascsyk, Mark
2017-10-01
As HPC facilities grow their resources, adaptation of classic HEP/NP workflows becomes a need. Linux containers may very well offer a way to lower the bar to exploiting such resources and at the time, help collaboration to reach vast elastic resources on such facilities and address their massive current and future data processing challenges. In this proceeding, we showcase STAR data reconstruction workflow at Cori HPC system at NERSC. STAR software is packaged in a Docker image and runs at Cori in Shifter containers. We highlight two of the typical end-to-end optimization challenges for such pipelines: 1) data transfer rate which was carried over ESnet after optimizing end points and 2) scalable deployment of conditions database in an HPC environment. Our tests demonstrate equally efficient data processing workflows on Cori/HPC, comparable to standard Linux clusters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramamurthy, Byravamurthy
2014-05-05
In this project, developed scheduling frameworks for dynamic bandwidth demands for large-scale science applications. In particular, we developed scheduling algorithms for dynamic bandwidth demands in this project. Apart from theoretical approaches such as Integer Linear Programming, Tabu Search and Genetic Algorithm heuristics, we have utilized practical data from ESnet OSCARS project (from our DOE lab partners) to conduct realistic simulations of our approaches. We have disseminated our work through conference paper presentations and journal papers and a book chapter. In this project we addressed the problem of scheduling of lightpaths over optical wavelength division multiplexed (WDM) networks. We published severalmore » conference papers and journal papers on this topic. We also addressed the problems of joint allocation of computing, storage and networking resources in Grid/Cloud networks and proposed energy-efficient mechanisms for operatin optical WDM networks.« less
Poppenga, Sandra K.; Gesch, Dean B.; Worstell, Bruce B.
2013-01-01
The 1:24,000-scale high-resolution National Hydrography Dataset (NHD) mapped hydrography flow lines require regular updating because land surface conditions that affect surface channel drainage change over time. Historically, NHD flow lines were created by digitizing surface water information from aerial photography and paper maps. Using these same methods to update nationwide NHD flow lines is costly and inefficient; furthermore, these methods result in hydrography that lacks the horizontal and vertical accuracy needed for fully integrated datasets useful for mapping and scientific investigations. Effective methods for improving mapped hydrography employ change detection analysis of surface channels derived from light detection and ranging (LiDAR) digital elevation models (DEMs) and NHD flow lines. In this article, we describe the usefulness of surface channels derived from LiDAR DEMs for hydrography change detection to derive spatially accurate and time-relevant mapped hydrography. The methods employ analyses of horizontal and vertical differences between LiDAR-derived surface channels and NHD flow lines to define candidate locations of hydrography change. These methods alleviate the need to analyze and update the nationwide NHD for time relevant hydrography, and provide an avenue for updating the dataset where change has occurred.
Indicators of School Crime and Safety: 2012. NCES 2013-036/NCJ 241446
ERIC Educational Resources Information Center
Robers, Simone; Kemp, Jana; Truman, Jennifer
2013-01-01
Establishing reliable indicators of the current state of school crime and safety across the nation and regularly updating and monitoring these indicators is important in ensuring the safety of our nation's students. This is the aim of "Indicators of School Crime and Safety." This report is the fifteenth in a series of annual publications…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-25
... amended by Public Law 105-280. The purpose of the Commission is to consult with the Secretary of the... Seashore. The regular business meeting is being held to discuss the following: 1. Adoption of Agenda. 2... Wetland Restoration; Wind Turbines/Cell Towers; Flexible Shorebird Management; Highlands Center Update...
Blogging as Public Pedagogy: Creating Alternative Educational Futures
ERIC Educational Resources Information Center
Dennis, Carol Azumah
2015-01-01
In this study, I explore "blogging", the use of a regularly updated website or web page, authored and curated by an individual or small group, written in a conversational style, as a form of public pedagogy. I analyse blogs as pre-figurative spaces where people go to learn with/in a public sphere, through collaboration with interested…
ERIC Educational Resources Information Center
Read, Nicholas
2017-01-01
This article reviews the accuracy and relevance of the national monitoring mechanisms currently used to establish national learning and teaching material (LTM) availability indicators. In many countries, only very basic LTM monitoring requirements are provided. These are not updated regularly and are usually not designed specificially to support…
75 FR 3238 - Draft Guidance for Industry and Food and Drug Administration Staff; Heart Valves...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-20
...-847-8149 to receive a hard copy. Please use the document number (1607). CDRH maintains an entry on the... personal computer with Internet access. Updated on a regular basis, the CDRH home page includes device... capability for all CDRH guidance documents is available at http://www.fda.gov/medicaldevices...
Guide to Special Information in Scientific and Engineering Journals.
ERIC Educational Resources Information Center
Harris, Mary Elizabeth
This update of a 1983 annotated bibliography lists 298 special features or special issues of science and technology periodicals with emphasis on compilations of information that appear in periodicals on a regular basis. In addition to the 203 entries listed in the original edition, 95 new entries are included. Subjects covered in the guide include…
Critical Race Ethnography in Education: Narrative, Inequality and the Problem of Epistemology
ERIC Educational Resources Information Center
Duncan, Garrett Albert
2005-01-01
Data presented in a previously reported ethnographic research project indicated that an urban elementary school regularly subjects its students to dated curricular materials and supplies. As reported, this occurred even though the school had at its disposal updated and even state-of-the-art resources, such as computers, visual aids, curriculum and…
Indicators of School Crime and Safety: 2010. NCES 2011-002/NCJ 230812
ERIC Educational Resources Information Center
Robers, Simone; Zhang, Jijun; Truman, Jennifer
2010-01-01
Ensuring safer schools requires establishing good indicators of the current state of school crime and safety across the nation and regularly updating and monitoring these indicators. This is the aim of this report. This report is the thirteenth in a series of annual publications produced jointly by the National Center for Education Statistics…
Wu, Junfeng; Dai, Fang; Hu, Gang; Mou, Xuanqin
2018-04-18
Excessive radiation exposure in computed tomography (CT) scans increases the chance of developing cancer and has become a major clinical concern. Recently, statistical iterative reconstruction (SIR) with l0-norm dictionary learning regularization has been developed to reconstruct CT images from the low dose and few-view dataset in order to reduce radiation dose. Nonetheless, the sparse regularization term adopted in this approach is l0-norm, which cannot guarantee the global convergence of the proposed algorithm. To address this problem, in this study we introduced the l1-norm dictionary learning penalty into SIR framework for low dose CT image reconstruction, and developed an alternating minimization algorithm to minimize the associated objective function, which transforms CT image reconstruction problem into a sparse coding subproblem and an image updating subproblem. During the image updating process, an efficient model function approach based on balancing principle is applied to choose the regularization parameters. The proposed alternating minimization algorithm was evaluated first using real projection data of a sheep lung CT perfusion and then using numerical simulation based on sheep lung CT image and chest image. Both visual assessment and quantitative comparison using terms of root mean square error (RMSE) and structural similarity (SSIM) index demonstrated that the new image reconstruction algorithm yielded similar performance with l0-norm dictionary learning penalty and outperformed the conventional filtered backprojection (FBP) and total variation (TV) minimization algorithms.
Regularized Dual Averaging Image Reconstruction for Full-Wave Ultrasound Computed Tomography.
Matthews, Thomas P; Wang, Kun; Li, Cuiping; Duric, Neb; Anastasio, Mark A
2017-05-01
Ultrasound computed tomography (USCT) holds great promise for breast cancer screening. Waveform inversion-based image reconstruction methods account for higher order diffraction effects and can produce high-resolution USCT images, but are computationally demanding. Recently, a source encoding technique has been combined with stochastic gradient descent (SGD) to greatly reduce image reconstruction times. However, this method bundles the stochastic data fidelity term with the deterministic regularization term. This limitation can be overcome by replacing SGD with a structured optimization method, such as the regularized dual averaging method, that exploits knowledge of the composition of the cost function. In this paper, the dual averaging method is combined with source encoding techniques to improve the effectiveness of regularization while maintaining the reduced reconstruction times afforded by source encoding. It is demonstrated that each iteration can be decomposed into a gradient descent step based on the data fidelity term and a proximal update step corresponding to the regularization term. Furthermore, the regularization term is never explicitly differentiated, allowing nonsmooth regularization penalties to be naturally incorporated. The wave equation is solved by the use of a time-domain method. The effectiveness of this approach is demonstrated through computer simulation and experimental studies. The results suggest that the dual averaging method can produce images with less noise and comparable resolution to those obtained by the use of SGD.
Southern California Daily Energy Report
2016-01-01
EIA has updated its Southern California Daily Energy Report to provide additional information on key energy market indicators for the winter season. The dashboard includes information that EIA regularly compiles about energy operations and the management of natural gas and electricity systems in Southern California in the aftermath of a leak at the Aliso Canyon natural gas storage facility outside of Los Angeles
Tunable Laser Development for In-flight Fiber Optic Based Structural Health Monitoring Systems
NASA Technical Reports Server (NTRS)
Richards, Lance; Parker, Allen; Chan, Patrick
2014-01-01
The objective of this task is to investigate, develop, and demonstrate a low-cost swept lasing light source for NASA DFRC's fiber optics sensing system (FOSS) to perform structural health monitoring on current and future aerospace vehicles. This is the regular update of the Tunable Laser Development for In-flight Fiber Optic Based Structural Health Monitoring Systems website.
ERIC Educational Resources Information Center
Leatherman, Carrie C.; Eckel, Edward J.
2012-01-01
Nearly every commercial database that covers natural sciences and engineering offers some type of current awareness (CA) service that provides regular updates to users on current literature in a selected field of interest. Current awareness services include e-mail alerts, tables of contents, and RSS feeds. This study was designed to find out what…
Updated Atomic Weights: Time to Review Our Table
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tyler B. Coplen; Holden, Norman E.; Meyers, Fabienne
Many readers might wonder what can be new about atomic weights and why such a subject deserves even a short paper in Chemistry Views magazine. However, despite common belief, atomic weights are not constants of nature. Scientists' ability to measure these values is regularly improving, so one would expect that the accuracy of these values should be improving with time.
Updated Atomic Weights: Time to Review Our Table
Tyler B. Coplen; Holden, Norman E.; Meyers, Fabienne
2016-04-05
Many readers might wonder what can be new about atomic weights and why such a subject deserves even a short paper in Chemistry Views magazine. However, despite common belief, atomic weights are not constants of nature. Scientists' ability to measure these values is regularly improving, so one would expect that the accuracy of these values should be improving with time.
Mapping PDB chains to UniProtKB entries.
Martin, Andrew C R
2005-12-01
UniProtKB/SwissProt is the main resource for detailed annotations of protein sequences. This database provides a jumping-off point to many other resources through the links it provides. Among others, these include other primary databases, secondary databases, the Gene Ontology and OMIM. While a large number of links are provided to Protein Data Bank (PDB) files, obtaining a regularly updated mapping between UniProtKB entries and PDB entries at the chain or residue level is not straightforward. In particular, there is no regularly updated resource which allows a UniProtKB/SwissProt entry to be identified for a given residue of a PDB file. We have created a completely automatically maintained database which maps PDB residues to residues in UniProtKB/SwissProt and UniProtKB/trEMBL entries. The protocol uses links from PDB to UniProtKB, from UniProtKB to PDB and a brute-force sequence scan to resolve PDB chains for which no annotated link is available. Finally the sequences from PDB and UniProtKB are aligned to obtain a residue-level mapping. The resource may be queried interactively or downloaded from http://www.bioinf.org.uk/pdbsws/.
Heudorf, Ursel; Grünewald, Miriam; Otto, Ulla
2016-01-01
The Commission for Hospital Hygiene and Infection Prevention (KRINKO) updated the recommendations for the prevention of catheter-associated urinary tract infections in 2015. This article will describe the implementation of these recommendations in Frankfurt's hospitals in autumn, 2015. In two non-ICU wards of each of Frankfurt's 17 hospitals, inspections were performed using a checklist based on the new KRINKO recommendations. In one large hospital, a total of 5 wards were inspected. The inspections covered the structure and process quality (operating instructions, training, indication, the placement and maintenance of catheters) and the demonstration of the preparation for insertion of a catheter using an empty bed and an imaginary patient, or insertion in a model. Operating instructions were available in all hospital wards; approximately half of the wards regularly performed training sessions. The indications were largely in line with the recommendations of the KRINKO. Alternatives to urinary tract catheters were available and were used more often than the urinary tract catheters themselves (15.9% vs. 13.5%). In accordance with the recommendations, catheters were placed without antibiotic prophylaxis or the instillation of antiseptic or antimicrobial substances or catheter flushing solutions. The demonstration of catheter placement was conscientiously performed. Need for improvement was seen in the daily documentation and the regular verification of continuing indication for a urinary catheter, as well as the omission of regular catheter change. Overall, the recommendations of the KRINKO on the prevention of catheter-associated urinary tract infections were adequately implemented. However, it cannot be ruled out that in situations with time pressure and staff shortage, the handling of urinary tract catheters may be of lower quality than that observed during the inspections, when catheter insertion was done by two nurses. Against this background, a sufficient number of qualified staff and regular ward rounds by the hygiene staff appear recommendable.
1979-05-01
Reoremnget Aircraft emergencies Emergency training Rsmutrsmngmn Behavioral decision theory Instructional systems Situlatonar mrecytann Decision making ...accidents) should be fed to ISO personnel to update training regularly; (10) special- attention should be paid to teaching difficult component skills...Need to Make Emergency Decisions? Ward Edwards 14 Comment PauZ Slovic 20 * Resource Management in Present and Future Aircraft Operations John Lauber
Deterministic Compressed Sensing
2011-11-01
of the algorithm can be derived by using the Bregman divergence based on the Kullback - Leibler function, and an additive update...regularized goodness - of - fit objective function. In contrast to many CS approaches, however, we measure the fit of an esti- mate to the data using the...sensing is information theoretically possible using any (2k, )-RIP sensing matrix . The following celebrated results of Candès, Romberg and Tao
Exploiting Inhibitory Siglecs to Combat Food Allergies
2017-10-01
such as ELISAs . Shiteng and Kevin Worrell participated in groups meetings, regularly presenting their research updates. Dr. Macauley worked closely...transferred the cells into naïve animals and then immunized with Ara h 2. She also contributed by running some ELISA experiments. Funding Support...procedures and peanut challenges. She also assisted with cellular studies, flow cytometry, and ELISA . Kelly performed the human CD33 basophil assays as
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alekhin, S.I.; Ezhela, V.V.; Filimonov, B.B.
We present an indexed guide to the literature experimental particle physics for the years 1988--1992. About 4,000 papers are indexed by Beam/Target/Momentum, Reaction Momentum (including the final state), Final State Particle, and Accelerator/Detector/Experiment. All indices are cross-referenced to the paper`s title and reference in the ID/Reference/Title Index. The information in this guide is also publicly available from a regularly updated computer database.
Duan, Jizhong; Liu, Yu; Jing, Peiguang
2018-02-01
Self-consistent parallel imaging (SPIRiT) is an auto-calibrating model for the reconstruction of parallel magnetic resonance imaging, which can be formulated as a regularized SPIRiT problem. The Projection Over Convex Sets (POCS) method was used to solve the formulated regularized SPIRiT problem. However, the quality of the reconstructed image still needs to be improved. Though methods such as NonLinear Conjugate Gradients (NLCG) can achieve higher spatial resolution, these methods always demand very complex computation and converge slowly. In this paper, we propose a new algorithm to solve the formulated Cartesian SPIRiT problem with the JTV and JL1 regularization terms. The proposed algorithm uses the operator splitting (OS) technique to decompose the problem into a gradient problem and a denoising problem with two regularization terms, which is solved by our proposed split Bregman based denoising algorithm, and adopts the Barzilai and Borwein method to update step size. Simulation experiments on two in vivo data sets demonstrate that the proposed algorithm is 1.3 times faster than ADMM for datasets with 8 channels. Especially, our proposal is 2 times faster than ADMM for the dataset with 32 channels. Copyright © 2017 Elsevier Inc. All rights reserved.
Ko, Seung-Hyun; Hur, Kyu Yeon; Rhee, Sang Youl; Kim, Nan-Hee; Moon, Min Kyong; Park, Seok-O; Lee, Byung-Wan; Kim, Hyun Jin; Choi, Kyung Mook; Kim, Jin Hwa
2017-11-01
In 2017, the Korean Diabetes Association (KDA) published a position statement on the use of antihyperglycemic agents for patients with type 2 diabetes mellitus (T2DM). The KDA regularly updates its Clinical Practice Guidelines, but since the last update in 2015, many results from clinical trials have been introduced, and domestic data from studies performed in Korean patients with T2DM have been published. Recently, evidence from large clinical studies assessing cardiovascular outcomes following the use of sodium-glucose cotransporter 2 inhibitors and glucagon-like peptide 1 receptor agonists in patients with T2DM were incorporated into the recommendations. Additionally, new data from clinical trials using dipeptidyl peptidase 4 inhibitors and thiazolidinediones in Korean patients with T2DM were added. Following a systematic review and assessment of recent evidence, the KDA updated and modified its clinical practice recommendations regarding the use of antihyperglycemic agents and revised the treatment algorithm for Korean adult patients with T2DM.
Collaborative Information Technologies
NASA Astrophysics Data System (ADS)
Meyer, William; Casper, Thomas
1999-11-01
Significant effort has been expended to provide infrastructure and to facilitate the remote collaborations within the fusion community and out. Through the Office of Fusion Energy Science Information Technology Initiative, communication technologies utilized by the fusion community are being improved. The initial thrust of the initiative has been collaborative seminars and meetings. Under the initiative 23 sites, both laboratory and university, were provided with hardware required to remotely view, or project, documents being presented. The hardware is capable of delivering documents to a web browser, or to compatible hardware, over ESNET in an access controlled manner. The ability also exists for documents to originate from virtually any of the collaborating sites. In addition, RealNetwork servers are being tested to provide audio and/or video, in a non-interactive environment with MBONE providing two-way interaction where needed. Additional effort is directed at remote distributed computing, file systems, security, and standard data storage and retrieval methods. This work supported by DoE contract No. W-7405-ENG-48
Instituto Geografico Nacional of Spain
NASA Technical Reports Server (NTRS)
Colomer, Francisco; Garcia-Espada, Susana; Gomez-Gonzalez, Jesus; Lopez-Fernandez, Jose Antonio; Santamaria-Gomez, Alvaro; De Vicente, Pablo
2013-01-01
This report updates the description of the space geodesy facilities of the Spanish National Geographic Institute (IGN). The current 40-meter radio telescope at Yebes, a network station for IVS, has performed geodetic VLBI observations regularly since September 2008. In addition to this, the project to establish an Atlantic Network of Geodynamical and Space Stations (RAEGE) is progressing with the construction of the first antenna, which is being erected at Yebes.
Reprint Filing: A Profile-Based Solution
Gass, David A.; Putnam, R. Wayne
1983-01-01
A reprint filing system based on practice profiles can give family physicians easy access to relevant medical information. The use of the ICHPPC classification and some supplemental categories provides a more practical coding mechanism than organ systems, textbook chapter titles or even Index Medicus subject headings. The system can be simply maintained, updated and improved, but users must regularly weed out unused information, and read widely to keep the reprints current. PMID:21283301
Graphical Acoustic Liner Design and Analysis Tool
NASA Technical Reports Server (NTRS)
Howerton, Brian M. (Inventor); Jones, Michael G. (Inventor)
2016-01-01
An interactive liner design and impedance modeling tool comprises software utilized to design acoustic liners for use in constrained spaces, both regularly and irregularly shaped. A graphical user interface allows the acoustic channel geometry to be drawn in a liner volume while the surface impedance calculations are updated and displayed in real-time. A one-dimensional transmission line model may be used as the basis for the impedance calculations.
NASA Astrophysics Data System (ADS)
Shu, Feng; Liu, Xingwen; Li, Min
2018-05-01
Memory is an important factor on the evolution of cooperation in spatial structure. For evolutionary biologists, the problem is often how cooperation acts can emerge in an evolving system. In the case of snowdrift game, it is found that memory can boost cooperation level for large cost-to-benefit ratio r, while inhibit cooperation for small r. Thus, how to enlarge the range of r for the purpose of enhancing cooperation becomes a hot issue recently. This paper addresses a new memory-based approach and its core lies in: Each agent applies the given rule to compare its own historical payoffs in a certain memory size, and take the obtained maximal one as virtual payoff. In order to get the optimal strategy, each agent randomly selects one of its neighbours to compare their virtual payoffs, which can lead to the optimal strategy. Both constant-size memory and size-varying memory are investigated by means of a scenario of asynchronous updating algorithm on regular lattices with different sizes. Simulation results show that this approach effectively enhances cooperation level in spatial structure and makes the high cooperation level simultaneously emerge for both small and large r. Moreover, it is discovered that population sizes have a significant influence on the effects of cooperation.
Using lean methodology to improve efficiency of electronic order set maintenance in the hospital.
Idemoto, Lori; Williams, Barbara; Blackmore, Craig
2016-01-01
Order sets, a series of orders focused around a diagnosis, condition, or treatment, can reinforce best practice, help eliminate outdated practice, and provide clinical guidance. However, order sets require regular updates as evidence and care processes change. We undertook a quality improvement intervention applying lean methodology to create a systematic process for order set review and maintenance. Root cause analysis revealed challenges with unclear prioritization of requests, lack of coordination between teams, and lack of communication between producers and requestors of order sets. In March of 2014, we implemented a systematic, cyclical order set review process, with a set schedule, defined responsibilities for various stakeholders, formal meetings and communication between stakeholders, and transparency of the process. We first identified and deactivated 89 order sets which were infrequently used. Between March and August 2014, 142 order sets went through the new review process. Processing time for the build duration of order sets decreased from a mean of 79.6 to 43.2 days (p<.001, CI=22.1, 50.7). Applying Lean production principles to the order set review process resulted in significant improvement in processing time and increased quality of orders. As use of order sets and other forms of clinical decision support increase, regular evidence and process updates become more critical.
Consent-based access to core EHR information. Collaborative approaches in Norway.
Heimly, Vigdis; Berntsen, Kirsti E
2009-01-01
Lack of access to updated drug information is a challenge for healthcare providers in Norway. Drug charts are updated in separate EHR systems but exchange of drug information between them is lacking. In order to provide ready access to updated medication information, a project for consent-based access to a core EHR has been established. End users have developed requirements for additions to the medication modules in the EHR systems in cooperation with vendors, researchers and standardization workers. The modules are then implemented by the vendors, tested in the usability lab, and finally tested by the national testing and approval service before implementation. An ethnographic study, with focus on future users and their interaction with other actors regarding medicines and medication, has included semi-/unstructured interviews with the involved organizational units. The core EHR uses the EHR kept by the patient's regular GP as the main source of information. A server-based solution has been chosen in order to keep the core EHR accessible outside the GP's regular work hours. The core EHR is being tested, and the EHR-vendors are implementing additions to their systems in order to facilitate communication with the core EHR. All major EHR-system vendors in Norway participate in the project. The core EHR provides a generic basis that may be used as a pilot for a national patient summary. Examples of a wider use of the core EHR can be: shared individual plans to support continuity of care, summary of the patient's contacts with health providers in different organizations, and core EHR information such as important diagnoses, allergies and contact information. Extensive electronic cooperation and communication requires that all partners adjust their documentation practices to fit with other actors' needs. The implementation effects on future work practices will be followed by researchers.
Cerebellum, temporal predictability and the updating of a mental model.
Kotz, Sonja A; Stockert, Anika; Schwartze, Michael
2014-12-19
We live in a dynamic and changing environment, which necessitates that we adapt to and efficiently respond to changes of stimulus form ('what') and stimulus occurrence ('when'). Consequently, behaviour is optimal when we can anticipate both the 'what' and 'when' dimensions of a stimulus. For example, to perceive a temporally expected stimulus, a listener needs to establish a fairly precise internal representation of its external temporal structure, a function ascribed to classical sensorimotor areas such as the cerebellum. Here we investigated how patients with cerebellar lesions and healthy matched controls exploit temporal regularity during auditory deviance processing. We expected modulations of the N2b and P3b components of the event-related potential in response to deviant tones, and also a stronger P3b response when deviant tones are embedded in temporally regular compared to irregular tone sequences. We further tested to what degree structural damage to the cerebellar temporal processing system affects the N2b and P3b responses associated with voluntary attention to change detection and the predictive adaptation of a mental model of the environment, respectively. Results revealed that healthy controls and cerebellar patients display an increased N2b response to deviant tones independent of temporal context. However, while healthy controls showed the expected enhanced P3b response to deviant tones in temporally regular sequences, the P3b response in cerebellar patients was significantly smaller in these sequences. The current data provide evidence that structural damage to the cerebellum affects the predictive adaptation to the temporal structure of events and the updating of a mental model of the environment under voluntary attention. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
Cerebellum, temporal predictability and the updating of a mental model
Kotz, Sonja A.; Stockert, Anika; Schwartze, Michael
2014-01-01
We live in a dynamic and changing environment, which necessitates that we adapt to and efficiently respond to changes of stimulus form (‘what’) and stimulus occurrence (‘when’). Consequently, behaviour is optimal when we can anticipate both the ‘what’ and ‘when’ dimensions of a stimulus. For example, to perceive a temporally expected stimulus, a listener needs to establish a fairly precise internal representation of its external temporal structure, a function ascribed to classical sensorimotor areas such as the cerebellum. Here we investigated how patients with cerebellar lesions and healthy matched controls exploit temporal regularity during auditory deviance processing. We expected modulations of the N2b and P3b components of the event-related potential in response to deviant tones, and also a stronger P3b response when deviant tones are embedded in temporally regular compared to irregular tone sequences. We further tested to what degree structural damage to the cerebellar temporal processing system affects the N2b and P3b responses associated with voluntary attention to change detection and the predictive adaptation of a mental model of the environment, respectively. Results revealed that healthy controls and cerebellar patients display an increased N2b response to deviant tones independent of temporal context. However, while healthy controls showed the expected enhanced P3b response to deviant tones in temporally regular sequences, the P3b response in cerebellar patients was significantly smaller in these sequences. The current data provide evidence that structural damage to the cerebellum affects the predictive adaptation to the temporal structure of events and the updating of a mental model of the environment under voluntary attention. PMID:25385781
Development and operations of the astrophysics data system
NASA Technical Reports Server (NTRS)
Murray, Stephen S.; Oliversen, Ronald (Technical Monitor)
2005-01-01
Abstract service - Continued regular updates of abstracts in the databases, both at SA0 and at all mirror sites. - Modified loading scripts to accommodate changes in data format (PhyS) - Discussed data deliveries with providers to clear up problems with format or other errors (EGU) - Continued inclusion of large numbers of historical literature volumes and physics conference volumes xeroxed from the library. - Performed systematic fixes on some data sets in the database to account for changes in article numbering (AGU journals) - Implemented linking of ADS bibliographic records with multimedia files - Debugged and fixed obscure connection problems with the ADS Korean mirror site which were preventing successful updates of the data holdings. - Wrote procedure to parse citation data and characterize an ADS record based on its citation ratios within each database.
Couvin, David; Bernheim, Aude; Toffano-Nioche, Claire; Touchon, Marie; Michalik, Juraj; Néron, Bertrand; C Rocha, Eduardo P; Vergnaud, Gilles; Gautheret, Daniel; Pourcel, Christine
2018-05-22
CRISPR (clustered regularly interspaced short palindromic repeats) arrays and their associated (Cas) proteins confer bacteria and archaea adaptive immunity against exogenous mobile genetic elements, such as phages or plasmids. CRISPRCasFinder allows the identification of both CRISPR arrays and Cas proteins. The program includes: (i) an improved CRISPR array detection tool facilitating expert validation based on a rating system, (ii) prediction of CRISPR orientation and (iii) a Cas protein detection and typing tool updated to match the latest classification scheme of these systems. CRISPRCasFinder can either be used online or as a standalone tool compatible with Linux operating system. All third-party software packages employed by the program are freely available. CRISPRCasFinder is available at https://crisprcas.i2bc.paris-saclay.fr.
Image super-resolution via adaptive filtering and regularization
NASA Astrophysics Data System (ADS)
Ren, Jingbo; Wu, Hao; Dong, Weisheng; Shi, Guangming
2014-11-01
Image super-resolution (SR) is widely used in the fields of civil and military, especially for the low-resolution remote sensing images limited by the sensor. Single-image SR refers to the task of restoring a high-resolution (HR) image from the low-resolution image coupled with some prior knowledge as a regularization term. One classic method regularizes image by total variation (TV) and/or wavelet or some other transform which introduce some artifacts. To compress these shortages, a new framework for single image SR is proposed by utilizing an adaptive filter before regularization. The key of our model is that the adaptive filter is used to remove the spatial relevance among pixels first and then only the high frequency (HF) part, which is sparser in TV and transform domain, is considered as the regularization term. Concretely, through transforming the original model, the SR question can be solved by two alternate iteration sub-problems. Before each iteration, the adaptive filter should be updated to estimate the initial HF. A high quality HF part and HR image can be obtained by solving the first and second sub-problem, respectively. In experimental part, a set of remote sensing images captured by Landsat satellites are tested to demonstrate the effectiveness of the proposed framework. Experimental results show the outstanding performance of the proposed method in quantitative evaluation and visual fidelity compared with the state-of-the-art methods.
CBO’s 2011 Long-Term Budget Outlook
2011-06-01
asteroid strike. Other possibilities include an epidemic (whether on the scale of the 1918 pandemic flu, which killed roughly one out of every 150...AMT and the regular income tax and then pay the higher amount.7 The parameters that deter- mine the amount owed under the AMT are not indexed for...Compare with deficit. sustainable growth rate (SGR): The formula that deter- mines updates to payment rates for physicians under the Medicare program
Technical Basis and Implementation Guidelines for a Technique for Human Event Analysis (ATHEANA)
2000-05-01
posted at NRC’s Web site address www.nrc.gov/NRC/NUREGS/indexnum.html are updated regularly and may differ from the last printed version. Non-NRC...distinctly different in that it provides structured search schemes for finding such EFCs, by using and integrating knowledge and experience in...Learned from Serious Accidents The record of significant incidents in nuclear power plant NPP operations shows a substantially different picture of
The Fermi LAT Very Important Project (VIP) List of Active Galactic Nuclei
NASA Astrophysics Data System (ADS)
Thompson, David J.; Fermi Large Area Telescope Collaboration
2018-01-01
Using nine years of Fermi Gamma-ray Space Telescope Large Area Telescope (LAT) observations, we have identified 30 projects for Active Galactic Nuclei (AGN) that appear to provide strong prospects for significant scientific advances. This Very Important Project (VIP) AGN list includes AGNs that have good multiwavelength coverage, are regularly detected by the Fermi LAT, and offer scientifically interesting timing or spectral properties. Each project has one or more LAT scientists identified who are actively monitoring the source. They will be regularly updating the LAT results for these VIP AGNs, working together with multiwavelength observers and theorists to maximize the scientific return during the coming years of the Fermi mission. See https://confluence.slac.stanford.edu/display/GLAMCOG/VIP+List+of+AGNs+for+Continued+Study
Kwag, Koren Hyogene; González-Lorenzo, Marien; Banzi, Rita; Bonovas, Stefanos
2016-01-01
Background The complexity of modern practice requires health professionals to be active information-seekers. Objective Our aim was to review the quality and progress of point-of-care information summaries—Web-based medical compendia that are specifically designed to deliver pre-digested, rapidly accessible, comprehensive, and periodically updated information to health care providers. We aimed to evaluate product claims of being evidence-based. Methods We updated our previous evaluations by searching Medline, Google, librarian association websites, and conference proceedings from August 2012 to December 2014. We included Web-based, regularly updated point-of-care information summaries with claims of being evidence-based. We extracted data on the general characteristics and content presentation of products, and we quantitatively assessed their breadth of disease coverage, editorial quality, and evidence-based methodology. We assessed potential relationships between these dimensions and compared them with our 2008 assessment. Results We screened 58 products; 26 met our inclusion criteria. Nearly a quarter (6/26, 23%) were newly identified in 2014. We accessed and analyzed 23 products for content presentation and quantitative dimensions. Most summaries were developed by major publishers in the United States and the United Kingdom; no products derived from low- and middle-income countries. The main target audience remained physicians, although nurses and physiotherapists were increasingly represented. Best Practice, Dynamed, and UptoDate scored the highest across all dimensions. The majority of products did not excel across all dimensions: we found only a moderate positive correlation between editorial quality and evidence-based methodology (r=.41, P=.0496). However, all dimensions improved from 2008: editorial quality (P=.01), evidence-based methodology (P=.015), and volume of diseases and medical conditions (P<.001). Conclusions Medical and scientific publishers are investing substantial resources towards the development and maintenance of point-of-care summaries. The number of these products has increased since 2008 along with their quality. Best Practice, Dynamed, and UptoDate scored the highest across all dimensions, while others that were marketed as evidence-based were less reliable. Individuals and institutions should regularly assess the value of point-of-care summaries as their quality changes rapidly over time. PMID:26786976
Kwag, Koren Hyogene; González-Lorenzo, Marien; Banzi, Rita; Bonovas, Stefanos; Moja, Lorenzo
2016-01-19
The complexity of modern practice requires health professionals to be active information-seekers. Our aim was to review the quality and progress of point-of-care information summaries-Web-based medical compendia that are specifically designed to deliver pre-digested, rapidly accessible, comprehensive, and periodically updated information to health care providers. We aimed to evaluate product claims of being evidence-based. We updated our previous evaluations by searching Medline, Google, librarian association websites, and conference proceedings from August 2012 to December 2014. We included Web-based, regularly updated point-of-care information summaries with claims of being evidence-based. We extracted data on the general characteristics and content presentation of products, and we quantitatively assessed their breadth of disease coverage, editorial quality, and evidence-based methodology. We assessed potential relationships between these dimensions and compared them with our 2008 assessment. We screened 58 products; 26 met our inclusion criteria. Nearly a quarter (6/26, 23%) were newly identified in 2014. We accessed and analyzed 23 products for content presentation and quantitative dimensions. Most summaries were developed by major publishers in the United States and the United Kingdom; no products derived from low- and middle-income countries. The main target audience remained physicians, although nurses and physiotherapists were increasingly represented. Best Practice, Dynamed, and UptoDate scored the highest across all dimensions. The majority of products did not excel across all dimensions: we found only a moderate positive correlation between editorial quality and evidence-based methodology (r=.41, P=.0496). However, all dimensions improved from 2008: editorial quality (P=.01), evidence-based methodology (P=.015), and volume of diseases and medical conditions (P<.001). Medical and scientific publishers are investing substantial resources towards the development and maintenance of point-of-care summaries. The number of these products has increased since 2008 along with their quality. Best Practice, Dynamed, and UptoDate scored the highest across all dimensions, while others that were marketed as evidence-based were less reliable. Individuals and institutions should regularly assess the value of point-of-care summaries as their quality changes rapidly over time.
FunCoup 3.0: database of genome-wide functional coupling networks
Schmitt, Thomas; Ogris, Christoph; Sonnhammer, Erik L. L.
2014-01-01
We present an update of the FunCoup database (http://FunCoup.sbc.su.se) of functional couplings, or functional associations, between genes and gene products. Identifying these functional couplings is an important step in the understanding of higher level mechanisms performed by complex cellular processes. FunCoup distinguishes between four classes of couplings: participation in the same signaling cascade, participation in the same metabolic process, co-membership in a protein complex and physical interaction. For each of these four classes, several types of experimental and statistical evidence are combined by Bayesian integration to predict genome-wide functional coupling networks. The FunCoup framework has been completely re-implemented to allow for more frequent future updates. It contains many improvements, such as a regularization procedure to automatically downweight redundant evidences and a novel method to incorporate phylogenetic profile similarity. Several datasets have been updated and new data have been added in FunCoup 3.0. Furthermore, we have developed a new Web site, which provides powerful tools to explore the predicted networks and to retrieve detailed information about the data underlying each prediction. PMID:24185702
Bouaud, Jacques; Séroussi, Brigitte; Brizon, Ambre; Culty, Thibault; Mentré, France; Ravery, Vincent
2007-01-01
Guideline-based clinical decision support systems (CDSSs) can be effective in increasing physician compliance with recommendations. However, the ever growing pace at which medical knowledge is produced requires that clinical practice guidelines (CPGs) be updated regularly. It is therefore mandatory that CDSSs be revised accordingly. The French Association for Urology publishes CPGs on bladder cancer management every 2 years. We studied the impact of the 2004 revision of these guidelines, with respect to the 2002 version with a CDSS, UroDoc. We proposed a typology of knowledge base modifications resulting from the update of CPGs making the difference between practice, clinical conditions and recommendations refinement as opposed to new practice and new recommendations. The number of formalized recommendations increased from 577 in 2002 to 1,081 in 2004. We evaluated the two versions of UroDoc on a randomized sample of patient records. A single new practice that modifies a decision taken in 49% of all recorded decisions leads to a fall from 67% to 46% of the compliance rate of decisions.
Ko, Seung Hyun; Hur, Kyu Yeon; Rhee, Sang Youl; Kim, Nan Hee; Moon, Min Kyong; Park, Seok O; Lee, Byung Wan; Kim, Hyun Jin; Choi, Kyung Mook; Kim, Jin Hwa
2017-10-01
In 2017, the Korean Diabetes Association (KDA) published a position statement on the use of antihyperglycemic agents for patients with type 2 diabetes mellitus (T2DM). The KDA regularly updates its Clinical Practice Guidelines, but since the last update in 2015, many results from clinical trials have been introduced, and domestic data from studies performed in Korean patients with T2DM have been published. Recently, evidence from large clinical studies assessing cardiovascular outcomes following the use of sodium-glucose cotransporter 2 inhibitors and glucagon-like peptide 1 receptor agonists in patients with T2DM were incorporated into the recommendations. Additionally, new data from clinical trials using dipeptidyl peptidase 4 inhibitors and thiazolidinediones in Korean patients with T2DM were added. Following a systematic review and assessment of recent evidence, the KDA updated and modified its clinical practice recommendations regarding the use of antihyperglycemic agents and revised the treatment algorithm for Korean adult patients with T2DM. Copyright © 2017 Korean Diabetes Association.
FunCoup 3.0: database of genome-wide functional coupling networks.
Schmitt, Thomas; Ogris, Christoph; Sonnhammer, Erik L L
2014-01-01
We present an update of the FunCoup database (http://FunCoup.sbc.su.se) of functional couplings, or functional associations, between genes and gene products. Identifying these functional couplings is an important step in the understanding of higher level mechanisms performed by complex cellular processes. FunCoup distinguishes between four classes of couplings: participation in the same signaling cascade, participation in the same metabolic process, co-membership in a protein complex and physical interaction. For each of these four classes, several types of experimental and statistical evidence are combined by Bayesian integration to predict genome-wide functional coupling networks. The FunCoup framework has been completely re-implemented to allow for more frequent future updates. It contains many improvements, such as a regularization procedure to automatically downweight redundant evidences and a novel method to incorporate phylogenetic profile similarity. Several datasets have been updated and new data have been added in FunCoup 3.0. Furthermore, we have developed a new Web site, which provides powerful tools to explore the predicted networks and to retrieve detailed information about the data underlying each prediction.
Satellite image based methods for fuels maps updating
NASA Astrophysics Data System (ADS)
Alonso-Benito, Alfonso; Hernandez-Leal, Pedro A.; Arbelo, Manuel; Gonzalez-Calvo, Alejandro; Moreno-Ruiz, Jose A.; Garcia-Lazaro, Jose R.
2016-10-01
Regular updating of fuels maps is important for forest fire management. Nevertheless complex and time consuming field work is usually necessary for this purpose, which prevents a more frequent update. That is why the assessment of the usefulness of satellite data and the development of remote sensing techniques that enable the automatic updating of these maps, is of vital interest. In this work, we have tested the use of the spectral bands of OLI (Operational Land Imager) sensor on board Landsat 8 satellite, for updating the fuels map of El Hierro Island (Spain). From previously digitized map, a set of 200 reference plots for different fuel types was created. A 50% of the plots were randomly used as a training set and the rest were considered for validation. Six supervised and 2 unsupervised classification methods were applied, considering two levels of detail. A first level with only 5 classes (Meadow, Brushwood, Undergrowth canopy cover >50%, Undergrowth canopy cover <15%, and Xeric formations), and the second one containing 19 fuel types. The level 1 classification methods yielded an overall accuracy ranging from 44% for Parellelepided to an 84% for Maximun Likelihood. Meanwhile, level 2 results showed at best, an unacceptable overall accuracy of 34%, which prevents the use of this data for such a detailed characterization. Anyway it has been demonstrated that in some conditions, images of medium spatial resolution, like Landsat 8-OLI, could be a valid tool for an automatic upgrade of fuels maps, minimizing costs and complementing traditional methodologies.
Lorenzetti, Valentina; Solowij, Nadia; Fornito, Alex; Lubman, Dan Ian; Yucel, Murat
2014-01-01
Cannabis is the most widely used illicit drug worldwide, though it is unclear whether its regular use is associated with persistent alterations in brain morphology. This review examines evidence from human structural neuroimaging investigations of regular cannabis users and focuses on achieving three main objectives. These include examining whether the literature to date provides evidence that alteration of brain morphology in regular cannabis users: i) is apparent, compared to non-cannabis using controls; ii) is associated with patterns of cannabis use; and with iii) measures of psychopathology and neurocognitive performance. The published findings indicate that regular cannabis use is associated with alterations in medial temporal, frontal and cerebellar brain regions. Greater brain morphological alterations were evident among samples that used at higher doses for longer periods. However, the evidence for an association between brain morphology and cannabis use parameters was mixed. Further, there is poor evidence for an association between measures of brain morphology and of psychopathology symptoms/neurocognitive performance. Overall, numerous methodological issues characterize the literature to date. These include investigation of small sample sizes, heterogeneity across studies in sample characteristics (e.g., sex, comorbidity) and in employed imaging techniques, as well as the examination of only a limited number of brain regions. These factors make it difficult to draw firm conclusions from the existing findings. Nevertheless, this review supports the notion that regular cannabis use is associated with alterations of brain morphology, and highlights the need to consider particular methodological issues when planning future cannabis research.
Amiodarone-Induced Thyroid Dysfunction: A Clinical Update.
Elnaggar, Mohamed Nabil; Jbeili, Kahtan; Nik-Hussin, Nik; Kozhippally, Mohandas; Pappachan, Joseph M
2018-06-01
Amiodarone is one of the most commonly prescribed antiarrhythmic agents in clinical practice owing to its efficacy, even with high toxicity profile. The high iodine content and the prolonged biological half-life of the drug can result in thyroid dysfunction in a high proportion of patients treated with amiodarone even after cessation of amiodarone. Both hypothyroidism and hyperthyroidism are common side effects that mandate regular monitoring of patients with thyroid function tests. Amiodarone-induced hypothyroidism (AIH) is diagnosed and managed in the same way as a usual case of hypothyroidism. However, differential diagnosis and clinical management of amiodarone-induced thyrotoxicosis (AIT) subtypes can be challenging. With the aid of a case snippet, we update the current evidence for the diagnostic work up and management of patients with amiodarone-induced thyroid dysfunction in this article. © Georg Thieme Verlag KG Stuttgart · New York.
Lee, Byung-Wan; Kim, Jin Hwa; Ko, Seung-Hyun; Hur, Kyu Yeon; Kim, Nan-Hee; Rhee, Sang Youl; Kim, Hyun Jin; Moon, Min Kyong; Park, Seok-O; Choi, Kyung Mook
2017-11-01
The Korean Diabetes Association (KDA) has regularly updated its Clinical Practice Guidelines. In 2017, the KDA published a position statement on the use of antihyperglycemic agents for patients with type 2 diabetes mellitus (T2DM). Growing evidence from new multinational clinical trials using novel and traditional insulin analogues has also been accumulated. Following global trends, many results of clinical trials, especially concerning the clinical efficacy and safety of insulin therapy, have been published about Korean patients with T2DM. After a systematic search of recent evidence, the KDA updated and modified its clinical practice recommendations regarding the initiation, choice, and intensification of insulin and created an insulin treatment algorithm for the first time to guide physicians caring for adult Korean patients with T2DM.
FY2014 Appropriations Lapse and the Department of Homeland Security: Impact and Legislation
2013-10-24
Fugate posted on FEMA’s website a memorandum to FEMA employees that noted, “Beginning shortly, we will be recalling some employees who were furloughed...19 Memorandum for FEMA Employees from Administrator Craig Fugate , “Shutdown Update and Potential Staff Recall - October 3, 2013,” posted October 2...same as the annualized discretionary budget authority that would have been provided to DHS through H.J.Res. 59. The sections in blue are regular
Bhowmik, D. M.; Dinda, A. K.; Mahanta, P.; Agarwal, S. K.
2010-01-01
Till the early 1990s there was no standardized international classification of renal allograft biopsies resulting in considerable heterogeneity in reporting among the various centers. A group of dedicated renal pathologists, nephrologists, and transplant surgeons developed a schema in Banff, Canada in 1991. Subsequently there have been updates at regular intervals. The following review presents the evolution of the Banff classification and its utility for clinicians. PMID:20535263
Ghooi, Ravindra B.
2011-01-01
The Nuremberg Code drafted at the end of the Doctor’s trial in Nuremberg 1947 has been hailed as a landmark document in medical and research ethics. Close examination of this code reveals that it was based on the Guidelines for Human Experimentation of 1931. The resemblance between these documents is uncanny. It is unfortunate that the authors of the Nuremberg Code passed it off as their original work. There is evidence that the defendants at the trial did request that their actions be judged on the basis of the 1931 Guidelines, in force in Germany. The prosecutors, however, ignored the request and tried the defendants for crimes against humanity, and the judges included the Nuremberg Code as a part of the judgment. Six of ten principles in Nuremberg Code are derived from the 1931 Guidelines, and two of four newly inserted principles are open to misinterpretation. There is little doubt that the Code was prepared after studying the Guidelines, but no reference was made to the Guidelines, for reasons that are not known. Using the Guidelines as a base document without giving due credit is plagiarism; as per our understanding of ethics today, this would be considered unethical. The Nuremberg Code has fallen by the wayside; since unlike the Declaration of Helsinki, it is not regularly reviewed and updated. The regular updating of some ethics codes is evidence of the evolving nature of human ethics. PMID:21731859
The Nuremberg Code-A critique.
Ghooi, Ravindra B
2011-04-01
The Nuremberg Code drafted at the end of the Doctor's trial in Nuremberg 1947 has been hailed as a landmark document in medical and research ethics. Close examination of this code reveals that it was based on the Guidelines for Human Experimentation of 1931. The resemblance between these documents is uncanny. It is unfortunate that the authors of the Nuremberg Code passed it off as their original work. There is evidence that the defendants at the trial did request that their actions be judged on the basis of the 1931 Guidelines, in force in Germany. The prosecutors, however, ignored the request and tried the defendants for crimes against humanity, and the judges included the Nuremberg Code as a part of the judgment. Six of ten principles in Nuremberg Code are derived from the 1931 Guidelines, and two of four newly inserted principles are open to misinterpretation. There is little doubt that the Code was prepared after studying the Guidelines, but no reference was made to the Guidelines, for reasons that are not known. Using the Guidelines as a base document without giving due credit is plagiarism; as per our understanding of ethics today, this would be considered unethical. The Nuremberg Code has fallen by the wayside; since unlike the Declaration of Helsinki, it is not regularly reviewed and updated. The regular updating of some ethics codes is evidence of the evolving nature of human ethics.
Sajjad, Muhammad; Mehmood, Irfan; Baik, Sung Wook
2015-01-01
Image super-resolution (SR) plays a vital role in medical imaging that allows a more efficient and effective diagnosis process. Usually, diagnosing is difficult and inaccurate from low-resolution (LR) and noisy images. Resolution enhancement through conventional interpolation methods strongly affects the precision of consequent processing steps, such as segmentation and registration. Therefore, we propose an efficient sparse coded image SR reconstruction technique using a trained dictionary. We apply a simple and efficient regularized version of orthogonal matching pursuit (ROMP) to seek the coefficients of sparse representation. ROMP has the transparency and greediness of OMP and the robustness of the L1-minization that enhance the dictionary learning process to capture feature descriptors such as oriented edges and contours from complex images like brain MRIs. The sparse coding part of the K-SVD dictionary training procedure is modified by substituting OMP with ROMP. The dictionary update stage allows simultaneously updating an arbitrary number of atoms and vectors of sparse coefficients. In SR reconstruction, ROMP is used to determine the vector of sparse coefficients for the underlying patch. The recovered representations are then applied to the trained dictionary, and finally, an optimization leads to high-resolution output of high-quality. Experimental results demonstrate that the super-resolution reconstruction quality of the proposed scheme is comparatively better than other state-of-the-art schemes.
Group-sparse representation with dictionary learning for medical image denoising and fusion.
Li, Shutao; Yin, Haitao; Fang, Leyuan
2012-12-01
Recently, sparse representation has attracted a lot of interest in various areas. However, the standard sparse representation does not consider the intrinsic structure, i.e., the nonzero elements occur in clusters, called group sparsity. Furthermore, there is no dictionary learning method for group sparse representation considering the geometrical structure of space spanned by atoms. In this paper, we propose a novel dictionary learning method, called Dictionary Learning with Group Sparsity and Graph Regularization (DL-GSGR). First, the geometrical structure of atoms is modeled as the graph regularization. Then, combining group sparsity and graph regularization, the DL-GSGR is presented, which is solved by alternating the group sparse coding and dictionary updating. In this way, the group coherence of learned dictionary can be enforced small enough such that any signal can be group sparse coded effectively. Finally, group sparse representation with DL-GSGR is applied to 3-D medical image denoising and image fusion. Specifically, in 3-D medical image denoising, a 3-D processing mechanism (using the similarity among nearby slices) and temporal regularization (to perverse the correlations across nearby slices) are exploited. The experimental results on 3-D image denoising and image fusion demonstrate the superiority of our proposed denoising and fusion approaches.
Gambito, Ephraim D V; Gonzalez-Suarez, Consuelo B; Grimmer, Karen A; Valdecañas, Carolina M; Dizon, Janine Margarita R; Beredo, Ma Eulalia J; Zamora, Marcelle Theresa G
2015-11-04
Clinical practice guidelines need to be regularly updated with current literature in order to remain relevant. This paper reports on the approach taken by the Philippine Academy of Rehabilitation Medicine (PARM). This dovetails with its writing guide, which underpinned its foundational work in contextualizing guidelines for stroke and low back pain (LBP) in 2011. Working groups of Filipino rehabilitation physicians and allied health practitioners met to reconsider and modify, where indicated, the 'typical' Filipino patient care pathways established in the foundation guidelines. New clinical guidelines on stroke and low back pain which had been published internationally in the last 3 years were identified using a search of electronic databases. The methodological quality of each guideline was assessed using the iCAHE Guideline Quality Checklist, and only those guidelines which provided full text references, evidence hierarchy and quality appraisal of the included literature, were included in the PARM update. Each of the PARM-endorsed recommendations was then reviewed, in light of new literature presented in the included clinical guidelines. A novel standard updating approach was developed based on the criteria reported by Johnston et al. (Int J Technol Assess Health Care 19(4):646-655, 2003) and then modified to incorporate wording from the foundational PARM writing guide. The new updating tool was debated, pilot-tested and agreed upon by the PARM working groups, before being applied to the guideline updating process. Ten new guidelines on stroke and eleven for low back pain were identified. Guideline quality scores were moderate to good, however not all guidelines comprehensively linked the evidence body underpinning recommendations with the literature. Consequently only five stroke and four low back pain guidelines were included. The modified PARM updating guide was applied by all working groups to ensure standardization of the wording of updated recommendations and the underpinning evidence bases. The updating tool provides a simple, standard and novel approach that incorporates evidence hierarchy and quality, and wordings of recommendations. It could be used efficiently by other guideline updaters particularly in developing countries, where resources for guideline development and updates are limited. When many people are involved in guideline writing, there is always the possibility of 'slippage' in use of wording and interpretation of evidence. The PARM updating tool provides a mechanism for maintaining a standard process for guideline updating processes that can be followed by clinicians with basic training in evidence-based practice principles.
Structured sparse linear graph embedding.
Wang, Haixian
2012-03-01
Subspace learning is a core issue in pattern recognition and machine learning. Linear graph embedding (LGE) is a general framework for subspace learning. In this paper, we propose a structured sparse extension to LGE (SSLGE) by introducing a structured sparsity-inducing norm into LGE. Specifically, SSLGE casts the projection bases learning into a regression-type optimization problem, and then the structured sparsity regularization is applied to the regression coefficients. The regularization selects a subset of features and meanwhile encodes high-order information reflecting a priori structure information of the data. The SSLGE technique provides a unified framework for discovering structured sparse subspace. Computationally, by using a variational equality and the Procrustes transformation, SSLGE is efficiently solved with closed-form updates. Experimental results on face image show the effectiveness of the proposed method. Copyright © 2011 Elsevier Ltd. All rights reserved.
Exercise training in children and adolescents with cystic fibrosis: theory into practice.
Williams, Craig A; Benden, Christian; Stevens, Daniel; Radtke, Thomas
2010-01-01
Physical activity and exercise training play an important role in the clinical management of patients with cystic fibrosis (CF). Exercise training is more common and recognized as an essential part of rehabilitation programmes and overall CF care. Regular exercise training is associated with improved aerobic and anaerobic capacity, higher pulmonary function, and enhanced airway mucus clearance. Furthermore, patients with higher aerobic fitness have an improved survival. Aerobic and anaerobic training may have different effects, while the combination of both have been reported to be beneficial in CF. However, exercise training remains underutilised and not always incorporated into routine CF management. We provide an update on aerobic and anaerobic responses to exercise and general training recommendations in children and adolescents with CF. We propose that an active lifestyle and exercise training are an efficacious part of regular CF patient management.
Hoebel, Jens; Finger, Jonas D; Kuntz, Benjamin; Lampert, Thomas
2016-02-01
Regular physical activity has positive effects on health at all ages. This study aims to investigate how far physical activity and regular sports engagement, as a more specific type of physical activity, are associated with socioeconomic factors in the middle-aged working population. Data were obtained from 21,699 working men and women aged between 30 and 64 years who participated in the 2009 and 2010 population-based national German Health Update (GEDA) surveys conducted by the Robert Koch Institute. Besides a multi-dimensional index of socioeconomic status (SES), three single dimensions of SES (education, occupation, and income) were used to analyse socioeconomic differences in total physical activity and regular sports engagement. While the prevalence of total physical activity increased with lower SES, the proportion of people with regular sports engagement decreased with lower SES. These associations remained after adjusting for age in men and women. After mutual adjustment of the three single socioeconomic dimensions, physical activity was independently associated with lower education and lower occupational status. Regular sports engagement was observed to be independently associated with higher education, higher occupational status, as well as higher income after mutual adjustment. This study demonstrates significant socioeconomic differences in physical and sports activity in the middle-aged working population. Education, occupation, and income show varying independent associations with physical activity behaviour. Such differences need to be considered when identifying target groups for health-enhancing physical activity interventions.
Advance reservation access control using software-defined networking and tokens
Chung, Joaquin; Jung, Eun-Sung; Kettimuthu, Rajkumar; ...
2017-03-09
Advance reservation systems allow users to reserve dedicated bandwidth connection resources from advanced high-speed networks. A common use case for such systems is data transfers in distributed science environments in which a user wants exclusive access to the reservation. However, current advance network reservation methods cannot ensure exclusive access of a network reservation to the specific flow for which the user made the reservation. We present in this paper a novel network architecture that addresses this limitation and ensures that a reservation is used only by the intended flow. We achieve this by leveraging software-defined networking (SDN) and token-based authorization.more » We use SDN to orchestrate and automate the reservation of networking resources, end-to-end and across multiple administrative domains, and tokens to create a strong binding between the user or application that requested the reservation and the flows provisioned by SDN. Finally, we conducted experiments on the ESNet 100G SDN testbed, and demonstrated that our system effectively protects authorized flows from competing traffic in the network.« less
Advance reservation access control using software-defined networking and tokens
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chung, Joaquin; Jung, Eun-Sung; Kettimuthu, Rajkumar
Advance reservation systems allow users to reserve dedicated bandwidth connection resources from advanced high-speed networks. A common use case for such systems is data transfers in distributed science environments in which a user wants exclusive access to the reservation. However, current advance network reservation methods cannot ensure exclusive access of a network reservation to the specific flow for which the user made the reservation. We present in this paper a novel network architecture that addresses this limitation and ensures that a reservation is used only by the intended flow. We achieve this by leveraging software-defined networking (SDN) and token-based authorization.more » We use SDN to orchestrate and automate the reservation of networking resources, end-to-end and across multiple administrative domains, and tokens to create a strong binding between the user or application that requested the reservation and the flows provisioned by SDN. Finally, we conducted experiments on the ESNet 100G SDN testbed, and demonstrated that our system effectively protects authorized flows from competing traffic in the network.« less
Advance reservation access control using software-defined networking and tokens
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chung, Joaquin; Jung, Eun-Sung; Kettimuthu, Rajkumar
Advance reservation systems allow users to reserve dedicated bandwidth connection resources from advanced high-speed networks. A common use case for such systems is data transfers in distributed science environments in which a user wants exclusive access to the reservation. However, current advance network reservation methods cannot ensure exclusive access of a network reservation to the specific flow for which the user made the reservation. We present here a novel network architecture that addresses this limitation and ensures that a reservation is used only by the intended flow. We achieve this by leveraging software-defined networking (SDN) and token-based authorization. We usemore » SDN to orchestrate and automate the reservation of networking resources, end-to-end and across multiple administrative domains, and tokens to create a strong binding between the user or application that requested the reservation and the flows provisioned by SDN. We conducted experiments on the ESNet 100G SDN testbed, and demonstrated that our system effectively protects authorized flows from competing traffic in the network. (C) 2017 Elsevier B.V. All rights reserved.« less
A guide to experimental particle physics literature, 1991-1996
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ezhela, V.V.; Filimonov, B.B.; Lugovsky, S.B.
1996-10-01
We present an indexed guide to experimental particle physics literature for the years 1991 - 1996. Approximately 4200 papers are indexed by (1) Beam/Target/Momentum (2) Reaction/Momentum/Data-Descriptor (including the final state) (3) Particle/Decay (4) Accelerator/Experiment/Detector. All indices are cross-referenced to the paper`s title and references in the ID/Reference/Title index. The information presented in this guide is also publicly available on a regularly-updated DATAGUIDE database from the World Wide Web.
The Smart Mine Simulator User’s Guide and Algorithm Description
1993-12-01
meters control kill range tank 2 meters * APC 1.5 meters other ground 1 meter munition burst type projectile 105APDS detonator M739 155mm C-1 WAM...in range 15 meters munition launch burst type projectile TOW detonator M739 155mm WAM Sublet: component parameter Index value sublet regular update...detonator M739 155mm sensor detection range 50 meters control firing angle -55 degrees munition fire burst type projectile TOW detonator M739 155mm
[Preventive vaccinations in dentistry].
Rostetter, Claudio; Lübbers, Heinz-Theo; Kruse, Astrid L; Metzler, Philipp
2015-01-01
The purpose of this current paper is to give a simple update and overview about vaccinations for dental health care workers considering the new guidelines published in February 2014 by the Swiss Federal Office of Public Health. It is recommended to have at least a valid protection against hepatitis B, measles, mumps, rubella, influenza, varicella, diphtheria, tetanus, poliomyelitis and pertussis. Dental health care workers are highly exposed and high risk carriers for inoculable diseases, therefore regular refreshment of vaccinations is necessary for public health and their own health.
2010-05-07
important for deep modular systems is that taking a series of small update steps and stopping before convergence, so called early stopping, is a form of regu...larization around the initial parameters of the system . For example, the stochastic gradient descent 5 1 u + 1 v = 1 6‖x2‖q = ‖x‖22q 22 Chapter 2...Aside from the overall speed of the classifier, no quantitative performance analysis was given, and the role played by the features in the larger system
Nunn, June; Foster, Martin; Master, Selina; Greening, Sue
2008-11-01
This policy document was prepared by J Nunn, M Foster, S Master and S Greening on behalf of the British Society of Paediatric Dentistry (BSPD). Policy documents produced by the BSPD represent a majority view, based on a consideration of currently available evidence. They are produced to provide guidance with the intention that the policy be regularly reviewed and updated to take account of changing views and developments.
Khatcheressian, James L; Hurley, Patricia; Bantug, Elissa; Esserman, Laura J; Grunfeld, Eva; Halberg, Francine; Hantel, Alexander; Henry, N Lynn; Muss, Hyman B; Smith, Thomas J; Vogel, Victor G; Wolff, Antonio C; Somerfield, Mark R; Davidson, Nancy E
2013-03-01
To provide recommendations on the follow-up and management of patients with breast cancer who have completed primary therapy with curative intent. To update the 2006 guideline of the American Society of Clinical Oncology (ASCO), a systematic review of the literature published from March 2006 through March 2012 was completed using MEDLINE and the Cochrane Collaboration Library. An Update Committee reviewed the evidence to determine whether the recommendations were in need of updating. There were 14 new publications that met inclusion criteria: nine systematic reviews (three included meta-analyses) and five randomized controlled trials. After its review and analysis of the evidence, the Update Committee concluded that no revisions to the existing ASCO recommendations were warranted. Regular history, physical examination, and mammography are recommended for breast cancer follow-up. Physical examinations should be performed every 3 to 6 months for the first 3 years, every 6 to 12 months for years 4 and 5, and annually thereafter. For women who have undergone breast-conserving surgery, a post-treatment mammogram should be obtained 1 year after the initial mammogram and at least 6 months after completion of radiation therapy. Thereafter, unless otherwise indicated, a yearly mammographic evaluation should be performed. The use of complete blood counts, chemistry panels, bone scans, chest radiographs, liver ultrasounds, pelvic ultrasounds, computed tomography scans, [(18)F]fluorodeoxyglucose-positron emission tomography scans, magnetic resonance imaging, and/or tumor markers (carcinoembryonic antigen, CA 15-3, and CA 27.29) is not recommended for routine follow-up in an otherwise asymptomatic patient with no specific findings on clinical examination.
Audibert, Francois; De Bie, Isabelle; Johnson, Jo-Ann; Okun, Nanette; Wilson, R Douglas; Armour, Christine; Chitayat, David; Kim, Raymond
2017-09-01
To review the available prenatal screening options in light of the recent technical advances and to provide an update of previous guidelines in the field of prenatal screening. Health care providers involved in prenatal screening, including general practitioners, obstetricians, midwives, maternal fetal medicine specialists, geneticists, and radiologists. All pregnant women receiving counselling and providing informed consent for prenatal screening. Published literature was retrieved through searches of Medline, PubMed, and the Cochrane Library in and prior to March 2016 using an appropriate controlled vocabulary (prenatal diagnosis, amniocentesis, chorionic villi sampling, non-invasive prenatal screening) and key words (prenatal screening, prenatal genetic counselling). Results were restricted to systematic reviews, randomized control trials/controlled clinical trials, and observational studies written in English and published from January 1985 to May 2016. Searches were updated on a regular basis and incorporated in the guideline. Grey (unpublished) literature was identified through searching the websites of health technology assessment and health technology-related agencies, clinical practice guideline collections, clinical trial registries, and national and international medical speciality societies. Evidence will be reviewed 5 years after publication to determine whether all or part of the guideline should be updated. However, if important new evidence is published prior to the 5-year cycle, the review process may be accelerated for a more rapid update of some recommendations. Copyright © 2017 The Society of Obstetricians and Gynaecologists of Canada/La Société des obstétriciens et gynécologues du Canada. Published by Elsevier Inc. All rights reserved.
Lee, Byung Wan; Kim, Jin Hwa; Ko, Seung Hyun; Hur, Kyu Yeon; Kim, Nan Hee; Rhee, Sang Youl; Kim, Hyun Jin; Moon, Min Kyong; Park, Seok O; Choi, Kyung Mook
2017-10-01
The Korean Diabetes Association (KDA) has regularly updated its Clinical Practice Guidelines. In 2017, the KDA published a position statement on the use of antihyperglycemic agents for patients with type 2 diabetes mellitus (T2DM). Growing evidence from new multinational clinical trials using novel and traditional insulin analogues has also been accumulated. Following global trends, many results of clinical trials, especially concerning the clinical efficacy and safety of insulin therapy, have been published about Korean patients with T2DM. After a systematic search of recent evidence, the KDA updated and modified its clinical practice recommendations regarding the initiation, choice, and intensification of insulin and created an insulin treatment algorithm for the first time to guide physicians caring for adult Korean patients with T2DM. Copyright © 2017 Korean Diabetes Association.
Unfolding large-scale online collaborative human dynamics
Zha, Yilong; Zhou, Tao; Zhou, Changsong
2016-01-01
Large-scale interacting human activities underlie all social and economic phenomena, but quantitative understanding of regular patterns and mechanism is very challenging and still rare. Self-organized online collaborative activities with a precise record of event timing provide unprecedented opportunity. Our empirical analysis of the history of millions of updates in Wikipedia shows a universal double–power-law distribution of time intervals between consecutive updates of an article. We then propose a generic model to unfold collaborative human activities into three modules: (i) individual behavior characterized by Poissonian initiation of an action, (ii) human interaction captured by a cascading response to previous actions with a power-law waiting time, and (iii) population growth due to the increasing number of interacting individuals. This unfolding allows us to obtain an analytical formula that is fully supported by the universal patterns in empirical data. Our modeling approaches reveal “simplicity” beyond complex interacting human activities. PMID:27911766
Novel ways of improving communication with members of health professional associations.
Chaudhary, Pushpa; Tuladhar, Heera
2014-10-01
The International Federation of Gynecology and Obstetrics (FIGO) supported the Nepal Society of Obstetricians and Gynaecologists (NESOG) to help influence national health policy and practice through FIGO's Leadership in Obstetrics and Gynecology for Impact and Change (LOGIC) Initiative in Maternal and Newborn Health. An Organizational Capacity Improvement Framework, developed by the Society of Obstetricians and Gynaecologists of Canada (SOGC), was used to evaluate NESOG's initial baseline organizational capacity in 2010. Communication among NESOG members was rated as moderate (39%). Several initiatives, such as the use of high-speed internet access, group SMS texts and emails for information sharing, member profile updates, use of social media, and regular updates to the NESOG website were examples of interventions that resulted in improved participation of members in NESOG's activities. Members were impressively active in reciprocating via Facebook, and via participation in online voting in the NESOG elections (84%). Copyright © 2014. Published by Elsevier Ireland Ltd.
U.S. Selected Practice Recommendations for Contraceptive Use, 2016.
Curtis, Kathryn M; Jatlaoui, Tara C; Tepper, Naomi K; Zapata, Lauren B; Horton, Leah G; Jamieson, Denise J; Whiteman, Maura K
2016-07-29
The 2016 U.S. Selected Practice Recommendations for Contraceptive Use (U.S. SPR) addresses a select group of common, yet sometimes controversial or complex, issues regarding initiation and use of specific contraceptive methods. These recommendations for health care providers were updated by CDC after review of the scientific evidence and consultation with national experts who met in Atlanta, Georgia, during August 26-28, 2015. The information in this report updates the 2013 U.S. SPR (CDC. U.S. selected practice recommendations for contraceptive use, 2013. MMWR 2013;62[No. RR-5]). Major updates include 1) revised recommendations for starting regular contraception after the use of emergency contraceptive pills and 2) new recommendations for the use of medications to ease insertion of intrauterine devices. The recommendations in this report are intended to serve as a source of clinical guidance for health care providers and provide evidence-based guidance to reduce medical barriers to contraception access and use. Health care providers should always consider the individual clinical circumstances of each person seeking family planning services. This report is not intended to be a substitute for professional medical advice for individual patients. Persons should seek advice from their health care providers when considering family planning options.
Personalised physical exercise regime for chronic patients through a wearable ICT platform.
Angelidis, Pantelis A
2010-01-01
Today's state of the art in exercise physiology, professional athletics and sports practice in general clearly shows that the best results depend on the personalisation and continuous update of the recommendations provided to an athlete training, a sports lover or a person whose medical condition demands regular physical exercise. The vital signs information gathered in telemonitoring systems can be better evaluated and exploited if processed along with data from the subject's electronic health records, training history and performance statistics. In this context, the current paper intends to exploit modern smart miniaturised systems and advanced information systems towards the development of an infrastructure for continuous, non-invasive acquisition and advanced processing of vital signs information. In particular, it will look into wearable electronics embedded in textile capable of performing regular or exceptional measurements of vital physiological parameters and communicating them to an application server for further processing.
Sparse Coding and Counting for Robust Visual Tracking
Liu, Risheng; Wang, Jing; Shang, Xiaoke; Wang, Yiyang; Su, Zhixun; Cai, Yu
2016-01-01
In this paper, we propose a novel sparse coding and counting method under Bayesian framework for visual tracking. In contrast to existing methods, the proposed method employs the combination of L0 and L1 norm to regularize the linear coefficients of incrementally updated linear basis. The sparsity constraint enables the tracker to effectively handle difficult challenges, such as occlusion or image corruption. To achieve real-time processing, we propose a fast and efficient numerical algorithm for solving the proposed model. Although it is an NP-hard problem, the proposed accelerated proximal gradient (APG) approach is guaranteed to converge to a solution quickly. Besides, we provide a closed solution of combining L0 and L1 regularized representation to obtain better sparsity. Experimental results on challenging video sequences demonstrate that the proposed method achieves state-of-the-art results both in accuracy and speed. PMID:27992474
Adult smokers in Colombia: Who isn’t giving it up
Storr, Carla L.; Cheng, Hui; Posada-Villa, Jose; Aguilar-Gaxiola, Sergio; Anthony, James C.
2008-01-01
Without ongoing surveillance systems to assess tobacco product demand and exposure levels, many low and middle income countries monitor smoking via periodic cross-sectional surveys. In this article, we seek to update estimates for the prevalence of adult smoking in Colombia and contribute additional information useful for tobacco control initiatives. Data are from the 2003 Colombian National Study of Mental Health (NSMH). A national probability sample of 4,426 adults (age 18-65) was assessed via a computer-assisted interview. An estimated 49% of the adult population had smoked at least once in their lifetimes; one in three adults (31%) had smoked regularly. Nearly half of regular smokers had been able to quit (44%; 95% CI= 40-48). Several personal and smoking related characteristics were associated with failing to quit: being a younger age, employed as compared to being a homemaker, and a history of daily use. Quitters and non-quitters were equivalent with respect to sex, educational status, and age of smoking onset. In conclusion, our findings describe the characteristics of regular smokers in Colombia and identify subgroups of non-quitters that may help guide tobacco control activities. PMID:18006241
Land cover classification of VHR airborne images for citrus grove identification
NASA Astrophysics Data System (ADS)
Amorós López, J.; Izquierdo Verdiguier, E.; Gómez Chova, L.; Muñoz Marí, J.; Rodríguez Barreiro, J. Z.; Camps Valls, G.; Calpe Maravilla, J.
Managing land resources using remote sensing techniques is becoming a common practice. However, data analysis procedures should satisfy the high accuracy levels demanded by users (public or private companies and governments) in order to be extensively used. This paper presents a multi-stage classification scheme to update the citrus Geographical Information System (GIS) of the Comunidad Valenciana region (Spain). Spain is the first citrus fruit producer in Europe and the fourth in the world. In particular, citrus fruits represent 67% of the agricultural production in this region, with a total production of 4.24 million tons (campaign 2006-2007). The citrus GIS inventory, created in 2001, needs to be regularly updated in order to monitor changes quickly enough, and allow appropriate policy making and citrus production forecasting. Automatic methods are proposed in this work to facilitate this update, whose processing scheme is summarized as follows. First, an object-oriented feature extraction process is carried out for each cadastral parcel from very high spatial resolution aerial images (0.5 m). Next, several automatic classifiers (decision trees, artificial neural networks, and support vector machines) are trained and combined to improve the final classification accuracy. Finally, the citrus GIS is automatically updated if a high enough level of confidence, based on the agreement between classifiers, is achieved. This is the case for 85% of the parcels and accuracy results exceed 94%. The remaining parcels are classified by expert photo-interpreters in order to guarantee the high accuracy demanded by policy makers.
Regulatory sequence analysis tools.
van Helden, Jacques
2003-07-01
The web resource Regulatory Sequence Analysis Tools (RSAT) (http://rsat.ulb.ac.be/rsat) offers a collection of software tools dedicated to the prediction of regulatory sites in non-coding DNA sequences. These tools include sequence retrieval, pattern discovery, pattern matching, genome-scale pattern matching, feature-map drawing, random sequence generation and other utilities. Alternative formats are supported for the representation of regulatory motifs (strings or position-specific scoring matrices) and several algorithms are proposed for pattern discovery. RSAT currently holds >100 fully sequenced genomes and these data are regularly updated from GenBank.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1982-10-01
The study was requested by the city to provide information reflecting current flood conditions in order for the community to better administer its floodplain management program and to qualify for participation in the regular phase of the National Flood Insurance Program (NFIP). This report updates and expands the coverage of a previous TVA report published in April 1967. Profiles and flooded area and floodway maps are provided for Roseberry Creek, Wacker Branch, and three previously unstudied tributaries to Roseberry Creek.
The national elevation data set
Gesch, Dean B.; Oimoen, Michael J.; Greenlee, Susan K.; Nelson, Charles A.; Steuck, Michael J.; Tyler, Dean J.
2002-01-01
The NED is a seamless raster dataset from the USGS that fulfills many of the concepts of framework geospatial data as envisioned for the NSDI, allowing users to focus on analysis rather than data preparation. It is regularly maintained and updated, and it provides basic elevation data for many GIS applications. The NED is one of several seamless datasets that the USGS is making available through the Web. The techniques and approaches developed for producing, maintaining, and distributing the NED are the type that will be used for implementing the USGS National Map (http://nationalmap.usgs.gov/).
Lang, C; Kolaj-Robin, O; Cirefice, G; Taconet, L; Pel, E; Jouette, S; Buda, M; Milne, C; Charton, E
2018-01-01
Since the opening for signature of the European Convention for the Protection of Animals Used for Experimental and Other Scientific Purposes in 1986, the European Pharmacopoeia Commission and its experts have carried out a programme of work committed to Replacing, Reducing and Refining (3Rs) the use of animals for test purposes. While updates on achievements in the field of the 3Rs are regularly provided, this article summarises the activities of the Ph. Eur. Commission in this field within the last decade.
e-Learning development in medical physics and engineering
Tabakov, S
2008-01-01
Medical Physics and Engineering was among the first professions to develop and apply e-Learning (e-L). The profession provides excellent background for application of simulations and other e-L materials. The paper describes several layers for e-L development: Programming specific simulations; Building e-L modules; Development of e-L web-based programmes. The paper shows examples from these layers and outlines their specificities. At the end, the newest e-L development (project EMITEL) is briefly introduced and the necessity of a regularly updated list of e-L activities is emphasised. PMID:21614312
Numerical solution of inverse scattering for near-field optics.
Bao, Gang; Li, Peijun
2007-06-01
A novel regularized recursive linearization method is developed for a two-dimensional inverse medium scattering problem that arises in near-field optics, which reconstructs the scatterer of an inhomogeneous medium located on a substrate from data accessible through photon scanning tunneling microscopy experiments. Based on multiple frequency scattering data, the method starts from the Born approximation corresponding to weak scattering at a low frequency, and each update is obtained by continuation on the wavenumber from solutions of one forward problem and one adjoint problem of the Helmholtz equation.
Lougheed, M Diane; Lemiere, Catherine; Ducharme, Francine M; Licskai, Chris; Dell, Sharon D; Rowe, Brian H; FitzGerald, Mark; Leigh, Richard; Watson, Wade; Boulet, Louis-Philippe
2012-01-01
BACKGROUND: In 2010, the Canadian Thoracic Society (CTS) published a Consensus Summary for the diagnosis and management of asthma in children six years of age and older, and adults, including an updated Asthma Management Continuum. The CTS Asthma Clinical Assembly subsequently began a formal clinical practice guideline update process, focusing, in this first iteration, on topics of controversy and/or gaps in the previous guidelines. METHODS: Four clinical questions were identified as a focus for the updated guideline: the role of noninvasive measurements of airway inflammation for the adjustment of anti-inflammatory therapy; the initiation of adjunct therapy to inhaled corticosteroids (ICS) for uncontrolled asthma; the role of a single inhaler of an ICS/long-acting beta2-agonist combination as a reliever, and as a reliever and a controller; and the escalation of controller medication for acute loss of asthma control as part of a self-management action plan. The expert panel followed an adaptation process to identify and appraise existing guidelines on the specified topics. In addition, literature searches were performed to identify relevant systematic reviews and randomized controlled trials. The panel formally assessed and graded the evidence, and made 34 recommendations. RESULTS: The updated guideline recommendations outline a role for inclusion of assessment of sputum eosinophils, in addition to standard measures of asthma control, to guide adjustment of controller therapy in adults with moderate to severe asthma. Appraisal of the evidence regarding which adjunct controller therapy to add to ICS and at what ICS dose to begin adjunct therapy in children and adults with poor asthma control supported the 2010 CTS Consensus Summary recommendations. New recommendations for the adjustment of controller medication within written action plans are provided. Finally, priority areas for future research were identified. CONCLUSIONS: The present clinical practice guideline is the first update of the CTS Asthma Guidelines following the Canadian Respiratory Guidelines Committee’s new guideline development process. Tools and strategies to support guideline implementation will be developed and the CTS will continue to regularly provide updates reflecting new evidence. PMID:22536582
A summary of the new GINA strategy: a roadmap to asthma control
Bateman, Eric D.; Becker, Allan; Boulet, Louis-Philippe; Cruz, Alvaro A.; Drazen, Jeffrey M.; Haahtela, Tari; Hurd, Suzanne S.; Inoue, Hiromasa; de Jongste, Johan C.; Lemanske, Robert F.; Levy, Mark L.; O'Byrne, Paul M.; Paggiaro, Pierluigi; Pedersen, Soren E.; Pizzichini, Emilio; Soto-Quiroz, Manuel; Szefler, Stanley J.; Wong, Gary W.K.; FitzGerald, J. Mark
2015-01-01
Over the past 20 years, the Global Initiative for Asthma (GINA) has regularly published and annually updated a global strategy for asthma management and prevention that has formed the basis for many national guidelines. However, uptake of existing guidelines is poor. A major revision of the GINA report was published in 2014, and updated in 2015, reflecting an evolving understanding of heterogeneous airways disease, a broader evidence base, increasing interest in targeted treatment, and evidence about effective implementation approaches. During development of the report, the clinical utility of recommendations and strategies for their practical implementation were considered in parallel with the scientific evidence. This article provides a summary of key changes in the GINA report, and their rationale. The changes include a revised asthma definition; tools for assessing symptom control and risk factors for adverse outcomes; expanded indications for inhaled corticosteroid therapy; a framework for targeted treatment based on phenotype, modifiable risk factors, patient preference, and practical issues; optimisation of medication effectiveness by addressing inhaler technique and adherence; revised recommendations about written asthma action plans; diagnosis and initial treatment of the asthma−chronic obstructive pulmonary disease overlap syndrome; diagnosis in wheezing pre-school children; and updated strategies for adaptation and implementation of GINA recommendations. PMID:26206872
Face Transplantation: An Update for the United States Trauma System.
Farber, Scott J; Kantar, Rami S; Diaz-Siso, J Rodrigo; Rodriguez, Eduardo D
2018-05-15
Face transplantation has evolved over the last 12 years into a safe and feasible reconstructive solution, with good aesthetic and functional outcomes for patients with severe facial defects who are not amenable to reconstruction through conventional and autologous approaches. Among patients who underwent face transplantation to date, a significant proportion did so following trauma, mostly ballistic and thermal injuries. It is therefore important for trauma surgeons who deal with these injuries regularly to be familiar with the literature on face transplantation following traumatic injuries. In this study, we provide a focused review on this topic, with an emphasis on highlighting the limitations of conventional craniomaxillofacial reconstruction, while emphasizing data available on the risks, benefits, surgical indications, contraindications, as well as aesthetic and functional outcomes of face transplantation. The authors also provide an update on all face transplants performed to date including traumatic mechanisms of injury, and extent of defects. They finally describe 2 cases performed by the senior author for patients presenting with devastating facial ballistic and thermal injuries. The authors hope that this work serves as an update for the trauma surgery community regarding the current role and limitations of face transplantation as a craniomaxillofacial reconstructive option for their patient population. This can potentially expedite the reconstructive process for patients who may benefit from face transplantation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chow, Edward, E-mail: Edward.Chow@sunnybrook.ca; Hoskin, Peter; Mitera, Gunita
2012-04-01
Purpose: To update the international consensus on palliative radiotherapy endpoints for future clinical trials in bone metastases by surveying international experts regarding previous uncertainties within the 2002 consensus, changes that may be necessary based on practice pattern changes and research findings since that time. Methods and Materials: A two-phase survey was used to determine revisions and new additions to the 2002 consensus. A total of 49 experts from the American Society for Radiation Oncology, the European Society for Therapeutic Radiology and Oncology, the Faculty of Radiation Oncology of the Royal Australian and New Zealand College of Radiologists, and the Canadianmore » Association of Radiation Oncology who are directly involved in the care of patients with bone metastases participated in this survey. Results: Consensus was established in areas involving response definitions, eligibility criteria for future trials, reirradiation, changes in systemic therapy, radiation techniques, parameters at follow-up, and timing of assessments. Conclusion: An outline for trials in bone metastases was updated based on survey and consensus. Investigators leading trials in bone metastases are encouraged to adopt the revised guideline to promote consistent reporting. Areas for future research were identified. It is intended for the consensus to be re-examined in the future on a regular basis.« less
Kunz, Matthew Ross; Ottaway, Joshua; Kalivas, John H; Georgiou, Constantinos A; Mousdis, George A
2011-02-23
Detecting and quantifying extra virgin olive adulteration is of great importance to the olive oil industry. Many spectroscopic methods in conjunction with multivariate analysis have been used to solve these issues. However, successes to date are limited as calibration models are built to a specific set of geographical regions, growing seasons, cultivars, and oil extraction methods (the composite primary condition). Samples from new geographical regions, growing seasons, etc. (secondary conditions) are not always correctly predicted by the primary model due to different olive oil and/or adulterant compositions stemming from secondary conditions not matching the primary conditions. Three Tikhonov regularization (TR) variants are used in this paper to allow adulterant (sunflower oil) concentration predictions in samples from geographical regions not part of the original primary calibration domain. Of the three TR variants, ridge regression with an additional 2-norm penalty provides the smallest validation sample prediction errors. Although the paper reports on using TR for model updating to predict adulterant oil concentration, the methods should also be applicable to updating models distinguishing adulterated samples from pure extra virgin olive oil. Additionally, the approaches are general and can be used with other spectroscopic methods and adulterants as well as with other agriculture products.
Using Tranformation Group Priors and Maximum Relative Entropy for Bayesian Glaciological Inversions
NASA Astrophysics Data System (ADS)
Arthern, R. J.; Hindmarsh, R. C. A.; Williams, C. R.
2014-12-01
One of the key advances that has allowed better simulations of the large ice sheets of Greenland and Antarctica has been the use of inverse methods. These have allowed poorly known parameters such as the basal drag coefficient and ice viscosity to be constrained using a wide variety of satellite observations. Inverse methods used by glaciologists have broadly followed one of two related approaches. The first is minimization of a cost function that describes the misfit to the observations, often accompanied by some kind of explicit or implicit regularization that promotes smallness or smoothness in the inverted parameters. The second approach is a probabilistic framework that makes use of Bayes' theorem to update prior assumptions about the probability of parameters, making use of data with known error estimates. Both approaches have much in common and questions of regularization often map onto implicit choices of prior probabilities that are made explicit in the Bayesian framework. In both approaches questions can arise that seem to demand subjective input. What should the functional form of the cost function be if there are alternatives? What kind of regularization should be applied, and how much? How should the prior probability distribution for a parameter such as basal slipperiness be specified when we know so little about the details of the subglacial environment? Here we consider some approaches that have been used to address these questions and discuss ways that probabilistic prior information used for regularizing glaciological inversions might be specified with greater objectivity.
Cyber-T web server: differential analysis of high-throughput data.
Kayala, Matthew A; Baldi, Pierre
2012-07-01
The Bayesian regularization method for high-throughput differential analysis, described in Baldi and Long (A Bayesian framework for the analysis of microarray expression data: regularized t-test and statistical inferences of gene changes. Bioinformatics 2001: 17: 509-519) and implemented in the Cyber-T web server, is one of the most widely validated. Cyber-T implements a t-test using a Bayesian framework to compute a regularized variance of the measurements associated with each probe under each condition. This regularized estimate is derived by flexibly combining the empirical measurements with a prior, or background, derived from pooling measurements associated with probes in the same neighborhood. This approach flexibly addresses problems associated with low replication levels and technology biases, not only for DNA microarrays, but also for other technologies, such as protein arrays, quantitative mass spectrometry and next-generation sequencing (RNA-seq). Here we present an update to the Cyber-T web server, incorporating several useful new additions and improvements. Several preprocessing data normalization options including logarithmic and (Variance Stabilizing Normalization) VSN transforms are included. To augment two-sample t-tests, a one-way analysis of variance is implemented. Several methods for multiple tests correction, including standard frequentist methods and a probabilistic mixture model treatment, are available. Diagnostic plots allow visual assessment of the results. The web server provides comprehensive documentation and example data sets. The Cyber-T web server, with R source code and data sets, is publicly available at http://cybert.ics.uci.edu/.
NASA Astrophysics Data System (ADS)
Hoeksema, J. T.; Baldner, C. S.; Bush, R. I.; Schou, J.; Scherrer, P. H.
2018-03-01
The Helioseismic and Magnetic Imager (HMI) instrument is a major component of NASA's Solar Dynamics Observatory (SDO) spacecraft. Since commencement of full regular science operations on 1 May 2010, HMI has operated with remarkable continuity, e.g. during the more than five years of the SDO prime mission that ended 30 September 2015, HMI collected 98.4% of all possible 45-second velocity maps; minimizing gaps in these full-disk Dopplergrams is crucial for helioseismology. HMI velocity, intensity, and magnetic-field measurements are used in numerous investigations, so understanding the quality of the data is important. This article describes the calibration measurements used to track the performance of the HMI instrument, and it details trends in important instrument parameters during the prime mission. Regular calibration sequences provide information used to improve and update the calibration of HMI data. The set-point temperature of the instrument front window and optical bench is adjusted regularly to maintain instrument focus, and changes in the temperature-control scheme have been made to improve stability in the observable quantities. The exposure time has been changed to compensate for a 20% decrease in instrument throughput. Measurements of the performance of the shutter and tuning mechanisms show that they are aging as expected and continue to perform according to specification. Parameters of the tunable optical-filter elements are regularly adjusted to account for drifts in the central wavelength. Frequent measurements of changing CCD-camera characteristics, such as gain and flat field, are used to calibrate the observations. Infrequent expected events such as eclipses, transits, and spacecraft off-points interrupt regular instrument operations and provide the opportunity to perform additional calibration. Onboard instrument anomalies are rare and seem to occur quite uniformly in time. The instrument continues to perform very well.
An adaptive regularization parameter choice strategy for multispectral bioluminescence tomography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feng Jinchao; Qin Chenghu; Jia Kebin
2011-11-15
Purpose: Bioluminescence tomography (BLT) provides an effective tool for monitoring physiological and pathological activities in vivo. However, the measured data in bioluminescence imaging are corrupted by noise. Therefore, regularization methods are commonly used to find a regularized solution. Nevertheless, for the quality of the reconstructed bioluminescent source obtained by regularization methods, the choice of the regularization parameters is crucial. To date, the selection of regularization parameters remains challenging. With regards to the above problems, the authors proposed a BLT reconstruction algorithm with an adaptive parameter choice rule. Methods: The proposed reconstruction algorithm uses a diffusion equation for modeling the bioluminescentmore » photon transport. The diffusion equation is solved with a finite element method. Computed tomography (CT) images provide anatomical information regarding the geometry of the small animal and its internal organs. To reduce the ill-posedness of BLT, spectral information and the optimal permissible source region are employed. Then, the relationship between the unknown source distribution and multiview and multispectral boundary measurements is established based on the finite element method and the optimal permissible source region. Since the measured data are noisy, the BLT reconstruction is formulated as l{sub 2} data fidelity and a general regularization term. When choosing the regularization parameters for BLT, an efficient model function approach is proposed, which does not require knowledge of the noise level. This approach only requests the computation of the residual and regularized solution norm. With this knowledge, we construct the model function to approximate the objective function, and the regularization parameter is updated iteratively. Results: First, the micro-CT based mouse phantom was used for simulation verification. Simulation experiments were used to illustrate why multispectral data were used rather than monochromatic data. Furthermore, the study conducted using an adaptive regularization parameter demonstrated our ability to accurately localize the bioluminescent source. With the adaptively estimated regularization parameter, the reconstructed center position of the source was (20.37, 31.05, 12.95) mm, and the distance to the real source was 0.63 mm. The results of the dual-source experiments further showed that our algorithm could localize the bioluminescent sources accurately. The authors then presented experimental evidence that the proposed algorithm exhibited its calculated efficiency over the heuristic method. The effectiveness of the new algorithm was also confirmed by comparing it with the L-curve method. Furthermore, various initial speculations regarding the regularization parameter were used to illustrate the convergence of our algorithm. Finally, in vivo mouse experiment further illustrates the effectiveness of the proposed algorithm. Conclusions: Utilizing numerical, physical phantom and in vivo examples, we demonstrated that the bioluminescent sources could be reconstructed accurately with automatic regularization parameters. The proposed algorithm exhibited superior performance than both the heuristic regularization parameter choice method and L-curve method based on the computational speed and localization error.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dall'Anese, Emiliano; Simonetto, Andrea; Dhople, Sairaj
This paper focuses on power distribution networks featuring inverter-interfaced distributed energy resources (DERs), and develops feedback controllers that drive the DER output powers to solutions of time-varying AC optimal power flow (OPF) problems. Control synthesis is grounded on primal-dual-type methods for regularized Lagrangian functions, as well as linear approximations of the AC power-flow equations. Convergence and OPF-solution-tracking capabilities are established while acknowledging: i) communication-packet losses, and ii) partial updates of control signals. The latter case is particularly relevant since it enables asynchronous operation of the controllers where DER setpoints are updated at a fast time scale based on local voltagemore » measurements, and information on the network state is utilized if and when available, based on communication constraints. As an application, the paper considers distribution systems with high photovoltaic integration, and demonstrates that the proposed framework provides fast voltage-regulation capabilities, while enabling the near real-time pursuit of solutions of AC OPF problems.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dall'Anese, Emiliano; Simonetto, Andrea; Dhople, Sairaj
This paper focuses on power distribution networks featuring inverter-interfaced distributed energy resources (DERs), and develops feedback controllers that drive the DER output powers to solutions of time-varying AC optimal power flow (OPF) problems. Control synthesis is grounded on primal-dual-type methods for regularized Lagrangian functions, as well as linear approximations of the AC power-flow equations. Convergence and OPF-solution-tracking capabilities are established while acknowledging: i) communication-packet losses, and ii) partial updates of control signals. The latter case is particularly relevant since it enables asynchronous operation of the controllers where DER setpoints are updated at a fast time scale based on local voltagemore » measurements, and information on the network state is utilized if and when available, based on communication constraints. As an application, the paper considers distribution systems with high photovoltaic integration, and demonstrates that the proposed framework provides fast voltage-regulation capabilities, while enabling the near real-time pursuit of solutions of AC OPF problems.« less
Bruland, Philipp; Doods, Justin; Storck, Michael; Dugas, Martin
2017-01-01
Data dictionaries provide structural meta-information about data definitions in health information technology (HIT) systems. In this regard, reusing healthcare data for secondary purposes offers several advantages (e.g. reduce documentation times or increased data quality). Prerequisites for data reuse are its quality, availability and identical meaning of data. In diverse projects, research data warehouses serve as core components between heterogeneous clinical databases and various research applications. Given the complexity (high number of data elements) and dynamics (regular updates) of electronic health record (EHR) data structures, we propose a clinical metadata warehouse (CMDW) based on a metadata registry standard. Metadata of two large hospitals were automatically inserted into two CMDWs containing 16,230 forms and 310,519 data elements. Automatic updates of metadata are possible as well as semantic annotations. A CMDW allows metadata discovery, data quality assessment and similarity analyses. Common data models for distributed research networks can be established based on similarity analyses.
Breast cancer in the 21st century: from early detection to new therapies.
Merino Bonilla, J A; Torres Tabanera, M; Ros Mendoza, L H
The analysis of the causes that have given rise to a change in tendency in the incidence and mortality rates of breast cancer in the last few decades generates important revelations regarding the role of breast screening, the regular application of adjuvant therapies and the change of risk factors. The benefits of early detection have been accompanied by certain adverse effects, even in terms of an excessive number of prophylactic mastectomies. Recently, several updates have been published on the recommendations in breast cancer screening at an international level. On the other hand, the advances in genomics have made it possible to establish a new molecular classification of breast cancer. Our aim is to present an updated overview of the epidemiological situation of breast cancer, as well as some relevant issues from the point of view of diagnosis, such as molecular classification and different strategies for both population-based and opportunistic screening. Copyright © 2017 SERAM. Publicado por Elsevier España, S.L.U. All rights reserved.
Bayesian image reconstruction for improving detection performance of muon tomography.
Wang, Guobao; Schultz, Larry J; Qi, Jinyi
2009-05-01
Muon tomography is a novel technology that is being developed for detecting high-Z materials in vehicles or cargo containers. Maximum likelihood methods have been developed for reconstructing the scattering density image from muon measurements. However, the instability of maximum likelihood estimation often results in noisy images and low detectability of high-Z targets. In this paper, we propose using regularization to improve the image quality of muon tomography. We formulate the muon reconstruction problem in a Bayesian framework by introducing a prior distribution on scattering density images. An iterative shrinkage algorithm is derived to maximize the log posterior distribution. At each iteration, the algorithm obtains the maximum a posteriori update by shrinking an unregularized maximum likelihood update. Inverse quadratic shrinkage functions are derived for generalized Laplacian priors and inverse cubic shrinkage functions are derived for generalized Gaussian priors. Receiver operating characteristic studies using simulated data demonstrate that the Bayesian reconstruction can greatly improve the detection performance of muon tomography.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, J; Tsui, B; Noo, F
Purpose: To develop a feature-preserving model based image reconstruction (MBIR) method that improves performance in pancreatic lesion classification at equal or reduced radiation dose. Methods: A set of pancreatic lesion models was created with both benign and premalignant lesion types. These two classes of lesions are distinguished by their fine internal structures; their delineation is therefore crucial to the task of pancreatic lesion classification. To reduce image noise while preserving the features of the lesions, we developed a MBIR method with curvature-based regularization. The novel regularization encourages formation of smooth surfaces that model both the exterior shape and the internalmore » features of pancreatic lesions. Given that the curvature depends on the unknown image, image reconstruction or denoising becomes a non-convex optimization problem; to address this issue an iterative-reweighting scheme was used to calculate and update the curvature using the image from the previous iteration. Evaluation was carried out with insertion of the lesion models into the pancreas of a patient CT image. Results: Visual inspection was used to compare conventional TV regularization with our curvature-based regularization. Several penalty-strengths were considered for TV regularization, all of which resulted in erasing portions of the septation (thin partition) in a premalignant lesion. At matched noise variance (50% noise reduction in the patient stomach region), the connectivity of the septation was well preserved using the proposed curvature-based method. Conclusion: The curvature-based regularization is able to reduce image noise while simultaneously preserving the lesion features. This method could potentially improve task performance for pancreatic lesion classification at equal or reduced radiation dose. The result is of high significance for longitudinal surveillance studies of patients with pancreatic cysts, which may develop into pancreatic cancer. The Senior Author receives financial support from Siemens GmbH Healthcare.« less
Metastable Behavior for Bootstrap Percolation on Regular Trees
NASA Astrophysics Data System (ADS)
Biskup, Marek; Schonmann, Roberto H.
2009-08-01
We examine bootstrap percolation on a regular ( b+1)-ary tree with initial law given by Bernoulli( p). The sites are updated according to the usual rule: a vacant site becomes occupied if it has at least θ occupied neighbors, occupied sites remain occupied forever. It is known that, when b> θ≥2, the limiting density q= q( p) of occupied sites exhibits a jump at some p T= p T( b, θ)∈(0,1) from q T:= q( p T)<1 to q( p)=1 when p> p T. We investigate the metastable behavior associated with this transition. Explicitly, we pick p= p T+ h with h>0 and show that, as h ↓0, the system lingers around the "critical" state for time order h -1/2 and then passes to fully occupied state in time O(1). The law of the entire configuration observed when the occupation density is q∈( q T,1) converges, as h ↓0, to a well-defined measure.
A New Continuous-Time Equality-Constrained Optimization to Avoid Singularity.
Quan, Quan; Cai, Kai-Yuan
2016-02-01
In equality-constrained optimization, a standard regularity assumption is often associated with feasible point methods, namely, that the gradients of constraints are linearly independent. In practice, the regularity assumption may be violated. In order to avoid such a singularity, a new projection matrix is proposed based on which a feasible point method to continuous-time, equality-constrained optimization is developed. First, the equality constraint is transformed into a continuous-time dynamical system with solutions that always satisfy the equality constraint. Second, a new projection matrix without singularity is proposed to realize the transformation. An update (or say a controller) is subsequently designed to decrease the objective function along the solutions of the transformed continuous-time dynamical system. The invariance principle is then applied to analyze the behavior of the solution. Furthermore, the proposed method is modified to address cases in which solutions do not satisfy the equality constraint. Finally, the proposed optimization approach is applied to three examples to demonstrate its effectiveness.
Potential for improvement of population diet through reformulation of commonly eaten foods.
van Raaij, Joop; Hendriksen, Marieke; Verhagen, Hans
2009-03-01
FOOD REFORMULATION: Reformulation of foods is considered one of the key options to achieve population nutrient goals. The compositions of many foods are modified to assist the consumer bring his or her daily diet more in line with dietary recommendations. INITIATIVES ON FOOD REFORMULATION: Over the past few years the number of reformulated foods introduced on the European market has increased enormously and it is expected that this trend will continue for the coming years. LIMITS TO FOOD REFORMULATION: Limitations to food reformulation in terms of choice of foods appropriate for reformulation and level of feasible reformulation relate mainly to consumer acceptance, safety aspects, technological challenges and food legislation. IMPACT ON KEY NUTRIENT INTAKE AND HEALTH: The potential impact of reformulated foods on key nutrient intake and health is obvious. Evaluation of the actual impact requires not only regular food consumption surveys, but also regular updates of the food composition table including the compositions of newly launched reformulated foods.
De Sanctis, Vincenzo; Bernasconi, Sergio; Bianchin, Luigi; Bona, Gianni; Bozzola, Mauro; Buzi, Fabio; De Sanctis, Carlo; Rigon, Franco; Tatò, Luciano; Tonini, Giorgio; Perissinotto, Egle
2014-11-01
Healthcare professionals need updated information about what is the range of "normal" variation of menstrual cycle features to support young girls and their parents in managing reproductive health, and to detect diseases early. This cross-sectional study aimed to provide an updated picture of age at menarche and main menstrual cycle characteristics and complaints in an Italian population-based sample of 3,783 adolescents attending secondary school. Girls filled in a self-administered anonymous questionnaire including questions about demography, anthropometry, smoking and drinking habits, use of contraceptive, socioeconomic status, age at menarche, menstrual pattern, and physical/psychological menstrual complaints. Mean age at menarche and prevalence of polymenorrhea (cycle length < 21 days), oligomenorrhea (cycle length > 35 days), irregularity, dysmenorrhea, and of physical/psychological complaints were computed. Factors associated with age at menarche and menstrual disturbances were explored by using multiple logistic models. The girls' mean age was 17.1 years (SD 1.4 years) and the mean age at menarche was 12.4 years (SD 1.3 years); menarche occurred with two monthly peaks of frequency in July-September and in December-January (P < 0.0001). Age at menarche was significantly associated with geographic genetics (as expressed by parents' birth area), mother's menarcheal age, BMI, family size, and age at data collection. The prevalence of polymenorrhea was about 2.5%, oligomenorrhea was declared by 3.7%, irregular length by 8.3%, while long bleeding (>6 days) was shown in 19.6% of girls. Gynecological age was significantly associated with cycle length (P < 0.0001) with long cycles becoming more regular within the fourth year after menarche, while frequency of polymenorrhea stabilized after the second gynecological year. Oligomenorrhea and irregularity were both significantly associated with long menstrual bleeding (adjusted OR = 2.36; 95% CI = 1.55-3.60, and adjusted OR = 2.59; 95% CI = 1.95-3.44, respectively). The findings of the study support the levelling-off of secular trend in menarche anticipation in Italy and confirm the timing in menstrual cycle regularization. The study provides updated epidemiological data on frequency of menstrual abnormalities to help reproductive health professionals in managing adolescent gynecology.
Gene selection for microarray data classification via subspace learning and manifold regularization.
Tang, Chang; Cao, Lijuan; Zheng, Xiao; Wang, Minhui
2017-12-19
With the rapid development of DNA microarray technology, large amount of genomic data has been generated. Classification of these microarray data is a challenge task since gene expression data are often with thousands of genes but a small number of samples. In this paper, an effective gene selection method is proposed to select the best subset of genes for microarray data with the irrelevant and redundant genes removed. Compared with original data, the selected gene subset can benefit the classification task. We formulate the gene selection task as a manifold regularized subspace learning problem. In detail, a projection matrix is used to project the original high dimensional microarray data into a lower dimensional subspace, with the constraint that the original genes can be well represented by the selected genes. Meanwhile, the local manifold structure of original data is preserved by a Laplacian graph regularization term on the low-dimensional data space. The projection matrix can serve as an importance indicator of different genes. An iterative update algorithm is developed for solving the problem. Experimental results on six publicly available microarray datasets and one clinical dataset demonstrate that the proposed method performs better when compared with other state-of-the-art methods in terms of microarray data classification. Graphical Abstract The graphical abstract of this work.
Manifold optimization-based analysis dictionary learning with an ℓ1∕2-norm regularizer.
Li, Zhenni; Ding, Shuxue; Li, Yujie; Yang, Zuyuan; Xie, Shengli; Chen, Wuhui
2018-02-01
Recently there has been increasing attention towards analysis dictionary learning. In analysis dictionary learning, it is an open problem to obtain the strong sparsity-promoting solutions efficiently while simultaneously avoiding the trivial solutions of the dictionary. In this paper, to obtain the strong sparsity-promoting solutions, we employ the ℓ 1∕2 norm as a regularizer. The very recent study on ℓ 1∕2 norm regularization theory in compressive sensing shows that its solutions can give sparser results than using the ℓ 1 norm. We transform a complex nonconvex optimization into a number of one-dimensional minimization problems. Then the closed-form solutions can be obtained efficiently. To avoid trivial solutions, we apply manifold optimization to update the dictionary directly on the manifold satisfying the orthonormality constraint, so that the dictionary can avoid the trivial solutions well while simultaneously capturing the intrinsic properties of the dictionary. The experiments with synthetic and real-world data verify that the proposed algorithm for analysis dictionary learning can not only obtain strong sparsity-promoting solutions efficiently, but also learn more accurate dictionary in terms of dictionary recovery and image processing than the state-of-the-art algorithms. Copyright © 2017 Elsevier Ltd. All rights reserved.
Distress and rumor exposure on social media during a campus lockdown
Jones, Nickolas M.; Thompson, Rebecca R.; Dunkel Schetter, Christine
2017-01-01
During crisis events, people often seek out event-related information to stay informed of what is happening. However, when information from official channels is lacking or disseminated irregularly, people may be at risk for exposure to rumors that fill the information void. We studied information-seeking during a university lockdown following an active-shooter event. In study 1, students in the lockdown (n = 3,890) completed anonymous surveys 1 week later. Those who indicated receiving conflicting information about the lockdown reported greater acute stress [standardized regression coefficient (b) = 0.07; SE = 0.01; 95% confidence interval (CI), 0.04, 0.10]. Additionally, those who reported direct contact with close others via text message (or phone) and used Twitter for critical updates during the lockdown were exposed to more conflicting information. Higher acute stress was reported by heavy social media users who trusted social media for critical updates (b = 0.06, SE = 0.01; 95% CI, 0.03, 0.10). In study 2, we employed a big data approach to explore the time course of rumor transmission across 5 hours surrounding the lockdown within a subset of the university’s Twitter followers. We also examined the patterning of distress in the hours during the lockdown as rumors about what was happening (e.g., presence of multiple shooters) spread among Twitter users. During periods without updates from official channels, rumors and distress increased. Results highlight the importance of releasing substantive updates at regular intervals during a crisis event and monitoring social media for rumors to mitigate rumor exposure and distress. PMID:29042513
Radiometric characterization of Landsat Collection 1 products
Micijevic, Esad; Haque, Md. Obaidul; Mishra, Nischal
2017-01-01
Landsat data in the U.S. Geological Survey (USGS) archive are being reprocessed to generate a tiered collection of consistently geolocated and radiometrically calibrated products that are suitable for time series analyses. With the implementation of the collection management, no major updates will be made to calibration of the Landsat sensors within a collection. Only calibration parameters needed to maintain the established calibration trends without an effect on derived environmental records will be regularly updated, while all other changes will be deferred to a new collection. This first collection, Collection 1, incorporates various radiometric calibration updates to all Landsat sensors including absolute and relative gains for Landsat 8 Operational Land Imager (OLI), stray light correction for Landsat 8 Thermal Infrared Sensor (TIRS), absolute gains for Landsat 4 and 5 Thematic Mappers (TM), recalibration of Landsat 1-5 Multispectral Scanners (MSS) to ensure radiometric consistency among different formats of archived MSS data, and a transfer of Landsat 8 OLI reflectance based calibration to all previous Landsat sensors. While all OLI/TIRS, ETM+ and majority of TM data have already been reprocessed to Collection 1, a completion of MSS and remaining TM data reprocessing is expected by the end of this year. It is important to note that, although still available for download from the USGS web pages, the products generated using the Pre-Collection processing do not benefit from the latest radiometric calibration updates. In this paper, we are assessing radiometry of solar reflective bands in Landsat Collection 1 products through analysis of trends in on-board calibrator and pseudo invariant site (PICS) responses.
Radiometric characterization of Landsat Collection 1 products
NASA Astrophysics Data System (ADS)
Micijevic, Esad; Haque, Md. Obaidul; Mishra, Nischal
2017-09-01
Landsat data in the U.S. Geological Survey (USGS) archive are being reprocessed to generate a tiered collection of consistently geolocated and radiometrically calibrated products that are suitable for time series analyses. With the implementation of the collection management, no major updates will be made to calibration of the Landsat sensors within a collection. Only calibration parameters needed to maintain the established calibration trends without an effect on derived environmental records will be regularly updated, while all other changes will be deferred to a new collection. This first collection, Collection 1, incorporates various radiometric calibration updates to all Landsat sensors including absolute and relative gains for Landsat 8 Operational Land Imager (OLI), stray light correction for Landsat 8 Thermal Infrared Sensor (TIRS), absolute gains for Landsat 4 and 5 Thematic Mappers (TM), recalibration of Landsat 1-5 Multispectral Scanners (MSS) to ensure radiometric consistency among different formats of archived MSS data, and a transfer of Landsat 8 OLI reflectance based calibration to all previous Landsat sensors. While all OLI/TIRS, ETM+ and majority of TM data have already been reprocessed to Collection 1, a completion of MSS and remaining TM data reprocessing is expected by the end of this year. It is important to note that, although still available for download from the USGS web pages, the products generated using the Pre-Collection processing do not benefit from the latest radiometric calibration updates. In this paper, we are assessing radiometry of solar reflective bands in Landsat Collection 1 products through analysis of trends in on-board calibrator and pseudo invariant site (PICS) responses.
Trust Based Evaluation of Wikipedia's Contributors
NASA Astrophysics Data System (ADS)
Krupa, Yann; Vercouter, Laurent; Hübner, Jomi Fred; Herzig, Andreas
Wikipedia is an encyclopedia on which anybody can change its content. Some users, self-proclaimed "patrollers", regularly check recent changes in order to delete or correct those which are ruining articles integrity. The huge quantity of updates leads some articles to remain polluted a certain time before being corrected. In this work, we show how a multiagent trust model can help patrollers in their task of controlling the Wikipedia. To direct the patrollers verification towards suspicious contributors, our work relies on a formalisation of Castelfranchi & Falcone's social trust theory to assist them by representing their trust model in a cognitive way.
[The global medical record + (DMG+), tool for prevention in first line care].
Schetgen, M
2012-09-01
The "global medical record +" can be offered to all 45 to 75 year-old patients in the form of a prevention module within the global medical record and which the general practitioner and the patient will regularly update. It will include in particular an assessment of cardiovascular risk, cervical, breast and colon cancer screening, a check of main adult vaccinations, as well as a primary prevention section focused on smoking, alcohol consumption and various hygiene and dietary measures. The inclusion of this module in a computerized medical record will make it more efficient and will lighten the practitioner's workload.
Uptake and trans-membrane transport of petroleum hydrocarbons by microorganisms
Hua, Fei; Wang, Hong Qi
2014-01-01
Petroleum-based products are a primary energy source in the industry and daily life. During the exploration, processing, transport and storage of petroleum and petroleum products, water or soil pollution occurs regularly. Biodegradation of the hydrocarbon pollutants by indigenous microorganisms is one of the primary mechanisms of removal of petroleum compounds from the environment. However, the physical contact between microorganisms and hydrophobic hydrocarbons limits the biodegradation rate. This paper presents an updated review of the petroleum hydrocarbon uptake and transport across the outer membrane of microorganisms with the help of outer membrane proteins. PMID:26740752
Critical review on refractive surgical lasers
NASA Astrophysics Data System (ADS)
Lin, J. T.
1995-03-01
The current status of refractive surgical lasers (including excimer and nonexcimer lasers) is reviewed with an emphasis on photorefractive keratectomy (PRK). The correlation of engineering parameters and the clinical requirements with optimal conditions are presented. The fundamentals of corneal reshaping with formulas for ablation profiles and the advantages of the multizone method are discussed. Updated information on the Mini-Excimer PRK laser system, with an emphasis on the scanning delivery device, is presented. PMMA ablation profiles performed by standard diaphragm and scanning modes are compared for surface ablation quality. Scanning mode ablation patterns for myopia, hyperopia, and regular and irregular astigmatism are presented.
Generalized Sheet Transition Condition FDTD Simulation of Metasurface
NASA Astrophysics Data System (ADS)
Vahabzadeh, Yousef; Chamanara, Nima; Caloz, Christophe
2018-01-01
We propose an FDTD scheme based on Generalized Sheet Transition Conditions (GSTCs) for the simulation of polychromatic, nonlinear and space-time varying metasurfaces. This scheme consists in placing the metasurface at virtual nodal plane introduced between regular nodes of the staggered Yee grid and inserting fields determined by GSTCs in this plane in the standard FDTD algorithm. The resulting update equations are an elegant generalization of the standard FDTD equations. Indeed, in the limiting case of a null surface susceptibility ($\\chi_\\text{surf}=0$), they reduce to the latter, while in the next limiting case of a time-invariant metasurface $[\\chi_\\text{surf}\
Game of life on phyllosilicates: Gliders, oscillators and still life
NASA Astrophysics Data System (ADS)
Adamatzky, Andrew
2013-10-01
A phyllosilicate is a sheet of silicate tetrahedra bound by basal oxygens. A phyllosilicate automaton is a regular network of finite state machines - silicon nodes and oxygen nodes - which mimics structure of the phyllosilicate. A node takes states 0 and 1. Each node updates its state in discrete time depending on a sum of states of its three (silicon) or six (oxygen) neighbours. Phyllosilicate automata exhibit localisations attributed to Conway's Game of Life: gliders, oscillators, still lifes, and a glider gun. Configurations and behaviour of typical localisations, and interactions between the localisations are illustrated.
Maguire, Sabine; Mann, Mala
2013-03-07
Dogma has long prevailed regarding the ageing of bruises, and whether certain patterns of bruising are suggestive or diagnostic of child abuse. We conducted the first Systematic Reviews addressing these two issues, to determine the scientific basis for current clinical practice. There have been seven updates since 2004. An all language literature search was performed across 13 databases, 1951-2004, using >60 key words, supplemented by 'snowballing' techniques. Quality standards included a novel confirmation of abuse scale. Updates used expanded key words, and a higher standard for confirmation of abuse. Of 1495 potential studies, only three met the inclusion criteria for ageing of bruises in 2004, confirming that it is inaccurate to do so with the naked eye. This was roundly rejected when first reported, generating a wave of new studies attempting to determine a scientifically valid method to age bruises, none of which are applicable in children yet. Regarding patterns of bruising that may be suggestive or diagnostic of abuse, we included 23 of 167 studies reviewed in 2004, although only 2 were comparative studies. Included studies noted that unintentional bruises occur predominantly on the front of the body, over bony prominences and their presence is directly correlated to the child's level of independent mobility. Bruising patterns in abused children, differed in location (most common site being face, neck, ear, head, trunk, buttocks, arms), and tended to be larger. Updates have included a further 14 studies, including bruising in disabled children, defining distinguishing patterns in severely injured abused and non-abused children, and importance of petechiae. Systematic Reviews of bruising challenged accepted wisdom regarding ageing of bruises, which had no scientific basis; stimulated higher quality research on patterns of bruises distinguishing abusive and non-abusive bruising patterns, and highlighted the benefits of regular updates of these reviews. Copyright © 2013 The Cochrane Collaboration. Published by John Wiley & Sons, Ltd.
An Investigation of Automatic Change Detection for Topographic Map Updating
NASA Astrophysics Data System (ADS)
Duncan, P.; Smit, J.
2012-08-01
Changes to the landscape are constantly occurring and it is essential for geospatial and mapping organisations that these changes are regularly detected and captured, so that map databases can be updated to reflect the current status of the landscape. The Chief Directorate of National Geospatial Information (CD: NGI), South Africa's national mapping agency, currently relies on manual methods of detecting changes and capturing these changes. These manual methods are time consuming and labour intensive, and rely on the skills and interpretation of the operator. It is therefore necessary to move towards more automated methods in the production process at CD: NGI. The aim of this research is to do an investigation into a methodology for automatic or semi-automatic change detection for the purpose of updating topographic databases. The method investigated for detecting changes is through image classification as well as spatial analysis and is focussed on urban landscapes. The major data input into this study is high resolution aerial imagery and existing topographic vector data. Initial results indicate the traditional pixel-based image classification approaches are unsatisfactory for large scale land-use mapping and that object-orientated approaches hold more promise. Even in the instance of object-oriented image classification generalization of techniques on a broad-scale has provided inconsistent results. A solution may lie with a hybrid approach of pixel and object-oriented techniques.
Urbinello, Damiano; Röösli, Martin
2013-01-01
When moving around, mobile phones in stand-by mode periodically send data about their positions. The aim of this paper is to evaluate how personal radiofrequency electromagnetic field (RF-EMF) measurements are affected by such location updates. Exposure from a mobile phone handset (uplink) was measured during commuting by using a randomized cross-over study with three different scenarios: disabled mobile phone (reference), an activated dual-band phone and a quad-band phone. In the reference scenario, uplink exposure was highest during train rides (1.19 mW/m(2)) and lowest during car rides in rural areas (0.001 mW/m(2)). In public transports, the impact of one's own mobile phone on personal RF-EMF measurements was not observable because of high background uplink radiation from other people's mobile phone. In a car, uplink exposure with an activated phone was orders of magnitude higher compared with the reference scenario. This study demonstrates that personal RF-EMF exposure is affected by one's own mobile phone in stand-by mode because of its regular location update. Further dosimetric studies should quantify the contribution of location updates to the total RF-EMF exposure in order to clarify whether the duration of mobile phone use, the most common exposure surrogate in the epidemiological RF-EMF research, is actually an adequate exposure proxy.
Prophylaxis and treatment of HIV-1 infection in pregnancy - Swedish Recommendations 2017.
Navér, Lars; Albert, Jan; Carlander, Christina; Flamholc, Leo; Gisslén, Magnus; Karlström, Olof; Svedhem-Johansson, Veronica; Sönnerborg, Anders; Westling, Katarina; Yilmaz, Aylin; Pettersson, Karin
2018-01-24
Prophylaxis and treatment with antiretroviral drugs have resulted in a very low rate of mother-to-child transmission (MTCT) of HIV during recent years. Registration of new antiretroviral drugs, modification of clinical praxis, updated general treatment guidelines and increasing knowledge about MTCT have necessitated regular revisions of the recommendations for 'Prophylaxis and treatment of HIV-1 infection in pregnancy'. The Swedish Reference Group for Antiviral Therapy (RAV) has updated the recommendations from 2013 at an expert meeting 19 September 2017. In the new text, current treatment guidelines for non-pregnant are considered. The most important revisions are that: (1) Caesarean section and infant prophylaxis with three drugs are recommended when maternal HIV RNA >150 copies/mL (previously >50 copies/mL). The treatment target of undetectable HIV RNA remains unchanged <50 copies/mL; (2) Obstetric management and mode of delivery at premature rupture of the membranes and rupture of the membranes at full term follow the same procedures as in HIV negative women; (3) Vaginal delivery is recommended to a well-treated woman with HIV RNA <150 copies/mL regardless of gestational age, if no obstetric contraindications are present; (4) Treatment during pregnancy should begin as soon as possible and should continue after delivery; (5) Ongoing well-functioning HIV treatment at pregnancy start should usually be retained; (6) Recommended drugs and drug combinations have been updated.
NASA Astrophysics Data System (ADS)
García-Mayordomo, Julián; Martín-Banda, Raquel; Insua-Arévalo, Juan M.; Álvarez-Gómez, José A.; Martínez-Díaz, José J.; Cabral, João
2017-08-01
Active fault databases are a very powerful and useful tool in seismic hazard assessment, particularly when singular faults are considered seismogenic sources. Active fault databases are also a very relevant source of information for earth scientists, earthquake engineers and even teachers or journalists. Hence, active fault databases should be updated and thoroughly reviewed on a regular basis in order to keep a standard quality and uniformed criteria. Desirably, active fault databases should somehow indicate the quality of the geological data and, particularly, the reliability attributed to crucial fault-seismic parameters, such as maximum magnitude and recurrence interval. In this paper we explain how we tackled these issues during the process of updating and reviewing the Quaternary Active Fault Database of Iberia (QAFI) to its current version 3. We devote particular attention to describing the scheme devised for classifying the quality and representativeness of the geological evidence of Quaternary activity and the accuracy of the slip rate estimation in the database. Subsequently, we use this information as input for a straightforward rating of the level of reliability of maximum magnitude and recurrence interval fault seismic parameters. We conclude that QAFI v.3 is a much better database than version 2 either for proper use in seismic hazard applications or as an informative source for non-specialized users. However, we already envision new improvements for a future update.
Unstable vicinal crystal growth from cellular automata
NASA Astrophysics Data System (ADS)
Krasteva, A.; Popova, H.; KrzyŻewski, F.; Załuska-Kotur, M.; Tonchev, V.
2016-03-01
In order to study the unstable step motion on vicinal crystal surfaces we devise vicinal Cellular Automata. Each cell from the colony has value equal to its height in the vicinal, initially the steps are regularly distributed. Another array keeps the adatoms, initially distributed randomly over the surface. The growth rule defines that each adatom at right nearest neighbor position to a (multi-) step attaches to it. The update of whole colony is performed at once and then time increases. This execution of the growth rule is followed by compensation of the consumed particles and by diffusional update(s) of the adatom population. Two principal sources of instability are employed - biased diffusion and infinite inverse Ehrlich-Schwoebel barrier (iiSE). Since these factors are not opposed by step-step repulsion the formation of multi-steps is observed but in general the step bunches preserve a finite width. We monitor the developing surface patterns and quantify the observations by scaling laws with focus on the eventual transition from diffusion-limited to kinetics-limited phenomenon. The time-scaling exponent of the bunch size N is 1/2 for the case of biased diffusion and 1/3 for the case of iiSE. Additional distinction is possible based on the time-scaling exponents of the sizes of multi-step Nmulti, these are 0.36÷0.4 (for biased diffusion) and 1/4 (iiSE).
Zhu, Dianwen; Li, Changqing
2014-12-01
Fluorescence molecular tomography (FMT) is a promising imaging modality and has been actively studied in the past two decades since it can locate the specific tumor position three-dimensionally in small animals. However, it remains a challenging task to obtain fast, robust and accurate reconstruction of fluorescent probe distribution in small animals due to the large computational burden, the noisy measurement and the ill-posed nature of the inverse problem. In this paper we propose a nonuniform preconditioning method in combination with L (1) regularization and ordered subsets technique (NUMOS) to take care of the different updating needs at different pixels, to enhance sparsity and suppress noise, and to further boost convergence of approximate solutions for fluorescence molecular tomography. Using both simulated data and phantom experiment, we found that the proposed nonuniform updating method outperforms its popular uniform counterpart by obtaining a more localized, less noisy, more accurate image. The computational cost was greatly reduced as well. The ordered subset (OS) technique provided additional 5 times and 3 times speed enhancements for simulation and phantom experiments, respectively, without degrading image qualities. When compared with the popular L (1) algorithms such as iterative soft-thresholding algorithm (ISTA) and Fast iterative soft-thresholding algorithm (FISTA) algorithms, NUMOS also outperforms them by obtaining a better image in much shorter period of time.
Khatcheressian, James L; Wolff, Antonio C; Smith, Thomas J; Grunfeld, Eva; Muss, Hyman B; Vogel, Victor G; Halberg, Francine; Somerfield, Mark R; Davidson, Nancy E
2006-11-01
To update the 1999 American Society of Clinical Oncology (ASCO) guideline on breast cancer follow-up and management in the adjuvant setting. An ASCO Expert Panel reviewed pertinent information from the literature through March 2006. More weight was given to studies that tested a hypothesis directly relating testing to one of the primary outcomes in a randomized design. The evidence supports regular history, physical examination, and mammography as the cornerstone of appropriate breast cancer follow-up. All patients should have a careful history and physical examination performed by a physician experienced in the surveillance of cancer patients and in breast examination. Examinations should be performed every 3 to 6 months for the first 3 years, every 6 to 12 months for years 4 and 5, and annually thereafter. For those who have undergone breast-conserving surgery, a post-treatment mammogram should be obtained 1 year after the initial mammogram and at least 6 months after completion of radiation therapy. Thereafter, unless otherwise indicated, a yearly mammographic evaluation should be performed. Patients at high risk for familial breast cancer syndromes should be referred for genetic counseling. The use of CBCs, chemistry panels, bone scans, chest radiographs, liver ultrasounds, computed tomography scans, [18F]fluorodeoxyglucose-positron emission tomography scanning, magnetic resonance imaging, or tumor markers (carcinoembryonic antigen, CA 15-3, and CA 27.29) is not recommended for routine breast cancer follow-up in an otherwise asymptomatic patient with no specific findings on clinical examination. Careful history taking, physical examination, and regular mammography are recommended for appropriate detection of breast cancer recurrence.
Etard, Cécile; Bigand, Emeline; Salvat, Cécile; Vidal, Vincent; Beregi, Jean Paul; Hornbeck, Amaury; Greffier, Joël
2017-10-01
A national retrospective survey on patient doses was performed by the French Society of Medical physicists to assess reference levels (RLs) in interventional radiology as required by the European Directive 2013/59/Euratom. Fifteen interventional procedures in neuroradiology, vascular radiology and osteoarticular procedures were analysed. Kerma area product (KAP), fluoroscopy time (FT), reference air kerma and number of images were recorded for 10 to 30 patients per procedure. RLs were calculated as the 3rd quartiles of the distributions. Results on 4600 procedures from 36 departments confirmed the large variability in patient dose for the same procedure. RLs were proposed for the four dosimetric estimators and the 15 procedures. RLs in terms of KAP and FT were 90 Gm.cm 2 and 11 mins for cerebral angiography, 35 Gy.cm 2 and 16 mins for biliary drainage, 75 Gy.cm 2 and 6 mins for lower limbs arteriography and 70 Gy.cm 2 and 11 mins for vertebroplasty. For these four procedures, RLs were defined according to the complexity of the procedure. For all the procedures, the results were lower than most of those already published. This study reports RLs in interventional radiology based on a national survey. Continual evolution of practices and technologies requires regular updates of RLs. • Delivered dose in interventional radiology depends on procedure, practice and patient. • National RLs are proposed for 15 interventional procedures. • Reference levels (RLs) are useful to benchmark practices and optimize protocols. • RLs are proposed for kerma area product, air kerma, fluoroscopy time and number of images. • RLs should be adapted to the procedure complexity and updated regularly.
Mehta, Shamir R; Bainey, Kevin R; Cantor, Warren J; Lordkipanidzé, Marie; Marquis-Gravel, Guillaume; Robinson, Simon D; Sibbald, Matthew; So, Derek Y; Wong, Graham C; Abunassar, Joseph G; Ackman, Margaret L; Bell, Alan D; Cartier, Raymond; Douketis, James D; Lawler, Patrick R; McMurtry, Michael S; Udell, Jacob A; van Diepen, Sean; Verma, Subodh; Mancini, G B John; Cairns, John A; Tanguay, Jean-François
2018-03-01
Antiplatelet therapy (APT) has become an important tool in the treatment and prevention of atherosclerotic events, particularly those associated with coronary artery disease. A large evidence base has evolved regarding the relationship between APT prescription in various clinical contexts and risk/benefit relationships. The Guidelines Committee of the Canadian Cardiovascular Society and Canadian Association of Interventional Cardiology publishes regular updates of its recommendations, taking into consideration the most recent clinical evidence. The present update to the 2011 and 2013 Canadian Cardiovascular Society APT guidelines incorporates new evidence on how to optimize APT use, particularly in situations in which few to no data were previously available. The recommendations update focuses on the following primary topics: (1) the duration of dual APT (DAPT) in patients who undergo percutaneous coronary intervention (PCI) for acute coronary syndrome and non-acute coronary syndrome indications; (2) management of DAPT in patients who undergo noncardiac surgery; (3) management of DAPT in patients who undergo elective and semiurgent coronary artery bypass graft surgery; (4) when and how to switch between different oral antiplatelet therapies; and (5) management of antiplatelet and anticoagulant therapy in patients who undergo PCI. For PCI patients, we specifically analyze the particular considerations in patients with atrial fibrillation, mechanical or bioprosthetic valves (including transcatheter aortic valve replacement), venous thromboembolic disease, and established left ventricular thrombus or possible left ventricular thrombus with reduced ejection fraction after ST-segment elevation myocardial infarction. In addition to specific recommendations, we provide values and preferences and practical tips to aid the practicing clinician in the day to day use of these important agents. Copyright © 2018. Published by Elsevier Inc.
Curran, Vernon; Fleet, Lisa; Greene, Melanie
2012-01-01
Resuscitation and life support skills training comprises a significant proportion of continuing education programming for health professionals. The purpose of this study was to explore the perceptions and attitudes of certified resuscitation providers toward the retention of resuscitation skills, regular skills updating, and methods for enhancing retention. A mixed-methods, explanatory study design was undertaken utilizing focus groups and an online survey-questionnaire of rural and urban health care providers. Rural providers reported less experience with real codes and lower abilities across a variety of resuscitation areas. Mock codes, practice with an instructor and a team, self-practice with a mannequin, and e-learning were popular methods for skills updating. Aspects of team performance that were felt to influence resuscitation performance included: discrepancies in skill levels, lack of communication, and team leaders not up to date on their skills. Confidence in resuscitation abilities was greatest after one had recently practiced or participated in an update or an effective debriefing session. Lowest confidence was reported when team members did not work well together, there was no clear leader of the resuscitation code, or if team members did not communicate. The study findings highlight the importance of access to update methods for improving providers' confidence and abilities, and the need for emphasis on teamwork training in resuscitation. An eclectic approach combining methods may be the best strategy for addressing the needs of health professionals across various clinical departments and geographic locales. Copyright © 2012 The Alliance for Continuing Education in the Health Professions, the Society for Academic Continuing Medical Education, and the Council on CME, Association for Hospital Medical Education.
Update on rehabilitation in multiple sclerosis.
Donzé, Cécile
2015-04-01
Given that mobility impairment is a hallmark of multiple sclerosis, people with this disease are likely to benefit from rehabilitation therapy throughout the course of their illness. The review provides an update on rehabilitation focused on balance and walking impairment. Classical rehabilitation focusing on muscle rehabilitation, neurotherapeutic facilitation is effective and recommended. Other techniques did not prove their superiority: transcutaneal neurostimulation, repetitive magnetic stimulation, electromagnetic therapy, whole body vibration and robot-assisted gait rehabilitation and need more studies to conclude. Cooling therapy, hydrotherapy, orthoses and textured insoles could represent a complementary service to other techniques in specific conditions. Multidisciplinary rehabilitation program provides positive effects and high satisfaction for patients with multiple sclerosis but needs more evaluation. New technologies using serious game and telerehabilitation seem to be an interesting technique to promote physical activity, self-management and quality of life. Rehabilitation like other therapy needs regular clinical evaluation to adapt the program and propose appropriate techniques. Moreover, the objective of rehabilitation needs to be decided with the patient with realistic expectation. Copyright © 2015 Elsevier Masson SAS. All rights reserved.
Voter model with non-Poissonian interevent intervals
NASA Astrophysics Data System (ADS)
Takaguchi, Taro; Masuda, Naoki
2011-09-01
Recent analysis of social communications among humans has revealed that the interval between interactions for a pair of individuals and for an individual often follows a long-tail distribution. We investigate the effect of such a non-Poissonian nature of human behavior on dynamics of opinion formation. We use a variant of the voter model and numerically compare the time to consensus of all the voters with different distributions of interevent intervals and different networks. Compared with the exponential distribution of interevent intervals (i.e., the standard voter model), the power-law distribution of interevent intervals slows down consensus on the ring. This is because of the memory effect; in the power-law case, the expected time until the next update event on a link is large if the link has not had an update event for a long time. On the complete graph, the consensus time in the power-law case is close to that in the exponential case. Regular graphs bridge these two results such that the slowing down of the consensus in the power-law case as compared to the exponential case is less pronounced as the degree increases.
Sulfur Dioxide Emission Rates from Kilauea Volcano, Hawai`i, an Update: 1998-2001
Elias, Tamar; Sutton, A. Jefferson
2002-01-01
Introduction Sulfur dioxide (SO2) emission rates from Kilauea Volcano were first measured by Stoiber and Malone (1975) and have been measured on a regular basis since 1979 (Greenland and others, 1985; Casadevall and others, 1987; Elias and others, 1998; Sutton and others, 2001). A compilation of SO2 emission-rate and wind-vector data from 1979 through 1997 is available as Open-File Report 98-462 (Elias and others, 1998) and on the web at http://hvo.wr.usgs.gov/products/OF98462/. The purpose of this report is to update the existing database through 2001. Kilauea releases SO2 gas predominantly from its summit caldera and east rift zone (ERZ) (fig. 1), as described in previous reports (Elias and others, 1998; Sutton and others, 2001). These two distinct sources are quantified independently. The summit and east rift zone emission rates reported here were derived using vehicle-based Correlation Spectrometry (COSPEC) measurements as described in Elias and others (1998). In 1998 and 1999, these measurements were augmented with airborne and tripod-based surveys.
NASA Astrophysics Data System (ADS)
Manzke, Nina; Kada, Martin; Kastler, Thomas; Xu, Shaojuan; de Lange, Norbert; Ehlers, Manfred
2016-06-01
Urban sprawl and the related landscape fragmentation is a Europe-wide challenge in the context of sustainable urban planning. The URBan land recycling Information services for Sustainable cities (URBIS) project aims for the development, implementation, and validation of web-based information services for urban vacant land in European functional urban areas in order to provide end-users with site specific characteristics and to facilitate the identification and evaluation of potential development areas. The URBIS services are developed based on open geospatial data. In particular, the Copernicus Urban Atlas thematic layers serve as the main data source for an initial inventory of sites. In combination with remotely sensed data like SPOT5 images and ancillary datasets like OpenStreetMap, detailed site specific information is extracted. Services are defined for three main categories: i) baseline services, which comprise an initial inventory and typology of urban land, ii) update services, which provide a regular inventory update as well as an analysis of urban land use dynamics and changes, and iii) thematic services, which deliver specific information tailored to end-users' needs.
Personal computer security: part 1. Firewalls, antivirus software, and Internet security suites.
Caruso, Ronald D
2003-01-01
Personal computer (PC) security in the era of the Health Insurance Portability and Accountability Act of 1996 (HIPAA) involves two interrelated elements: safeguarding the basic computer system itself and protecting the information it contains and transmits, including personal files. HIPAA regulations have toughened the requirements for securing patient information, requiring every radiologist with such data to take further precautions. Security starts with physically securing the computer. Account passwords and a password-protected screen saver should also be set up. A modern antivirus program can easily be installed and configured. File scanning and updating of virus definitions are simple processes that can largely be automated and should be performed at least weekly. A software firewall is also essential for protection from outside intrusion, and an inexpensive hardware firewall can provide yet another layer of protection. An Internet security suite yields additional safety. Regular updating of the security features of installed programs is important. Obtaining a moderate degree of PC safety and security is somewhat inconvenient but is necessary and well worth the effort. Copyright RSNA, 2003
Opinion dynamics on an adaptive random network
NASA Astrophysics Data System (ADS)
Benczik, I. J.; Benczik, S. Z.; Schmittmann, B.; Zia, R. K. P.
2009-04-01
We revisit the classical model for voter dynamics in a two-party system with two basic modifications. In contrast to the original voter model studied in regular lattices, we implement the opinion formation process in a random network of agents in which interactions are no longer restricted by geographical distance. In addition, we incorporate the rapidly changing nature of the interpersonal relations in the model. At each time step, agents can update their relationships. This update is determined by their own opinion, and by their preference to make connections with individuals sharing the same opinion, or rather with opponents. In this way, the network is built in an adaptive manner, in the sense that its structure is correlated and evolves with the dynamics of the agents. The simplicity of the model allows us to examine several issues analytically. We establish criteria to determine whether consensus or polarization will be the outcome of the dynamics and on what time scales these states will be reached. In finite systems consensus is typical, while in infinite systems a disordered metastable state can emerge and persist for infinitely long time before consensus is reached.
[Plug-in Based Centralized Control System in Operating Rooms].
Wang, Yunlong
2017-05-30
Centralized equipment controls in an operating room (OR) is crucial to an efficient workflow in the OR. To achieve centralized control, an integrative OR needs to focus on designing a control panel that can appropriately incorporate equipment from different manufactures with various connecting ports and controls. Here we propose to achieve equipment integration using plug-in modules. Each OR will be equipped with a dynamic plug-in control panel containing physically removable connecting ports. Matching outlets will be installed onto the control panels of each equipment used at any given time. This dynamic control panel will be backed with a database containing plug-in modules that can connect any two types of connecting ports common among medical equipment manufacturers. The correct connecting ports will be called using reflection dynamics. This database will be updated regularly to include new connecting ports on the market, making it easy to maintain, update, expand and remain relevant as new equipment are developed. Together, the physical panel and the database will achieve centralized equipment controls in the OR that can be easily adapted to any equipment in the OR.
Im, Eun-Ok; Chee, Wonshik
2005-11-01
In this paper, issues in Internet recruitment of ethnic minorities in the US are explored through an analysis of an Internet survey study. The issues include those related to: (a) the difficulties in ensuring authenticity; (b) a lack of cooperation by gate keepers; (c) the flexibility required in the recruitment process; (d) a very low response rate; and (e) selected groups of ethnic minorities. Based on the discussions on the issues, we propose regular updates of knowledge and skills related to Internet interactions and technologies, usage of multiple recruitment sources, pluralistic recruitment approaches, and a quota sampling method.
Updating our thinking on the role of human activity in wolf recovery
Mech, L. David
1993-01-01
It is common for land managers and administrators involved in wolf (Canis lupus) management to assume that human development within wolf habitat is a hindrance to wolf recovery. Thus, researchers are often asked questions like We are planning to build a new dock and parking area on Wolf Lake; how will that affect wolf recovery? This kind of question pervades national forest and national park plans and has been pondered with regularity by both federal and state resource managers.Meanwhile, wolves have been extending their ranges into regions of much greater human development. The apparent contradiction has created both confusion and misdirected effort.
NSI customer service representatives and user support office: NASA Science Internet
NASA Technical Reports Server (NTRS)
1991-01-01
The NASA Science Internet, (NSI) was established in 1987 to provide NASA's Offices of Space Science and Applications (OSSA) missions with transparent wide-area data connectivity to NASA's researchers, computational resources, and databases. The NSI Office at NASA/Ames Research Center has the lead responsibility for implementing a total, open networking program to serve the OSSA community. NSI is a full-service communications provider whose services include science network planning, network engineering, applications development, network operations, and network information center/user support services. NSI's mission is to provide reliable high-speed communications to the NASA science community. To this end, the NSI Office manages and operates the NASA Science Internet, a multiprotocol network currently supporting both DECnet and TCP/IP protocols. NSI utilizes state-of-the-art network technology to meet its customers' requirements. THe NASA Science Internet interconnects with other national networks including the National Science Foundation's NSFNET, the Department of Energy's ESnet, and the Department of Defense's MILNET. NSI also has international connections to Japan, Australia, New Zealand, Chile, and several European countries. NSI cooperates with other government agencies as well as academic and commercial organizations to implement networking technologies which foster interoperability, improve reliability and performance, increase security and control, and expedite migration to the OSI protocols.
Ammari, Maha Al; Sultana, Khizra; Yunus, Faisal; Ghobain, Mohammed Al; Halwan, Shatha M. Al
2016-01-01
Objectives: To assess the proportion of critical errors committed while demonstrating the inhaler technique in hospitalized patients diagnosed with asthma and chronic obstructive pulmonary disease (COPD). Methods: This cross-sectional observational study was conducted in 47 asthmatic and COPD patients using inhaler devices. The study took place at King Abdulaziz Medical City, Riyadh, Saudi Arabia between September and December 2013. Two pharmacists independently assessed inhaler technique with a validated checklist. Results: Seventy percent of patients made at least one critical error while demonstrating their inhaler technique, and the mean number of critical errors per patient was 1.6. Most patients used metered dose inhaler (MDI), and 73% of MDI users and 92% of dry powder inhaler users committed at least one critical error. Conclusion: Inhaler technique in hospitalized Saudi patients was inadequate. Health care professionals should understand the importance of reassessing and educating patients on a regular basis for inhaler technique, recommend the use of a spacer when needed, and regularly assess and update their own inhaler technique skills. PMID:27146622
Explicit instructions and consolidation promote rewiring of automatic behaviors in the human mind.
Szegedi-Hallgató, Emese; Janacsek, Karolina; Vékony, Teodóra; Tasi, Lia Andrea; Kerepes, Leila; Hompoth, Emőke Adrienn; Bálint, Anna; Németh, Dezső
2017-06-29
One major challenge in human behavior and brain sciences is to understand how we can rewire already existing perceptual, motor, cognitive, and social skills or habits. Here we aimed to characterize one aspect of rewiring, namely, how we can update our knowledge of sequential/statistical regularities when they change. The dynamics of rewiring was explored from learning to consolidation using a unique experimental design which is suitable to capture the effect of implicit and explicit processing and the proactive and retroactive interference. Our results indicate that humans can rewire their knowledge of such regularities incidentally, and consolidation has a critical role in this process. Moreover, old and new knowledge can coexist, leading to effective adaptivity of the human mind in the changing environment, although the execution of the recently acquired knowledge may be more fluent than the execution of the previously learned one. These findings can contribute to a better understanding of the cognitive processes underlying behavior change, and can provide insights into how we can boost behavior change in various contexts, such as sports, educational settings or psychotherapy.
[French guidelines for the management of adult sickle cell disease: 2015 update].
Habibi, A; Arlet, J-B; Stankovic, K; Gellen-Dautremer, J; Ribeil, J-A; Bartolucci, P; Lionnet, F
2015-05-11
Sickle cell disease is a systemic genetic disorder, causing many functional and tissular modifications. As the prevalence of patients with sickle cell disease increases gradually in France, every physician can be potentially involved in the care of these patients. Complications of sickle cell disease can be acute and chronic. Pain is the main symptom and should be treated quickly and aggressively. In order to reduce the fatality rate associated with acute chest syndrome, it must be detected and treated early. Chronic complications are one of the main concerns in adults and should be identified as early as possible in order to prevent end organ damage. Many organs can be involved, including bones, kidneys, eyes, lungs, etc. The indications for a specific treatment (blood transfusion or hydroxyurea) should be regularly discussed. Coordinated health care should be carefully organized to allow a regular follow-up near the living place and access to specialized departments. We present in this article the French guidelines for the sickle cell disease management in adulthood. Copyright © 2015 Elsevier Inc. All rights reserved.
Travel time tomography with local image regularization by sparsity constrained dictionary learning
NASA Astrophysics Data System (ADS)
Bianco, M.; Gerstoft, P.
2017-12-01
We propose a regularization approach for 2D seismic travel time tomography which models small rectangular groups of slowness pixels, within an overall or `global' slowness image, as sparse linear combinations of atoms from a dictionary. The groups of slowness pixels are referred to as patches and a dictionary corresponds to a collection of functions or `atoms' describing the slowness in each patch. These functions could for example be wavelets.The patch regularization is incorporated into the global slowness image. The global image models the broad features, while the local patch images incorporate prior information from the dictionary. Further, high resolution slowness within patches is permitted if the travel times from the global estimates support it. The proposed approach is formulated as an algorithm, which is repeated until convergence is achieved: 1) From travel times, find the global slowness image with a minimum energy constraint on the pixel variance relative to a reference. 2) Find the patch level solutions to fit the global estimate as a sparse linear combination of dictionary atoms.3) Update the reference as the weighted average of the patch level solutions.This approach relies on the redundancy of the patches in the seismic image. Redundancy means that the patches are repetitions of a finite number of patterns, which are described by the dictionary atoms. Redundancy in the earth's structure was demonstrated in previous works in seismics where dictionaries of wavelet functions regularized inversion. We further exploit redundancy of the patches by using dictionary learning algorithms, a form of unsupervised machine learning, to estimate optimal dictionaries from the data in parallel with the inversion. We demonstrate our approach on densely, but irregularly sampled synthetic seismic images.
Recommendations for management of diabetes during Ramadan: update 2015
Ibrahim, Mahmoud; Abu Al Magd, Megahed; Annabi, Firas A; Assaad-Khalil, Samir; Ba-Essa, Ebtesam M; Fahdil, Ibtihal; Karadeniz, Sehnaz; Meriden, Terry; Misha'l, Aly A; Pozzilli, Paolo; Shera, Samad; Thomas, Abraham; Bahijri, Suhad; Tuomilehto, Jaakko; Yilmaz, Temel; Umpierrez, Guillermo E
2015-01-01
Since the first ADA working group report on the recommendations for management of diabetes during Ramadan in 2005 and our update in 2010, we received many inquiries asking for regular updates on information regarding education, nutritional habits and new oral and injectable agents that may be useful for the management of patients with diabetes during Ramadan. Patients can be stratified into their risk of hypoglycemia and/or complications prior to the start of the fasting period of Ramadan. Those at high risk of hypoglycemia and with multiple diabetic complications should be advised against prolonged fasting. Even in the lower hypoglycemia risk group, adverse effects may still occur. In order to minimize adverse side effects during fasting in patients with diabetes and improve or maintain glucose control, education and discussion of glucose monitoring and treatment regimens should occur several weeks prior to Ramadan. Agents such as metformin, thiazolidinediones and dipeptidyl peptidase-4 inhibitors appear to be safe and do not need dose adjustment. Most sulfonylureas may not be used safely during Ramadan except with extreme caution; besides, older agents, such as chlorpropamide or glyburide, should not be used. Reduction of the dosage of sulfonylurea is needed depending on the degree of control prior to fasting. Misconceptions and local habits should be addressed and dealt with in any educational intervention and therapeutic planning with patients with diabetes. In this regard, efforts are still needed for controlled prospective studies in the field of efficacy and safety of the different interventions during the Ramadan Fast. PMID:26113983
Comparative Study of foF2 Measurements with IRI-2007 Model Predictions During Extended Solar Minimum
NASA Technical Reports Server (NTRS)
Zakharenkova, I. E.; Krankowski, A.; Bilitza, D.; Cherniak, Iu.V.; Shagimuratov, I.I.; Sieradzki, R.
2013-01-01
The unusually deep and extended solar minimum of cycle 2324 made it very difficult to predict the solar indices 1 or 2 years into the future. Most of the predictions were proven wrong by the actual observed indices. IRI gets its solar, magnetic, and ionospheric indices from an indices file that is updated twice a year. In recent years, due to the unusual solar minimum, predictions had to be corrected downward with every new indices update. In this paper we analyse how much the uncertainties in the predictability of solar activity indices affect the IRI outcome and how the IRI values calculated with predicted and observed indices compared to the actual measurements.Monthly median values of F2 layer critical frequency (foF2) derived from the ionosonde measurements at the mid-latitude ionospheric station Juliusruh were compared with the International Reference Ionosphere (IRI-2007) model predictions. The analysis found that IRIprovides reliable results that compare well with actual measurements, when the definite (observed and adjusted) indices of solar activityare used, while IRI values based on earlier predictions of these indices noticeably overestimated the measurements during the solar minimum.One of the principal objectives of this paper is to direct attention of IRI users to update their solar activity indices files regularly.Use of an older index file can lead to serious IRI overestimations of F-region electron density during the recent extended solar minimum.
NASA Astrophysics Data System (ADS)
Schumacher, Florian; Friederich, Wolfgang
Due to increasing computational resources, the development of new numerically demanding methods and software for imaging Earth's interior remains of high interest in Earth sciences. Here, we give a description from a user's and programmer's perspective of the highly modular, flexible and extendable software package ASKI-Analysis of Sensitivity and Kernel Inversion-recently developed for iterative scattering-integral-based seismic full waveform inversion. In ASKI, the three fundamental steps of solving the seismic forward problem, computing waveform sensitivity kernels and deriving a model update are solved by independent software programs that interact via file output/input only. Furthermore, the spatial discretizations of the model space used for solving the seismic forward problem and for deriving model updates, respectively, are kept completely independent. For this reason, ASKI does not contain a specific forward solver but instead provides a general interface to established community wave propagation codes. Moreover, the third fundamental step of deriving a model update can be repeated at relatively low costs applying different kinds of model regularization or re-selecting/weighting the inverted dataset without need to re-solve the forward problem or re-compute the kernels. Additionally, ASKI offers the user sensitivity and resolution analysis tools based on the full sensitivity matrix and allows to compose customized workflows in a consistent computational environment. ASKI is written in modern Fortran and Python, it is well documented and freely available under terms of the GNU General Public License (http://www.rub.de/aski).
Efficient dynamic graph construction for inductive semi-supervised learning.
Dornaika, F; Dahbi, R; Bosaghzadeh, A; Ruichek, Y
2017-10-01
Most of graph construction techniques assume a transductive setting in which the whole data collection is available at construction time. Addressing graph construction for inductive setting, in which data are coming sequentially, has received much less attention. For inductive settings, constructing the graph from scratch can be very time consuming. This paper introduces a generic framework that is able to make any graph construction method incremental. This framework yields an efficient and dynamic graph construction method that adds new samples (labeled or unlabeled) to a previously constructed graph. As a case study, we use the recently proposed Two Phase Weighted Regularized Least Square (TPWRLS) graph construction method. The paper has two main contributions. First, we use the TPWRLS coding scheme to represent new sample(s) with respect to an existing database. The representative coefficients are then used to update the graph affinity matrix. The proposed method not only appends the new samples to the graph but also updates the whole graph structure by discovering which nodes are affected by the introduction of new samples and by updating their edge weights. The second contribution of the article is the application of the proposed framework to the problem of graph-based label propagation using multiple observations for vision-based recognition tasks. Experiments on several image databases show that, without any significant loss in the accuracy of the final classification, the proposed dynamic graph construction is more efficient than the batch graph construction. Copyright © 2017 Elsevier Ltd. All rights reserved.
Estcourt, Lise J; Fortin, Patricia M; Hopewell, Sally; Trivella, Marialena; Hambleton, Ian R; Cho, Gavin
2016-01-01
Background Sickle cell disease is a genetic haemoglobin disorder, which can cause severe pain, significant end-organ damage, pulmonary complications, and premature death. Sickle cell disease is one of the most common severe monogenic disorders in the world, due to the inheritance of two abnormal haemoglobin (beta globin) genes. The two most common chronic chest complications due to sickle cell disease are pulmonary hypertension and chronic sickle lung disease. These complications can lead to morbidity (such as reduced exercise tolerance) and increased mortality. This is an update of a Cochrane review first published in 2011 and updated in 2014. Objectives We wanted to determine whether trials involving people with sickle cell disease that compare regular long-term blood transfusion regimens with standard care, hydroxycarbamide (hydroxyurea) any other drug treatment show differences in the following: mortality associated with chronic chest complications; severity of established chronic chest complications; development and progression of chronic chest complications; serious adverse events. Search methods We searched the Cochrane Cystic Fibrosis and Genetic Disorders Group’s Haemoglobinopathies Trials Register. Date of the last search: 25 April 2016. We also searched for randomised controlled trials in the Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library, Issue 1, 26 January 2016), MEDLINE (from 1946), Embase (from 1974), CINAHL (from 1937), the Transfusion Evidence Library (from 1950), and ongoing trial databases to 26 January 2016. Selection criteria We included randomised controlled trials of people of any age with one of four common sickle cell disease genotypes, i.e. Hb SS, Sß0, SC, or Sß+ that compared regular red blood cell transfusion regimens (either simple or exchange transfusions) to hydroxycarbamide, any other drug treatment, or to standard care that were aimed at reducing the development or progression of chronic chest complications (chronic sickle lung and pulmonary hypertension). Data collection and analysis We used the standard methodological procedures expected by Cochrane. Main results No studies matching the selection criteria were found. Authors’ conclusions There is a need for randomised controlled trials looking at the role of long-term transfusion therapy in pulmonary hypertension and chronic sickle lung disease. Due to the chronic nature of the conditions, such trials should aim to use a combination of objective and subjective measures to assess participants repeatedly before and after the intervention. PMID:27198469
Time-invariant component-based normalization for a simultaneous PET-MR scanner.
Belzunce, M A; Reader, A J
2016-05-07
Component-based normalization is a method used to compensate for the sensitivity of each of the lines of response acquired in positron emission tomography. This method consists of modelling the sensitivity of each line of response as a product of multiple factors, which can be classified as time-invariant, time-variant and acquisition-dependent components. Typical time-variant factors are the intrinsic crystal efficiencies, which are needed to be updated by a regular normalization scan. Failure to do so would in principle generate artifacts in the reconstructed images due to the use of out of date time-variant factors. For this reason, an assessment of the variability and the impact of the crystal efficiencies in the reconstructed images is important to determine the frequency needed for the normalization scans, as well as to estimate the error obtained when an inappropriate normalization is used. Furthermore, if the fluctuations of these components are low enough, they could be neglected and nearly artifact-free reconstructions become achievable without performing a regular normalization scan. In this work, we analyse the impact of the time-variant factors in the component-based normalization used in the Biograph mMR scanner, but the work is applicable to other PET scanners. These factors are the intrinsic crystal efficiencies and the axial factors. For the latter, we propose a new method to obtain fixed axial factors that was validated with simulated data. Regarding the crystal efficiencies, we assessed their fluctuations during a period of 230 d and we found that they had good stability and low dispersion. We studied the impact of not including the intrinsic crystal efficiencies in the normalization when reconstructing simulated and real data. Based on this assessment and using the fixed axial factors, we propose the use of a time-invariant normalization that is able to achieve comparable results to the standard, daily updated, normalization factors used in this scanner. Moreover, to extend the analysis to other scanners, we generated distributions of crystal efficiencies with greater fluctuations than those found in the Biograph mMR scanner and evaluated their impact in simulations with a wide variety of noise levels. An important finding of this work is that a regular normalization scan is not needed in scanners with photodetectors with relatively low dispersion in their efficiencies.
Time-invariant component-based normalization for a simultaneous PET-MR scanner
NASA Astrophysics Data System (ADS)
Belzunce, M. A.; Reader, A. J.
2016-05-01
Component-based normalization is a method used to compensate for the sensitivity of each of the lines of response acquired in positron emission tomography. This method consists of modelling the sensitivity of each line of response as a product of multiple factors, which can be classified as time-invariant, time-variant and acquisition-dependent components. Typical time-variant factors are the intrinsic crystal efficiencies, which are needed to be updated by a regular normalization scan. Failure to do so would in principle generate artifacts in the reconstructed images due to the use of out of date time-variant factors. For this reason, an assessment of the variability and the impact of the crystal efficiencies in the reconstructed images is important to determine the frequency needed for the normalization scans, as well as to estimate the error obtained when an inappropriate normalization is used. Furthermore, if the fluctuations of these components are low enough, they could be neglected and nearly artifact-free reconstructions become achievable without performing a regular normalization scan. In this work, we analyse the impact of the time-variant factors in the component-based normalization used in the Biograph mMR scanner, but the work is applicable to other PET scanners. These factors are the intrinsic crystal efficiencies and the axial factors. For the latter, we propose a new method to obtain fixed axial factors that was validated with simulated data. Regarding the crystal efficiencies, we assessed their fluctuations during a period of 230 d and we found that they had good stability and low dispersion. We studied the impact of not including the intrinsic crystal efficiencies in the normalization when reconstructing simulated and real data. Based on this assessment and using the fixed axial factors, we propose the use of a time-invariant normalization that is able to achieve comparable results to the standard, daily updated, normalization factors used in this scanner. Moreover, to extend the analysis to other scanners, we generated distributions of crystal efficiencies with greater fluctuations than those found in the Biograph mMR scanner and evaluated their impact in simulations with a wide variety of noise levels. An important finding of this work is that a regular normalization scan is not needed in scanners with photodetectors with relatively low dispersion in their efficiencies.
Predicting silicon pore optics
NASA Astrophysics Data System (ADS)
Vacanti, Giuseppe; Barriére, Nicolas; Bavdaz, Marcos; Chatbi, Abdelhakim; Collon, Maximilien; Dekker, Danielle; Girou, David; Günther, Ramses; van der Hoeven, Roy; Landgraf, Boris; Sforzini, Jessica; Vervest, Mark; Wille, Eric
2017-09-01
Continuing improvement of Silicon Pore Optics (SPO) calls for regular extension and validation of the tools used to model and predict their X-ray performance. In this paper we present an updated geometrical model for the SPO optics and describe how we make use of the surface metrology collected during each of the SPO manufacturing runs. The new geometrical model affords the user a finer degree of control on the mechanical details of the SPO stacks, while a standard interface has been developed to make use of any type of metrology that can return changes in the local surface normal of the reflecting surfaces. Comparisons between the predicted and actual performance of samples optics will be shown and discussed.
Far-infrared heterodyne spectrometer
NASA Technical Reports Server (NTRS)
Boreiko, Rita T.; Betz, Al L.
1995-01-01
A far-infrared heterodyne spectrometer was designed and built by our group for observations of atomic and molecular lines from interstellar clouds. Linewidths as narrow as 1 km/s can be expected from such regions, and so the spectrometer is designed with sub-km/s resolution so that observed line profiles will be resolved. Since its debut on the Kuiper Airborne Observatory (KAO) in 1985, the instrument has been used in regular annual flight programs from both Moffett Field, CA and Christchurch, NZ. The basic plan of the spectrometer remains unchanged from the original design presented at the previous airborne science symposium. Numerous improvements and updates to the technical capability have of course been included over the many years of operational service.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cletcher, J.W.
1995-10-01
This is a regular report of summary statistics relating to recent reactor shutdown experience. The information includes both number of events and rates of occurence. It was compiled from data about operating events that were entered into the SCSS data system by the Nuclear Operations Analysis Center at the Oak ridge National Laboratory and covers the six mont period of July 1 to December 31, 1994. Cumulative information, starting from May 1, 1994, is also reported. Updates on shutdown events included in earlier reports is excluded. Information on shutdowns as a function of reactor power at the time of themore » shutdown for both BWR and PWR reactors is given. Data is also discerned by shutdown type and reactor age.« less
NASA Technical Reports Server (NTRS)
Coppin, Ann
2013-01-01
For a number of years ongoing bibliographies of various JPL missions (AIRS, ASTER, Cassini, GRACE, Earth Science, Mars Exploration Rovers (Spirit & Opportunity)) have been compiled by the JPL Library. Mission specific bibliographies are compiled by the Library and sent to mission scientists and managers in the form of regular (usually quarterly) updates. Charts showing publications by years are periodically provided to the ASTER, Cassini, and GRACE missions for supporting Senior Review/ongoing funding requests, and upon other occasions as a measure of the impact of the missions. Basically the Web of Science, Compendex, sometimes Inspec, GeoRef and Aerospace databases are searched for the mission name in the title, abstract, and assigned keywords. All get coded for journal publications that are refereed publications.
Zeinali Sehrig, Fatemeh; Majidi, Sima; Asvadi, Sahar; Hsanzadeh, Arash; Rasta, Seyed Hossein; Emamverdy, Masumeh; Akbarzadeh, Jamshid; Jahangiri, Sahar; Farahkhiz, Shahrzad; Akbarzadeh, Abolfazl
2016-11-01
Today, technologies based on magnetic nanoparticles (MNPs) are regularly applied to biological systems with diagnostic or therapeutic aims. Nanoparticles made of the elements iron (Fe), gadolinium (Gd) or manganese (Mn) are generally used in many diagnostic applications performed under magnetic resonance imaging (MRI). Similar to molecular-based contrast agents, nanoparticles can be used to increase the resolution of imaging while offering well biocompatibility, poisonousness and biodistribution. Application of MNPs enhanced MRI sensitivity due to the accumulation of iron in the liver caused by discriminating action of the hepatobiliary system. The aim of this study is about the use, properties and advantages of MNPs in MRI.
An Update on Statistical Boosting in Biomedicine.
Mayr, Andreas; Hofner, Benjamin; Waldmann, Elisabeth; Hepp, Tobias; Meyer, Sebastian; Gefeller, Olaf
2017-01-01
Statistical boosting algorithms have triggered a lot of research during the last decade. They combine a powerful machine learning approach with classical statistical modelling, offering various practical advantages like automated variable selection and implicit regularization of effect estimates. They are extremely flexible, as the underlying base-learners (regression functions defining the type of effect for the explanatory variables) can be combined with any kind of loss function (target function to be optimized, defining the type of regression setting). In this review article, we highlight the most recent methodological developments on statistical boosting regarding variable selection, functional regression, and advanced time-to-event modelling. Additionally, we provide a short overview on relevant applications of statistical boosting in biomedicine.
Cuthbertson, Andrew G. S.
2013-01-01
The sweetpotato whitefly Bemisia tabaci (Gennadius) (Hemiptera: Aleyrodidae) continues to be a serious threat to crops worldwide. The UK holds Protected Zone status against this pest and, as a result, B. tabaci entering on plant material is subjected to a policy of eradication. Both B and Q Bemisia biotypes are now regularly intercepted entering the UK. With increasing reports of neonicotinoid resistance in both these biotypes, it is becoming more problematic to control/eradicate. Therefore, alternative means of control are necessary. Entomopathogenic fungi (Lecanicilllium muscarium and Beauveria bassiana) offer much potential as control agents of B. tabaci within eradication programmes in the UK. PMID:26464385
Recommendations for accreditation of laboratories in molecular biology of hematologic malignancies.
Flandrin-Gresta, Pascale; Cornillet, Pascale; Hayette, Sandrine; Gachard, Nathalie; Tondeur, Sylvie; Mauté, Carole; Cayuela, Jean-Michel
2015-01-01
Over recent years, the development of molecular biology techniques has improved the hematological diseases diagnostic and follow-up. Consequently, these techniques are largely used in the biological screening of these diseases; therefore the Hemato-oncology molecular diagnostics laboratories must be actively involved in the accreditation process according the ISO 15189 standard. The French group of molecular biologists (GBMHM) provides requirements for the implementation of quality assurance for the medical molecular laboratories. This guideline states the recommendations for the pre-analytical, analytical (methods validation procedures, quality controls, reagents), and post-analytical conditions. In addition, herein we state a strategy for the internal quality control management. These recommendations will be regularly updated.
SU-E-T-446: Group-Sparsity Based Angle Generation Method for Beam Angle Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao, H
2015-06-15
Purpose: This work is to develop the effective algorithm for beam angle optimization (BAO), with the emphasis on enabling further improvement from existing treatment-dependent templates based on clinical knowledge and experience. Methods: The proposed BAO algorithm utilizes a priori beam angle templates as the initial guess, and iteratively generates angular updates for this initial set, namely angle generation method, with improved dose conformality that is quantitatively measured by the objective function. That is, during each iteration, we select “the test angle” in the initial set, and use group-sparsity based fluence map optimization to identify “the candidate angle” for updating “themore » test angle”, for which all the angles in the initial set except “the test angle”, namely “the fixed set”, are set free, i.e., with no group-sparsity penalty, and the rest of angles including “the test angle” during this iteration are in “the working set”. And then “the candidate angle” is selected with the smallest objective function value from the angles in “the working set” with locally maximal group sparsity, and replaces “the test angle” if “the fixed set” with “the candidate angle” has a smaller objective function value by solving the standard fluence map optimization (with no group-sparsity regularization). Similarly other angles in the initial set are in turn selected as “the test angle” for angular updates and this chain of updates is iterated until no further new angular update is identified for a full loop. Results: The tests using the MGH public prostate dataset demonstrated the effectiveness of the proposed BAO algorithm. For example, the optimized angular set from the proposed BAO algorithm was better the MGH template. Conclusion: A new BAO algorithm is proposed based on the angle generation method via group sparsity, with improved dose conformality from the given template. Hao Gao was partially supported by the NSFC (#11405105), the 973 Program (#2015CB856000) and the Shanghai Pujiang Talent Program (#14PJ1404500)« less
Literature searches on Ayurveda: An update.
Aggithaya, Madhur G; Narahari, Saravu R
2015-01-01
The journals that publish on Ayurveda are increasingly indexed by popular medical databases in recent years. However, many Eastern journals are not indexed biomedical journal databases such as PubMed. Literature searches for Ayurveda continue to be challenging due to the nonavailability of active, unbiased dedicated databases for Ayurvedic literature. In 2010, authors identified 46 databases that can be used for systematic search of Ayurvedic papers and theses. This update reviewed our previous recommendation and identified current and relevant databases. To update on Ayurveda literature search and strategy to retrieve maximum publications. Author used psoriasis as an example to search previously listed databases and identify new. The population, intervention, control, and outcome table included keywords related to psoriasis and Ayurvedic terminologies for skin diseases. Current citation update status, search results, and search options of previous databases were assessed. Eight search strategies were developed. Hundred and five journals, both biomedical and Ayurveda, which publish on Ayurveda, were identified. Variability in databases was explored to identify bias in journal citation. Five among 46 databases are now relevant - AYUSH research portal, Annotated Bibliography of Indian Medicine, Digital Helpline for Ayurveda Research Articles (DHARA), PubMed, and Directory of Open Access Journals. Search options in these databases are not uniform, and only PubMed allows complex search strategy. "The Researches in Ayurveda" and "Ayurvedic Research Database" (ARD) are important grey resources for hand searching. About 44/105 (41.5%) journals publishing Ayurvedic studies are not indexed in any database. Only 11/105 (10.4%) exclusive Ayurveda journals are indexed in PubMed. AYUSH research portal and DHARA are two major portals after 2010. It is mandatory to search PubMed and four other databases because all five carry citations from different groups of journals. The hand searching is important to identify Ayurveda publications that are not indexed elsewhere. Availability information of citations in Ayurveda libraries from National Union Catalogue of Scientific Serials in India if regularly updated will improve the efficacy of hand searching. A grey database (ARD) contains unpublished PG/Ph.D. theses. The AYUSH portal, DHARA (funded by Ministry of AYUSH), and ARD should be merged to form single larger database to limit Ayurveda literature searches.
Strategies to assess the validity of recommendations: a study protocol
2013-01-01
Background Clinical practice guidelines (CPGs) become quickly outdated and require a periodic reassessment of evidence research to maintain their validity. However, there is little research about this topic. Our project will provide evidence for some of the most pressing questions in this field: 1) what is the average time for recommendations to become out of date?; 2) what is the comparative performance of two restricted search strategies to evaluate the need to update recommendations?; and 3) what is the feasibility of a more regular monitoring and updating strategy compared to usual practice?. In this protocol we will focus on questions one and two. Methods The CPG Development Programme of the Spanish Ministry of Health developed 14 CPGs between 2008 and 2009. We will stratify guidelines by topic and by publication year, and include one CPG by strata. We will develop a strategy to assess the validity of CPG recommendations, which includes a baseline survey of clinical experts, an update of the original exhaustive literature searches, the identification of key references (reference that trigger a potential recommendation update), and the assessment of the potential changes in each recommendation. We will run two alternative search strategies to efficiently identify important new evidence: 1) PLUS search based in McMaster Premium LiteratUre Service (PLUS) database; and 2) a Restrictive Search (ReSe) based on the least number of MeSH terms and free text words needed to locate all the references of each original recommendation. We will perform a survival analysis of recommendations using the Kaplan-Meier method and we will use the log-rank test to analyse differences between survival curves according to the topic, the purpose, the strength of recommendations and the turnover. We will retrieve key references from the exhaustive search and evaluate their presence in the PLUS and ReSe search results. Discussion Our project, using a highly structured and transparent methodology, will provide guidance of when recommendations are likely to be at risk of being out of date. We will also assess two novel restrictive search strategies which could reduce the workload without compromising rigour when CPGs developers check for the need of updating. PMID:23967896
A general framework for regularized, similarity-based image restoration.
Kheradmand, Amin; Milanfar, Peyman
2014-12-01
Any image can be represented as a function defined on a weighted graph, in which the underlying structure of the image is encoded in kernel similarity and associated Laplacian matrices. In this paper, we develop an iterative graph-based framework for image restoration based on a new definition of the normalized graph Laplacian. We propose a cost function, which consists of a new data fidelity term and regularization term derived from the specific definition of the normalized graph Laplacian. The normalizing coefficients used in the definition of the Laplacian and associated regularization term are obtained using fast symmetry preserving matrix balancing. This results in some desired spectral properties for the normalized Laplacian such as being symmetric, positive semidefinite, and returning zero vector when applied to a constant image. Our algorithm comprises of outer and inner iterations, where in each outer iteration, the similarity weights are recomputed using the previous estimate and the updated objective function is minimized using inner conjugate gradient iterations. This procedure improves the performance of the algorithm for image deblurring, where we do not have access to a good initial estimate of the underlying image. In addition, the specific form of the cost function allows us to render the spectral analysis for the solutions of the corresponding linear equations. In addition, the proposed approach is general in the sense that we have shown its effectiveness for different restoration problems, including deblurring, denoising, and sharpening. Experimental results verify the effectiveness of the proposed algorithm on both synthetic and real examples.
Manifold regularized discriminative nonnegative matrix factorization with fast gradient descent.
Guan, Naiyang; Tao, Dacheng; Luo, Zhigang; Yuan, Bo
2011-07-01
Nonnegative matrix factorization (NMF) has become a popular data-representation method and has been widely used in image processing and pattern-recognition problems. This is because the learned bases can be interpreted as a natural parts-based representation of data and this interpretation is consistent with the psychological intuition of combining parts to form a whole. For practical classification tasks, however, NMF ignores both the local geometry of data and the discriminative information of different classes. In addition, existing research results show that the learned basis is unnecessarily parts-based because there is neither explicit nor implicit constraint to ensure the representation parts-based. In this paper, we introduce the manifold regularization and the margin maximization to NMF and obtain the manifold regularized discriminative NMF (MD-NMF) to overcome the aforementioned problems. The multiplicative update rule (MUR) can be applied to optimizing MD-NMF, but it converges slowly. In this paper, we propose a fast gradient descent (FGD) to optimize MD-NMF. FGD contains a Newton method that searches the optimal step length, and thus, FGD converges much faster than MUR. In addition, FGD includes MUR as a special case and can be applied to optimizing NMF and its variants. For a problem with 165 samples in R(1600), FGD converges in 28 s, while MUR requires 282 s. We also apply FGD in a variant of MD-NMF and experimental results confirm its efficiency. Experimental results on several face image datasets suggest the effectiveness of MD-NMF.
A blind deconvolution method based on L1/L2 regularization prior in the gradient space
NASA Astrophysics Data System (ADS)
Cai, Ying; Shi, Yu; Hua, Xia
2018-02-01
In the process of image restoration, the result of image restoration is very different from the real image because of the existence of noise, in order to solve the ill posed problem in image restoration, a blind deconvolution method based on L1/L2 regularization prior to gradient domain is proposed. The method presented in this paper first adds a function to the prior knowledge, which is the ratio of the L1 norm to the L2 norm, and takes the function as the penalty term in the high frequency domain of the image. Then, the function is iteratively updated, and the iterative shrinkage threshold algorithm is applied to solve the high frequency image. In this paper, it is considered that the information in the gradient domain is better for the estimation of blur kernel, so the blur kernel is estimated in the gradient domain. This problem can be quickly implemented in the frequency domain by fast Fast Fourier Transform. In addition, in order to improve the effectiveness of the algorithm, we have added a multi-scale iterative optimization method. This paper proposes the blind deconvolution method based on L1/L2 regularization priors in the gradient space can obtain the unique and stable solution in the process of image restoration, which not only keeps the edges and details of the image, but also ensures the accuracy of the results.
An update on new antibiotic prophylaxis and treatment for urinary tract infections in children.
Delbet, Jean Daniel; Lorrot, Mathie; Ulinski, Tim
2017-10-01
This review focuses on the treatment of urinary tract infections (UTI) in children and in particular its recent changes. Areas covered: Acute pyelonephritis, acute cystitis and asymptomatic bacteriuria or asymptomatic infections have to be clearly distinguished. Prompt treatment is required in pyelonephritis and cystitis, but not in asymptomatic bacteriuria or infection, in order to avoid selection of more virulent strains. This concept should be considered even in immunocompromised or bedridden children. In case of pyelonephritis, there should be no delay in beginning the antibiotic treatment in order to decrease the risk of long term complication, such as renal scars. Predisposing conditions for UTI, such as voiding anomalies and urinary tract malformation should be carefully evaluated. Expert opinion: One major concern is the increasing resistance to 3 rd generation cephalosporins. Therefore overconsumption in low-risk settings should be absolutely avoided. The prevalence of infections with E. coli producing extended spectrum ß-lactamase (ESBL) is increasing and pediatricians should be aware about the specific treatment options. Any recommendation about (initial) antibiotic treatment should be regularly updated and adapted to local resistance profiles and to economic factors in different health systems.
The Canterbury Charity Hospital: an update (2010-2012) and effects of the earthquakes.
Bagshaw, Philip F; Maimbo-M'siska, Miriam; Nicholls, M Gary; Shaw, Carl G; Allardyce, Randall A; Bagshaw, Susan N; McNabb, Angela L; Johnson, Stuart S; Frampton, Christopher M; Stokes, Brian W
2013-11-22
To update activities of the Canterbury Charity Hospital (CCH) and its Trust over the 3 years 2010-2012, during which the devastating Christchurch earthquakes occurred. Patients' treatments, establishment of new services, expansion of the CCH, staffing and finances were reviewed. Previously established services including general surgery continued as before, some services such as ophthalmology declined, and new services were established including colonoscopy, dentistry and some gynaecological procedures; counselling was provided following the earthquakes. Teaching and research endeavours increased. An adjacent property was purchased and renovated to accommodate the expansion. The Trust became financially self-sustaining in 2010; annual running costs of $340,000/year were maintained but were anticipated to increase soon. Of the money generously donated by the community to the Trust, 82% went directly to patient care. Although not formally recorded, hundreds of appointment request were rejected because of service unavailability or unmet referral criteria. This 3-year review highlights substantial, undocumented unmet healthcare needs in the region, which were exacerbated by the 2010/2011 earthquakes. We contend that the level of unmet healthcare in Canterbury and throughout the country should be regularly documented to inform planning of public healthcare services.
Arrieta, Francisco; Iglesias, Pedro; Pedro-Botet, Juan; Tébar, Francisco Javier; Ortega, Emilio; Nubiola, Andreu; Pardo, Jose Luis; Maldonado, Gonzálo Fernando; Obaya, Juan Carlos; Matute, Pablo; Petrecca, Romina; Alonso, Nuria; Sarabia, Elena; Sánchez-Margalet, Victor; Alemán, José Juan; Navarro, Jorge; Becerra, Antonio; Duran, Santiago; Aguilar, Manuel; Escobar-Jiménez, Fernando
2015-01-01
The present paper updates the Clinical Practice Recommendations for the management of cardiovascular risk factors (CVRF) in diabetes mellitus. This is a medical consensus agreed by an independent panel of experts from the Spanish Society of Diabetes (SED). Several consensuses have been proposed by scientific and medical Societies to achieve clinical goals. However, the risk score for general population may lack sensitivity for individual assessment or for particular groups at risk, such as diabetics. Traditional risk factors together with non-traditional factors are reviewed throughout this paper. Intervention strategies for managing CVRF in the diabetic patient are reviewed in detail: balanced food intake, weight reduction, physical exercise, smoking cessation, reduction in HbA1c, therapy for high blood pressure, obesity, lipid disorders, and platelet anti-aggregation. It is hoped that these guidelines can help clinicians in the decisions of their clinical activity. This regular update by the SED Cardiovascular Disease Group of the most relevant concepts, and of greater practical and realistic clinical interest, is presented in order to reduce CVR of diabetics. Copyright © 2014 Sociedad Española de Arteriosclerosis. Published by Elsevier España. All rights reserved.
Arrieta, Francisco; Iglesias, Pedro; Pedro-Botet, Juan; Tébar, Francisco Javier; Ortega, Emilio; Nubiola, Andreu; Pardo, Jose Luis; Maldonado, Gonzálo Fernando; Obaya, Juan Carlos; Matute, Pablo; Petrecca, Romina; Alonso, Nuria; Sarabia, Elena; Sánchez-Margalet, Victor; Alemán, José Juan; Navarro, Jorge; Becerra, Antonio; Duran, Santiago; Aguilar, Manuel; Escobar-Jiménez, Fernando
2016-05-01
The present paper updates the Clinical Practice Recommendations for the management of cardiovascular risk factors (CVRF) in diabetes mellitus. This is a medical consensus agreed by an independent panel of experts from the Spanish Society of Diabetes (SED). Several consensuses have been proposed by scientific and medical Societies to achieve clinical goals. However, the risk score for general population may lack sensitivity for individual assessment or for particular groups at risk, such as diabetics. Traditional risk factors together with non-traditional factors are reviewed throughout this paper. Intervention strategies for managing CVRF in the diabetic patient are reviewed in detail: balanced food intake, weight reduction, physical exercise, smoking cessation, reduction in HbA1c, therapy for high blood pressure, obesity, lipid disorders, and platelet anti-aggregation. It is hoped that these guidelines can help clinicians in the decisions of their clinical activity. This regular update by the SED Cardiovascular Disease Group of the most relevant concepts, and of greater practical and realistic clinical interest, is presented in order to reduce CVR of diabetics. Copyright © 2015. Publicado por Elsevier España, S.L.U.
Jali, Pramod K; Singh, Shamsher; Babaji, Prashant; Chaurasia, Vishwajit Rampratap; Somasundaram, P; Lau, Himani
2014-01-01
Internet is a useful tool to update the knowledge. The aim of the present study was to assess the current level of knowledge on the computer and internet among under graduate dental students. The study consists of self-administered close ended questionnaire survey. Questionnaires were distributed to undergraduate dental students. The study was conducted during July to September 2012. In the selected samples, response rate was 100%. Most (94.4%) of the students had computer knowledge and 77.4% had their own computer and access at home. Nearly 40.8% of students use computer for general purpose, 28.5% for entertainment and 22.8% used for research purpose. Most of the students had internet knowledge (92.9%) and they used it independently (79.1%). Nearly 42.1% used internet occasionally whereas, 34.4% used regularly, 21.7% rarely and 1.8% don't use respectively. Internet was preferred for getting information (48.8%) due to easy accessibility and recent updates. For dental purpose students used internet 2-3 times/week (45.3%). Most (95.3%) of the students responded to have computer based learning program in the curriculum. Computer knowledge was observed to be good among dental students.
GANESH: software for customized annotation of genome regions.
Huntley, Derek; Hummerich, Holger; Smedley, Damian; Kittivoravitkul, Sasivimol; McCarthy, Mark; Little, Peter; Sergot, Marek
2003-09-01
GANESH is a software package designed to support the genetic analysis of regions of human and other genomes. It provides a set of components that may be assembled to construct a self-updating database of DNA sequence, mapping data, and annotations of possible genome features. Once one or more remote sources of data for the target region have been identified, all sequences for that region are downloaded, assimilated, and subjected to a (configurable) set of standard database-searching and genome-analysis packages. The results are stored in compressed form in a relational database, and are updated automatically on a regular schedule so that they are always immediately available in their most up-to-date versions. A Java front-end, executed as a stand alone application or web applet, provides a graphical interface for navigating the database and for viewing the annotations. There are facilities for importing and exporting data in the format of the Distributed Annotation System (DAS), enabling a GANESH database to be used as a component of a DAS configuration. The system has been used to construct databases for about a dozen regions of human chromosomes and for three regions of mouse chromosomes.
Changing viewer perspectives reveals constraints to implicit visual statistical learning.
Jiang, Yuhong V; Swallow, Khena M
2014-10-07
Statistical learning-learning environmental regularities to guide behavior-likely plays an important role in natural human behavior. One potential use is in search for valuable items. Because visual statistical learning can be acquired quickly and without intention or awareness, it could optimize search and thereby conserve energy. For this to be true, however, visual statistical learning needs to be viewpoint invariant, facilitating search even when people walk around. To test whether implicit visual statistical learning of spatial information is viewpoint independent, we asked participants to perform a visual search task from variable locations around a monitor placed flat on a stand. Unbeknownst to participants, the target was more often in some locations than others. In contrast to previous research on stationary observers, visual statistical learning failed to produce a search advantage for targets in high-probable regions that were stable within the environment but variable relative to the viewer. This failure was observed even when conditions for spatial updating were optimized. However, learning was successful when the rich locations were referenced relative to the viewer. We conclude that changing viewer perspective disrupts implicit learning of the target's location probability. This form of learning shows limited integration with spatial updating or spatiotopic representations. © 2014 ARVO.
The Colorado River and its deposits downstream from Grand Canyon in Arizona, California, and Nevada
Crow, Ryan S.; Block, Debra L.; Felger, Tracey J.; House, P. Kyle; Pearthree, Philip A.; Gootee, Brian F.; Youberg, Ann M.; Howard, Keith A.; Beard, L. Sue
2018-02-05
Understanding the evolution of the Colorado River system has direct implications for (1) the processes and timing of continental-scale river system integration, (2) the formation of iconic landscapes like those in and around Grand Canyon, and (3) the availability of groundwater resources. Spatial patterns in the position and type of Colorado River deposits, only discernible through geologic mapping, can be used to test models related to Colorado River evolution. This is particularly true downstream from Grand Canyon where ancestral Colorado River deposits are well-exposed. We are principally interested in (1) regional patterns in the minimum and maximum elevation of each depositional unit, which are affected by depositional mechanism and postdepositional deformation; and (2) the volume of each unit, which reflects regional changes in erosion, transport efficiency, and accommodation space. The volume of Colorado River deposits below Grand Canyon has implications for groundwater resources, as the primary regional aquifer there is composed of those deposits. To this end, we are presently mapping Colorado River deposits and compiling and updating older mapping. This preliminary data release shows the current status of our mapping and compilation efforts. We plan to update it at regular intervals in conjunction with ongoing mapping.
Sequential Dictionary Learning From Correlated Data: Application to fMRI Data Analysis.
Seghouane, Abd-Krim; Iqbal, Asif
2017-03-22
Sequential dictionary learning via the K-SVD algorithm has been revealed as a successful alternative to conventional data driven methods such as independent component analysis (ICA) for functional magnetic resonance imaging (fMRI) data analysis. fMRI datasets are however structured data matrices with notions of spatio-temporal correlation and temporal smoothness. This prior information has not been included in the K-SVD algorithm when applied to fMRI data analysis. In this paper we propose three variants of the K-SVD algorithm dedicated to fMRI data analysis by accounting for this prior information. The proposed algorithms differ from the K-SVD in their sparse coding and dictionary update stages. The first two algorithms account for the known correlation structure in the fMRI data by using the squared Q, R-norm instead of the Frobenius norm for matrix approximation. The third and last algorithm account for both the known correlation structure in the fMRI data and the temporal smoothness. The temporal smoothness is incorporated in the dictionary update stage via regularization of the dictionary atoms obtained with penalization. The performance of the proposed dictionary learning algorithms are illustrated through simulations and applications on real fMRI data.
Pant, Jeevan K; Krishnan, Sridhar
2014-04-01
A new algorithm for the reconstruction of electrocardiogram (ECG) signals and a dictionary learning algorithm for the enhancement of its reconstruction performance for a class of signals are proposed. The signal reconstruction algorithm is based on minimizing the lp pseudo-norm of the second-order difference, called as the lp(2d) pseudo-norm, of the signal. The optimization involved is carried out using a sequential conjugate-gradient algorithm. The dictionary learning algorithm uses an iterative procedure wherein a signal reconstruction and a dictionary update steps are repeated until a convergence criterion is satisfied. The signal reconstruction step is implemented by using the proposed signal reconstruction algorithm and the dictionary update step is implemented by using the linear least-squares method. Extensive simulation results demonstrate that the proposed algorithm yields improved reconstruction performance for temporally correlated ECG signals relative to the state-of-the-art lp(1d)-regularized least-squares and Bayesian learning based algorithms. Also for a known class of signals, the reconstruction performance of the proposed algorithm can be improved by applying it in conjunction with a dictionary obtained using the proposed dictionary learning algorithm.
Grissa, Ibtissem; Vergnaud, Gilles; Pourcel, Christine
2007-01-01
Background In Archeae and Bacteria, the repeated elements called CRISPRs for "clustered regularly interspaced short palindromic repeats" are believed to participate in the defence against viruses. Short sequences called spacers are stored in-between repeated elements. In the current model, motifs comprising spacers and repeats may target an invading DNA and lead to its degradation through a proposed mechanism similar to RNA interference. Analysis of intra-species polymorphism shows that new motifs (one spacer and one repeated element) are added in a polarised fashion. Although their principal characteristics have been described, a lot remains to be discovered on the way CRISPRs are created and evolve. As new genome sequences become available it appears necessary to develop automated scanning tools to make available CRISPRs related information and to facilitate additional investigations. Description We have produced a program, CRISPRFinder, which identifies CRISPRs and extracts the repeated and unique sequences. Using this software, a database is constructed which is automatically updated monthly from newly released genome sequences. Additional tools were created to allow the alignment of flanking sequences in search for similarities between different loci and to build dictionaries of unique sequences. To date, almost six hundred CRISPRs have been identified in 475 published genomes. Two Archeae out of thirty-seven and about half of Bacteria do not possess a CRISPR. Fine analysis of repeated sequences strongly supports the current view that new motifs are added at one end of the CRISPR adjacent to the putative promoter. Conclusion It is hoped that availability of a public database, regularly updated and which can be queried on the web will help in further dissecting and understanding CRISPR structure and flanking sequences evolution. Subsequent analyses of the intra-species CRISPR polymorphism will be facilitated by CRISPRFinder and the dictionary creator. CRISPRdb is accessible at PMID:17521438
flyDIVaS: A Comparative Genomics Resource for Drosophila Divergence and Selection
Stanley, Craig E.; Kulathinal, Rob J.
2016-01-01
With arguably the best finished and expertly annotated genome assembly, Drosophila melanogaster is a formidable genetics model to study all aspects of biology. Nearly a decade ago, the 12 Drosophila genomes project expanded D. melanogaster’s breadth as a comparative model through the community-development of an unprecedented genus- and genome-wide comparative resource. However, since its inception, these datasets for evolutionary inference and biological discovery have become increasingly outdated, outmoded, and inaccessible. Here, we provide an updated and upgradable comparative genomics resource of Drosophila divergence and selection, flyDIVaS, based on the latest genomic assemblies, curated FlyBase annotations, and recent OrthoDB orthology calls. flyDIVaS is an online database containing D. melanogaster-centric orthologous gene sets, CDS and protein alignments, divergence statistics (% gaps, dN, dS, dN/dS), and codon-based tests of positive Darwinian selection. Out of 13,920 protein-coding D. melanogaster genes, ∼80% have one aligned ortholog in the closely related species, D. simulans, and ∼50% have 1–1 12-way alignments in the original 12 sequenced species that span over 80 million yr of divergence. Genes and their orthologs can be chosen from four different taxonomic datasets differing in phylogenetic depth and coverage density, and visualized via interactive alignments and phylogenetic trees. Users can also batch download entire comparative datasets. A functional survey finds conserved mitotic and neural genes, highly diverged immune and reproduction-related genes, more conspicuous signals of divergence across tissue-specific genes, and an enrichment of positive selection among highly diverged genes. flyDIVaS will be regularly updated and can be freely accessed at www.flydivas.info. We encourage researchers to regularly use this resource as a tool for biological inference and discovery, and in their classrooms to help train the next generation of biologists to creatively use such genomic big data resources in an integrative manner. PMID:27226167
flyDIVaS: A Comparative Genomics Resource for Drosophila Divergence and Selection.
Stanley, Craig E; Kulathinal, Rob J
2016-08-09
With arguably the best finished and expertly annotated genome assembly, Drosophila melanogaster is a formidable genetics model to study all aspects of biology. Nearly a decade ago, the 12 Drosophila genomes project expanded D. melanogaster's breadth as a comparative model through the community-development of an unprecedented genus- and genome-wide comparative resource. However, since its inception, these datasets for evolutionary inference and biological discovery have become increasingly outdated, outmoded, and inaccessible. Here, we provide an updated and upgradable comparative genomics resource of Drosophila divergence and selection, flyDIVaS, based on the latest genomic assemblies, curated FlyBase annotations, and recent OrthoDB orthology calls. flyDIVaS is an online database containing D. melanogaster-centric orthologous gene sets, CDS and protein alignments, divergence statistics (% gaps, dN, dS, dN/dS), and codon-based tests of positive Darwinian selection. Out of 13,920 protein-coding D. melanogaster genes, ∼80% have one aligned ortholog in the closely related species, D. simulans, and ∼50% have 1-1 12-way alignments in the original 12 sequenced species that span over 80 million yr of divergence. Genes and their orthologs can be chosen from four different taxonomic datasets differing in phylogenetic depth and coverage density, and visualized via interactive alignments and phylogenetic trees. Users can also batch download entire comparative datasets. A functional survey finds conserved mitotic and neural genes, highly diverged immune and reproduction-related genes, more conspicuous signals of divergence across tissue-specific genes, and an enrichment of positive selection among highly diverged genes. flyDIVaS will be regularly updated and can be freely accessed at www.flydivas.info We encourage researchers to regularly use this resource as a tool for biological inference and discovery, and in their classrooms to help train the next generation of biologists to creatively use such genomic big data resources in an integrative manner. Copyright © 2016 Stanley and Kulathinal.
Yang, Ian A; Brown, Juliet L; George, Johnson; Jenkins, Sue; McDonald, Christine F; McDonald, Vanessa M; Phillips, Kirsten; Smith, Brian J; Zwar, Nicholas A; Dabscheck, Eli
2017-11-20
Chronic obstructive pulmonary disease (COPD) is characterised by persistent respiratory symptoms and chronic airflow limitation, and is associated with exacerbations and comorbidities. Advances in the management of COPD are updated quarterly in the national COPD guidelines, the COPD-X plan, published by Lung Foundation Australia in conjunction with the Thoracic Society of Australia and New Zealand and available at http://copdx.org.au. Main recommendations: Spirometry detects persistent airflow limitation (post-bronchodilator FEV1/FVC < 0.7) and must be used to confirm the diagnosis.Non-pharmacological and pharmacological therapies should be considered as they optimise function (ie, improve symptoms and quality of life) and prevent deterioration (ie, prevent exacerbations and reduce decline).Pulmonary rehabilitation and regular exercise are highly beneficial and should be provided to all symptomatic COPD patients.Short- and long-acting inhaled bronchodilators and, in more severe disease, anti-inflammatory agents (inhaled corticosteroids) should be considered in a stepwise approach.Given the wide range of inhaler devices available, inhaler technique and adherence should be checked regularly.Smoking cessation is essential, and influenza and pneumococcal vaccinations reduce the risk of exacerbations.A plan of care should be developed with the multidisciplinary team. COPD action plans reduce hospitalisations and are recommended as part of COPD self-management.Exacerbations should be managed promptly with bronchodilators, corticosteroids and antibiotics as appropriate to prevent hospital admission and delay COPD progression.Comorbidities of COPD require identification and appropriate management.Supportive, palliative and end-of-life care are beneficial for patients with advanced disease.Education of patients, carers and clinicians, and a strong partnership between primary and tertiary care, facilitate evidence-based management of COPD. Changes in management as result of the guideline: Spirometry remains the gold standard for diagnosing airflow obstruction and COPD. Non-pharmacological and pharmacological treatment should be used in a stepwise fashion to control symptoms and reduce exacerbation risk.
Belcher, Wayne R.; Sweetkind, Donald S.; Faunt, Claudia C.; Pavelko, Michael T.; Hill, Mary C.
2017-01-19
Since the original publication of the Death Valley regional groundwater flow system (DVRFS) numerical model in 2004, more information on the regional groundwater flow system in the form of new data and interpretations has been compiled. Cooperators such as the Bureau of Land Management, National Park Service, U.S. Fish and Wildlife Service, the Department of Energy, and Nye County, Nevada, recognized a need to update the existing regional numerical model to maintain its viability as a groundwater management tool for regional stakeholders. The existing DVRFS numerical flow model was converted to MODFLOW-2005, updated with the latest available data, and recalibrated. Five main data sets were revised: (1) recharge from precipitation varying in time and space, (2) pumping data, (3) water-level observations, (4) an updated regional potentiometric map, and (5) a revision to the digital hydrogeologic framework model.The resulting DVRFS version 2.0 (v. 2.0) numerical flow model simulates groundwater flow conditions for the Death Valley region from 1913 to 2003 to correspond to the time frame for the most recently published (2008) water-use data. The DVRFS v 2.0 model was calibrated by using the Tikhonov regularization functionality in the parameter estimation and predictive uncertainty software PEST. In order to assess the accuracy of the numerical flow model in simulating regional flow, the fit of simulated to target values (consisting of hydraulic heads and flows, including evapotranspiration and spring discharge, flow across the model boundary, and interbasin flow; the regional water budget; values of parameter estimates; and sensitivities) was evaluated. This evaluation showed that DVRFS v. 2.0 simulates conditions similar to DVRFS v. 1.0. Comparisons of the target values with simulated values also indicate that they match reasonably well and in some cases (boundary flows and discharge) significantly better than in DVRFS v. 1.0.
NASA Technical Reports Server (NTRS)
Stupl, Jan; Faber, Nicolas; Foster, Cyrus; Yang, Fan Yang; Nelson, Bron; Aziz, Jonathan; Nuttall, Andrew; Henze, Chris; Levit, Creon
2014-01-01
This paper provides an updated efficiency analysis of the LightForce space debris collision avoidance scheme. LightForce aims to prevent collisions on warning by utilizing photon pressure from ground based, commercial off the shelf lasers. Past research has shown that a few ground-based systems consisting of 10 kilowatt class lasers directed by 1.5 meter telescopes with adaptive optics could lower the expected number of collisions in Low Earth Orbit (LEO) by an order of magnitude. Our simulation approach utilizes the entire Two Line Element (TLE) catalogue in LEO for a given day as initial input. Least-squares fitting of a TLE time series is used for an improved orbit estimate. We then calculate the probability of collision for all LEO objects in the catalogue for a time step of the simulation. The conjunctions that exceed a threshold probability of collision are then engaged by a simulated network of laser ground stations. After those engagements, the perturbed orbits are used to re-assess the probability of collision and evaluate the efficiency of the system. This paper describes new simulations with three updated aspects: 1) By utilizing a highly parallel simulation approach employing hundreds of processors, we have extended our analysis to a much broader dataset. The simulation time is extended to one year. 2) We analyze not only the efficiency of LightForce on conjunctions that naturally occur, but also take into account conjunctions caused by orbit perturbations due to LightForce engagements. 3) We use a new simulation approach that is regularly updating the LightForce engagement strategy, as it would be during actual operations. In this paper we present our simulation approach to parallelize the efficiency analysis, its computational performance and the resulting expected efficiency of the LightForce collision avoidance system. Results indicate that utilizing a network of four LightForce stations with 20 kilowatt lasers, 85% of all conjunctions with a probability of collision Pc > 10 (sup -6) can be mitigated.
NASA Astrophysics Data System (ADS)
Konacki, M.; Lejba, P.; Sybilski, P.; Pawłaszek, R.; Kozłowski, S.; Suchodolski, T.; Słonina, M.; Litwicki, M.; Sybilska, A.; Rogowska, B.; Kolb, U.; Burwitz, V.; Baader, J.; Groot, P.; Bloemen, S.; Ratajczak, M.; Hełminiak, K.; Borek, R.; Chodosiewicz, P.; Chimicz, A.
We present an update on the preparation of our assets that consists of a robotic network of eight optical telescopes and a laser ranging station for regular services in the SST domain. We report the development of new optical assets that include a double telescope system, Panoptes-1AB, and a new astrograph on our Solaris-3 telescope at the Siding Spring Observatory, Australia. Progress in the software development necessary for smooth SST operation includes a web based portal and an XML Azure Queue scheduling for the network giving easy access to our sensors. Astrometry24.net our new prototype cloud service for fast astrometry, streak detection and measurement with precision and performance results is also described. In the laser domain, for more than a year, Space Research Centre Borowiec laser station has regularly tracked space debris cooperative and uncooperative targets. The efforts of the stations’ staff have been focused on the tracking of typical rocket bodies from the LEO regime. Additionally, a second independent laser system fully dedicated to SST activities is under development. It will allow for an increased pace of operation of our consortium in the global SST laser domain.
NASA Astrophysics Data System (ADS)
Ji, Yuanbo; van der Geest, Rob J.; Nazarian, Saman; Lelieveldt, Boudewijn P. F.; Tao, Qian
2018-03-01
Anatomical objects in medical images very often have dual contours or surfaces that are highly correlated. Manually segmenting both of them by following local image details is tedious and subjective. In this study, we proposed a two-layer region-based level set method with a soft distance constraint, which not only regularizes the level set evolution at two levels, but also imposes prior information on wall thickness in an effective manner. By updating the level set function and distance constraint functions alternatingly, the method simultaneously optimizes both contours while regularizing their distance. The method was applied to segment the inner and outer wall of both left atrium (LA) and left ventricle (LV) from MR images, using a rough initialization from inside the blood pool. Compared to manual annotation from experience observers, the proposed method achieved an average perpendicular distance (APD) of less than 1mm for the LA segmentation, and less than 1.5mm for the LV segmentation, at both inner and outer contours. The method can be used as a practical tool for fast and accurate dual wall annotations given proper initialization.
Abdomen and spinal cord segmentation with augmented active shape models.
Xu, Zhoubing; Conrad, Benjamin N; Baucom, Rebeccah B; Smith, Seth A; Poulose, Benjamin K; Landman, Bennett A
2016-07-01
Active shape models (ASMs) have been widely used for extracting human anatomies in medical images given their capability for shape regularization of topology preservation. However, sensitivity to model initialization and local correspondence search often undermines their performances, especially around highly variable contexts in computed-tomography (CT) and magnetic resonance (MR) images. In this study, we propose an augmented ASM (AASM) by integrating the multiatlas label fusion (MALF) and level set (LS) techniques into the traditional ASM framework. Using AASM, landmark updates are optimized globally via a region-based LS evolution applied on the probability map generated from MALF. This augmentation effectively extends the searching range of correspondent landmarks while reducing sensitivity to the image contexts and improves the segmentation robustness. We propose the AASM framework as a two-dimensional segmentation technique targeting structures with one axis of regularity. We apply AASM approach to abdomen CT and spinal cord (SC) MR segmentation challenges. On 20 CT scans, the AASM segmentation of the whole abdominal wall enables the subcutaneous/visceral fat measurement, with high correlation to the measurement derived from manual segmentation. On 28 3T MR scans, AASM yields better performances than other state-of-the-art approaches in segmenting white/gray matter in SC.
Severus, Emanuel; Bauer, Michael; Geddes, John
2018-06-13
For more than 40 years, lithium has been the gold standard in the long-term treatment of bipolar disorders. In the course of the last 15 years, other drugs have been approved in this indication and are widely used in clinical practice at the expense of lithium. New research from the last few years, however, indicates that lithium is still the first-line treatment in this indication. Against this background and lithium's proven acute antimanic efficacy, we should perhaps be using lithium more regularly (in combination with an atypical antipsychotic, if necessary) right from the start for the acute treatment of a manic episode and, once remission has been achieved and euthymia maintained during continuation treatment, to regularly taper off the atypical antipsychotic, if possible, and continue with lithium as monotherapy for prophylactic treatment. This might lead to lithium being used more consistently with the scientific evidence in the long-term treatment of bipolar disorders. It remains uncertain, however, to predict who will respond to and tolerate lithium prophylactically, and more research is needed to deliver the best possible individualized care to our patients. © Georg Thieme Verlag KG Stuttgart · New York.
Extortion provides alternative routes to the evolution of cooperation in structured populations
NASA Astrophysics Data System (ADS)
Xu, Xiongrui; Rong, Zhihai; Wu, Zhi-Xi; Zhou, Tao; Tse, Chi Kong
2017-05-01
In this paper, we study the evolution of cooperation in structured populations (individuals are located on either a regular lattice or a scale-free network) in the context of repeated games by involving three types of strategies, namely, unconditional cooperation, unconditional defection, and extortion. The strategy updating of the players is ruled by the replicator-like dynamics. We find that extortion strategies can act as catalysts to promote the emergence of cooperation in structured populations via different mechanisms. Specifically, on regular lattice, extortioners behave as both a shield, which can enwrap cooperators inside and keep them away from defectors, and a spear, which can defeat those surrounding defectors with the help of the neighboring cooperators. Particularly, the enhancement of cooperation displays a resonance-like behavior, suggesting the existence of optimal extortion strength mostly favoring the evolution of cooperation, which is in good agreement with the predictions from the generalized mean-field approximation theory. On scale-free network, the hubs, who are likely occupied by extortioners or defectors at the very beginning, are then prone to be conquered by cooperators on small-degree nodes as time elapses, thus establishing a bottom-up mechanism for the emergence and maintenance of cooperation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dykstra, Dave; Garzoglio, Gabriele; Kim, Hyunwoo
As of 2012, a number of US Department of Energy (DOE) National Laboratories have access to a 100 Gb/s wide-area network backbone. The ESnet Advanced Networking Initiative (ANI) project is intended to develop a prototype network, based on emerging 100 Gb/s Ethernet technology. The ANI network will support DOE's science research programs. A 100 Gb/s network test bed is a key component of the ANI project. The test bed offers the opportunity for early evaluation of 100Gb/s network infrastructure for supporting the high impact data movement typical of science collaborations and experiments. In order to make effective use of thismore » advanced infrastructure, the applications and middleware currently used by the distributed computing systems of large-scale science need to be adapted and tested within the new environment, with gaps in functionality identified and corrected. As a user of the ANI test bed, Fermilab aims to study the issues related to end-to-end integration and use of 100 Gb/s networks for the event simulation and analysis applications of physics experiments. In this paper we discuss our findings from evaluating existing HEP Physics middleware and application components, including GridFTP, Globus Online, etc. in the high-speed environment. These will include possible recommendations to the system administrators, application and middleware developers on changes that would make production use of the 100 Gb/s networks, including data storage, caching and wide area access.« less
Quality-improvement analytics for intravenous infusion pumps.
Skledar, Susan J; Niccolai, Cynthia S; Schilling, Dennis; Costello, Susan; Mininni, Nicolette; Ervin, Kelly; Urban, Alana
2013-04-15
The implementation of a smart-pump continuous quality-improvement (CQI) program across a large health system is described, with an emphasis on key metrics for outcomes analyses and program refinement. Three years ago, the University of Pittsburgh Medical Center health system launched a CQI initiative to help ensure the safe use of 6000 smart pumps in its 14 inpatient facilities. A centralized team led by pharmacists is responsible for the retrieval and interpretation of smart-pump data, which is continuously transmitted to a main server. CQI findings are regularly posted on the health system's interdisciplinary intranet. Monitored metrics include rates of compliance with preprogrammed infusion limits, the top 20 drugs involved in alerts, drugs associated with alert-override rates of ≥90%, numbers of alerts by infusion type, nurse responses to alerts, and alert rate per drug library update. Based on the collected CQI data and site-specific requests, four systemwide updates of the smart-pump drug library were performed during the first 18 months of the program, reducing "nuisance alerts" by about 10% per update cycle and enabling targeted interventions to reduce rapid-infusion errors, other adverse drug events (ADEs), and pump-programming workarounds. Over one 12-month period, bedside alerts prompted nurses to reprogram or cancel continuous infusions an average of 400 times per month, potentially averting i.v. medication ADEs. A smart-pump CQI program is an effective tool for enhancing the safety of i.v. medication administration. The ongoing refinement of the drug library through the development and implementation of key interventions promotes the growth and sustainability of the smart-pump initiative systemwide.
A FAST ITERATIVE METHOD FOR SOLVING THE EIKONAL EQUATION ON TRIANGULATED SURFACES*
Fu, Zhisong; Jeong, Won-Ki; Pan, Yongsheng; Kirby, Robert M.; Whitaker, Ross T.
2012-01-01
This paper presents an efficient, fine-grained parallel algorithm for solving the Eikonal equation on triangular meshes. The Eikonal equation, and the broader class of Hamilton–Jacobi equations to which it belongs, have a wide range of applications from geometric optics and seismology to biological modeling and analysis of geometry and images. The ability to solve such equations accurately and efficiently provides new capabilities for exploring and visualizing parameter spaces and for solving inverse problems that rely on such equations in the forward model. Efficient solvers on state-of-the-art, parallel architectures require new algorithms that are not, in many cases, optimal, but are better suited to synchronous updates of the solution. In previous work [W. K. Jeong and R. T. Whitaker, SIAM J. Sci. Comput., 30 (2008), pp. 2512–2534], the authors proposed the fast iterative method (FIM) to efficiently solve the Eikonal equation on regular grids. In this paper we extend the fast iterative method to solve Eikonal equations efficiently on triangulated domains on the CPU and on parallel architectures, including graphics processors. We propose a new local update scheme that provides solutions of first-order accuracy for both architectures. We also propose a novel triangle-based update scheme and its corresponding data structure for efficient irregular data mapping to parallel single-instruction multiple-data (SIMD) processors. We provide detailed descriptions of the implementations on a single CPU, a multicore CPU with shared memory, and SIMD architectures with comparative results against state-of-the-art Eikonal solvers. PMID:22641200
Catalogue of extreme wave events in Ireland: revised and updated for 14 680 BP to 2017
NASA Astrophysics Data System (ADS)
O'Brien, Laura; Renzi, Emiliano; Dudley, John M.; Clancy, Colm; Dias, Frédéric
2018-03-01
This paper aims to extend and update the survey of extreme wave events in Ireland that was previously carried out by O'Brien et al. (2013). The original catalogue highlighted the frequency of such events dating back as far as the turn of the last ice age and as recent as 2012. Ireland's marine territory extends far beyond its coastline and is one of the largest seabed territories in Europe. It is therefore not surprising that extreme waves have continued to occur regularly since 2012, particularly considering the severity of weather during the winters of 2013-2014 and 2015-2016. In addition, a large number of storm surges have been identified since the publication of the original catalogue. This paper updates the O'Brien et al. (2013) catalogue to include events up to the end of 2017. Storm surges are included as a new category and events are categorised into long waves (tsunamis and storm surges) and short waves (storm and rogue waves). New results prior to 2012 are also included and some of the events previously documented are reclassified. Important questions regarding public safety, services and the influence of climate change are also highlighted. An interactive map has been created to allow the reader to navigate through events: https://drive.google.com/open?id=19cZ59pDHfDnXKYIziYAVWV6AfoE&usp=sharing.
Overhauling, updating and augmenting NASA spacelink electronic information system
NASA Technical Reports Server (NTRS)
Blake, Jean A.
1991-01-01
NASA/Spacelink is a collection of NASA information and educational materials stored on a computer at the MSFC. It is provided by the NASA Educational Affairs Division and is operated by the Education Branch of the Marshall Center Public Affairs Office. It is designed to communicate with a wide variety of computers and modems, especially those most commonly found in classrooms and homes. It was made available to the public in February, 1988. The system may be accessed by educators and the public over regular telephone lines. NASA/Spacelink is free except for the cost of long distance calls. Overhauling and updating Spacelink was done to refurbish NASA/Spacelink, a very valuable resource medium. Several new classroom activities and miscellaneous topics were edited and entered into Spacelink. One of the areas that received a major overhaul (under the guidance of Amos Crisp) was the SPINOFFS BENEFITS, the great benefits resulting from America's space explorations. The Spinoff Benefits include information on a variety of topics including agriculture, communication, the computer, consumer, energy, equipment and materials, food, health, home, industry, medicine, natural resources, public services, recreation, safety, sports, and transportation. In addition to the Space Program Spinoff Benefits, the following is a partial list of some of the material updated and introduced: Astronaut Biographies, Miscellaneous Aeronautics Classroom Activities, Miscellaneous Astronomy Classroom Activities, Miscellaneous Rocketry Classroom Activities, Miscellaneous Classroom Activities, NASA and Its Center, NASA Areas of Research, NASA Patents, Licensing, NASA Technology Transfer, Pictures from Space Classroom Activities, Status of Current NASA Projects, Using Art to Teach Science, and Word Puzzles for Use in the Classroom.
Evaluation of Roadmap to Achieve Energy Delivery Systems Cybersecurity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chavez, Adrian R.
The Department of Energy/Office of Electricity Delivery and Energy Reliability (DOE/OE) Cybersecurity for Energy Delivery Systems (CEDS) program is currently evaluating the Roadmap to Achieve Energy Delivery Systems Cybersecurity document that sets a vision and outlines a set of milestones. The milestones are divided into five strategic focus areas that include: 1. Build a Culture of Security; 2. Assess and Monitor Risk; 3. Develop and Implement New Protective Measures to Reduce Risk; 4. Manage Incidents; and 5. Sustain Security Improvements. The most current version of the roadmap was last updated in September of 2016. Sandia National Laboratories (SNL) has beenmore » tasked with revisiting the roadmap to update the current state of energy delivery systems cybersecurity protections. SNL is currently working with previous and current partners to provide feedback on which of the roadmap milestones have been met and to identify any preexisting or new gaps that are not addressed by the roadmap. The specific focus areas SNL was asked to evaluate are: 1. Develop and Implement New Protective Measures to Reduce Risk and 2. Sustain Security Improvements. SNL has formed an Industry Advisory Board (IAB) to assist in answering these questions. The IAB consists of previous partners on past CEDS funded efforts as well as new collaborators that have unique insights into the current state of cybersecurity within energy delivery systems. The IAB includes asset owners, utilities and vendors of control systems. SNL will continue to maintain regular communications with the IAB to provide various perspectives on potential future updates to further improve the breadth of cybersecurity coverage of the roadmap.« less
TRMM Version 7 Near-Realtime Data Products
NASA Technical Reports Server (NTRS)
Tocker, Erich Franz; Kelley, Owen
2012-01-01
The TRMM data system has been providing near-realtime data products to the community since late 1999. While the TRMM project never had near-realtime production requirements, the science and applications communities had a great interest in receiving TRMM data as quickly as possible. As a result these NRT data are provided under a best-effort scenario but with the objective of having the swath data products available within three hours of data collection 90% of the time. In July of 2011 the Joint Precipitation Measurement Missions Science Team (JPST) authorized the reprocessing of TRMM mission data using the new version 7 algorithms. The reprocessing of the 14+ years of the mission was concluded within 30 days. Version 7 algorithms had substantial changes in the data product file formats both for data and metadata. In addition, the algorithms themselves had major modifications and improvements. The general approach to versioning up the NRT is to wait for the regular production algorithms to have run for a while and shake out any issues that might arise from the new version before updating the NRT products. Because of the substantial changes in data/metadata formats as well as the algorithm improvements themselves, the update of NRT to V7 followed an even more conservative path than usual. This was done to ensure that applications agencies and other users of the TRMM NRT would not be faces with short-timeframes for conversion to the new format. This paper will describe the process by which the TRMM NRT was updated to V7 and the V7 data products themselves.
Maternity leave in normal pregnancy.
Leduc, Dean
2011-08-01
To assist maternity care providers in recognizing and discussing health- and illness-related issues in pregnancy and their relationship to maternity benefits. Published literature was retrieved through searches of PubMed or Medline, CINAHL, and The Cochrane Library in 2009 using appropriate controlled vocabulary (e.g., maternity benefits) and key words (e.g., maternity, benefits, pregnancy). Results were restricted to systematic reviews, randomized controlled trials/controlled clinical trials, and observational studies. There were no date or language restrictions. Searches were updated on a regular basis and incorporated in the guideline to December 2009. Grey (unpublished) literature was identified through searching the web sites of health technology assessment and health technology assessment-related agencies, clinical practice guideline collections, clinical trial registries, and national and international medical specialty societies.
SUstaiNability: a science communication website on environmental research
NASA Astrophysics Data System (ADS)
Gravina, Teresita; Muselli, Maurizio; Ligrone, Roberto; Rutigliano, Flora Angela
2017-08-01
Social networks enable anyone to publish potentially boundless amounts of information. However, such information is also highly prone to creating and/or diffusing mistakes and misunderstandings in scientific issues. In 2013 we produced a website (www.sunability.unina2.it) reporting on some research outputs from the University of Campania Luigi Vanvitelli (formerly the Second University of Naples, SUN), and shared it on Facebook and Twitter to analyse the effectiveness of these platforms in scientific dissemination. The study results suggest that (i) a regular update of the website stimulates the user's interest, (ii) Campania's citizens are more concerned with pollution problems than natural hazards, and (iii) direct involvement of researchers effectively enhances web-mediated scientific dissemination.
Data analysis and theoretical studies for atmospheric Explorer C, D and E
NASA Technical Reports Server (NTRS)
Dalgarno, A.
1983-01-01
The research concentrated on construction of a comprehensive model of the chemistry of the ionosphere. It proceeded by comparing detailed predictions of the atmospheric parameters observed by the instrumentation on board the Atmospheric Explorer Satellites with the measured values and modifying the chemistry to bring about consistency. Full account was taken of laboratory measurements of the processes identified as important. The research programs were made available to the AE team members. Regularly updated tables of recommended values of photoionization cross sections and electron impact excitation and ionization cross sections were provided. The research did indeed lead to a chemistry model in which the main pathways are quantitatively secure. The accuracy was sufficient that remaining differences are small.
Modeling the Radiation Belts During a Geomagnetic Storm
NASA Astrophysics Data System (ADS)
Glocer, A.; Fok, M.; Toth, G.
2009-05-01
We utilize the Radiation Belt Environment (RBE) model to simulate the radiation belt electrons during a geomagnetic storm. Particularly, we focus on the relative contribution of whistler mode wave-particle interactions and radial diffusion associated with rapid changes in the magnetospheric magnetic field. In our study, the RBE model obtains a realistic magnetic field from the BATS-R-US magnetosphere model at a regular, but adjustable, cadence. We simulate the storm with and without wave particle interactions, and with different frequencies for updating the magnetic field. The impacts of the wave-particle interactions, and the rapid variations in the magnetospheric magnetic field, can then be studied. Simulation results are also extracted along various satellite trajectories for direct comparison where appropriate.
U-10Mo Baseline Fuel Fabrication Process Description
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hubbard, Lance R.; Arendt, Christina L.; Dye, Daniel F.
This document provides a description of the U.S. High Power Research Reactor (USHPRR) low-enriched uranium (LEU) fuel fabrication process. This document is intended to be used in conjunction with the baseline process flow diagram (PFD) presented in Appendix A. The baseline PFD is used to document the fabrication process, communicate gaps in technology or manufacturing capabilities, convey alternatives under consideration, and as the basis for a dynamic simulation model of the fabrication process. The simulation model allows for the assessment of production rates, costs, and manufacturing requirements (manpower, fabrication space, numbers and types of equipment, etc.) throughout the lifecycle ofmore » the USHPRR program. This document, along with the accompanying PFD, is updated regularly« less
Concepts for national assessment of water availability and use
,
2002-01-01
In response to a directive from Congress to the U.S. Geological Survey to 'prepare a report describing the scope and magnitude of the efforts needed to provide periodic assessments of the status and trends in the availability and use of freshwater resources,' of the United States, a program is proposed to develop and report on indicators of the status and trends in storage volume, flow rates, and uses of water nationwide. This program would be analogous to the task of other Federal statistical programs that produce and regularly update indicator variables that describe economic, demographic, and health conditions of the Nation. The assessment also would provide regional estimates of recharge, evapotranspiration, interbasin transfers, and other components of the water cycle.
Surface-water quality-assurance plan for the U.S. Geological Survey Washington Water Science Center
Mastin, Mark C.
2016-02-19
This Surface-Water Quality-Assurance Plan documents the standards, policies, and procedures used by the U.S. Geological Survey Washington Water Science Center (WAWSC) for activities related to the collection, processing, storage, analysis, and publication of surface-water data. This plan serves as a guide to all WAWSC personnel involved in surface-water data activities, and changes as the needs and requirements of the WAWSC change. Regular updates to this plan represent an integral part of the quality-assurance process. In the WAWSC, direct oversight and responsibility by the hydrographer(s) assigned to a surface-water station, combined with team approaches in all work efforts, assure highquality data, analyses, reviews, and reports for cooperating agencies and the public.
VizieR Online Data Catalog: The Gemini Observation Log (CADC, 2001-)
NASA Astrophysics Data System (ADS)
Association of Universities For Research in Astronomy
2018-01-01
This database contains a log of the Gemini Telescope observations since 2001, managed by the Canadian Astronomical Data Center (CADC). The data are regularly updated (see the date of the last version at the end of this file). The Gemini Observatory consists of twin 8.1-meter diameter optical/infrared telescopes located on two of the best observing sites on the planet. From their locations on mountains in Hawai'i and Chile, Gemini Observatory's telescopes can collectively access the entire sky. Gemini is operated by a partnership of five countries including the United States, Canada, Brazil, Argentina and Chile. Any astronomer in these countries can apply for time on Gemini, which is allocated in proportion to each partner's financial stake. (1 data file).
Hamacher, Michael; Eisenacher, Martin; Tribl, Florian; Stephan, Christian; Marcus, Katrin; Hardt, Tanja; Wiltfang, Jens; Martens, Lennart; Desiderio, Dominic; Gutstein, Howard; Park, Young Mok; Meyer, Helmut E
2008-06-01
The Human Brain Proteome Project (HUPO BPP) aims at advancing knowledge and the understanding of neurodiseases and aging with the purpose of identifying prognostic and diagnostic biomarkers, as well as to push new diagnostic approaches and medications. The participating groups meet in semi-annual workshops to discuss the progress, as well as the needs, within the field of proteomics. The 9(th) HUPO BPP workshop took place in Barbados from 9-10 January, 2008. Discussing the future HUPO BPP Roadmap, the attendees drafted the so called HUPO BPP wish list containing timelines, suggestions and missions. This wish list will be updated regularly and will serve as a guideline for the next phase.
Sulfur dioxide emission rates from Kīlauea Volcano, Hawai‘i, 2007–2010
Elias, T.; Sutton, A.J.
2012-01-01
Kīlauea Volcano has one of the longest running volcanic sulfur dioxide (SO2) emission rate databases on record. Sulfur dioxide emission rates from Kīlauea Volcano were first measured by Stoiber and Malone (1975) and have been measured on a regular basis since 1979 (Elias and Sutton, 2007, and references within). Compilations of SO2 emission-rate and wind-vector data from 1979 through 2006 are available on the USGS Web site (Elias and others, 1998; Elias and Sutton, 2002; Elias and Sutton, 2007). This report updates the database, documents the changes in data collection and processing methods, and highlights how SO2 emissions have varied with eruptive activity at Kīlauea Volcano for the interval 2007–2010.
Design and implementation of a wearable healthcare monitoring system.
Sagahyroon, Assim; Raddy, Hazem; Ghazy, Ali; Suleman, Umair
2009-01-01
A wearable healthcare monitoring unit that integrates various technologies was developed to provide patients with the option of leading a healthy and independent life without risks or confinement to medical facilities. The unit consists of various sensors integrated to a microcontroller and attached to the patient's body, reading vital signs and transmitting these readings via a Bluetooth link to the patient's mobile phone. Short-Messaging-Service (SMS) is incorporated in the design to alert a physician in emergency cases. Additionally, an application program running on the mobile phone uses the internet to update (at regular intervals) the patient records in a hospital database with the most recent readings. To reduce development costs, the components used were both off-the-shelf and affordable.
Ongoing Analysis of Jupiter's Equatorial Hotspots and Plumes from Cassini
NASA Technical Reports Server (NTRS)
Choi, D. S.; Showmwn, A. P.; Vasavada, A. R.; Simon-Miller, A. A.
2012-01-01
We present updated results from our ongoing analysis of Cassini observations of Jupiter's equatorial meteorology. For two months preceding the spacecraft's closest approach of the planet, the ISS instrument onboard Cassini regularly imaged the atmosphere of Jupiter. We created time-lapse movies from this period that show the complex activity and interactions of the equatorial atmosphere. During this period, hot spots exhibited significant variations in size and shape over timescales of days and weeks. Some of these changes appear to be a result of interactions with passing vortex systems in adjacent latitudes. Strong anticyclonic gyres to the southeast of the dark areas converge with flow from the west and appear to circulate into a hot spot at its southwestern corner.
Collective Behaviors in Spatially Extended Systems with Local Interactions and Synchronous Updating
NASA Astrophysics Data System (ADS)
ChatÉ, H.; Manneville, P.
1992-01-01
Assessing the extent to which dynamical systems with many degrees of freedom can be described within a thermodynamics formalism is a problem that currently attracts much attention. In this context, synchronously updated regular lattices of identical, chaotic elements with local interactions are promising models for which statistical mechanics may be hoped to provide some insights. This article presents a large class of cellular automata rules and coupled map lattices of the above type in space dimensions d = 2 to 6.Such simple models can be approached by a mean-field approximation which usually reduces the dynamics to that of a map governing the evolution of some extensive density. While this approximation is exact in the d = infty limit, where macroscopic variables must display the time-dependent behavior of the mean-field map, basic intuition from equilibrium statistical mechanics rules out any such behavior in a low-dimensional systems, since it would involve the collective motion of locally disordered elements.The models studied are chosen to be as close as possible to mean-field conditions, i.e., rather high space dimension, large connectivity, and equal-weight coupling between sites. While the mean-field evolution is never observed, a new type of non-trivial collective behavior is found, at odds with the predictions of equilibrium statistical mechanics. Both in the cellular automata models and in the coupled map lattices, macroscopic variables frequently display a non-transient, time-dependent, low-dimensional dynamics emerging out of local disorder. Striking examples are period 3 cycles in two-state cellular automata and a Hopf bifurcation for a d = 5 lattice of coupled logistic maps. An extensive account of the phenomenology is given, including a catalog of behaviors, classification tables for the celular automata rules, and bifurcation diagrams for the coupled map lattices.The observed underlying dynamics is accompanied by an intrinsic quasi-Gaussian noise (stemming from the local disorder) which disappears in the infinite-size limit. The collective behaviors constitute a robust phenomenon, resisting external noise, small changes in the local dynamics, and modifications of the initial and boundary conditions. Synchronous updating, high space dimension and the regularity of connections are shown to be crucial ingredients in the subtle build-up of correlations giving rise to the collective motion. The discussion stresses the need for a theoretical understanding that neither equilibrium statistical mechanics nor higher-order mean-field approximations are able to provide.
A new approach to blind deconvolution of astronomical images
NASA Astrophysics Data System (ADS)
Vorontsov, S. V.; Jefferies, S. M.
2017-05-01
We readdress the strategy of finding approximate regularized solutions to the blind deconvolution problem, when both the object and the point-spread function (PSF) have finite support. Our approach consists in addressing fixed points of an iteration in which both the object x and the PSF y are approximated in an alternating manner, discarding the previous approximation for x when updating x (similarly for y), and considering the resultant fixed points as candidates for a sensible solution. Alternating approximations are performed by truncated iterative least-squares descents. The number of descents in the object- and in the PSF-space play a role of two regularization parameters. Selection of appropriate fixed points (which may not be unique) is performed by relaxing the regularization gradually, using the previous fixed point as an initial guess for finding the next one, which brings an approximation of better spatial resolution. We report the results of artificial experiments with noise-free data, targeted at examining the potential capability of the technique to deconvolve images of high complexity. We also show the results obtained with two sets of satellite images acquired using ground-based telescopes with and without adaptive optics compensation. The new approach brings much better results when compared with an alternating minimization technique based on positivity-constrained conjugate gradients, where the iterations stagnate when addressing data of high complexity. In the alternating-approximation step, we examine the performance of three different non-blind iterative deconvolution algorithms. The best results are provided by the non-negativity-constrained successive over-relaxation technique (+SOR) supplemented with an adaptive scheduling of the relaxation parameter. Results of comparable quality are obtained with steepest descents modified by imposing the non-negativity constraint, at the expense of higher numerical costs. The Richardson-Lucy (or expectation-maximization) algorithm fails to locate stable fixed points in our experiments, due apparently to inappropriate regularization properties.
Safety of regular formoterol or salmeterol in children with asthma: an overview of Cochrane reviews
Cates, Christopher J; Oleszczuk, Marta; Stovold, Elizabeth; Wieland, L. Susan
2014-01-01
Background Two large surveillance studies in adults with asthma have found an increased risk of asthma-related mortality in those who took regular salmeterol as monotherapy in comparison to placebo or regular salbutamol. No similar sized surveillance studies have been carried out in children with asthma, and we remain uncertain about the comparative safety of regular combination therapy with either formoterol or salmeterol in children with asthma. Objectives We have used the paediatric trial results from Cochrane systematic reviews to assess the safety of regular formoterol or salmeterol, either as monotherapy or as combination therapy, in children with asthma. Methods We included Cochrane reviews relating to the safety of regular formoterol and salmeterol from a search of the Cochrane Database of Systematic Reviews conducted in May 2012, and ran updated searches for each of the reviews. These were independently assessed. All the reviews were assessed for quality using the AMSTAR tool. We extracted the data relating to children from each review and from new trials found in the updated searches (including risks of bias, study characteristics, serious adverse event outcomes, and control arm event rates). The safety of regular formoterol and salmeterol were assessed directly from the paediatric trials in the Cochrane reviews of monotherapy and combination therapy with each product. Then monotherapy was indirectly compared to combination therapy by looking at the differences between the pooled trial results for monotherapy and the pooled results for combination therapy. The comparative safety of formoterol and salmeterol was assessed using direct evidence from trials that randomised children to each treatment; this was combined with the result of an indirect comparison of the combination therapy trials, which represents the difference between the pooled results of each product when randomised against inhaled corticosteroids alone. Main results We identified six high quality, up to date Cochrane reviews. Four of these related to the safety of regular formoterol or salmeterol (as monotherapy or combination therapy) and these included 19 studies in children. We added data from two recent studies on salmeterol combination therapy in 689 children which were published after the relevant Cochrane review had been completed, making a total of 21 trials on 7474 children (from four to 17 years of age). The two remaining reviews compared the safety of formoterol with salmeterol from trials randomising participants to one or other treatment, but the reviews only included a single trial in children in which there were 156 participants. Only one child died across all the trials, so impact on mortality could not be assessed. We found a statistically significant increase in the odds of suffering a non-fatal serious adverse event of any cause in children on formoterol monotherapy (Peto odds ratio (OR) 2.48; 95% confidence interval (CI) 1.27 to 4.83, I2 = 0%, 5 trials, N = 1335, high quality) and smaller increases in odds which were not statistically significant for salmeterol monotherapy (Peto OR 1.30; 95% CI 0.82 to 2.05, I2 = 17%, 5 trials, N = 1333, moderate quality), formoterol combination therapy (Peto OR 1.60; 95% CI 0.80 to 3.28, I2 = 32%, 7 trials, N = 2788, moderate quality) and salmeterol combination therapy (Peto OR 1.20; 95% CI 0.37 to 2.91, I2 = 0%, 5 trials, N = 1862, moderate quality). We compared the pooled results of the monotherapy and combination therapy trials. There was no significant difference between the pooled ORs of children with a serious adverse event (SAE) from long-acting beta2-agonist beta agonist (LABA) monotherapy (Peto OR 1.60; 95% CI 1.10 to 2.33, 10 trials, N = 2668) and combination trials (Peto OR 1.50; 95% CI 0.82 to 2.75, 12 trials, N = 4,650). However, there were fewer children with an SAE in the regular inhaled corticosteroid (ICS) control group (0.7%) than in the placebo control group (3.6%). As a result, there was an absolute increase of an additional 21 children (95% CI 4 to 45) suffering such an SAE of any cause for every 1000 children treated over six months with either regular formoterol or salmeterol monotherapy, whilst for combination therapy the increased risk was an additional three children (95% CI 1 fewer to 12 more) per 1000 over three months. We only found a single trial in 156 children comparing the safety of regular salmeterol to regular formoterol monotherapy, and even with the additional evidence from indirect comparisons between the combination formoterol and salmeterol trials, the CI around the effect on SAEs is too wide to tell whether there is a difference in the comparative safety of formoterol and salmeterol (OR 1.26; 95% CI 0.37 to 4.32). Authors’ conclusions We do not know if regular combination therapy with formoterol or salmeterol in children alters the risk of dying from asthma. Regular combination therapy is likely to be less risky than monotherapy in children with asthma, but we cannot say that combination therapy is risk free. There are probably an additional three children per 1000 who suffer a non-fatal serious adverse event on combination therapy in comparison to ICS over three months. This is currently our best estimate of the risk of using LABA combination therapy in children and has to be balanced against the symptomatic benefit obtained for each child. We await the results of large on-going surveillance studies to further clarify the risks of combination therapy in children and adolescents with asthma. The relative safety of formoterol in comparison to salmeterol remains unclear, even when all currently available direct and indirect trial evidence is combined. PMID:23076961
Systematic meta-review of supported self-management for asthma: a healthcare perspective.
Pinnock, Hilary; Parke, Hannah L; Panagioti, Maria; Daines, Luke; Pearce, Gemma; Epiphaniou, Eleni; Bower, Peter; Sheikh, Aziz; Griffiths, Chris J; Taylor, Stephanie J C
2017-03-17
Supported self-management has been recommended by asthma guidelines for three decades; improving current suboptimal implementation will require commitment from professionals, patients and healthcare organisations. The Practical Systematic Review of Self-Management Support (PRISMS) meta-review and Reducing Care Utilisation through Self-management Interventions (RECURSIVE) health economic review were commissioned to provide a systematic overview of supported self-management to inform implementation. We sought to investigate if supported asthma self-management reduces use of healthcare resources and improves asthma control; for which target groups it works; and which components and contextual factors contribute to effectiveness. Finally, we investigated the costs to healthcare services of providing supported self-management. We undertook a meta-review (systematic overview) of systematic reviews updated with randomised controlled trials (RCTs) published since the review search dates, and health economic meta-analysis of RCTs. Twelve electronic databases were searched in 2012 (updated in 2015; pre-publication update January 2017) for systematic reviews reporting RCTs (and update RCTs) evaluating supported asthma self-management. We assessed the quality of included studies and undertook a meta-analysis and narrative synthesis. A total of 27 systematic reviews (n = 244 RCTs) and 13 update RCTs revealed that supported self-management can reduce hospitalisations, accident and emergency attendances and unscheduled consultations, and improve markers of control and quality of life for people with asthma across a range of cultural, demographic and healthcare settings. Core components are patient education, provision of an action plan and regular professional review. Self-management is most effective when delivered in the context of proactive long-term condition management. The total cost (n = 24 RCTs) of providing self-management support is offset by a reduction in hospitalisations and accident and emergency visits (standard mean difference 0.13, 95% confidence interval -0.09 to 0.34). Evidence from a total of 270 RCTs confirms that supported self-management for asthma can reduce unscheduled care and improve asthma control, can be delivered effectively for diverse demographic and cultural groups, is applicable in a broad range of clinical settings, and does not significantly increase total healthcare costs. Informed by this comprehensive synthesis of the literature, clinicians, patient-interest groups, policy-makers and providers of healthcare services should prioritise provision of supported self-management for people with asthma as a core component of routine care. RECURSIVE: PROSPERO CRD42012002694 ; PRISMS: PROSPERO does not register meta-reviews.
Literature searches on Ayurveda: An update
Aggithaya, Madhur G.; Narahari, Saravu R.
2015-01-01
Introduction: The journals that publish on Ayurveda are increasingly indexed by popular medical databases in recent years. However, many Eastern journals are not indexed biomedical journal databases such as PubMed. Literature searches for Ayurveda continue to be challenging due to the nonavailability of active, unbiased dedicated databases for Ayurvedic literature. In 2010, authors identified 46 databases that can be used for systematic search of Ayurvedic papers and theses. This update reviewed our previous recommendation and identified current and relevant databases. Aims: To update on Ayurveda literature search and strategy to retrieve maximum publications. Methods: Author used psoriasis as an example to search previously listed databases and identify new. The population, intervention, control, and outcome table included keywords related to psoriasis and Ayurvedic terminologies for skin diseases. Current citation update status, search results, and search options of previous databases were assessed. Eight search strategies were developed. Hundred and five journals, both biomedical and Ayurveda, which publish on Ayurveda, were identified. Variability in databases was explored to identify bias in journal citation. Results: Five among 46 databases are now relevant – AYUSH research portal, Annotated Bibliography of Indian Medicine, Digital Helpline for Ayurveda Research Articles (DHARA), PubMed, and Directory of Open Access Journals. Search options in these databases are not uniform, and only PubMed allows complex search strategy. “The Researches in Ayurveda” and “Ayurvedic Research Database” (ARD) are important grey resources for hand searching. About 44/105 (41.5%) journals publishing Ayurvedic studies are not indexed in any database. Only 11/105 (10.4%) exclusive Ayurveda journals are indexed in PubMed. Conclusion: AYUSH research portal and DHARA are two major portals after 2010. It is mandatory to search PubMed and four other databases because all five carry citations from different groups of journals. The hand searching is important to identify Ayurveda publications that are not indexed elsewhere. Availability information of citations in Ayurveda libraries from National Union Catalogue of Scientific Serials in India if regularly updated will improve the efficacy of hand searching. A grey database (ARD) contains unpublished PG/Ph.D. theses. The AYUSH portal, DHARA (funded by Ministry of AYUSH), and ARD should be merged to form single larger database to limit Ayurveda literature searches. PMID:27313409
Robinson, James; Waller, Matthew J.; Fail, Sylvie C.; McWilliam, Hamish; Lopez, Rodrigo; Parham, Peter; Marsh, Steven G. E.
2009-01-01
It is 10 years since the IMGT/HLA database was released, providing the HLA community with a searchable repository of highly curated HLA sequences. The HLA complex is located within the 6p21.3 region of human chromosome 6 and contains more than 220 genes of diverse function. Many of the genes encode proteins of the immune system and are highly polymorphic. The naming of these HLA genes and alleles, and their quality control is the responsibility of the WHO Nomenclature Committee for Factors of the HLA System. Through the work of the HLA Informatics Group and in collaboration with the European Bioinformatics Institute, we are able to provide public access to this data through the website http://www.ebi.ac.uk/imgt/hla/. The first release contained 964 sequences, the most recent release 3300 sequences, with around 450 new sequences been added each year. The tools provided on the website have been updated to allow more complex alignments, which include genomic sequence data, as well as the development of tools for probe and primer design and the inclusion of data from the HLA Dictionary. Regular updates to the website ensure that new and confirmatory sequences are dispersed to the HLA community, and the wider research and clinical communities. PMID:18838392
Jali, Pramod K.; Singh, Shamsher; Babaji, Prashant; Chaurasia, Vishwajit Rampratap; Somasundaram, P; Lau, Himani
2014-01-01
Background: Internet is a useful tool to update the knowledge. The aim of the present study was to assess the current level of knowledge on the computer and internet among under graduate dental students. Materials and Methods: The study consists of self-administered close ended questionnaire survey. Questionnaires were distributed to undergraduate dental students. The study was conducted during July to September 2012. Results: In the selected samples, response rate was 100%. Most (94.4%) of the students had computer knowledge and 77.4% had their own computer and access at home. Nearly 40.8% of students use computer for general purpose, 28.5% for entertainment and 22.8% used for research purpose. Most of the students had internet knowledge (92.9%) and they used it independently (79.1%). Nearly 42.1% used internet occasionally whereas, 34.4% used regularly, 21.7% rarely and 1.8% don’t use respectively. Internet was preferred for getting information (48.8%) due to easy accessibility and recent updates. For dental purpose students used internet 2-3 times/week (45.3%). Most (95.3%) of the students responded to have computer based learning program in the curriculum. Conclusion: Computer knowledge was observed to be good among dental students. PMID:24818091
Human Ageing Genomic Resources: new and updated databases
Tacutu, Robi; Thornton, Daniel; Johnson, Emily; Budovsky, Arie; Barardo, Diogo; Craig, Thomas; Diana, Eugene; Lehmann, Gilad; Toren, Dmitri; Wang, Jingwei; Fraifeld, Vadim E
2018-01-01
Abstract In spite of a growing body of research and data, human ageing remains a poorly understood process. Over 10 years ago we developed the Human Ageing Genomic Resources (HAGR), a collection of databases and tools for studying the biology and genetics of ageing. Here, we present HAGR’s main functionalities, highlighting new additions and improvements. HAGR consists of six core databases: (i) the GenAge database of ageing-related genes, in turn composed of a dataset of >300 human ageing-related genes and a dataset with >2000 genes associated with ageing or longevity in model organisms; (ii) the AnAge database of animal ageing and longevity, featuring >4000 species; (iii) the GenDR database with >200 genes associated with the life-extending effects of dietary restriction; (iv) the LongevityMap database of human genetic association studies of longevity with >500 entries; (v) the DrugAge database with >400 ageing or longevity-associated drugs or compounds; (vi) the CellAge database with >200 genes associated with cell senescence. All our databases are manually curated by experts and regularly updated to ensure a high quality data. Cross-links across our databases and to external resources help researchers locate and integrate relevant information. HAGR is freely available online (http://genomics.senescence.info/). PMID:29121237
Evaluation of a new eLearning platform for distance teaching of microsurgery.
Messaoudi, T; Bodin, F; Hidalgo Diaz, J J; Ichihara, S; Fikry, T; Lacreuse, I; Liverneaux, P; Facca, S
2015-06-01
Online learning (or eLearning) is in constant evolution in medicine. An analytical survey of the websites of eight academic societies and medical schools was carried out. These sites were evaluated against parameters that define the quality of an eLearning website, as well as the shareable content object reference model (SCORM) technical standards. All studied platforms were maintained by a webmaster and regularly updated. Only two platforms had teleconference opportunities, five had courses in PDF format, and four allowed online testing. Based on SCORM standards, only four platforms allowed direct access without a password. The content of all platforms was adaptable, interoperable and reusable. But their sustainability was difficult to assess. In parallel, we developed the first eLearning platform to be used as part of a university diploma in microsurgery in France. The platform was evaluated by students enrolled this diploma program. A satisfaction survey and platform evaluation showed that students were generally satisfied and had used the platform for microsurgery education, especially the seven students living abroad. ELearning for microsurgery allows the content to be continuously updated, makes for fewer classroom visits, provides easy remote access, and especially better training time management and cost savings in terms of travel and accommodations. Copyright © 2015 Elsevier Masson SAS. All rights reserved.
Soy Consumption and the Risk of Prostate Cancer: An Updated Systematic Review and Meta-Analysis
Ranard, Katherine M.; Jeon, Sookyoung; Erdman, John W.
2018-01-01
Prostate cancer (PCa) is the second most commonly diagnosed cancer in men, accounting for 15% of all cancers in men worldwide. Asian populations consume soy foods as part of a regular diet, which may contribute to the lower PCa incidence observed in these countries. This meta-analysis provides a comprehensive updated analysis that builds on previously published meta-analyses, demonstrating that soy foods and their isoflavones (genistein and daidzein) are associated with a lower risk of prostate carcinogenesis. Thirty articles were included for analysis of the potential impacts of soy food intake, isoflavone intake, and circulating isoflavone levels, on both primary and advanced PCa. Total soy food (p < 0.001), genistein (p = 0.008), daidzein (p = 0.018), and unfermented soy food (p < 0.001) intakes were significantly associated with a reduced risk of PCa. Fermented soy food intake, total isoflavone intake, and circulating isoflavones were not associated with PCa risk. Neither soy food intake nor circulating isoflavones were associated with advanced PCa risk, although very few studies currently exist to examine potential associations. Combined, this evidence from observational studies shows a statistically significant association between soy consumption and decreased PCa risk. Further studies are required to support soy consumption as a prophylactic dietary approach to reduce PCa carcinogenesis. PMID:29300347
Langhorst, J; Heldmann, P; Henningsen, P; Kopke, K; Krumbein, L; Lucius, H; Winkelmann, A; Wolf, B; Häuser, W
2017-06-01
The regular update of the guidelines on fibromyalgia syndrome, AWMF number 145/004, was scheduled for April 2017. The guidelines were developed by 13 scientific societies and 2 patient self-help organizations coordinated by the German Pain Society. Working groups (n =8) with a total of 42 members were formed balanced with respect to gender, medical expertise, position in the medical or scientific hierarchy and potential conflicts of interest. A search of the literature for systematic reviews of randomized controlled trials of complementary and alternative therapies from December 2010 to May 2016 was performed in the Cochrane library, MEDLINE, PsycINFO and Scopus databases. Levels of evidence were assigned according to the classification system of the Oxford Centre for Evidence-Based Medicine version 2009. The strength of recommendations was formed by multiple step formalized procedures to reach a consensus. Efficacy, risks, patient preferences and applicability of available therapies were weighed up against each other. The guidelines were reviewed and approved by the board of directors of the societies engaged in the development of the guidelines. Meditative movement therapies (e.g. qi gong, tai chi and yoga) are strongly recommended. Acupuncture and weight reduction in cases of obesity can be considered.
Winkelmann, A; Bork, H; Brückle, W; Dexl, C; Heldmann, P; Henningsen, P; Krumbein, L; Pullwitt, V; Schiltenwolf, M; Häuser, W
2017-06-01
The regular update of the guidelines on fibromyalgia syndrome, AWMF number 145/004, was scheduled for April 2017. The guidelines were developed by 13 scientific societies and 2 patient self-help organizations coordinated by the German Pain Society. Working groups (n =8) with a total of 42 members were formed balanced with respect to gender, medical expertise, position in the medical or scientific hierarchy and potential conflicts of interest. A literature search for systematic reviews of randomized, controlled trials on physiotherapy, occupational therapy and physical therapy from December 2010 to May 2016 was performed in the Cochrane library, MEDLINE, PsycINFO and Scopus databases. Levels of evidence were assigned according to the classification system of the Oxford Centre for Evidence-Based Medicine version 2009. The strength of recommendations was achieved by multiple step formalized procedures to reach a consensus. Efficacy, risks, patient preferences and applicability of available therapies were weighed up against each other. The guidelines were reviewed and approved by the board of directors of the societies engaged in the development of the guidelines. Low to moderate intensity endurance and strength training are strongly recommended. Chiropractic, laser therapy, magnetic field therapy, massage and transcranial magnetic stimulation are not recommended.
Can efficient supply management in the operating room save millions?
Park, Kyung W; Dickerson, Cheryl
2009-04-01
Supply expenses occupy an ever-increasing portion of the expense budget in today's increasingly technologically complex operating rooms. Yet, little has been studied and published in the anesthesia literature. This review attempts to bring the topic of supply management to anesthesiologists, who play a significant role in operating room management. Little investigative work has been performed on supply management. Anecdotal reports suggest the benefits of a perpetual inventory system over a periodic inventory system. A perpetual inventory system uses utilization data to update inventory on hand continually and this information is linked to purchasing and restocking, whereas a periodic inventory system counts inventory at some regular intervals (such as annually) and uses average utilization to set par levels. On the basis of application of operational management concepts, ways of taking advantage of a perpetual inventory system to achieve savings in supply expenses are outlined. These include linking the operating room scheduling and supply order system, distributor-driven just-in-time delivery of case carts, continual updating of preference lists based on utilization patterns, increasing inventory turnovers, standardizing surgical practices, and vendor consignment of high unit-cost items such as implants. In addition, Lean principles of visual management and elimination of eight wastes may be applicable to supply management.
A brief qualitative survey on the utilization of Yoga research resources by Yoga teachers.
Bhavanani, Ananda Balayogi
2016-01-01
Yoga has become popular worldwide with increasing research done on its therapeutic potential. However, it remains to be determined whether such findings actually percolate down into teaching and practice of Yoga teachers/therapists. The aim of this survey was to document awareness of Yoga research findings in the Yoga community and find out how these were utilized. It was undertaken with a select group of 34 international Yoga teachers and therapists utilizing email and social media between August and December 2015. Majority of responders had well-established reputation in Yoga and were from diverse lineages with 30 of them having more than 5 years of experience in the field. A set of eight questions were sent to them related to essentiality of Yoga research, how they updated themselves on research findings and whether such studies influenced their teaching and practice. Responses were compiled and appropriate statistics determined for quantitative aspects while feedback, comments and suggestions were noted in detail. About 89% agreed that it was essential to be up-to-date on Yoga research but only 70% updated themselves regularly with average papers read fully per year being <10. Most accessed information through general news reports, emails from contacts, and articles on internet sites whereas only 7% were through PubMed. About 60% felt these studies helped them in general teaching whereas 20% said that such studies had not really influenced it in any way. This survey provides a basic picture of a general lack of awareness of Yoga research amongst practicing Yoga teachers and therapists. Though a majority agree research is important, few seriously update themselves on this through scientific channels. With regard to future studies, most wanted "proof" that could be used to convince potential clients and felt that more qualitative methods should be applied.
Kozar, Mark D.; Kahle, Sue C.
2013-01-01
This report documents the standard procedures, policies, and field methods used by the U.S. Geological Survey’s (USGS) Washington Water Science Center staff for activities related to the collection, processing, analysis, storage, and publication of groundwater data. This groundwater quality-assurance plan changes through time to accommodate new methods and requirements developed by the Washington Water Science Center and the USGS Office of Groundwater. The plan is based largely on requirements and guidelines provided by the USGS Office of Groundwater, or the USGS Water Mission Area. Regular updates to this plan represent an integral part of the quality-assurance process. Because numerous policy memoranda have been issued by the Office of Groundwater since the previous groundwater quality assurance plan was written, this report is a substantial revision of the previous report, supplants it, and contains significant additional policies not covered in the previous report. This updated plan includes information related to the organization and responsibilities of USGS Washington Water Science Center staff, training, safety, project proposal development, project review procedures, data collection activities, data processing activities, report review procedures, and archiving of field data and interpretative information pertaining to groundwater flow models, borehole aquifer tests, and aquifer tests. Important updates from the previous groundwater quality assurance plan include: (1) procedures for documenting and archiving of groundwater flow models; (2) revisions to procedures and policies for the creation of sites in the Groundwater Site Inventory database; (3) adoption of new water-level forms to be used within the USGS Washington Water Science Center; (4) procedures for future creation of borehole geophysics, surface geophysics, and aquifer-test archives; and (5) use of the USGS Multi Optional Network Key Entry System software for entry of routine water-level data collected as part of long-term water-level monitoring networks.
Soares-Miranda, Luisa; Siscovick, David S; Psaty, Bruce M; Longstreth, W T; Mozaffarian, Dariush
2016-01-12
Although guidelines suggest that older adults engage in regular physical activity (PA) to reduce cardiovascular disease (CVD), surprisingly few studies have evaluated this relationship, especially in those >75 years. In addition, with advancing age the ability to perform some types of PA might decrease, making light-moderate exercise such as walking especially important to meet recommendations. Prospective cohort analysis among 4207 US men and women of a mean age of 73 years (standard deviation=6) who were free of CVD at baseline in the Cardiovascular Health Study were followed from 1989 to 1999. PA was assessed and cumulatively updated over time to minimize misclassification and assess the long-term effects of habitual activity. Walking (pace, blocks, combined walking score) was updated annually from baseline through 1999. Leisure-time activity and exercise intensity were updated at baseline, 1992, and 1996. Incident CVD (fatal or nonfatal myocardial infarction, coronary death, or stroke) was adjudicated using medical records. During 41,995 person-years of follow-up, 1182 CVD events occurred. After multivariable adjustment, greater PA was inversely associated with coronary heart disease, stroke (especially ischemic stroke), and total CVD, even in those ≥75 years. Walking pace, distance, and overall walking score, leisure-time activity, and exercise intensity were each associated with lower risk. For example, in comparison with a walking pace <2 mph, those that habitually walked at a pace >3 mph had a lower risk of coronary heart disease (0.50; confidence interval, 0.38-0.67), stroke (0.47; confidence interval, 033-0.66), and CVD (0.50; confidence interval, 0.40-0.62). These data provide empirical evidence supporting PA recommendations, in particular, walking, to reduce the incidence of CVD among older adults. © 2015 American Heart Association, Inc.
Social Media Use and Access to Digital Technology in US Young Adults in 2016.
Villanti, Andrea C; Johnson, Amanda L; Ilakkuvan, Vinu; Jacobs, Megan A; Graham, Amanda L; Rath, Jessica M
2017-06-07
In 2015, 90% of US young adults with Internet access used social media. Digital and social media are highly prevalent modalities through which young adults explore identity formation, and by extension, learn and transmit norms about health and risk behaviors during this developmental life stage. The purpose of this study was to provide updated estimates of social media use from 2014 to 2016 and correlates of social media use and access to digital technology in data collected from a national sample of US young adults in 2016. Young adult participants aged 18-24 years in Wave 7 (October 2014, N=1259) and Wave 9 (February 2016, N=989) of the Truth Initiative Young Adult Cohort Study were asked about use frequency for 11 social media sites and access to digital devices, in addition to sociodemographic characteristics. Regular use was defined as using a given social media site at least weekly. Weighted analyses estimated the prevalence of use of each social media site, overlap between regular use of specific sites, and correlates of using a greater number of social media sites regularly. Bivariate analyses identified sociodemographic correlates of access to specific digital devices. In 2014, 89.42% (weighted n, 1126/1298) of young adults reported regular use of at least one social media site. This increased to 97.5% (weighted n, 965/989) of young adults in 2016. Among regular users of social media sites in 2016, the top five sites were Tumblr (85.5%), Vine (84.7%), Snapchat (81.7%), Instagram (80.7%), and LinkedIn (78.9%). Respondents reported regularly using an average of 7.6 social media sites, with 85% using 6 or more sites regularly. Overall, 87% of young adults reported access or use of a smartphone with Internet access, 74% a desktop or laptop computer with Internet access, 41% a tablet with Internet access, 29% a smart TV or video game console with Internet access, 11% a cell phone without Internet access, and 3% none of these. Access to all digital devices with Internet was lower in those reporting a lower subjective financial situation; there were also significant differences in access to specific digital devices with Internet by race, ethnicity, and education. The high mean number of social media sites used regularly and the substantial overlap in use of multiple social media sites reflect the rapidly changing social media environment. Mobile devices are a primary channel for social media, and our study highlights disparities in access to digital technologies with Internet access among US young adults by race/ethnicity, education, and subjective financial status. Findings from this study may guide the development and implementation of future health interventions for young adults delivered via the Internet or social media sites. ©Andrea C Villanti, Amanda L Johnson, Vinu Ilakkuvan, Megan A Jacobs, Amanda L Graham, Jessica M Rath. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 07.06.2017.
Social Media Use and Access to Digital Technology in US Young Adults in 2016
Johnson, Amanda L; Ilakkuvan, Vinu; Jacobs, Megan A; Graham, Amanda L; Rath, Jessica M
2017-01-01
Background In 2015, 90% of US young adults with Internet access used social media. Digital and social media are highly prevalent modalities through which young adults explore identity formation, and by extension, learn and transmit norms about health and risk behaviors during this developmental life stage. Objective The purpose of this study was to provide updated estimates of social media use from 2014 to 2016 and correlates of social media use and access to digital technology in data collected from a national sample of US young adults in 2016. Methods Young adult participants aged 18-24 years in Wave 7 (October 2014, N=1259) and Wave 9 (February 2016, N=989) of the Truth Initiative Young Adult Cohort Study were asked about use frequency for 11 social media sites and access to digital devices, in addition to sociodemographic characteristics. Regular use was defined as using a given social media site at least weekly. Weighted analyses estimated the prevalence of use of each social media site, overlap between regular use of specific sites, and correlates of using a greater number of social media sites regularly. Bivariate analyses identified sociodemographic correlates of access to specific digital devices. Results In 2014, 89.42% (weighted n, 1126/1298) of young adults reported regular use of at least one social media site. This increased to 97.5% (weighted n, 965/989) of young adults in 2016. Among regular users of social media sites in 2016, the top five sites were Tumblr (85.5%), Vine (84.7%), Snapchat (81.7%), Instagram (80.7%), and LinkedIn (78.9%). Respondents reported regularly using an average of 7.6 social media sites, with 85% using 6 or more sites regularly. Overall, 87% of young adults reported access or use of a smartphone with Internet access, 74% a desktop or laptop computer with Internet access, 41% a tablet with Internet access, 29% a smart TV or video game console with Internet access, 11% a cell phone without Internet access, and 3% none of these. Access to all digital devices with Internet was lower in those reporting a lower subjective financial situation; there were also significant differences in access to specific digital devices with Internet by race, ethnicity, and education. Conclusions The high mean number of social media sites used regularly and the substantial overlap in use of multiple social media sites reflect the rapidly changing social media environment. Mobile devices are a primary channel for social media, and our study highlights disparities in access to digital technologies with Internet access among US young adults by race/ethnicity, education, and subjective financial status. Findings from this study may guide the development and implementation of future health interventions for young adults delivered via the Internet or social media sites. PMID:28592394
Nolte, Florian; Angelucci, Emanuele; Breccia, Massimo; Gattermann, Norbert; Santini, Valeria; Vey, Norbert; Hofmann, Wolf-Karsten
2015-10-01
Myelodysplastic syndromes (MDS) are oligoclonal hematopoietic disorders characterized by peripheral cytopenias with anemias being the most prevalent feature. The majority of patients will depend on regular transfusions of packed red blood cells (PRBC) during the course of the disease. Particularly patients with MDS and low risk for transformation into acute myeloid leukemia and low risk of early death will receive PRBC transfusions on a regular basis, which puts them at high risk for transfusional iron overload. Transfusion dependence has been associated with negative impact on organ function and reduced life expectancy. Recently, several retrospective but also some prospective studies have indicated, that transfusion dependent patients with MDS might benefit from consequent iron chelation with regard to morbidity and mortality. However, low treatment adherence due to adverse events mainly gastrointestinal in nature is an important obstacle in achieving sufficient iron chelation in MDS patients. Here, we will summarize and discuss the existing data on Deferasirox in low risk MDS published so far and provide recommendations for optimal management of gastrointestinal adverse events during iron chelation aiming at improving treatment compliance and, hence, sufficiently removing excess iron from the patients. Copyright © 2015. Published by Elsevier Ltd.
Mouratidou, T; Miguel, M L; Androutsos, O; Manios, Y; De Bourdeaudhuij, I; Cardon, G; Kulaga, Z; Socha, P; Galcheva, S; Iotova, V; Payr, A; Koletzko, B; Moreno, L A
2014-08-01
The ToyBox-intervention is a kindergarten-based, family-involved intervention targeting multiple lifestyle behaviours in preschool children, their teachers and their families. This intervention was conducted in six European countries, namely Belgium, Bulgaria, Germany, Greece, Poland and Spain. The aim of this paper is to provide a descriptive overview of the harmonization and standardization procedures of the baseline and follow-up evaluation of the study (and substudies). Steps related to the study's operational, standardization and harmonization procedures as well as the impact and outcome evaluation assessment tools used are presented. Experiences from the project highlight the importance of safeguarding the measurement process to minimize data heterogeneity derived from potential measurement error and country-by-country differences. In addition, it was made clear that continuing quality control and support is an important component of such studies. For this reason, well-supported communication channels, such as regular email updates and teleconferences, and regular internal and external meetings to ensure smooth and accurate implementation were in place during the study. The ToyBox-intervention and its harmonized and standardized procedures can serve as a successful case study for future studies evaluating the efficacy of similar interventions. © 2014 World Obesity.
Total variation-based method for radar coincidence imaging with model mismatch for extended target
NASA Astrophysics Data System (ADS)
Cao, Kaicheng; Zhou, Xiaoli; Cheng, Yongqiang; Fan, Bo; Qin, Yuliang
2017-11-01
Originating from traditional optical coincidence imaging, radar coincidence imaging (RCI) is a staring/forward-looking imaging technique. In RCI, the reference matrix must be computed precisely to reconstruct the image as preferred; unfortunately, such precision is almost impossible due to the existence of model mismatch in practical applications. Although some conventional sparse recovery algorithms are proposed to solve the model-mismatch problem, they are inapplicable to nonsparse targets. We therefore sought to derive the signal model of RCI with model mismatch by replacing the sparsity constraint item with total variation (TV) regularization in the sparse total least squares optimization problem; in this manner, we obtain the objective function of RCI with model mismatch for an extended target. A more robust and efficient algorithm called TV-TLS is proposed, in which the objective function is divided into two parts and the perturbation matrix and scattering coefficients are updated alternately. Moreover, due to the ability of TV regularization to recover sparse signal or image with sparse gradient, TV-TLS method is also applicable to sparse recovering. Results of numerical experiments demonstrate that, for uniform extended targets, sparse targets, and real extended targets, the algorithm can achieve preferred imaging performance both in suppressing noise and in adapting to model mismatch.
Recommendations for the management of sickle cell disease in South Africa.
Alli, N A; Patel, M; Alli, H D; Bassa, F; Coetzee, M J; Davidson, A; Essop, M R; Lakha, A; Louw, V J; Novitzky, N; Philip, V; Poole, J E; Wainwright, R D
2014-11-01
The spectrum of sickle cell disease (SCD) encompasses a heterogeneous group of disorders that include: (I) homozygous SCD (HbSS), also referred to as sickle cell anaemia; (ii) heterozygous SCD (HbAS), also referred to as sickle cell trait; and (iii) compound heterozygous states such as HbSC disease, HbSβ thalassaemia, etc. Homozygous or compound heterozygous SCD patients manifest with clinical disease of varying severity that is influenced by biological and environmental factors, whereas subject with sickle cell trait are largely asymptomatic. SCD is characterized by vaso-occlusive episodes that result in tissue ischaemia and pain in the affected region. Repeated infarctive episodes cause organ damage and may eventually lead to organ failure. For effective management, regular follow-up with support from a multidisciplinary healthcare team is necessary. The chronic nature of the disease, the steady increase in patient numbers, and relapsing acute episodes have cost implications that are likely to impact on provincial and national health budgets. Limited resources mandate local management protocols for the purposes of consistency and standardisation, which could also facilitate sharing of resources between centres for maximal utility. These recommendations have been developed for the South African setting, and it is intended to update them regularly to meet new demands and challenges.
What Drives the Variability of the Mid-Latitude Ionosphere?
NASA Astrophysics Data System (ADS)
Goncharenko, L. P.; Zhang, S.; Erickson, P. J.; Harvey, L.; Spraggs, M. E.; Maute, A. I.
2016-12-01
The state of the ionosphere is determined by the superposition of the regular changes and stochastic variations of the ionospheric parameters. Regular variations are represented by diurnal, seasonal and solar cycle changes, and can be well described by empirical models. Short-term perturbations that vary from a few seconds to a few hours or days can be induced in the ionosphere by solar flares, changes in solar wind, coronal mass ejections, travelling ionospheric disturbances, or meteorological influences. We use over 40 years of observations by the Millstone Hill incoherent scatter radar (42.6oN, 288.5oE) to develop an updated empirical model of ionospheric parameters, and wintertime data collected in 2004-2016 to study variability in ionospheric parameters. We also use NASA MERRA2 atmospheric reanalysis data to examine possible connections between the state of the stratosphere & mesosphere and the upper atmosphere (250-400km). A case of major SSW of January 2013 is selected for in-depth study and reveals large anomalies in ionospheric parameters. Modeling with the NCAR Thermospheric-Ionospheric-Mesospheric-Electrodynamics general Circulation Model (TIME-GCM) nudged by WACCM-GEOS5 simulation indicates that during the 2013 SSW the neutral and ion temperature in the polar through mid-latitude region deviates from the seasonal behavior.
Abdomen and spinal cord segmentation with augmented active shape models
Xu, Zhoubing; Conrad, Benjamin N.; Baucom, Rebeccah B.; Smith, Seth A.; Poulose, Benjamin K.; Landman, Bennett A.
2016-01-01
Abstract. Active shape models (ASMs) have been widely used for extracting human anatomies in medical images given their capability for shape regularization of topology preservation. However, sensitivity to model initialization and local correspondence search often undermines their performances, especially around highly variable contexts in computed-tomography (CT) and magnetic resonance (MR) images. In this study, we propose an augmented ASM (AASM) by integrating the multiatlas label fusion (MALF) and level set (LS) techniques into the traditional ASM framework. Using AASM, landmark updates are optimized globally via a region-based LS evolution applied on the probability map generated from MALF. This augmentation effectively extends the searching range of correspondent landmarks while reducing sensitivity to the image contexts and improves the segmentation robustness. We propose the AASM framework as a two-dimensional segmentation technique targeting structures with one axis of regularity. We apply AASM approach to abdomen CT and spinal cord (SC) MR segmentation challenges. On 20 CT scans, the AASM segmentation of the whole abdominal wall enables the subcutaneous/visceral fat measurement, with high correlation to the measurement derived from manual segmentation. On 28 3T MR scans, AASM yields better performances than other state-of-the-art approaches in segmenting white/gray matter in SC. PMID:27610400
[Exercise contacts in the treatment of substance dependence and mental disorders].
Skrede, Atle; Munkvold, Harald; Watne, Øyvind; Martinsen, Egil W
2006-08-10
Physical exercise is useful for individuals with mental disorders with additional substance dependency or abuse. Many exercise actively while in institution, but a major challenge is to continue after discharge. Many patients are isolated and lonely and find it hard to motivate themselves to exercise on their own. In Sogn og Fjordane county, Norway, the problem was dealt with through a training program of exercise contacts. These are social support persons who were thus assigned a new function. By way of a 40-hour course that covered physical activity, psychological problems, and substance abuse and dependency, lay people were trained to help people in their home environment. By the end of 2005, almost 300 exercise contacts, living in 25 of the 26 municipalities in the county, had passed the course exam. Their expertise is highly demanded and more courses have been requested. The course evaluations have been quite positive. In particular, the practical instructions about how to exercise, in combination with updated theory on substance abuse/dependence and mental disorders, were highly appreciated. Clients were helped to continue with regular physical activity and they have appreciated the improved physical and mental health that was associated with regular exercise. Moreover, the exercise contacts help clients break social isolation and have given them access to the common social arenas.
He, Xiaowei; Liang, Jimin; Wang, Xiaorui; Yu, Jingjing; Qu, Xiaochao; Wang, Xiaodong; Hou, Yanbin; Chen, Duofang; Liu, Fang; Tian, Jie
2010-11-22
In this paper, we present an incomplete variables truncated conjugate gradient (IVTCG) method for bioluminescence tomography (BLT). Considering the sparse characteristic of the light source and insufficient surface measurement in the BLT scenarios, we combine a sparseness-inducing (ℓ1 norm) regularization term with a quadratic error term in the IVTCG-based framework for solving the inverse problem. By limiting the number of variables updated at each iterative and combining a variable splitting strategy to find the search direction more efficiently, it obtains fast and stable source reconstruction, even without a priori information of the permissible source region and multispectral measurements. Numerical experiments on a mouse atlas validate the effectiveness of the method. In vivo mouse experimental results further indicate its potential for a practical BLT system.
No. 263-Maternity Leave in Normal Pregnancy.
Leduc, Dean
2017-10-01
To assist maternity care providers in recognizing and discussing health- and illness-related issues in pregnancy and their relationship to maternity benefits. Published literature was retrieved through searches of PubMed or Medline, CINAHL, and The Cochrane Library in 2009 using appropriate controlled vocabulary (e.g., maternity benefits) and key words (e.g., maternity, benefits, pregnancy). Results were restricted to systematic reviews, randomized controlled trials/controlled clinical trials, and observational studies. There were no date or language restrictions. Searches were updated on a regular basis and incorporated in the guideline to December 2009. Grey (unpublished) literature was identified through searching the web sites of health technology assessment and health technology assessment-related agencies, clinical practice guideline collections, clinical trial registries, and national and international medical specialty societies. Copyright © 2017. Published by Elsevier Inc.
Recursive Bayesian recurrent neural networks for time-series modeling.
Mirikitani, Derrick T; Nikolaev, Nikolay
2010-02-01
This paper develops a probabilistic approach to recursive second-order training of recurrent neural networks (RNNs) for improved time-series modeling. A general recursive Bayesian Levenberg-Marquardt algorithm is derived to sequentially update the weights and the covariance (Hessian) matrix. The main strengths of the approach are a principled handling of the regularization hyperparameters that leads to better generalization, and stable numerical performance. The framework involves the adaptation of a noise hyperparameter and local weight prior hyperparameters, which represent the noise in the data and the uncertainties in the model parameters. Experimental investigations using artificial and real-world data sets show that RNNs equipped with the proposed approach outperform standard real-time recurrent learning and extended Kalman training algorithms for recurrent networks, as well as other contemporary nonlinear neural models, on time-series modeling.
MEPD: a Medaka gene expression pattern database
Henrich, Thorsten; Ramialison, Mirana; Quiring, Rebecca; Wittbrodt, Beate; Furutani-Seiki, Makoto; Wittbrodt, Joachim; Kondoh, Hisato
2003-01-01
The Medaka Expression Pattern Database (MEPD) stores and integrates information of gene expression during embryonic development of the small freshwater fish Medaka (Oryzias latipes). Expression patterns of genes identified by ESTs are documented by images and by descriptions through parameters such as staining intensity, category and comments and through a comprehensive, hierarchically organized dictionary of anatomical terms. Sequences of the ESTs are available and searchable through BLAST. ESTs in the database are clustered upon entry and have been blasted against public data-bases. The BLAST results are updated regularly, stored within the database and searchable. The MEPD is a project within the Medaka Genome Initiative (MGI) and entries will be interconnected to integrated genomic map databases. MEPD is accessible through the WWW at http://medaka.dsp.jst.go.jp/MEPD. PMID:12519950
A narrative review of alcohol consumption as a risk factor for global burden of disease.
Rehm, Jürgen; Imtiaz, Sameer
2016-10-28
Since the original Comparative Risk Assessment (CRA) for alcohol consumption as part of the Global Burden of Disease Study for 1990, there had been regular updates of CRAs for alcohol from the World Health Organization and/or the Institute for Health Metrics and Evaluation. These studies have become more and more refined with respect to establishing causality between dimensions of alcohol consumption and different disease and mortality (cause of death) outcomes, refining risk relations, and improving the methodology for estimating exposure and alcohol-attributable burden. The present review will give an overview on the main results of the CRAs with respect to alcohol consumption as a risk factor, sketch out new trends and developments, and draw implications for future research and policy.
Guideline.gov: A Database of Clinical Specialty Guidelines.
El-Khayat, Yamila M; Forbes, Carrie S; Coghill, Jeffrey G
2017-01-01
The National Guidelines Clearinghouse (NGC), also known as Guideline.gov, is a database of resources to assist health care providers with a central depository of guidelines for clinical specialty areas in medicine. The database is provided free of charge and is sponsored by the U.S. Department of Health and Human Services and the Agency for Healthcare Research and Quality. The guidelines for treatment are updated regularly, with new guidelines replacing older guidelines every five years. There are hundreds of current guidelines with more added each week. The purpose and goal of NGC is to provide physicians, nurses, and other health care providers, insurance companies, and others in the field of health care with a unified database of the most current, detailed, relevant, and objective clinical practice guidelines.
CancerNet redistribution via WWW.
Quade, G; Püschel, N; Far, F
1996-01-01
CancerNet from the National Cancer Institute contains nearly 500 ASCII-files, updated monthly, with up-to-date information about cancer and the "Golden Standard" in tumor therapy. Perl scripts are used to convert these files to HTML-documents. A complex algorithm, using regular expression matching and extensive exception handling, detects headlines, listings and other constructs of the original ASCII-text and converts them into their HTML-counterparts. A table of contents is also created during the process. The resulting files are indexed for full-text search via WAIS. Building the complete CancerNet WWW redistribution takes less than two hours with a minimum of manual work. For 26,000 requests of information from our service per month the average costs for the worldwide delivery of one document is about 19 cents.
Update on antibacterial soaps: the FDA takes a second look at triclosans.
Bergstrom, Kendra Gail
2014-04-01
In December of 2013 the Food and Drug Administration announced it would look further into the safety and efficacy of the biocide triclosan and requested further safety data as part of a new review with the Environmental Protection Agency. The use of triclosan has increased exponentially since its introduction in in 1972, to the point that 75% of commercial soap brands contain triclosan and 76% of a nationwide sample of adults and children excrete triclosan in the urine. This announcement raised an important dialog about the appropriate use of all over the counter biocides. Particular concerns include whether these biocides are more effective than regular soaps, whether they may create new drug resistant bacteria, and whether they may also act as hormone disruptors in humans or the environment.
Data mining in newt-omics, the repository for omics data from the newt.
Looso, Mario; Braun, Thomas
2015-01-01
Salamanders are an excellent model organism to study regenerative processes due to their unique ability to regenerate lost appendages or organs. Straightforward bioinformatics tools to analyze and take advantage of the growing number of "omics" studies performed in salamanders were lacking so far. To overcome this limitation, we have generated a comprehensive data repository for the red-spotted newt Notophthalmus viridescens, named newt-omics, merging omics style datasets on the transcriptome and proteome level including expression values and annotations. The resource is freely available via a user-friendly Web-based graphical user interface ( http://newt-omics.mpi-bn.mpg.de) that allows access and queries to the database without prior bioinformatical expertise. The repository is updated regularly, incorporating new published datasets from omics technologies.
Kurtz, M.; Bennett, T.; Garvin, P.; Manuel, F.; Williams, M.; Langreder, S.
1991-01-01
Because of the rapid evolution of the heart, heart/lung, liver, kidney and kidney/pancreas transplant programs at our institution, and because of a lack of an existing comprehensive database, we were required to develop a computerized management information system capable of supporting both clinical and research requirements of a multifaceted transplant program. SLUMIS (ST. LOUIS UNIVERSITY MULTI-ORGAN INFORMATION SYSTEM) was developed for the following reasons: 1) to comply with the reporting requirements of various transplant registries, 2) for reporting to an increasing number of government agencies and insurance carriers, 3) to obtain updates of our operative experience at regular intervals, 4) to integrate the Histocompatibility and Immunogenetics Laboratory (HLA) for online test result reporting, and 5) to facilitate clinical investigation. PMID:1807741
Design of a secure remote management module for a software-operated medical device.
Burnik, Urban; Dobravec, Štefan; Meža, Marko
2017-12-09
Software-based medical devices need to be maintained throughout their entire life cycle. The efficiency of after-sales maintenance can be improved by managing medical systems remotely. This paper presents how to design the remote access function extensions in order to prevent risks imposed by uncontrolled remote access. A thorough analysis of standards and legislation requirements regarding safe operation and risk management of medical devices is presented. Based on the formal requirements, a multi-layer machine design solution is proposed that eliminates remote connectivity risks by strict separation of regular device functionalities from remote management service, deploys encrypted communication links and uses digital signatures to prevent mishandling of software images. The proposed system may also be used as an efficient version update of the existing medical device designs.
Surface-Water Quality-Assurance Plan for the Tallahassee Office, U.S. Geological Survey
Tomlinson, Stewart A.
2006-01-01
This Tallahassee Office Surface-Water Quality-Assurance Plan documents the standards, policies, and procedures used by the Tallahassee Office for activities related to the collection, processing, storage, analysis, and publication of surface-water data. This plan serves as a guide to all Tallahassee Office personnel involved in surface-water data activities, and changes as the needs and requirements of the Tallahassee Office, Florida Integrated Science Center, and Water Discipline change. Reg-ular updates to this Plan represent an integral part of the quality-assurance process. In the Tallahassee Office, direct oversight and responsibility by the employee(s) assigned to a surface-water station, combined with team approaches in all work efforts, assure high-quality data, analyses, reviews, and reports for cooperating agencies and the public.
Application of CRISPR/Cas9 in plant biology.
Liu, Xuan; Wu, Surui; Xu, Jiao; Sui, Chun; Wei, Jianhe
2017-05-01
The CRISPR/Cas (clustered regularly interspaced short palindromic repeats/CRISPR-associated proteins) system was first identified in bacteria and archaea and can degrade exogenous substrates. It was developed as a gene editing technology in 2013. Over the subsequent years, it has received extensive attention owing to its easy manipulation, high efficiency, and wide application in gene mutation and transcriptional regulation in mammals and plants. The process of CRISPR/Cas is optimized constantly and its application has also expanded dramatically. Therefore, CRISPR/Cas is considered a revolutionary technology in plant biology. Here, we introduce the mechanism of the type II CRISPR/Cas called CRISPR/Cas9, update its recent advances in various applications in plants, and discuss its future prospects to provide an argument for its use in the study of medicinal plants.
Library preparation and data analysis packages for rapid genome sequencing.
Pomraning, Kyle R; Smith, Kristina M; Bredeweg, Erin L; Connolly, Lanelle R; Phatale, Pallavi A; Freitag, Michael
2012-01-01
High-throughput sequencing (HTS) has quickly become a valuable tool for comparative genetics and genomics and is now regularly carried out in laboratories that are not connected to large sequencing centers. Here we describe an updated version of our protocol for constructing single- and paired-end Illumina sequencing libraries, beginning with purified genomic DNA. The present protocol can also be used for "multiplexing," i.e. the analysis of several samples in a single flowcell lane by generating "barcoded" or "indexed" Illumina sequencing libraries in a way that is independent from Illumina-supported methods. To analyze sequencing results, we suggest several independent approaches but end users should be aware that this is a quickly evolving field and that currently many alignment (or "mapping") and counting algorithms are being developed and tested.
The zebrafish as a model for complex tissue regeneration
Gemberling, Matthew; Bailey, Travis J.; Hyde, David R.; Poss, Kenneth D.
2013-01-01
For centuries, philosophers and scientists have been fascinated by the principles and implications of regeneration in lower vertebrate species. Two features have made zebrafish an informative model system for determining mechanisms of regenerative events. First, they are highly regenerative, able to regrow amputated fins, as well as a lesioned brain, retina, spinal cord, heart, and other tissues. Second, they are amenable to both forward and reverse genetic approaches, with a research toolset regularly updated by an expanding community of zebrafish researchers. Zebrafish studies have helped identify new mechanistic underpinnings of regeneration in multiple tissues, and in some cases have served as a guide for contemplating regenerative strategies in mammals. Here, we review the recent history of zebrafish as a genetic model system for understanding how and why tissue regeneration occurs. PMID:23927865
Network Communication as a Service-Oriented Capability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnston, William; Johnston, William; Metzger, Joe
2008-01-08
In widely distributed systems generally, and in science-oriented Grids in particular, software, CPU time, storage, etc., are treated as"services" -- they can be allocated and used with service guarantees that allows them to be integrated into systems that perform complex tasks. Network communication is currently not a service -- it is provided, in general, as a"best effort" capability with no guarantees and only statistical predictability. In order for Grids (and most types of systems with widely distributed components) to be successful in performing the sustained, complex tasks of large-scale science -- e.g., the multi-disciplinary simulation of next generation climate modelingmore » and management and analysis of the petabytes of data that will come from the next generation of scientific instrument (which is very soon for the LHC at CERN) -- networks must provide communication capability that is service-oriented: That is it must be configurable, schedulable, predictable, and reliable. In order to accomplish this, the research and education network community is undertaking a strategy that involves changes in network architecture to support multiple classes of service; development and deployment of service-oriented communication services, and; monitoring and reporting in a form that is directly useful to the application-oriented system so that it may adapt to communications failures. In this paper we describe ESnet's approach to each of these -- an approach that is part of an international community effort to have intra-distributed system communication be based on a service-oriented capability.« less
Little, Mark; Cooper, Jim; Gope, Monica; Hahn, Kelly A; Kibar, Cem; McCoubrie, David; Ng, Conrad; Robinson, Annie; Soderstrom, Jessamine; Leclercq, Muriel
2012-08-01
The Royal Perth Hospital (RPH; Perth, Australia) has been the receiving facility for burns patients in two separate disasters. In 2002, RPH received 28 severely injured burns patients after the Bali bombing, and in 2009 RPH received 23 significantly burnt patients as a result of an explosion on board a foreign vessel in the remote Ashmore Reef Islands (840 km west of Darwin). The aim of this paper is to identify the interventions developed following the Bali bombing in 2002 and review their effectiveness of their implementation in the subsequent burns disaster. A comparative case study analysis using a standardised approach was used to describe context with debrief reports and ED photographs from both disasters used for evaluation. The implementation of regular ED disaster response planning and training, early Code Brown notification of the entire hospital with regular updates, early clearing of inpatient beds, use of Short Message Service to communicate regularly with ED staff, control of the public and media access to the ED, visual identification of staff within the ED, early panendoscopy to ascertain intubation needs, and senior clinical decision makers in all areas of the ED were all acknowledged as effective based on the debrief reports. There was a reduction in ED length of stay (150 to 55 min) and no deaths occurred; however, quantitative analysis can only be suggestive rather than a direct measure of improvement given the likelihood of other system changes. There were a number of lessons observed from the Bali experience in 2002 that have led to improvements in practice and lessons learned. © 2012 The Authors. EMA © 2012 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.
An efficient decoding for low density parity check codes
NASA Astrophysics Data System (ADS)
Zhao, Ling; Zhang, Xiaolin; Zhu, Manjie
2009-12-01
Low density parity check (LDPC) codes are a class of forward-error-correction codes. They are among the best-known codes capable of achieving low bit error rates (BER) approaching Shannon's capacity limit. Recently, LDPC codes have been adopted by the European Digital Video Broadcasting (DVB-S2) standard, and have also been proposed for the emerging IEEE 802.16 fixed and mobile broadband wireless-access standard. The consultative committee for space data system (CCSDS) has also recommended using LDPC codes in the deep space communications and near-earth communications. It is obvious that LDPC codes will be widely used in wired and wireless communication, magnetic recording, optical networking, DVB, and other fields in the near future. Efficient hardware implementation of LDPC codes is of great interest since LDPC codes are being considered for a wide range of applications. This paper presents an efficient partially parallel decoder architecture suited for quasi-cyclic (QC) LDPC codes using Belief propagation algorithm for decoding. Algorithmic transformation and architectural level optimization are incorporated to reduce the critical path. First, analyze the check matrix of LDPC code, to find out the relationship between the row weight and the column weight. And then, the sharing level of the check node updating units (CNU) and the variable node updating units (VNU) are determined according to the relationship. After that, rearrange the CNU and the VNU, and divide them into several smaller parts, with the help of some assistant logic circuit, these smaller parts can be grouped into CNU during the check node update processing and grouped into VNU during the variable node update processing. These smaller parts are called node update kernel units (NKU) and the assistant logic circuit are called node update auxiliary unit (NAU). With NAUs' help, the two steps of iteration operation are completed by NKUs, which brings in great hardware resource reduction. Meanwhile, efficient techniques have been developed to reduce the computation delay of the node processing units and to minimize hardware overhead for parallel processing. This method may be applied not only to regular LDPC codes, but also to the irregular ones. Based on the proposed architectures, a (7493, 6096) irregular QC-LDPC code decoder is described using verilog hardware design language and implemented on Altera field programmable gate array (FPGA) StratixII EP2S130. The implementation results show that over 20% of logic core size can be saved than conventional partially parallel decoder architectures without any performance degradation. If the decoding clock is 100MHz, the proposed decoder can achieve a maximum (source data) decoding throughput of 133 Mb/s at 18 iterations.
NASA Astrophysics Data System (ADS)
Eberle, J.; Gerlach, R.; Hese, S.; Schmullius, C.
2012-04-01
To provide earth observation products in the area of Siberia, the Siberian Earth System Science Cluster (SIB-ESS-C) was established as a spatial data infrastructure at the University of Jena (Germany), Department for Earth Observation. This spatial data infrastructure implements standards published by the Open Geospatial Consortium (OGC) and the International Organizsation for Standardization (ISO) for data discovery, data access, data processing and data analysis. The objective of SIB-ESS-C is to faciliate environmental research and Earth system science in Siberia. The region for this project covers the entire Asian part of the Russian Federation approximately between 58°E - 170°W and 48°N - 80°N. To provide discovery, access and analysis services a webportal was published for searching and visualisation of available data. This webportal is based on current web technologies like AJAX, Drupal Content Management System as backend software and a user-friendly surface with Drag-n-Drop and further mouse events. To have a wide range of regular updated earth observation products, some products from sensor MODIS at the satellites Aqua and Terra were processed. A direct connection to NASA archive servers makes it possible to download MODIS Level 3 and 4 products and integrate it in the SIB-ESS-C infrastructure. These data can be downloaded in a file format called Hierarchical Data Format (HDF). For visualisation and further analysis, this data is reprojected, converted to GeoTIFF and global products clipped to the project area. All these steps are implemented as an automatic process chain. If new MODIS data is available within the infrastructure this process chain is executed. With the link to a MODIS catalogue system, the system gets new data daily. With the implemented analysis processes, timeseries data can be analysed, for example to plot a trend or different time series against one another. Scientists working in this area and working with MODIS data can make use of this service over the webportal. Both searching manually the NASA archive for MODIS data, processing these data automatically and then download it for further processing and using the regular updated products.
Application of Modis Data to Assess the Latest Forest Cover Changes of Sri Lanka
NASA Astrophysics Data System (ADS)
Perera, K.; Herath, S.; Apan, A.; Tateishi, R.
2012-07-01
Assessing forest cover of Sri Lanka is becoming important to lower the pressure on forest lands as well as man-elephant conflicts. Furthermore, the land access to north-east Sri Lanka after the end of 30 years long civil war has increased the need of regularly updated land cover information for proper planning. This study produced an assessment of the forest cover of Sri Lanka using two satellite data based maps within 23 years of time span. For the old forest cover map, the study used one of the first island-wide digital land cover classification produced by the main author in 1988. The old land cover classification was produced at 80 m spatial resolution, using Landsat MSS data. A previously published another study by the author has investigated the application feasibility of MODIS and Landsat MSS imagery for a selected sub-section of Sri Lanka to identify the forest cover changes. Through the light of these two studies, the assessment was conducted to investigate the application possibility of MODIS 250 m over a small island like Sri Lanka. The relation between the definition of forest in the study and spatial resolution of the used satellite data sets were considered since the 2012 map was based on MODIS data. The forest cover map of 1988 was interpolated into 250 m spatial resolution to integrate with the GIS data base. The results demonstrated the advantages as well as disadvantages of MODIS data in a study at this scale. The successful monitoring of forest is largely depending on the possibility to update the field conditions at regular basis. Freely available MODIS data provides a very valuable set of information of relatively large green patches on the ground at relatively real-time basis. Based on the changes of forest cover from 1988 to 2012, the study recommends the use of MODIS data as a resalable method to forest assessment and to identify hotspots to be re-investigated. It's noteworthy to mention the possibility of uncounted small isolated pockets of forest, or sub-pixel size forest patches when MODIS 250 m x 250 m data used in small regions.
Z-Earth: 4D topography from space combining short-baseline stereo and lidar
NASA Astrophysics Data System (ADS)
Dewez, T. J.; Akkari, H.; Kaab, A. M.; Lamare, M. L.; Doyon, G.; Costeraste, J.
2013-12-01
The advent of free-of-charge global topographic data sets SRTM and Aster GDEM have enabled testing a host of geoscience hypotheses. Availability of such data is now considered standard, and though resolved at 30-m to 90-m pixel size, they are today regarded as obsolete and inappropriate given the regularly updated sub-meter imagery coming through web services like Google Earth. Two features will thus help meet the current topographic data needs of the Geoscience communities: field-scale-compatible elevation datasets (i.e. meter-scale digital models and sub-meter elevation precision) and provision for regularly updated topography to tackle earth surface changes in 4D, while retaining the key for success: data availability at no charge. A new space borne instrumental concept called Z-Earth has undergone phase 0 study at CNES, the French space agency to fulfill these aims. The scientific communities backing this proposal are that of natural hazards, glaciology and biomass. The system under study combines a short-baseline native stereo imager and a lidar profiler. This combination provides spatially resolved elevation swaths together with absolute along-track elevation control point profiles. Acquisition is designed for revisit time better than a year. Intended products not only target single pass digital surface models, color orthoimages and small footprint full-wave-form lidar profiles to update existing topographic coverage, but also time series of them. 3D change detection targets centimetre-scale horizontal precision and metric vertical precision, in complement of -now traditional- spectral change detection. To assess the actual concept value, two real-size experiments were carried out. We used sub-meter-scale Pleiades panchromatic stereo-images to generate digital surface models and check them against dense airborne lidar coverages, one heliborne set purposely flown in Corsica (50-100pts/sq.m) and a second one retrieved from OpenTopography.org (~10pts/sq.m.). In Corsica, over a challenging 45-degree-grade tree-covered mountain side, the Pleiades 2-m-grid-posting digital surface model described the topography with a median error of -4.75m +/-2.59m (NMAD). A planimetric bias between both datasets was found to be about 7m to the South. This planimetric misregistration, though well within Pleiades specifications, partly explains the dramatic effect on elevation difference. In the Redmond area (eastern Oregon), a very gentle desert landscape, elevation differences also contained a vertical median bias of -4.02m+/-1.22m (NMAD). Though here, sub-pixel planimetric registration between stereo DSM and lidar coverage was enforced. This real-size experiment hints that sub-meter accuracy for 2-m-grid-posting DSM is an achievable goal when combining stereoimaging and lidar.
W-phase estimation of first-order rupture distribution for megathrust earthquakes
NASA Astrophysics Data System (ADS)
Benavente, Roberto; Cummins, Phil; Dettmer, Jan
2014-05-01
Estimating the rupture pattern for large earthquakes during the first hour after the origin time can be crucial for rapid impact assessment and tsunami warning. However, the estimation of coseismic slip distribution models generally involves complex methodologies that are difficult to implement rapidly. Further, while model parameter uncertainty can be crucial for meaningful estimation, they are often ignored. In this work we develop a finite fault inversion for megathrust earthquakes which rapidly generates good first order estimates and uncertainties of spatial slip distributions. The algorithm uses W-phase waveforms and a linear automated regularization approach to invert for rupture models of some recent megathrust earthquakes. The W phase is a long period (100-1000 s) wave which arrives together with the P wave. Because it is fast, has small amplitude and a long-period character, the W phase is regularly used to estimate point source moment tensors by the NEIC and PTWC, among others, within an hour of earthquake occurrence. We use W-phase waveforms processed in a manner similar to that used for such point-source solutions. The inversion makes use of 3 component W-phase records retrieved from the Global Seismic Network. The inverse problem is formulated by a multiple time window method, resulting in a linear over-parametrized problem. The over-parametrization is addressed by Tikhonov regularization and regularization parameters are chosen according to the discrepancy principle by grid search. Noise on the data is addressed by estimating the data covariance matrix from data residuals. The matrix is obtained by starting with an a priori covariance matrix and then iteratively updating the matrix based on the residual errors of consecutive inversions. Then, a covariance matrix for the parameters is computed using a Bayesian approach. The application of this approach to recent megathrust earthquakes produces models which capture the most significant features of their slip distributions. Also, reliable solutions are generally obtained with data in a 30-minute window following the origin time, suggesting that a real-time system could obtain solutions in less than one hour following the origin time.
Evolving network simulation study. From regular lattice to scale free network
NASA Astrophysics Data System (ADS)
Makowiec, D.
2005-12-01
The Watts-Strogatz algorithm of transferring the square lattice to a small world network is modified by introducing preferential rewiring constrained by connectivity demand. The evolution of the network is two-step: sequential preferential rewiring of edges controlled by p and updating the information about changes done. The evolving system self-organizes into stationary states. The topological transition in the graph structure is noticed with respect to p. Leafy phase a graph formed by multiple connected vertices (graph skeleton) with plenty of leaves attached to each skeleton vertex emerges when p is small enough to pretend asynchronous evolution. Tangling phase where edges of a graph circulate frequently among low degree vertices occurs when p is large. There exist conditions at which the resulting stationary network ensemble provides networks which degree distribution exhibit power-law decay in large interval of degrees.
GénoPlante-Info (GPI): a collection of databases and bioinformatics resources for plant genomics
Samson, Delphine; Legeai, Fabrice; Karsenty, Emmanuelle; Reboux, Sébastien; Veyrieras, Jean-Baptiste; Just, Jeremy; Barillot, Emmanuel
2003-01-01
Génoplante is a partnership program between public French institutes (INRA, CIRAD, IRD and CNRS) and private companies (Biogemma, Bayer CropScience and Bioplante) that aims at developing genome analysis programs for crop species (corn, wheat, rapeseed, sunflower and pea) and model plants (Arabidopsis and rice). The outputs of these programs form a wealth of information (genomic sequence, transcriptome, proteome, allelic variability, mapping and synteny, and mutation data) and tools (databases, interfaces, analysis software), that are being integrated and made public at the public bioinformatics resource centre of Génoplante: GénoPlante-Info (GPI). This continuous flood of data and tools is regularly updated and will grow continuously during the coming two years. Access to the GPI databases and tools is available at http://genoplante-info.infobiogen.fr/. PMID:12519976
Zahraei, Seyed Mohsen; Marandi, Alireza; Sadrizadeh, Bijan; Gouya, Mehdi Mohammad; Rezaei, Parviz; Vazirian, Parviz; Yaghini, Fatheme
2010-04-19
The National Immunization Technical Advisory Group (NITAG) was established in Iran in 1982 and has made many important technical recommendations (e.g., regarding polio eradication, introduction of new vaccines, organizing special studies) that have contributed to a dramatic decline in vaccine preventable disease burden. The NITAG consists of experts from the Ministry of Health and Medical Education (MOHME), vaccine manufacturers, and medical universities with national Expanded Program of Immunization (EPI) staff serving as the secretariat. It is not completely independent from MOHME or EPI. It meets on a quarterly basis, and publishes national guidelines and immunization schedules that are updated regularly. Although primarily an advisory body, representation from MOHME members, including the EPI manager, ensures almost universal implementation of NITAG recommendations. Copyright © 2010 Elsevier Ltd. All rights reserved.
50 years of the International Committee on Taxonomy of Viruses: progress and prospects.
Adams, Michael J; Lefkowitz, Elliot J; King, Andrew M Q; Harrach, Balázs; Harrison, Robert L; Knowles, Nick J; Kropinski, Andrew M; Krupovic, Mart; Kuhn, Jens H; Mushegian, Arcady R; Nibert, Max L; Sabanadzovic, Sead; Sanfaçon, Hélène; Siddell, Stuart G; Simmonds, Peter; Varsani, Arvind; Zerbini, Francisco Murilo; Orton, Richard J; Smith, Donald B; Gorbalenya, Alexander E; Davison, Andrew J
2017-05-01
We mark the 50th anniversary of the International Committee on Taxonomy of Viruses (ICTV) by presenting a brief history of the organization since its foundation, showing how it has adapted to advancements in our knowledge of virus diversity and the methods used to characterize it. We also outline recent developments, supported by a grant from the Wellcome Trust (UK), that are facilitating substantial changes in the operations of the ICTV and promoting dialogue with the virology community. These developments will generate improved online resources, including a freely available and regularly updated ICTV Virus Taxonomy Report. They also include a series of meetings between the ICTV and the broader community focused on some of the major challenges facing virus taxonomy, with the outcomes helping to inform the future policy and practice of the ICTV.
Financial and materiel management.
Willock, M; Motley, C
1998-01-01
Hospitals have to purchase new technology, update equipment, and replenish supplies continually to meet the needs of patients and the medical and nursing staff in a sound financial way. Thus, inventories must be maintained accurately and adequately with proper controls. Awareness of the cost of capital and operational supplies is essential to meeting budget allocations. With or without centralized buying, the MM department has the expertise to assist every department in purchasing to meet its needs and in setting and resetting inventory levels for its supplies. Explanations and formulas for handling capital equipment and regular supplies and some formats have been presented to facilitate the process. Because OR items are both expensive and numerous and OR storage space the most costly space in the hospital, physicians and nurse managers must understand the financial processes and inventory management and educate their staffs in these matters.
SSEP: secondary structural elements of proteins
Shanthi, V.; Selvarani, P.; Kiran Kumar, Ch.; Mohire, C. S.; Sekar, K.
2003-01-01
SSEP is a comprehensive resource for accessing information related to the secondary structural elements present in the 25 and 90% non-redundant protein chains. The database contains 1771 protein chains from 1670 protein structures and 6182 protein chains from 5425 protein structures in 25 and 90% non-redundant protein chains, respectively. The current version provides information about the α-helical segments and β-strand fragments of varying lengths. In addition, it also contains the information about 310-helix, β- and ν-turns and hairpin loops. The free graphics program RASMOL has been interfaced with the search engine to visualize the three-dimensional structures of the user queried secondary structural fragment. The database is updated regularly and is available through Bioinformatics web server at http://cluster.physics.iisc.ernet.in/ssep/ or http://144.16.71.148/ssep/. PMID:12824336
NASA Astrophysics Data System (ADS)
Bontemps, S.; Defourny, P.; Van Bogaert, E.; Weber, J. L.; Arino, O.
2010-12-01
Regular and global land cover mapping contributes to evaluating the impact of human activities on the environment. Jointly supported by the European Space Agency and the European Environmental Agency, the GlobCorine project builds on the GlobCover findings and aims at making the full use of the MERIS time series for frequent land cover monitoring. The GlobCover automated classification approach has been tuned to the pan-European continent and adjusted towards a classification compatible with the Corine typology. The GlobCorine 2005 land cover map has been achieved, validated and made available to a broad- level stakeholder community from the ESA website. A first version of the GlobCorine 2009 map has also been produced, demonstrating the possibility for an operational production of frequent and updated global land cover maps.
Utility-sized Madaras wind plants
NASA Astrophysics Data System (ADS)
Whitford, D. H.; Minardi, J. E.
1981-01-01
An analysis and technological updating were conducted for the Madaras Rotor Power Plant concept, to determine its ability to compete both technically and economically with horizontal axis wind turbine generators currently under development. The Madaras system uses large cylinders rotating vertically atop each regularly spaced flatcar of a train to propel them, by means of Magnus-effect interaction with the wind, along a circular or oval track. Alternators geared to the wheels of each car generate electrical power, which is transmitted to a power station by a trolley system. The study, consisting of electromechanical design, wind tunnel testing, and performance and cost analyses, shows that utility-sized plants greater than 228 MW in capacity and producing 975,000 kWh/year are feasible. Energy costs for such plants are projected to be between 22% lower and 12% higher than horizontal axis turbine plants of comparable output.
Bodina, Annalisa; Brizzolara, Antonella; Vadruccio, Gianluca; Castaldi, Silvana
2012-01-01
This paper describes the experience of a hospital which has introduced a system of computerized management of letters of authorization for healthcare workers to access sensitive health data, through the use of open source software. A new corporate intranet portal was created with access given only to the privacy contacts of each operational unit of the hospital. Once the privacy contact has entered the relevant user authorization, these must be approved first by the Directors of the respective operational units and finally by the privacy officer. The introduction of this system has allowed a systematic approach to the management of authorization for access to health data by hospital staff, regular updating and monitoring of the authorization and the start of a process of digitalization of documents.
Correlated Noise: How it Breaks NMF, and What to Do About It.
Plis, Sergey M; Potluru, Vamsi K; Lane, Terran; Calhoun, Vince D
2011-01-12
Non-negative matrix factorization (NMF) is a problem of decomposing multivariate data into a set of features and their corresponding activations. When applied to experimental data, NMF has to cope with noise, which is often highly correlated. We show that correlated noise can break the Donoho and Stodden separability conditions of a dataset and a regular NMF algorithm will fail to decompose it, even when given freedom to be able to represent the noise as a separate feature. To cope with this issue, we present an algorithm for NMF with a generalized least squares objective function (glsNMF) and derive multiplicative updates for the method together with proving their convergence. The new algorithm successfully recovers the true representation from the noisy data. Robust performance can make glsNMF a valuable tool for analyzing empirical data.
Correlated Noise: How it Breaks NMF, and What to Do About It
Plis, Sergey M.; Potluru, Vamsi K.; Lane, Terran; Calhoun, Vince D.
2010-01-01
Non-negative matrix factorization (NMF) is a problem of decomposing multivariate data into a set of features and their corresponding activations. When applied to experimental data, NMF has to cope with noise, which is often highly correlated. We show that correlated noise can break the Donoho and Stodden separability conditions of a dataset and a regular NMF algorithm will fail to decompose it, even when given freedom to be able to represent the noise as a separate feature. To cope with this issue, we present an algorithm for NMF with a generalized least squares objective function (glsNMF) and derive multiplicative updates for the method together with proving their convergence. The new algorithm successfully recovers the true representation from the noisy data. Robust performance can make glsNMF a valuable tool for analyzing empirical data. PMID:23750288
The Potential for Predicting Precipitation on Seasonal-to-Interannual Timescales
NASA Technical Reports Server (NTRS)
Koster, R. D.
1999-01-01
The ability to predict precipitation several months in advance would have a significant impact on water resource management. This talk provides an overview of a project aimed at developing this prediction capability. NASA's Seasonal-to-Interannual Prediction Project (NSIPP) will generate seasonal-to-interannual sea surface temperature predictions through detailed ocean circulation modeling and will then translate these SST forecasts into forecasts of continental precipitation through the application of an atmospheric general circulation model and a "SVAT"-type land surface model. As part of the process, ocean variables (e.g., height) and land variables (e.g., soil moisture) will be updated regularly via data assimilation. The overview will include a discussion of the variability inherent in such a modeling system and will provide some quantitative estimates of the absolute upper limits of seasonal-to-interannual precipitation predictability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dall'Anese, Emiliano; Simonetto, Andrea
This paper considers distribution networks featuring inverter-interfaced distributed energy resources, and develops distributed feedback controllers that continuously drive the inverter output powers to solutions of AC optimal power flow (OPF) problems. Particularly, the controllers update the power setpoints based on voltage measurements as well as given (time-varying) OPF targets, and entail elementary operations implementable onto low-cost microcontrollers that accompany power-electronics interfaces of gateways and inverters. The design of the control framework is based on suitable linear approximations of the AC power-flow equations as well as Lagrangian regularization methods. Convergence and OPF-target tracking capabilities of the controllers are analytically established. Overall,more » the proposed method allows to bypass traditional hierarchical setups where feedback control and optimization operate at distinct time scales, and to enable real-time optimization of distribution systems.« less
Bartlett, D
2016-08-26
The management challenge with erosive tooth wear is that the condition involves erosion and contributions from attrition and abrasion, both of which impact on the longevity of restorations. Severe erosive tooth wear results in visibly shorter teeth, exposure of dentine and adaptive changes which complicate restorative management. There is increasing evidence to suggest if the risk factors, such as reducing the frequency of acidic foods and drinks, are reduced the progression of tooth wear slows and follows a normal pattern of wear. But once teeth become shorter patients often seek advice from dentists on restorative intervention. Composite restorations are successful in some patients but they often involve regular maintenance with repairs and rebuilds, which for some patients is unacceptable. Full coverage crowns, although destructive of tooth tissue, remain an option for restorations.
Promoting the safety performance of industrial radiography using a quantitative assessment system.
Kardan, M R; Mianji, F A; Rastkhah, N; Babakhani, A; Azad, S Borhan
2006-12-01
The increasing number of industrial radiographers and their considerable occupational exposure has been one of the main concerns of the Iran Nuclear Regulatory Authority (INRA) in recent years. In 2002, a quantitative system of evaluating the safety performance of licensees and a complementary enforcement system was introduced by the National Radiation Protection Department (NRPD). Each parameter of the practice is given a weighting factor according to its importance to safety. Assessment of the licensees is done quantitatively by summing up their scores using prepared tables. Implementing this system of evaluation showed a considerable decrease in deficiencies in the various centres. Tables are updated regularly as a result of findings during the inspections. This system is used in addition to enforcement to promote safety performance and to increase the culture of safety in industrial radiography.
RADS Version 4: An Efficient Way to Analyse the Multi-Mission Altimeter Database
NASA Astrophysics Data System (ADS)
Scharroo, Remko; Leuliette, Eric; Naeije, Marc; Martin-Puig, Cristina; Pires, Nelson
2016-08-01
The Radar Altimeter Database System (RADS) has grown to become a mature altimeter database. Over the last 18 years it is continuously being developed, first at Delft University of Technology, now also at the National Oceanic and Atmospheric Administration (NOAA) and the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT).RADS now serves as a fundamental Climate Data Record for sea level. Because of the multiple users involved in vetting the data and the regular updates to the database, RADS is one of the most accurate and complete databases of satellite altimeter data around.RADS version 4 is a major change from the previous version. While the database is compatible with both software versions, the new software provides new tools, allows easier expansion, and has a better and more standardised interface.
Improve Performance of Data Warehouse by Query Cache
NASA Astrophysics Data System (ADS)
Gour, Vishal; Sarangdevot, S. S.; Sharma, Anand; Choudhary, Vinod
2010-11-01
The primary goal of data warehouse is to free the information locked up in the operational database so that decision makers and business analyst can make queries, analysis and planning regardless of the data changes in operational database. As the number of queries is large, therefore, in certain cases there is reasonable probability that same query submitted by the one or multiple users at different times. Each time when query is executed, all the data of warehouse is analyzed to generate the result of that query. In this paper we will study how using query cache improves performance of Data Warehouse and try to find the common problems faced. These kinds of problems are faced by Data Warehouse administrators which are minimizes response time and improves the efficiency of query in data warehouse overall, particularly when data warehouse is updated at regular interval.
Winsor, Geoffrey L; Griffiths, Emma J; Lo, Raymond; Dhillon, Bhavjinder K; Shay, Julie A; Brinkman, Fiona S L
2016-01-04
The Pseudomonas Genome Database (http://www.pseudomonas.com) is well known for the application of community-based annotation approaches for producing a high-quality Pseudomonas aeruginosa PAO1 genome annotation, and facilitating whole-genome comparative analyses with other Pseudomonas strains. To aid analysis of potentially thousands of complete and draft genome assemblies, this database and analysis platform was upgraded to integrate curated genome annotations and isolate metadata with enhanced tools for larger scale comparative analysis and visualization. Manually curated gene annotations are supplemented with improved computational analyses that help identify putative drug targets and vaccine candidates or assist with evolutionary studies by identifying orthologs, pathogen-associated genes and genomic islands. The database schema has been updated to integrate isolate metadata that will facilitate more powerful analysis of genomes across datasets in the future. We continue to place an emphasis on providing high-quality updates to gene annotations through regular review of the scientific literature and using community-based approaches including a major new Pseudomonas community initiative for the assignment of high-quality gene ontology terms to genes. As we further expand from thousands of genomes, we plan to provide enhancements that will aid data visualization and analysis arising from whole-genome comparative studies including more pan-genome and population-based approaches. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Problem gambling worldwide: An update and systematic review of empirical research (2000–2015)
Calado, Filipa; Griffiths, Mark D.
2016-01-01
Background and aims Problem gambling has been identified as an emergent public health issue, and there is a need to identify gambling trends and to regularly update worldwide gambling prevalence rates. This paper aims to review recent research on adult gambling and problem gambling (since 2000) and then, in the context of a growing liberalization of the gambling market in the European Union, intends to provide a more detailed analysis of adult gambling behavior across European countries. Methods A systematic literature search was carried out using academic databases, Internet, and governmental websites. Results Following this search and utilizing exclusion criteria, 69 studies on adult gambling prevalence were identified. These studies demonstrated that there are wide variations in past-year problem gambling rates across different countries in the world (0.12–5.8%) and in Europe (0.12–3.4%). However, it is difficult to directly compare studies due to different methodological procedures, instruments, cut-offs, and time frames. Despite the variability among instruments, some consistent results with regard to demographics were found. Discussion and conclusion The findings highlight the need for continuous monitoring of problem gambling prevalence rates in order to examine the influence of cultural context on gambling patterns, assess the effectiveness of policies on gambling-related harms, and establish priorities for future research. PMID:27784180
NASA Astrophysics Data System (ADS)
Martín-Luis, Antonio; Arbelo, Manuel; Hernández-Leal, Pedro; Arbelo-Bayó, Manuel
2016-10-01
Reliable and updated maps of vegetation in protected natural areas are essential for a proper management and conservation. Remote sensing is a valid tool for this purpose. In this study, a methodology based on a WorldView-2 (WV-2) satellite image and in situ spectral signatures measurements was applied to map the Canarian Monteverde ecosystem located in the north of the Tenerife Island (Canary Islands, Spain). Due to the high spectral similarity of vegetation species in the study zone, a Multiple Endmember Spectral Mixture Analysis (MESMA) was performed. MESMA determines the fractional cover of different components within one pixel and it allows for a pixel-by-pixel variation of endmembers. Two libraries of endmembers were collected for the most abundant species in the test area. The first library was collected from in situ spectral signatures measured with an ASD spectroradiometer during a field campaign in June 2015. The second library was obtained from pure pixels identified in the satellite image for the same species. The accuracy of the mapping process was assessed from a set of independent validation plots. The overall accuracy for the ASD-based method was 60.51 % compared to the 86.67 % reached for the WV-2 based mapping. The results suggest the possibility of using WV-2 images for monitoring and regularly updating the maps of the Monteverde forest on the island of Tenerife.
Hartman, Victoria; Castillo-Pelayo, Tania; Babinszky, Sindy; Dee, Simon; Leblanc, Jodi; Matzke, Lise; O'Donoghue, Sheila; Carpenter, Jane; Carter, Candace; Rush, Amanda; Byrne, Jennifer; Barnes, Rebecca; Mes-Messons, Anne-Marie; Watson, Peter
2018-02-01
Ongoing quality management is an essential part of biobank operations and the creation of high quality biospecimen resources. Adhering to the standards of a national biobanking network is a way to reduce variability between individual biobank processes, resulting in cross biobank compatibility and more consistent support for health researchers. The Canadian Tissue Repository Network (CTRNet) implemented a set of required operational practices (ROPs) in 2011 and these serve as the standards and basis for the CTRNet biobank certification program. A review of these 13 ROPs covering 314 directives was conducted after 5 years to identify areas for revision and update, leading to changes to 7/314 directives (2.3%). A review of all internal controlled documents (including policies, standard operating procedures and guides, and forms for actions and processes) used by the BC Cancer Agency's Tumor Tissue Repository (BCCA-TTR) to conform to these ROPs was then conducted. Changes were made to 20/106 (19%) of BCCA-TTR documents. We conclude that a substantial fraction of internal controlled documents require updates at regular intervals to accommodate changes in best practices. Reviewing documentation is an essential aspect of keeping up to date with best practices and ensuring the quality of biospecimens and data managed by biobanks.
NASA Astrophysics Data System (ADS)
Olugboji, T. M.; Lekic, V.; McDonough, W.
2017-07-01
We present a new approach for evaluating existing crustal models using ambient noise data sets and its associated uncertainties. We use a transdimensional hierarchical Bayesian inversion approach to invert ambient noise surface wave phase dispersion maps for Love and Rayleigh waves using measurements obtained from Ekström (2014). Spatiospectral analysis shows that our results are comparable to a linear least squares inverse approach (except at higher harmonic degrees), but the procedure has additional advantages: (1) it yields an autoadaptive parameterization that follows Earth structure without making restricting assumptions on model resolution (regularization or damping) and data errors; (2) it can recover non-Gaussian phase velocity probability distributions while quantifying the sources of uncertainties in the data measurements and modeling procedure; and (3) it enables statistical assessments of different crustal models (e.g., CRUST1.0, LITHO1.0, and NACr14) using variable resolution residual and standard deviation maps estimated from the ensemble. These assessments show that in the stable old crust of the Archean, the misfits are statistically negligible, requiring no significant update to crustal models from the ambient noise data set. In other regions of the U.S., significant updates to regionalization and crustal structure are expected especially in the shallow sedimentary basins and the tectonically active regions, where the differences between model predictions and data are statistically significant.
NASA Astrophysics Data System (ADS)
Jakovetic, Dusan; Xavier, João; Moura, José M. F.
2011-08-01
We study distributed optimization in networked systems, where nodes cooperate to find the optimal quantity of common interest, x=x^\\star. The objective function of the corresponding optimization problem is the sum of private (known only by a node,) convex, nodes' objectives and each node imposes a private convex constraint on the allowed values of x. We solve this problem for generic connected network topologies with asymmetric random link failures with a novel distributed, decentralized algorithm. We refer to this algorithm as AL-G (augmented Lagrangian gossiping,) and to its variants as AL-MG (augmented Lagrangian multi neighbor gossiping) and AL-BG (augmented Lagrangian broadcast gossiping.) The AL-G algorithm is based on the augmented Lagrangian dual function. Dual variables are updated by the standard method of multipliers, at a slow time scale. To update the primal variables, we propose a novel, Gauss-Seidel type, randomized algorithm, at a fast time scale. AL-G uses unidirectional gossip communication, only between immediate neighbors in the network and is resilient to random link failures. For networks with reliable communication (i.e., no failures,) the simplified, AL-BG (augmented Lagrangian broadcast gossiping) algorithm reduces communication, computation and data storage cost. We prove convergence for all proposed algorithms and demonstrate by simulations the effectiveness on two applications: l_1-regularized logistic regression for classification and cooperative spectrum sensing for cognitive radio networks.
European LeukemiaNet recommendations for the management of chronic myeloid leukemia: 2013
Deininger, Michael W.; Rosti, Gianantonio; Hochhaus, Andreas; Soverini, Simona; Apperley, Jane F.; Cervantes, Francisco; Clark, Richard E.; Cortes, Jorge E.; Guilhot, François; Hjorth-Hansen, Henrik; Hughes, Timothy P.; Kantarjian, Hagop M.; Kim, Dong-Wook; Larson, Richard A.; Lipton, Jeffrey H.; Mahon, François-Xavier; Martinelli, Giovanni; Mayer, Jiri; Müller, Martin C.; Niederwieser, Dietger; Pane, Fabrizio; Radich, Jerald P.; Rousselot, Philippe; Saglio, Giuseppe; Saußele, Susanne; Schiffer, Charles; Silver, Richard; Simonsson, Bengt; Steegmann, Juan-Luis; Goldman, John M.; Hehlmann, Rüdiger
2013-01-01
Advances in chronic myeloid leukemia treatment, particularly regarding tyrosine kinase inhibitors, mandate regular updating of concepts and management. A European LeukemiaNet expert panel reviewed prior and new studies to update recommendations made in 2009. We recommend as initial treatment imatinib, nilotinib, or dasatinib. Response is assessed with standardized real quantitative polymerase chain reaction and/or cytogenetics at 3, 6, and 12 months. BCR-ABL1 transcript levels ≤10% at 3 months, <1% at 6 months, and ≤0.1% from 12 months onward define optimal response, whereas >10% at 6 months and >1% from 12 months onward define failure, mandating a change in treatment. Similarly, partial cytogenetic response (PCyR) at 3 months and complete cytogenetic response (CCyR) from 6 months onward define optimal response, whereas no CyR (Philadelphia chromosome–positive [Ph+] >95%) at 3 months, less than PCyR at 6 months, and less than CCyR from 12 months onward define failure. Between optimal and failure, there is an intermediate warning zone requiring more frequent monitoring. Similar definitions are provided for response to second-line therapy. Specific recommendations are made for patients in the accelerated and blastic phases, and for allogeneic stem cell transplantation. Optimal responders should continue therapy indefinitely, with careful surveillance, or they can be enrolled in controlled studies of treatment discontinuation once a deeper molecular response is achieved. PMID:23803709
Retrievable payload carrier, next generation Long Duration Exposure Facility: Update 1992
NASA Technical Reports Server (NTRS)
Perry, A. T.; Cagle, J. A.; Newman, S. C.
1993-01-01
Access to space and cost have been two major inhibitors of low Earth orbit research. The Retrievable Payload Carrier (RPC) Program is a commercial space program which strives to overcome these two barriers to space experimentation. The RPC Program's fleet of spacecraft, ground communications station, payload processing facility, and experienced integration and operations team will provide a convenient 'one-stop shop' for investigators seeking to use the unique vantage point and environment of low Earth orbit for research. The RPC is a regularly launched and retrieved, free-flying spacecraft providing resources adequate to meet modest payload/experiment requirements, and presenting ample surface area, volume, mass, and growth capacity for investigator usage. Enhanced capabilities of ground communications, solar-array-supplied electrical power, central computing, and on-board data storage pick up on the path where NASA's Long Duration Exposure Facility (LDEF) blazed the original technology trail. Mission lengths of 6-18 months, or longer, are envisioned. The year 1992 was designated as the 'International Space Year' and coincides with the 500th anniversary of Christopher Columbus's voyage to the New World. This is a fitting year in which to launch the full scale development of our unique shop of discovery whose intent is to facilitate retrieving technological rewards from another new world: space. Presented is an update on progress made on the RPC Program's development since the November 1991 LDEF Materials Workshop.
Alabi, Olatunji; Doctor, Henry V; Jumare, Abdulazeez; Sahabi, Nasiru; Abdulwahab, Ahmad; Findley, Sally E; Abubakar, Sani D
2014-12-01
The Nahuche Health and Demographic Surveillance System (HDSS) study site, established in 2009 with 137 823 individuals is located in Zamfara State, north western Nigeria. North-West Nigeria is a region with one of the worst maternal and child health indicators in Nigeria. For example, the 2013 Nigeria Demographic and Health Survey estimated an under-five mortality rate of 185 deaths per 1000 live births for the north-west geo-political zone compared with a national average of 128 deaths per 1000 live births. The site comprises over 100 villages under the leadership of six district heads. Virtually all the residents of the catchment population are Hausa by ethnicity. After a baseline census in 2010, regular update rounds of data collection are conducted every 6 months. Data collection on births, deaths, migration events, pregnancies, marriages and marriage termination events are routinely conducted. Verbal autopsy (VA) data are collected on all deaths reported during routine data collection. Annual update data on antenatal care and household characteristics are also collected. Opportunities for collaborations are available at Nahuche HDSS. The Director of Nahuche HDSS, M.O. Oche at [ochedr@hotmail.com] is the contact person for all forms of collaboration. © The Author 2014; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.
Home artificial nutrition: an update seven years after the regional regulation.
Santarpia, Lidia; Pagano, Maria Carmen; Pasanisi, Fabrizio; Contaldo, Franco
2014-10-01
Home Artificial Nutrition (HAN) is a well established extra-hospital therapy, contributing to decreased health care costs, by reducing the number and length of hospitalizations. The knowledge of the epidemiology of HAN helps plan health-care funding and in analyze the factors that can improve HAN service. An update on the prevalence of Home Artificial Nutrition (HAN) in the Campania region (Southern Italy) and patients clinical characteristics has been regularly carried out in the past seven years after a specific regional regulation issued in 2005. Total number of patients on HAN has increased from 355 in April 2005 to 1165 in April 2012 (+228.2%); in particular, patients on Home Parenteral Nutrition (HPN) increased from 156 in April 2005 to 306 in April 2012 (+96.2%) and patients on Home Enteral Nutrition (HEN) from 199 to 838 (+321.1%) respectively. HEN/HPN ratio in adults has changed from 1.3/1 in April 2005 to 2.7/1 in April 2012, gradually nearing the expected national mean ratio of 5/1 as observed in the 2005 national survey. The specific regional regulation in Campania has contributed to increase the prescription of HAN and to ameliorate its indications; in particular, through the years, HEN is gradually nearing national standards. Copyright © 2013 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.
Sulfur Dioxide Emission Rates from Kilauea Volcano, Hawai`i, an Update: 2002-2006
Elias, Tamar; Sutton, A.J.
2007-01-01
Introduction Sulfur dioxide (SO2) emission rates from Kilauea Volcano were first measured by Stoiber and Malone (1975) and have been measured on a regular basis since 1979 (Greenland and others, 1985; Casadevall and others, 1987; Elias and others, 1998; Sutton and others, 2001, Elias and Sutton, 2002, Sutton and others, 2003). Compilations of SO2 emission-rate and wind-vector data from 1979 through 2001 are available on the web. (Elias and others, 1998 and 2002). This report updates the database through 2006, and documents the changes in data collection and processing that have occurred during the interval 2002-2006. During the period covered by this report, Kilauea continued to release SO2 gas predominantly from its summit caldera and east rift zone (ERZ) (Elias and others, 1998; Sutton and others, 2001, Elias and others, 2002, Sutton and others, 2003). These two distinct sources are always measured independently (fig.1). Sulphur Banks is a minor source of SO2 and does not contribute significantly to the total emissions for Kilauea (Stoiber and Malone, 1975). From 1979 until 2003, summit and east rift zone emission rates were derived using vehicle- and tripod- based Correlation Spectrometry (COSPEC) measurements. In late 2003, we began to augment traditional COSPEC measurements with data from one of the new generation of miniature spectrometer systems, the FLYSPEC (Horton and others, 2006; Elias and others, 2006, Williams-Jones and others, 2006).
[The added value of information summaries supporting clinical decisions at the point-of-care.
Banzi, Rita; González-Lorenzo, Marien; Kwag, Koren Hyogene; Bonovas, Stefanos; Moja, Lorenzo
2016-11-01
Evidence-based healthcare requires the integration of the best research evidence with clinical expertise and patients' values. International publishers are developing evidence-based information services and resources designed to overcome the difficulties in retrieving, assessing and updating medical information as well as to facilitate a rapid access to valid clinical knowledge. Point-of-care information summaries are defined as web-based medical compendia that are specifically designed to deliver pre-digested, rapidly accessible, comprehensive, and periodically updated information to health care providers. Their validity must be assessed against marketing claims that they are evidence-based. We periodically evaluate the content development processes of several international point-of-care information summaries. The number of these products has increased along with their quality. The last analysis done in 2014 identified 26 products and found that three of them (Best Practice, Dynamed e Uptodate) scored the highest across all evaluated dimensions (volume, quality of the editorial process and evidence-based methodology). Point-of-care information summaries as stand-alone products or integrated with other systems, are gaining ground to support clinical decisions. The choice of one product over another depends both on the properties of the service and the preference of users. However, even the most innovative information system must rely on transparent and valid contents. Individuals and institutions should regularly assess the value of point-of-care summaries as their quality changes rapidly over time.
Policies and practices on competing interests of academic staff in Australian universities.
Chapman, Simon; Morrell, Bronwen; Forsyth, Rowena; Kerridge, Ian; Stewart, Cameron
2012-04-16
To document the existence and provisions of Australian universities' policies on the competing interests of academic staff and university practices in recording, updating and making these declarations publicly accessible. A 14-item survey was sent to the vice-chancellors of 39 Australian universities and university websites were searched for relevant policies. Twelve universities declined to provide any information. Of the 27 that did, all had policies on staff competing interests. Fifteen did not require regular declarations from staff and only four required annual declarations. Eight universities maintained a centralised register of COIs of all staff and six had a mechanism in place that allowed members of the public to access information on COIs. None reported that they required that staff place their COI declarations on their website profiles and none had policies that indicated that staff should declare COIs when making a public comment. Australian universities vary significantly in their approaches to the declaration and management of competing interests. While two-thirds of Australian universities require staff to declare competing interests, this information is mostly inaccessible to the public. Australian universities should adopt a standard approach to the declaration and management of competing interests and commit to meaningful transparency and public accountability. This could include frequently updated declarations on website profiles of all staff. In addition, dialogue about what is needed to effectively deal with competing interests should be encouraged.
Lewis, Nicola S; Anderson, Tavis K; Kitikoon, Pravina; Skepner, Eugene; Burke, David F; Vincent, Amy L
2014-05-01
Swine influenza A virus is an endemic and economically important pathogen in pigs, with the potential to infect other host species. The hemagglutinin (HA) protein is the primary target of protective immune responses and the major component in swine influenza A vaccines. However, as a result of antigenic drift, vaccine strains must be regularly updated to reflect currently circulating strains. Characterizing the cross-reactivity between strains in pigs and seasonal influenza virus strains in humans is also important in assessing the relative risk of interspecies transmission of viruses from one host population to the other. Hemagglutination inhibition (HI) assay data for swine and human H3N2 viruses were used with antigenic cartography to quantify the antigenic differences among H3N2 viruses isolated from pigs in the United States from 1998 to 2013 and the relative cross-reactivity between these viruses and current human seasonal influenza A virus strains. Two primary antigenic clusters were found circulating in the pig population, but with enough diversity within and between the clusters to suggest updates in vaccine strains are needed. We identified single amino acid substitutions that are likely responsible for antigenic differences between the two primary antigenic clusters and between each antigenic cluster and outliers. The antigenic distance between current seasonal influenza virus H3 strains in humans and those endemic in swine suggests that population immunity may not prevent the introduction of human viruses into pigs, and possibly vice versa, reinforcing the need to monitor and prepare for potential incursions. Influenza A virus (IAV) is an important pathogen in pigs and humans. The hemagglutinin (HA) protein is the primary target of protective immune responses and the major target of vaccines. However, vaccine strains must be updated to reflect current strains. Characterizing the differences between seasonal IAV in humans and swine IAV is important in assessing the relative risk of interspecies transmission of viruses. We found two primary antigenic clusters of H3N2 in the U.S. pig population, with enough diversity to suggest updates in swine vaccine strains are needed. We identified changes in the HA protein that are likely responsible for these differences and that may be useful in predicting when vaccines need to be updated. The difference between human H3N2 viruses and those in swine is enough that population immunity is unlikely to prevent new introductions of human IAV into pigs or vice versa, reinforcing the need to monitor and prepare for potential introductions.
Updating ARI Educational Benefits Usage Database for Army Regular, Reserve, and Guard: 2009-2010
2011-05-01
the Behavioral and Social Sciences (ARI) under Delivery Order # 5 of the Contract for Manpower, Personnel, Leader Development, and Training... 5 19 45 50 119 586 2008 60 0 4 5 5 14 74...A- 5 ***** MGIB Regular Army Data as of Sep 2009 ***** Page 3
InSAR data for monitoring land subsidence: time to think big
NASA Astrophysics Data System (ADS)
Ferretti, A.; Colombo, D.; Fumagalli, A.; Novali, F.; Rucci, A.
2015-11-01
Satellite interferometric synthetic aperture radar (InSAR) data have proven effective and valuable in the analysis of urban subsidence phenomena based on multi-temporal radar images. Results obtained by processing data acquired by different radar sensors, have shown the potential of InSAR and highlighted the key points for an operational use of this technology, namely: (1) regular acquisition over large areas of interferometric data stacks; (2) use of advanced processing algorithms, capable of estimating and removing atmospheric disturbances; (3) access to significant processing power for a regular update of the information over large areas. In this paper, we show how the operational potential of InSAR has been realized thanks to the recent advances in InSAR processing algorithms, the advent of cloud computing and the launch of new satellite platforms, specifically designed for InSAR analyses (e.g. Sentinel-1a operated by the ESA and ALOS2 operated by JAXA). The processing of thousands of SAR scenes to cover an entire nation has been performed successfully in Italy in a project financed by the Italian Ministry of the Environment. The challenge for the future is to pass from the historical analysis of SAR scenes already acquired in digital archives to a near real-time monitoring program where up to date deformation data are routinely provided to final users and decision makers.
Physical health care monitoring for people with serious mental illness.
Tosh, Graeme; Clifton, Andrew V; Xia, Jun; White, Margueritte M
2014-01-17
Current guidance suggests that we should monitor the physical health of people with serious mental illness, and there has been a significant financial investment over recent years to provide this. To assess the effectiveness of physical health monitoring, compared with standard care for people with serious mental illness. We searched the Cochrane Schizophrenia Group Trials Register (October 2009, update in October 2012), which is based on regular searches of CINAHL, EMBASE, MEDLINE and PsycINFO. All randomised clinical trials focusing on physical health monitoring versus standard care, or comparing i) self monitoring versus monitoring by a healthcare professional; ii) simple versus complex monitoring; iii) specific versus non-specific checks; iv) once only versus regular checks; or v) different guidance materials. Initially, review authors (GT, AC, SM) independently screened the search results and identified three studies as possibly fulfilling the review's criteria. On examination, however, all three were subsequently excluded. Forty-two additional citations were identified in October 2012 and screened by two review authors (JX and MW), 11 of which underwent full screening. No relevant randomised trials which assess the effectiveness of physical health monitoring in people with serious mental illness have been completed. We identified one ongoing study. There is still no evidence from randomised trials to support or refute current guidance and practice. Guidance and practice are based on expert consensus, clinical experience and good intentions rather than high quality evidence.
Kim, Jeongho; Yu, Il Je
2016-01-01
A national survey on workplace environment nanomaterial handling and manufacturing was conducted in 2014. Workplaces relevant to nanomaterials were in the order of TiO2 (91), SiO2 (88), carbon black (84), Ag (35), Al2O3 (35), ZnO (34), Pb (33), and CeO2 (31). The survey results indicated that the number of workplaces handling or manufacturing nanomaterials was 340 (0.27% of total 126,846) workplaces. The number of nanomaterials used and products was 546 (1.60 per company) and 583 (1.71 per company), respectively. For most workplaces, the results on exposure to hazardous particulate materials, including nanomaterials, were below current OELs, yet a few workplaces were above the action level. As regards the health status of workers, 9 workers were diagnosed with a suspected respiratory occupational disease, where 7 were recommended for regular follow-up health monitoring. 125 safety data sheets (SDSs) were collected from the nanomaterial-relevant workplaces and evaluated for their completeness and reliability. Only 4 CNT SDSs (3.2%) included the term nanomaterial, while most nanomaterial SDSs were not regularly updated and lacked hazard information. When taken together, the current analysis provides valuable national-level information on the exposure and health status of workers that can guide the next policy steps for nanomaterial management in the workplace.
Levy, Mark L; Dekhuijzen, P N R; Barnes, P J; Broeders, M; Corrigan, C J; Chawes, B L; Corbetta, L; Dubus, J C; Hausen, Th; Lavorini, F; Roche, N; Sanchis, J; Usmani, Omar S; Viejo, J; Vincken, W; Voshaar, Th; Crompton, G K; Pedersen, Soren
2016-04-21
Health professionals tasked with advising patients with asthma and chronic obstructive pulmonary disease (COPD) how to use inhaler devices properly and what to do about unwanted effects will be aware of a variety of commonly held precepts. The evidence for many of these is, however, lacking or old and therefore in need of re-examination. Few would disagree that facilitating and encouraging regular and proper use of inhaler devices for the treatment of asthma and COPD is critical for successful outcomes. It seems logical that the abandonment of unnecessary or ill-founded practices forms an integral part of this process: the use of inhalers is bewildering enough, particularly with regular introduction of new drugs, devices and ancillary equipment, without unnecessary and pointless adages. We review the evidence, or lack thereof, underlying ten items of inhaler 'lore' commonly passed on by health professionals to each other and thence to patients. The exercise is intended as a pragmatic, evidence-informed review by a group of clinicians with appropriate experience. It is not intended to be an exhaustive review of the literature; rather, we aim to stimulate debate, and to encourage researchers to challenge some of these ideas and to provide new, updated evidence on which to base relevant, meaningful advice in the future. The discussion on each item is followed by a formal, expert opinion by members of the ADMIT Working Group.
2016-01-01
A national survey on workplace environment nanomaterial handling and manufacturing was conducted in 2014. Workplaces relevant to nanomaterials were in the order of TiO2 (91), SiO2 (88), carbon black (84), Ag (35), Al2O3 (35), ZnO (34), Pb (33), and CeO2 (31). The survey results indicated that the number of workplaces handling or manufacturing nanomaterials was 340 (0.27% of total 126,846) workplaces. The number of nanomaterials used and products was 546 (1.60 per company) and 583 (1.71 per company), respectively. For most workplaces, the results on exposure to hazardous particulate materials, including nanomaterials, were below current OELs, yet a few workplaces were above the action level. As regards the health status of workers, 9 workers were diagnosed with a suspected respiratory occupational disease, where 7 were recommended for regular follow-up health monitoring. 125 safety data sheets (SDSs) were collected from the nanomaterial-relevant workplaces and evaluated for their completeness and reliability. Only 4 CNT SDSs (3.2%) included the term nanomaterial, while most nanomaterial SDSs were not regularly updated and lacked hazard information. When taken together, the current analysis provides valuable national-level information on the exposure and health status of workers that can guide the next policy steps for nanomaterial management in the workplace. PMID:27556041
3D superwide-angle one-way propagator and its application in seismic modeling and imaging
NASA Astrophysics Data System (ADS)
Jia, Xiaofeng; Jiang, Yunong; Wu, Ru-Shan
2018-07-01
Traditional one-way wave-equation based propagators have been widely used in past decades. Comparing to two-way propagators, one-way methods have higher efficiency and lower memory demands. These two features are especially important in solving large-scale 3D problems. However, regular one-way propagators cannot simulate waves that propagate in large angles within 90° because of their inherent wide angle limitation. Traditional one-way can only propagate along the determined direction (e.g., z-direction), so simulation of turning waves is beyond the ability of one-way methods. We develop 3D superwide-angle one-way propagator to overcome angle limitation and to simulate turning waves with superwide-angle propagation angle (>90°) for modeling and imaging complex geological structures. Wavefields propagating along vertical and horizontal directions are combined using typical stacking scheme. A weight function related to the propagation angle is used for combining and updating wavefields in each propagating step. In the implementation, we use graphics processing units (GPU) to accelerate the process. Typical workflow is designed to exploit the advantages of GPU architecture. Numerical examples show that the method achieves higher accuracy in modeling and imaging steep structures than regular one-way propagators. Actually, superwide-angle one-way propagator can be applied based on any one-way method to improve the effects of seismic modeling and imaging.
Evolutionary Games of Multiplayer Cooperation on Graphs
Arranz, Jordi; Traulsen, Arne
2016-01-01
There has been much interest in studying evolutionary games in structured populations, often modeled as graphs. However, most analytical results so far have only been obtained for two-player or linear games, while the study of more complex multiplayer games has been usually tackled by computer simulations. Here we investigate evolutionary multiplayer games on graphs updated with a Moran death-Birth process. For cycles, we obtain an exact analytical condition for cooperation to be favored by natural selection, given in terms of the payoffs of the game and a set of structure coefficients. For regular graphs of degree three and larger, we estimate this condition using a combination of pair approximation and diffusion approximation. For a large class of cooperation games, our approximations suggest that graph-structured populations are stronger promoters of cooperation than populations lacking spatial structure. Computer simulations validate our analytical approximations for random regular graphs and cycles, but show systematic differences for graphs with many loops such as lattices. In particular, our simulation results show that these kinds of graphs can even lead to more stringent conditions for the evolution of cooperation than well-mixed populations. Overall, we provide evidence suggesting that the complexity arising from many-player interactions and spatial structure can be captured by pair approximation in the case of random graphs, but that it need to be handled with care for graphs with high clustering. PMID:27513946
NASA Technical Reports Server (NTRS)
Zanley, Nancy L.
1991-01-01
The NASA Science Internet (NSI) Network Operations Staff is responsible for providing reliable communication connectivity for the NASA science community. As the NSI user community expands, so does the demand for greater interoperability with users and resources on other networks (e.g., NSFnet, ESnet), both nationally and internationally. Coupled with the science community's demand for greater access to other resources is the demand for more reliable communication connectivity. Recognizing this, the NASA Science Internet Project Office (NSIPO) expands its Operations activities. By January 1990, Network Operations was equipped with a telephone hotline, and its staff was expanded to six Network Operations Analysts. These six analysts provide 24-hour-a-day, 7-day-a-week coverage to assist site managers with problem determination and resolution. The NSI Operations staff monitors network circuits and their associated routers. In most instances, NSI Operations diagnoses and reports problems before users realize a problem exists. Monitoring of the NSI TCP/IP Network is currently being done with Proteon's Overview monitoring system. The Overview monitoring system displays a map of the NSI network utilizing various colors to indicate the conditions of the components being monitored. Each node or site is polled via the Simple Network Monitoring Protocol (SNMP). If a circuit goes down, Overview alerts the Network Operations staff with an audible alarm and changes the color of the component. When an alert is received, Network Operations personnel immediately verify and diagnose the problem, coordinate repair with other networking service groups, track problems, and document problem and resolution into a trouble ticket data base. NSI Operations offers the NSI science community reliable connectivity by exercising prompt assessment and resolution of network problems.
Screening for breast cancer: U.S. Preventive Services Task Force recommendation statement.
2009-11-17
Update of the 2002 U.S. Preventive Services Task Force (USPSTF) recommendation statement on screening for breast cancer in the general population. The USPSTF examined the evidence on the efficacy of 5 screening modalities in reducing mortality from breast cancer: film mammography, clinical breast examination, breast self-examination, digital mammography, and magnetic resonance imaging in order to update the 2002 recommendation. To accomplish this update, the USPSTF commissioned 2 studies: 1) a targeted systematic evidence review of 6 selected questions relating to benefits and harms of screening, and 2) a decision analysis that used population modeling techniques to compare the expected health outcomes and resource requirements of starting and ending mammography screening at different ages and using annual versus biennial screening intervals. The USPSTF recommends against routine screening mammography in women aged 40 to 49 years. The decision to start regular, biennial screening mammography before the age of 50 years should be an individual one and take into account patient context, including the patient's values regarding specific benefits and harms. (Grade C recommendation) The USPSTF recommends biennial screening mammography for women between the ages of 50 and 74 years. (Grade B recommendation) The USPSTF concludes that the current evidence is insufficient to assess the additional benefits and harms of screening mammography in women 75 years or older. (I statement) The USPSTF concludes that the current evidence is insufficient to assess the additional benefits and harms of clinical breast examination beyond screening mammography in women 40 years or older. (I statement) The USPSTF recommends against clinicians teaching women how to perform breast self-examination. (Grade D recommendation) The USPSTF concludes that the current evidence is insufficient to assess additional benefits and harms of either digital mammography or magnetic resonance imaging instead of film mammography as screening modalities for breast cancer. (I statement).
Performance of two updated blood glucose monitoring systems: an evaluation following ISO 15197:2013.
Pleus, Stefan; Baumstark, Annette; Rittmeyer, Delia; Jendrike, Nina; Haug, Cornelia; Freckmann, Guido
2016-05-01
Objective For patients with diabetes, regular self-monitoring of blood glucose (SMBG) is essential to ensure adequate glycemic control. Therefore, accurate and reliable blood glucose measurements with SMBG systems are necessary. The international standard ISO 15197 describes requirements for SMBG systems, such as limits within which 95% of glucose results have to fall to reach acceptable system accuracy. The 2013 version of this standard sets higher demands, especially regarding system accuracy, than the currently still valid edition. ISO 15197 can be applied by manufacturers to receive a CE mark for their system. Research design and methods This study was an accuracy evaluation following ISO 15197:2013 section 6.3 of two recently updated SMBG systems (Contour * and Contour TS; Bayer Consumer Care AG, Basel, Switzerland) with an improved algorithm to investigate whether the systems fulfill the requirements of the new standard. For this purpose, capillary blood samples of approximately 100 participants were measured with three test strip lots of both systems and deviations from glucose values obtained with a hexokinase-based comparison method (Cobas Integra † 400 plus; Roche Instrument Center, Rotkreuz, Switzerland) were determined. Percentages of values within the acceptance criteria of ISO 15197:2013 were calculated. This study was registered at clinicaltrials.gov (NCT02358408). Main outcome Both updated systems fulfilled the system accuracy requirements of ISO 15197:2013 as 98.5% to 100% of the results were within the stipulated limits. Furthermore, all results were within the clinically non-critical zones A and B of the consensus error grid for type 1 diabetes. Conclusions The technical improvement of the systems ensured compliance with ISO 15197 in the hands of healthcare professionals even in its more stringent 2013 version. Alternative presentation of system accuracy results in radar plots provides additional information with certain advantages. In addition, the surveillance error grid offers a modern tool to assess a system's clinical performance.
Viazzi, Francesca; Piscitelli, Pamela; Ceriello, Antonio; Fioretto, Paola; Giorda, Carlo; Guida, Pietro; Russo, Giuseppina; De Cosmo, Salvatore; Pontremoli, Roberto
2017-09-22
Apparent treatment resistant hypertension (aTRH) is highly prevalent in patients with type 2 diabetes mellitus (T2D) and entails worse cardiovascular prognosis. The impact of aTRH and long-term achievement of recommended blood pressure (BP) values on renal outcome remains largely unknown. We assessed the role of aTRH and BP on the development of chronic kidney disease in patients with T2D and hypertension in real-life clinical practice. Clinical records from a total of 29 923 patients with T2D and hypertension, with normal baseline estimated glomerular filtration rate and regular visits during a 4-year follow-up, were retrieved and analyzed. The association between time-updated BP control (ie, 75% of visits with BP <140/90 mm Hg) and the occurrence of estimated glomerular filtration rate <60 and/or a reduction ≥30% from baseline was assessed. At baseline, 17% of patients had aTRH. Over the 4-year follow-up, 19% developed low estimated glomerular filtration rate and 12% an estimated glomerular filtration rate reduction ≥30% from baseline. Patients with aTRH showed an increased risk of developing both renal outcomes (adjusted odds ratio, 1.31 and 1.43; P <0.001 respectively), as compared with those with non-aTRH. No association was found between BP control and renal outcomes in non-aTRH, whereas in aTRH, BP control was associated with a 30% ( P =0.036) greater risk of developing the renal end points. ATRH entails a worse renal prognosis in T2D with hypertension. BP control is not associated with a more-favorable renal outcome in aTRH. The relationship between time-updated BP and renal function seems to be J-shaped, with optimal systolic BP values between 120 and 140 mm Hg. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.
CycADS: an annotation database system to ease the development and update of BioCyc databases
Vellozo, Augusto F.; Véron, Amélie S.; Baa-Puyoulet, Patrice; Huerta-Cepas, Jaime; Cottret, Ludovic; Febvay, Gérard; Calevro, Federica; Rahbé, Yvan; Douglas, Angela E.; Gabaldón, Toni; Sagot, Marie-France; Charles, Hubert; Colella, Stefano
2011-01-01
In recent years, genomes from an increasing number of organisms have been sequenced, but their annotation remains a time-consuming process. The BioCyc databases offer a framework for the integrated analysis of metabolic networks. The Pathway tool software suite allows the automated construction of a database starting from an annotated genome, but it requires prior integration of all annotations into a specific summary file or into a GenBank file. To allow the easy creation and update of a BioCyc database starting from the multiple genome annotation resources available over time, we have developed an ad hoc data management system that we called Cyc Annotation Database System (CycADS). CycADS is centred on a specific database model and on a set of Java programs to import, filter and export relevant information. Data from GenBank and other annotation sources (including for example: KAAS, PRIAM, Blast2GO and PhylomeDB) are collected into a database to be subsequently filtered and extracted to generate a complete annotation file. This file is then used to build an enriched BioCyc database using the PathoLogic program of Pathway Tools. The CycADS pipeline for annotation management was used to build the AcypiCyc database for the pea aphid (Acyrthosiphon pisum) whose genome was recently sequenced. The AcypiCyc database webpage includes also, for comparative analyses, two other metabolic reconstruction BioCyc databases generated using CycADS: TricaCyc for Tribolium castaneum and DromeCyc for Drosophila melanogaster. Linked to its flexible design, CycADS offers a powerful software tool for the generation and regular updating of enriched BioCyc databases. The CycADS system is particularly suited for metabolic gene annotation and network reconstruction in newly sequenced genomes. Because of the uniform annotation used for metabolic network reconstruction, CycADS is particularly useful for comparative analysis of the metabolism of different organisms. Database URL: http://www.cycadsys.org PMID:21474551
NASA Astrophysics Data System (ADS)
Semken, S. C.; Arrowsmith, R.; Fouch, M. J.; Garnero, E. J.; Taylor, W. L.; Bohon, W.; Pacheco, H. A.; Schwab, P.; Baumback, D.; Pettis, L.; Colunga, J.; Robinson, S.; Dick, C.
2012-12-01
The EarthScope Program (www.earthscope.org) funded by the National Science Foundation fosters interdisciplinary exploration of the geologic structure and evolution of the North American continent by means of seismology, geodesy, magnetotellurics, in-situ fault-zone sampling, geochronology, and high-resolution topographic measurements. EarthScope scientific data and findings are transforming the study of Earth structure and processes throughout the planet. These data enhance the understanding and mitigation of hazards and inform environmental and economic applications of geoscience. The EarthScope Program also offers significant resources and opportunities for education and outreach (E&O) in the Earth system sciences. The EarthScope National Office (ESNO) at Arizona State University serves all EarthScope stakeholders, including researchers, educators, students, and the general public. ESNO continues to actively support and promote E&O with programmatic activities such as a regularly updated presence on the web and social media, newsletters, biannual national conferences, workshops for E&O providers and informal educators (interpreters), collaborative interaction with other Earth science organizations, continuing education for researchers, promotion of place-based education, and support for regional K-12 teacher professional-development programs led by EarthScope stakeholders. EarthScope E&O, coordinated by ESNO, leads the compilation and dissemination of the data, findings, and legacy of the epic EarthScope Program. In this presentation we offer updated reports and outcomes from ESNO E&O activities, including web and social-media upgrades, the Earth Science E&O Provider Summit for partnering organizations, the Central Appalachian Interpretive Workshop for informal Earth science educators, the U.S. Science and Engineering Fair, and collaborative efforts with partner organizations. The EarthScope National Office is supported by the National Science Foundation under grants EAR-1101100 and EAR-1216301. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.
Patient data system for monitoring shunts.
Frank, E; Su, E; Smith, K
1988-01-01
Rapidly locating accurate data on a patient's shunt system is often extremely difficult. We have developed a simple system to fill a perceived need for recording current data on a patients shunt. This system employs an easily updated record in the patient's hospital or clinic chart as well as a wallet-sized data card for the patient or his family to carry. The data in the chart include the configuration of the patient's current shunt system and a graphic record of previous shunt problems. The small patient data card describes the age of the shunt system and its current configuration. We have found that this system provides assistance in the routine follow-up of patients with shunts and plays an extremely necessary role in the emergency evaluation of these patients, particularly when an emergency evaluation is undertaken in facilities distant from the location of regular treatment.
Development and Operations of the Astrophysics Data System
NASA Technical Reports Server (NTRS)
Murray, Stephen S.; Oliversen, Ronald (Technical Monitor)
2003-01-01
SAO TASKS ACCOMPLISHED: Abstract Service: (1) Continued regular updates of abstracts in the databases, both at SAO and at all mirror sites; (2) Established a new naming convention of QB books in preparation for adding physics books from Hollis or Library of Congress; (3) Modified handling of object tag so as not to interfere with XHTML definition; (4) Worked on moving 'what's new' announcements to a majordomo email list so as not to interfere with divisional mail handling; (5) Implemented and tested new first author feature following suggestions from users at the AAS meeting; (6) Added SSRv entries back to volume 1 in preparation for scanning of the journal; (7) Assisted in the re-configuration of the ADS mirror site at the CDS and sent a new set of tapes containing article data to allow re-creation of the ADS article data lost during the move; (8) Created scripts to automatically download Astrobiology.
2016-02-01
The Environmental Effects Assessment Panel (EEAP) is one of three Panels that regularly informs the Parties (countries) to the Montreal Protocol on the effects of ozone depletion and the consequences of climate change interactions with respect to human health, animals, plants, biogeochemistry, air quality, and materials. The Panels provide a detailed assessment report every four years. The most recent 2014 Quadrennial Assessment by the EEAP was published as a special issue of seven papers in 2015 (Photochem. Photobiol. Sci., 2015, 14, 1-184). The next Quadrennial Assessment will be published in 2018/2019. In the interim, the EEAP generally produces an annual update or progress report of the relevant scientific findings. The present progress report for 2015 assesses some of the highlights and new insights with regard to the interactive nature of the effects of UV radiation, atmospheric processes, and climate change.
Binary catalogue of exoplanets
NASA Astrophysics Data System (ADS)
Schwarz, Richard; Bazso, Akos; Zechner, Renate; Funk, Barbara
2016-02-01
Since 1995 there is a database which list most of the known exoplanets (The Extrasolar Planets Encyclopaedia at http://exoplanet.eu/). With the growing number of detected exoplanets in binary and multiple star systems it became more important to mark and to separate them into a new database, which is not available in the Extrasolar Planets Encyclopaedia. Therefore we established an online database (which can be found at: http://www.univie.ac.at/adg/schwarz/multiple.html) for all known exoplanets in binary star systems and in addition for multiple star systems, which will be updated regularly and linked to the Extrasolar Planets Encyclopaedia. The binary catalogue of exoplanets is available online as data file and can be used for statistical purposes. Our database is divided into two parts: the data of the stars and the planets, given in a separate list. We describe also the different parameters of the exoplanetary systems and present some applications.
[Calculation of standardised unit costs from a societal perspective for health economic evaluation].
Bock, J-O; Brettschneider, C; Seidl, H; Bowles, D; Holle, R; Greiner, W; König, H H
2015-01-01
Due to demographic aging, economic evaluation of health care technologies for the elderly becomes more important. A standardised questionnaire to measure the health-related resource utilisation has been designed. The monetary valuation of the resource use documented by the questionnaire is a central step towards the determination of the corresponding costs. The aim of this paper is to provide unit costs for the resources in the questionnaire from a societal perspective. The unit costs are calculated pragmatically based on regularly published sources. Thus, an easy update is possible. This paper presents the calculated unit costs for outpatient medical care, inpatient care, informal and formal nursing care and pharmaceuticals from a societal perspective. The calculated unit costs can serve as a reference case in health economic evaluations and hence help to increase their comparability. © Georg Thieme Verlag KG Stuttgart · New York.
Tenofovir Nephrotoxicity: 2011 Update
Fernandez-Fernandez, Beatriz; Montoya-Ferrer, Ana; Sanz, Ana B.; Sanchez-Niño, Maria D.; Izquierdo, Maria C.; Poveda, Jonay; Sainz-Prestel, Valeria; Ortiz-Martin, Natalia; Parra-Rodriguez, Alejandro; Selgas, Rafael; Ruiz-Ortega, Marta; Egido, Jesus; Ortiz, Alberto
2011-01-01
Tenofovir is an acyclic nucleotide analogue reverse-transcriptase inhibitor structurally similar to the nephrotoxic drugs adefovir and cidofovir. Tenofovir is widely used to treat HIV infection and approved for treatment of hepatitis B virus. Despite initial cell culture and clinical trials results supporting the renal safety of tenofovir, its clinical use is associated with a low, albeit significant, risk of kidney injury. Proximal tubular cell secretion of tenofovir explains the accumulation of the drug in these mitochondria-rich cells. Tenofovir nephrotoxicity is characterized by proximal tubular cell dysfunction that may be associated with acute kidney injury or chronic kidney disease. Withdrawal of the drug leads to improvement of analytical parameters that may be partial. Understanding the risk factors for nephrotoxicity and regular monitoring of proximal tubular dysfunction and serum creatinine in high-risk patients is required to minimize nephrotoxicity. Newer, structurally similar molecular derivatives that do not accumulate in proximal tubules are under study. PMID:21716719
van Dam, Alje P; van Ogtrop, Marc L; Golparian, Daniel; Mehrtens, Jan; de Vries, Henry J C; Unemo, Magnus
2014-11-01
We describe the first case of treatment failure of gonorrhoea with a third generation cephalosporin, cefotaxime 1g intramuscularly, in the Netherlands. The case was from a high-frequency transmitting population (men having sex with men) and was caused by the internationally spreading multidrug-resistant gonococcal NG-MAST ST1407 clone. The patient was clinically cured after treatment with ceftriaxone 500 mg intramuscularly and this is the only third generation cephalosporin that should be used for first-line empiric treatment of gonorrhoea. Increased awareness of failures with third generation cephalosporins, enhanced monitoring and appropriate verification of treatment failures including more frequent test-of-cures, and strict adherence to regularly updated treatment guidelines are essential globally. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Neuroendocrine Causes of Amenorrhea—An Update
Fourman, Lindsay T.
2015-01-01
Context: Secondary amenorrhea—the absence of menses for three consecutive cycles—affects approximately 3–4% of reproductive age women, and infertility—the failure to conceive after 12 months of regular intercourse—affects approximately 6–10%. Neuroendocrine causes of amenorrhea and infertility, including functional hypothalamic amenorrhea and hyperprolactinemia, constitute a majority of these cases. Objective: In this review, we discuss the physiologic, pathologic, and iatrogenic causes of amenorrhea and infertility arising from perturbations in the hypothalamic-pituitary-adrenal axis, including potential genetic causes. We focus extensively on the hormonal mechanisms involved in disrupting the hypothalamic-pituitary-ovarian axis. Conclusions: A thorough understanding of the neuroendocrine causes of amenorrhea and infertility is critical for properly assessing patients presenting with these complaints. Prompt evaluation and treatment are essential to prevent loss of bone mass due to hypoestrogenemia and/or to achieve the time-sensitive treatment goal of conception. PMID:25581597
[Teleradiology - update 2014].
Pinto dos Santos, D; Hempel, J-M; Kloeckner, R; Düber, C; Mildenberger, P
2014-05-01
Due to economic considerations and thanks to technological advances there is a growing interest in the integration of teleradiological applications into the regular radiological workflow. The legal and technical hurdles which are still to be overcome are being discussed in politics as well as by national and international radiological societies. The European Commission as well as the German Federal Ministry of Health placed a focus on telemedicine with their recent eHealth initiatives. The European Society of Radiology (ESR) recently published a white paper on teleradiology. In Germany §3 section 4 of the Röntgenverordnung (RöV, X-ray regulations) and DIN 6868-159 set a framework in which teleradiology can also be used for primary reads. These possibilities are already being used by various networks and some commercial providers across Germany. With regards to cross-border teleradiology, which currently stands in contrast to the RöV, many issues remain unsolved.
SigReannot-mart: a query environment for expression microarray probe re-annotations.
Moreews, François; Rauffet, Gaelle; Dehais, Patrice; Klopp, Christophe
2011-01-01
Expression microarrays are commonly used to study transcriptomes. Most of the arrays are now based on oligo-nucleotide probes. Probe design being a tedious task, it often takes place once at the beginning of the project. The oligo set is then used for several years. During this time period, the knowledge gathered by the community on the genome and the transcriptome increases and gets more precise. Therefore re-annotating the set is essential to supply the biologists with up-to-date annotations. SigReannot-mart is a query environment populated with regularly updated annotations for different oligo sets. It stores the results of the SigReannot pipeline that has mainly been used on farm and aquaculture species. It permits easy extraction in different formats using filters. It is used to compare probe sets on different criteria, to choose the set for a given experiment to mix probe sets in order to create a new one.
Teen smoking cessation help via the Internet: a survey of search engines.
Edwards, Christine C; Elliott, Sean P; Conway, Terry L; Woodruff, Susan I
2003-07-01
The objective of this study was to assess Web sites related to teen smoking cessation on the Internet. Seven Internet search engines were searched using the keywords teen quit smoking. The top 20 hits from each search engine were reviewed and categorized. The keywords teen quit smoking produced between 35 and 400,000 hits depending on the search engine. Of 140 potential hits, 62% were active, unique sites; 85% were listed by only one search engine; and 40% focused on cessation. Findings suggest that legitimate on-line smoking cessation help for teens is constrained by search engine choice and the amount of time teens spend looking through potential sites. Resource listings should be updated regularly. Smoking cessation Web sites need to be picked up on multiple search engine searches. Further evaluation of smoking cessation Web sites need to be conducted to identify the most effective help for teens.
On the accuracy of ERS-1 orbit predictions
NASA Technical Reports Server (NTRS)
Koenig, Rolf; Li, H.; Massmann, Franz-Heinrich; Raimondo, J. C.; Rajasenan, C.; Reigber, C.
1993-01-01
Since the launch of ERS-1, the D-PAF (German Processing and Archiving Facility) provides regularly orbit predictions for the worldwide SLR (Satellite Laser Ranging) tracking network. The weekly distributed orbital elements are so called tuned IRV's and tuned SAO-elements. The tuning procedure, designed to improve the accuracy of the recovery of the orbit at the stations, is discussed based on numerical results. This shows that tuning of elements is essential for ERS-1 with the currently applied tracking procedures. The orbital elements are updated by daily distributed time bias functions. The generation of the time bias function is explained. Problems and numerical results are presented. The time bias function increases the prediction accuracy considerably. Finally, the quality assessment of ERS-1 orbit predictions is described. The accuracy is compiled for about 250 days since launch. The average accuracy lies in the range of 50-100 ms and has considerably improved.
NPP-VIIRS DNB-based reallocating subpopulations to mercury in Urumqi city cluster, central Asia
NASA Astrophysics Data System (ADS)
Zhou, X.; Feng, X. B.; Dai, W.; Li, P.; Ju, C. Y.; Bao, Z. D.; Han, Y. L.
2017-02-01
Accurate and update assignment of population-related environmental matters onto fine grid cells in oasis cities of arid areas remains challenging. We present the approach based on Suomi National Polar-orbiting Partnership (S-NPP) -Visible Infrared Imaging Radiometer Suite (VIIRS) Day/Night Band (DNB) to reallocate population onto a regular finer surface. The number of potential population to the mercury were reallocated onto 0.1x0.1 km reference grid in Urumqi city cluster of China’s Xinjiang, central Asia. The result of Monte Carlo modelling indicated that the range of 0.5 to 2.4 million people was reliable. The study highlights that the NPP-VIIRS DNB-based multi-layered, dasymetric, spatial method enhances our abilities to remotely estimate the distribution and size of target population at the street-level scale and has the potential to transform control strategies for epidemiology, public policy and other socioeconomic fields.
A review of drug-induced liver injury databases.
Luo, Guangwen; Shen, Yiting; Yang, Lizhu; Lu, Aiping; Xiang, Zheng
2017-09-01
Drug-induced liver injuries have been a major focus of current research in drug development, and are also one of the major reasons for the failure and withdrawal of drugs in development. Drug-induced liver injuries have been systematically recorded in many public databases, which have become valuable resources in this field. In this study, we provide an overview of these databases, including the liver injury-specific databases LiverTox, LTKB, Open TG-GATEs, LTMap and Hepatox, and the general databases, T3DB, DrugBank, DITOP, DART, CTD and HSDB. The features and limitations of these databases are summarized and discussed in detail. Apart from their powerful functions, we believe that these databases can be improved in several ways: by providing the data about the molecular targets involved in liver toxicity, by incorporating information regarding liver injuries caused by drug interactions, and by regularly updating the data.
The new powder diffractometer D1B of the Institut Laue Langevin
NASA Astrophysics Data System (ADS)
Puente Orench, I.; Clergeau, J. F.; Martínez, S.; Olmos, M.; Fabelo, O.; Campo, J.
2014-11-01
D1B is a medium resolution high flux powder diffractometer located at the Institut Laue Langevin, ILL. D1B a suitable instrument for studying a large variety of polycrystalline materials. D1B runs since 1998 as a CRG (collaborating research group) instrument, being exploited by the CNRS (Centre National de la Recherche Scientifique, France) and CSIC (Consejo Superior de Investigaciones Cientificas, Spain). In 2008 the Spanish CRG started an updating program which included a new detector and a radial oscillating collimator (ROC). The detector, which has a sensitive height of 100mm, covers an angular range of 128°. Its 1280 gold wires provide a neutron detection point every 0.1°. The ROC is made of 198 gadolinium- based absorbing collimation blades, regular placed every 0.67°. Here the present characteristics of D1B are reviewed and the different experimental performances will be presented.
The WHO Green Page - Assessment of the Environmental Health Risks in Children.
Kurpas, Donata; Church, Joseph; Mroczek, Bożena; Hans-Wytrychowska, Anna; Rudkowski, Zbigniew
2014-01-01
The objective of this study was to assess the possibility of implementation of the WHO Green Page as a tool to supplement basic medical interviews with environmental health risk factors for children. The WHO Green Page questionnaire was tested on parents of children who visited family practice doctors. A total of 159 parents took part in the study. It was noted that 24.3% of caregivers expressed concern about their children's environment without naming the risk factors. It was also found that 23.7% of the parents demonstrated knowledge and awareness of existing real environmental risks, and 7.0% of them stated that their children had sustained injuries in connection with road traffic prior to the questionnaire study. The WHO Green Page will provide additional information to the basic medical interview and, if regularly updated, will allow for monitoring of changing environmental conditions of children.
The WHO Green Page – Assessment of the Environmental Health Risks in Children
Kurpas, Donata; Church, Joseph; Mroczek, Bożena; Hans-Wytrychowska, Anna; Rudkowski, Zbigniew
2013-01-01
Background: The objective of this study was to assess the possibility of implementation of the WHO Green Page as a tool to supplement basic medical interviews with environmental health risk factors for children. Methods: The WHO Green Page questionnaire was tested on parents of children who visited family practice doctors. Results: A total of 159 parents took part in the study. It was noted that 24.3% of caregivers expressed concern about their children’s environment without naming the risk factors. It was also found that 23.7% of the parents demonstrated knowledge and awareness of existing real environmental risks, and 7.0% of them stated that their children had sustained injuries in connection with road traffic prior to the questionnaire study. Conclusions: The WHO Green Page will provide additional information to the basic medical interview and, if regularly updated, will allow for monitoring of changing environmental conditions of children. PMID:25648271
The World Hypertension League: where now and where to in salt reduction
Lackland, Daniel T.; Lisheng, Liu; Zhang, Xin-Hua; Nilsson, Peter M.; Niebylski, Mark L.
2015-01-01
High dietary salt is a leading risk for death and disability largely by causing increased blood pressure. Other associated health risks include gastric and renal cell cancers, osteoporosis, renal stones, and increased disease activity in multiple sclerosis, headache, increased body fat and Meniere’s disease. The World Hypertension League (WHL) has prioritized advocacy for salt reduction. WHL resources and actions include a non-governmental organization policy statement, dietary salt fact sheet, development of standardized nomenclature, call for quality research, collaboration in a weekly salt science update, development of a process to set recommended dietary salt research standards and regular literature reviews, development of adoptable power point slide sets to support WHL positions and resources, and critic of weak research studies on dietary salt. The WHL plans to continue to work with multiple governmental and non-governmental organizations to promote dietary salt reduction towards the World Health Organization (WHO) recommendations. PMID:26090335
CADB: Conformation Angles DataBase of proteins
Sheik, S. S.; Ananthalakshmi, P.; Bhargavi, G. Ramya; Sekar, K.
2003-01-01
Conformation Angles DataBase (CADB) provides an online resource to access data on conformation angles (both main-chain and side-chain) of protein structures in two data sets corresponding to 25% and 90% sequence identity between any two proteins, available in the Protein Data Bank. In addition, the database contains the necessary crystallographic parameters. The package has several flexible options and display facilities to visualize the main-chain and side-chain conformation angles for a particular amino acid residue. The package can also be used to study the interrelationship between the main-chain and side-chain conformation angles. A web based JAVA graphics interface has been deployed to display the user interested information on the client machine. The database is being updated at regular intervals and can be accessed over the World Wide Web interface at the following URL: http://144.16.71.148/cadb/. PMID:12520049
Exercise training and cardiometabolic diseases: focus on the vascular system.
Roque, Fernanda R; Hernanz, Raquel; Salaices, Mercedes; Briones, Ana M
2013-06-01
The regular practice of physical activity is a well-recommended strategy for the prevention and treatment of several cardiovascular and metabolic diseases. Physical exercise prevents the progression of vascular diseases and reduces cardiovascular morbidity and mortality. Exercise training also ameliorates vascular changes including endothelial dysfunction and arterial remodeling and stiffness, usually present in type 2 diabetes, obesity, hypertension and metabolic syndrome. Common to these diseases is excessive oxidative stress, which plays an important role in the processes underlying vascular changes. At the vascular level, exercise training improves the redox state and consequently NO availability. Moreover, growing evidence indicates that other mediators such as prostanoids might be involved in the beneficial effects of exercise. The purpose of this review is to update recent findings describing the adaptation response induced by exercise in cardiovascular and metabolic diseases, focusing more specifically on the beneficial effects of exercise in the vasculature and the underlying mechanisms.
Evaluation of low emission zone policy on vehicle emission reduction in Beijing, China
NASA Astrophysics Data System (ADS)
Zhang, Yi; Andre, Michel; Liu, Yao; Wu, Lin; Jing, Boyu; Mao, Hongjun
2018-02-01
This study evaluates the effect of the LEZ in Beijing from the perspective of vehicle emission reduction based on developing an urban street-scale vehicle emission inventory on the basis of the local emission factors and the dynamic or static traffic data via a bottom-up approach. In 2016, before the implementation of the LEZ, the vehicle emission of CO, HC, NOx, and PM were 49.01×104, 6.31×104, 5.96×104, and 0.12×104 t, respectively. According to the simulation results, the LEZ policy would have an obviously positive effect on emission reduction, especially for CO and HC. In order to realize the long-term mitigation target, it is necessary to update and amend the detailed terms of the LEZ policy regularly according to the traffic development and vehicle emission change.
IMG: the integrated microbial genomes database and comparative analysis system
Markowitz, Victor M.; Chen, I-Min A.; Palaniappan, Krishna; Chu, Ken; Szeto, Ernest; Grechkin, Yuri; Ratner, Anna; Jacob, Biju; Huang, Jinghua; Williams, Peter; Huntemann, Marcel; Anderson, Iain; Mavromatis, Konstantinos; Ivanova, Natalia N.; Kyrpides, Nikos C.
2012-01-01
The Integrated Microbial Genomes (IMG) system serves as a community resource for comparative analysis of publicly available genomes in a comprehensive integrated context. IMG integrates publicly available draft and complete genomes from all three domains of life with a large number of plasmids and viruses. IMG provides tools and viewers for analyzing and reviewing the annotations of genes and genomes in a comparative context. IMG's data content and analytical capabilities have been continuously extended through regular updates since its first release in March 2005. IMG is available at http://img.jgi.doe.gov. Companion IMG systems provide support for expert review of genome annotations (IMG/ER: http://img.jgi.doe.gov/er), teaching courses and training in microbial genome analysis (IMG/EDU: http://img.jgi.doe.gov/edu) and analysis of genomes related to the Human Microbiome Project (IMG/HMP: http://www.hmpdacc-resources.org/img_hmp). PMID:22194640
IMG: the Integrated Microbial Genomes database and comparative analysis system.
Markowitz, Victor M; Chen, I-Min A; Palaniappan, Krishna; Chu, Ken; Szeto, Ernest; Grechkin, Yuri; Ratner, Anna; Jacob, Biju; Huang, Jinghua; Williams, Peter; Huntemann, Marcel; Anderson, Iain; Mavromatis, Konstantinos; Ivanova, Natalia N; Kyrpides, Nikos C
2012-01-01
The Integrated Microbial Genomes (IMG) system serves as a community resource for comparative analysis of publicly available genomes in a comprehensive integrated context. IMG integrates publicly available draft and complete genomes from all three domains of life with a large number of plasmids and viruses. IMG provides tools and viewers for analyzing and reviewing the annotations of genes and genomes in a comparative context. IMG's data content and analytical capabilities have been continuously extended through regular updates since its first release in March 2005. IMG is available at http://img.jgi.doe.gov. Companion IMG systems provide support for expert review of genome annotations (IMG/ER: http://img.jgi.doe.gov/er), teaching courses and training in microbial genome analysis (IMG/EDU: http://img.jgi.doe.gov/edu) and analysis of genomes related to the Human Microbiome Project (IMG/HMP: http://www.hmpdacc-resources.org/img_hmp).
[Haemovigilance and blood safety in overseas military].
Sailliol, A; Plang, S; Martinaud, C; Pouget, T; Vedy, S; Clavier, B; Cellarier, V; Roche, C; Civadier, C; Ausset, S
2014-11-01
The French military blood institute (FMBI) is the only military blood supplier in France. FMBI operates independently and autonomously under the Ministry of Defense's supervision, and accordingly, to the French, European and NATO technical and safety guidelines. FMBI is in charge of the collection, preparation and distribution of blood products to supply transfusion support to armed forces, especially during overseas operations. In overseas military, a primary physician is responsible for haemovigilance in permanent relation with an expert in the FMBI to manage any adverse reaction. Additionally, traceability of delivered or collected blood products during overseas operation represents a priority, allowing an appropriate management of transfusion inquiries and assessment of practices aiming to improve and update procedures and training. Transfusion safety in overseas operation is based on regular and specific training of people concerned by blood supply chain in exceptional situation. Copyright © 2014 Elsevier Masson SAS. All rights reserved.
[Comparison among various software for LMS growth curve fitting methods].
Han, Lin; Wu, Wenhong; Wei, Qiuxia
2015-03-01
To explore the methods to realize the growth curve fitting of coefficients of skewness-median-coefficient of variation (LMS) using different software, and to optimize growth curve statistical method for grass-root child and adolescent staffs. Regular physical examination data of head circumference for normal infants aging 3, 6, 9 and 12 months in Baotou City were analyzed. Statistical software such as SAS, R, STATA and SPSS were used to fit the LMS growth curve and the results were evaluated upon the user 's convenience, study circle, user interface, results display forms, software update and maintenance and so on. Growth curve fitting results showed the same calculation outcome and each of statistical software had its own advantages and disadvantages. With all the evaluation aspects in consideration, R software excelled others in LMS growth curve fitting. R software have the advantage over other software in grass roots child and adolescent staff.
Madore, Amy; Rosenberg, Julie; Muyindike, Winnie R; Bangsberg, David R; Bwana, Mwebesa B; Martin, Jeffrey N; Kanyesigye, Michael; Weintraub, Rebecca
2015-12-01
Implementation lessons: • Technology alone does not necessarily lead to improvement in health service delivery, in contrast to the common assumption that advanced technology goes hand in hand with progress. • Implementation of electronic medical record (EMR) systems is a complex, resource-intensive process that, in addition to software, hardware, and human resource investments, requires careful planning, change management skills, adaptability, and continuous engagement of stakeholders. • Research requirements and goals must be balanced with service delivery needs when determining how much information is essential to collect and who should be interfacing with the EMR system. • EMR systems require ongoing monitoring and regular updates to ensure they are responsive to evolving clinical use cases and research questions. • High-quality data and analyses are essential for EMRs to deliver value to providers, researchers, and patients. Copyright © 2015 Elsevier Inc. All rights reserved.
The ESA activities on future launchers
NASA Technical Reports Server (NTRS)
Pfeffer, H.
1984-01-01
A future launcher development scenario depends on many assumptions, such as the impetus provided by the probability of future missions, and the political willingness of member states to undertake future developments. Because of the long timescale implied by a coherent launcher development, a step-wise approach within an overall future launcher development plan appears essential. The definition of development steps allows the launcher developments to be adapted to the driving external forces, so that no possible opportunity to Europe in the space launch business is missed out because of improper planning on the absence of a long term goal. The launcher senario, to be presented in 1985, forms part of Europe's overall STS plan for the future. This overall STS plan is one product of the complete STS LTPP, a first draft of which should exist by 1985, and which will be updated regularly to take into account the changing political and economic perspectives.
2003-01-01
The Portuguese Respiratory Society makes a series of recommendations as to the state of the art of the diagnostic, therapeutic and preventive approach to community-acquired pneumonia in immunocompetent adults in Portugal. These proposals should be regarded as general guidelines and are not intended to replace the clinical sense used in resolving each individual case. Our main goal is to stratify the patients according to the risk of morbidity and mortality in order to justify the following decisions more rationally: the choice of place of treatment (outpatient or inpatient), diagnostic tests and antimicrobial therapy. We also make a set of recommendations for the prevention of CAP. We plan to conduct multi-centre prospective studies, preferably in collaboration with other scientific societies, in order to be able to characterise the situation in Portugal more accurately and regularly update this document.
A World Wide Web selected bibliography for pediatric infectious diseases.
Jenson, H B; Baltimore, R S
1999-02-01
A pediatric infectious diseases bibliography of selected medical reference citations has been developed and placed on the World Wide Web (WWW) at http://www.pedid.uthscsa.edu. A regularly updated bibliography of >2,500 selected literature citations representing general reviews and key articles has been organized under a standard outline for individual infectious diseases and related topics that cover the breadth of pediatric infectious diseases. Citations are categorized by infectious disease or clinical syndrome, and access can be achieved by the disease or by syndrome or the name of the pathogen. Abstracts, and in some cases the complete text of articles, may be viewed by use of hypertext links. The bibliography provides medical students, residents, fellows, and clinicians with a constantly available resource of current literature citations in pediatric infectious diseases. The WWW is an emerging educational and clinical resource for the practice of clinical infectious diseases.
BrucellaBase: Genome information resource.
Sankarasubramanian, Jagadesan; Vishnu, Udayakumar S; Khader, L K M Abdul; Sridhar, Jayavel; Gunasekaran, Paramasamy; Rajendhran, Jeyaprakash
2016-09-01
Brucella sp. causes a major zoonotic disease, brucellosis. Brucella belongs to the family Brucellaceae under the order Rhizobiales of Alphaproteobacteria. We present BrucellaBase, a web-based platform, providing features of a genome database together with unique analysis tools. We have developed a web version of the multilocus sequence typing (MLST) (Whatmore et al., 2007) and phylogenetic analysis of Brucella spp. BrucellaBase currently contains genome data of 510 Brucella strains along with the user interfaces for BLAST, VFDB, CARD, pairwise genome alignment and MLST typing. Availability of these tools will enable the researchers interested in Brucella to get meaningful information from Brucella genome sequences. BrucellaBase will regularly be updated with new genome sequences, new features along with improvements in genome annotations. BrucellaBase is available online at http://www.dbtbrucellosis.in/brucellabase.html or http://59.99.226.203/brucellabase/homepage.html. Copyright © 2016 Elsevier B.V. All rights reserved.
Orbit Determination and Navigation Software Testing for the Mars Reconnaissance Orbiter
NASA Technical Reports Server (NTRS)
Pini, Alex
2011-01-01
During the extended science phase of the Mars Reconnaissance Orbiter's lifecycle, the operational duties pertaining to navigation primarily involve orbit determination. The orbit determination process utilizes radiometric tracking data and is used for the prediction and reconstruction of MRO's trajectories. Predictions are done twice per week for ephemeris updates on-board the spacecraft and for planning purposes. Orbit Trim Maneuvers (OTM-s) are also designed using the predicted trajectory. Reconstructions, which incorporate a batch estimator, provide precise information about the spacecraft state to be synchronized with scientific measurements. These tasks were conducted regularly to validate the results obtained by the MRO Navigation Team. Additionally, the team is in the process of converting to newer versions of the navigation software and operating system. The capability to model multiple densities in the Martian atmosphere is also being implemented. However, testing outputs among these different configurations was necessary to ensure compliance to a satisfactory degree.
CRISPR-Cas: Adapting to change.
Jackson, Simon A; McKenzie, Rebecca E; Fagerlund, Robert D; Kieper, Sebastian N; Fineran, Peter C; Brouns, Stan J J
2017-04-07
Bacteria and archaea are engaged in a constant arms race to defend against the ever-present threats of viruses and invasion by mobile genetic elements. The most flexible weapons in the prokaryotic defense arsenal are the CRISPR-Cas adaptive immune systems. These systems are capable of selective identification and neutralization of foreign DNA and/or RNA. CRISPR-Cas systems rely on stored genetic memories to facilitate target recognition. Thus, to keep pace with a changing pool of hostile invaders, the CRISPR memory banks must be regularly updated with new information through a process termed CRISPR adaptation. In this Review, we outline the recent advances in our understanding of the molecular mechanisms governing CRISPR adaptation. Specifically, the conserved protein machinery Cas1-Cas2 is the cornerstone of adaptive immunity in a range of diverse CRISPR-Cas systems. Copyright © 2017, American Association for the Advancement of Science.
Update on rheumatology: part 1.
Neal-Boylan, Leslie
2009-05-01
There are many rheumatic diseases. Part 1 of this 2 part series on rheumatology presented a few of those most commonly seen in the community. Home health clinicians can be helpful in managing these diseases and preventing progression by watching for new symptoms or acute attacks of pain or disability, ensuring that patients take their medications appropriately, reminding patients to see their rheumatology providers and have their lab work done regularly, and reporting adverse effects to medications promptly. Additionally, as with most home health patients, an interdisciplinary approach that includes physical and occupational therapy, social work, nursing, nutrition, and other disciplines as needed should be implemented so that all patient needs are met and the patient is discharged at the highest level of self-care that is possible. Part 2 of this series will discuss the care of the patient with rheumatic disease at home and will provide a more in-depth look at lab diagnosis of rheumatic diseases.
Ziese, Thomas; Moebus, Susanne
2017-11-01
Good communication is an essential feature of public health. The existing communication channels from sender to receiver are increasingly supplemented or even replaced by new forms of communication such as social media in all areas of life. Public Health must adopt these changes in order to make its concerns and results accessible to different user groups. 1. Many groups of the population (e. g. migrants, socially disadvantaged) are hard to reach for purposes of communication. Different addressees need different forms of communication, including social media. Appropriate access routes must be identified and used for communication. 2. Strategies must be developed on how public health information can be effectively communicated via social media. They must be professionally sound, reliable and quality-assured, and regular updating must ensured. 3. Participation and dialogue are important elements of effective public health communication. © Georg Thieme Verlag KG Stuttgart · New York.
Sequential and parallel image restoration: neural network implementations.
Figueiredo, M T; Leitao, J N
1994-01-01
Sequential and parallel image restoration algorithms and their implementations on neural networks are proposed. For images degraded by linear blur and contaminated by additive white Gaussian noise, maximum a posteriori (MAP) estimation and regularization theory lead to the same high dimension convex optimization problem. The commonly adopted strategy (in using neural networks for image restoration) is to map the objective function of the optimization problem into the energy of a predefined network, taking advantage of its energy minimization properties. Departing from this approach, we propose neural implementations of iterative minimization algorithms which are first proved to converge. The developed schemes are based on modified Hopfield (1985) networks of graded elements, with both sequential and parallel updating schedules. An algorithm supported on a fully standard Hopfield network (binary elements and zero autoconnections) is also considered. Robustness with respect to finite numerical precision is studied, and examples with real images are presented.
FragFit: a web-application for interactive modeling of protein segments into cryo-EM density maps.
Tiemann, Johanna K S; Rose, Alexander S; Ismer, Jochen; Darvish, Mitra D; Hilal, Tarek; Spahn, Christian M T; Hildebrand, Peter W
2018-05-21
Cryo-electron microscopy (cryo-EM) is a standard method to determine the three-dimensional structures of molecular complexes. However, easy to use tools for modeling of protein segments into cryo-EM maps are sparse. Here, we present the FragFit web-application, a web server for interactive modeling of segments of up to 35 amino acids length into cryo-EM density maps. The fragments are provided by a regularly updated database containing at the moment about 1 billion entries extracted from PDB structures and can be readily integrated into a protein structure. Fragments are selected based on geometric criteria, sequence similarity and fit into a given cryo-EM density map. Web-based molecular visualization with the NGL Viewer allows interactive selection of fragments. The FragFit web-application, accessible at http://proteinformatics.de/FragFit, is free and open to all users, without any login requirements.
Evolution and classification of the CRISPR-Cas systems
S. Makarova, Kira; H. Haft, Daniel; Barrangou, Rodolphe; J. J. Brouns, Stan; Charpentier, Emmanuelle; Horvath, Philippe; Moineau, Sylvain; J. M. Mojica, Francisco; I. Wolf, Yuri; Yakunin, Alexander F.; van der Oost, John; V. Koonin, Eugene
2012-01-01
The CRISPR–Cas (clustered regularly interspaced short palindromic repeats–CRISPR-associated proteins) modules are adaptive immunity systems that are present in many archaea and bacteria. These defence systems are encoded by operons that have an extraordinarily diverse architecture and a high rate of evolution for both the cas genes and the unique spacer content. Here, we provide an updated analysis of the evolutionary relationships between CRISPR–Cas systems and Cas proteins. Three major types of CRISPR–Cas system are delineated, with a further division into several subtypes and a few chimeric variants. Given the complexity of the genomic architectures and the extremely dynamic evolution of the CRISPR–Cas systems, a unified classification of these systems should be based on multiple criteria. Accordingly, we propose a `polythetic' classification that integrates the phylogenies of the most common cas genes, the sequence and organization of the CRISPR repeats and the architecture of the CRISPR–cas loci. PMID:21552286
Matching by linear programming and successive convexification.
Jiang, Hao; Drew, Mark S; Li, Ze-Nian
2007-06-01
We present a novel convex programming scheme to solve matching problems, focusing on the challenging problem of matching in a large search range and with cluttered background. Matching is formulated as metric labeling with L1 regularization terms, for which we propose a novel linear programming relaxation method and an efficient successive convexification implementation. The unique feature of the proposed relaxation scheme is that a much smaller set of basis labels is used to represent the original label space. This greatly reduces the size of the searching space. A successive convexification scheme solves the labeling problem in a coarse to fine manner. Importantly, the original cost function is reconvexified at each stage, in the new focus region only, and the focus region is updated so as to refine the searching result. This makes the method well-suited for large label set matching. Experiments demonstrate successful applications of the proposed matching scheme in object detection, motion estimation, and tracking.
Toward Near Real-Time Tomography of the Upper Mantle
NASA Astrophysics Data System (ADS)
Debayle, E.; Dubuffet, F.
2014-12-01
We added a layer of automation to the Debayle and Ricard (2012)'s waveform modeling scheme for fundamental and higher mode surface waves in the period range 50-160s. We processed all the Rayleigh waveforms recorded on the LHZ channel by the virtual networks GSN_broadband, FDSN_all, and US_backbone between January 1996 and December 2013. Six millions of waveforms were obtained from IRIS DMC. We check that all the necessary information (instrument response, global CMT determination) is available and that each record includes a velocity window which encompasses the surface wave. Selected data must also have a signal-to-noise ratio greater than 3 in a range covering at least the periods between 50 and 100 s. About 3 millions of waveforms are selected (92% of the rejections are due to the signal to noise ratio criterion) and processed using Debayle and Ricard (2012)'s scheme, which allows the successful modeling of about 1.5 millions of waveforms. We complete this database with 60,000 waveforms recorded between 1976 and 1996 or after 1996 during various temporary experiments and with 161,730 Rayleigh waveforms analyzed at longer period, between 120 and 360 s. The whole data set is inverted using Debayle and Sambridge (2004)'s scheme to produce a 3D shear velocity model. A simple shell command "update_tomo" can then update our seismic model in an entirely automated way. Currently, this command checks from the CMT catalog what are the potential data available at the GSN_broadband, FDSN_all, and US_backbone virtual networks, uses web services to request these data from IRIS DMC and applies the processing chain described above to update our seismic model. We plan to update our seismic model on a regular basis in a near future, and to make it available on the web. Our most recent seismic model includes azimuthal anisotropy, achieves a lateral resolution of few hundred kilometers and a vertical resolution of a few tens of kilometers. The correlation with surface tectonics is very strong in the uppermost 200 km. Regions deeper than 400 km show no velocity contrasts larger than 1%, except for high velocity slabs which produce broad high velocity regions within the transition zone. The use of higher modes and long period surface waves allows us to extract the shear velocity structure down to about 1000 km depth.
Oeffinger, Kevin C; Fontham, Elizabeth T H; Etzioni, Ruth; Herzig, Abbe; Michaelson, James S; Shih, Ya-Chen Tina; Walter, Louise C; Church, Timothy R; Flowers, Christopher R; LaMonte, Samuel J; Wolf, Andrew M D; DeSantis, Carol; Lortet-Tieulent, Joannie; Andrews, Kimberly; Manassaram-Baptiste, Deana; Saslow, Debbie; Smith, Robert A; Brawley, Otis W; Wender, Richard
2015-10-20
Breast cancer is a leading cause of premature mortality among US women. Early detection has been shown to be associated with reduced breast cancer morbidity and mortality. To update the American Cancer Society (ACS) 2003 breast cancer screening guideline for women at average risk for breast cancer. The ACS commissioned a systematic evidence review of the breast cancer screening literature to inform the update and a supplemental analysis of mammography registry data to address questions related to the screening interval. Formulation of recommendations was based on the quality of the evidence and judgment (incorporating values and preferences) about the balance of benefits and harms. Screening mammography in women aged 40 to 69 years is associated with a reduction in breast cancer deaths across a range of study designs, and inferential evidence supports breast cancer screening for women 70 years and older who are in good health. Estimates of the cumulative lifetime risk of false-positive examination results are greater if screening begins at younger ages because of the greater number of mammograms, as well as the higher recall rate in younger women. The quality of the evidence for overdiagnosis is not sufficient to estimate a lifetime risk with confidence. Analysis examining the screening interval demonstrates more favorable tumor characteristics when premenopausal women are screened annually vs biennially. Evidence does not support routine clinical breast examination as a screening method for women at average risk. The ACS recommends that women with an average risk of breast cancer should undergo regular screening mammography starting at age 45 years (strong recommendation). Women aged 45 to 54 years should be screened annually (qualified recommendation). Women 55 years and older should transition to biennial screening or have the opportunity to continue screening annually (qualified recommendation). Women should have the opportunity to begin annual screening between the ages of 40 and 44 years (qualified recommendation). Women should continue screening mammography as long as their overall health is good and they have a life expectancy of 10 years or longer (qualified recommendation). The ACS does not recommend clinical breast examination for breast cancer screening among average-risk women at any age (qualified recommendation). These updated ACS guidelines provide evidence-based recommendations for breast cancer screening for women at average risk of breast cancer. These recommendations should be considered by physicians and women in discussions about breast cancer screening.
Skoeries, B A; Ulbricht, S; Koepsell, S; Rumpf, H-J; John, U; Meyer, C
2010-04-01
The effectiveness of brief interventions on smoking cessation together with regular visits to the general practitioners (GP) has been proven. Nevertheless, the guidelines for smoking cessation are not currently implemented sufficiently. A lack of financial resources, time, and consulting abilities prevent GPs from offering systematic advice on smoking cessation. This study examine 1) to what extent GPs ask their patients to provide information about their smoking habits and to what extent they document this, 2) how willing, and 3) how confident GPs are to offer all smoking patient counselling, and 4) which factors influence their level of confidence. From August 2005 until May 2006, a questionnaire was sent to all 1 247 GPs in Brandenburg. In all 68 practices was excluded for several reasons (closed practice, death, not providing primary care) a total of 54.0% (n=637) of the GPs took part. 30.0% of the GPs documented the smoking status of their patients during the first consultation. 12.9% had already offered advice to all their smoking patients, while 27.6% were not willing to offer advice to all smoking patients. The average confidence of GPs to offer all smoking patients advice on smoking cessation was 4.1 (SD=2.6) on scale of 1 to 10 (1=not at all confident and 10=very confident). The confidence of non-smoking GPs to offer an advice was higher in comparison with smoking GPs. To motivate GPs to offer advice on smoking cessation, it seems necessary to change some conditions. This includes programms, initiated by professional medical associations, to help colleagues stop smoking. Further studies should indicate whether the inclusion of practice colleagues in screening and regular updates of the patient's smoking status increase the number of regular counselling. Georg Thieme Verlag KG Stuttgart, New York.
Petzke, F; Brückle, W; Eidmann, U; Heldmann, P; Köllner, V; Kühn, T; Kühn-Becker, H; Strunk-Richter, M; Schiltenwolf, M; Settan, M; von Wachter, M; Weigl, M; Häuser, W
2017-06-01
The regular update of the guidelines on fibromyalgia syndrome, AWMF number 145/004, was scheduled for April 2017. The guidelines were developed by 13 scientific societies and 2 patient self-help organizations coordinated by the German Pain Society. Working groups (n =8) with a total of 42 members were formed balanced with respect to gender, medical expertise, position in the medical or scientific hierarchy and potential conflicts of interest. A search of the literature for systematic reviews on randomized, controlled trials on patient education and shared decision-making from December 2010 to May 2016 was performed in the Cochrane library, MEDLINE, PsycINFO and Scopus databases. Levels of evidence were assigned according to the classification system of the Oxford Centre for Evidence-Based Medicine version 2009. The strength of recommendations was achieved by multiple step formalized procedures to reach a consensus. Efficacy, risks, patient preferences, clinical and practical applicability of available therapies were weighed up against each other. The guidelines were reviewed and approved by the board of directors of the societies engaged in the development of the guidelines. The diagnosis of fibromyalgia syndrome should be explicitly communicated to the affected individual. Shared decision-making with the patient on the therapeutic options based on individual preferences of the patient, comorbidities and the success of previous treatment is recommended. A step-wise treatment approach depending on the severity of fibromyalgia syndrome and the response to therapeutic measures is recommended.
Kristensen, David M.; Wolf, Yuri I.; Koonin, Eugene V.
2017-01-01
The Alignable Tight Genomic Clusters (ATGCs) database is a collection of closely related bacterial and archaeal genomes that provides several tools to aid research into evolutionary processes in the microbial world. Each ATGC is a taxonomy-independent cluster of 2 or more completely sequenced genomes that meet the objective criteria of a high degree of local gene order (synteny) and a small number of synonymous substitutions in the protein-coding genes. As such, each ATGC is suited for analysis of microevolutionary variations within a cohesive group of organisms (e.g. species), whereas the entire collection of ATGCs is useful for macroevolutionary studies. The ATGC database includes many forms of pre-computed data, in particular ATGC-COGs (Clusters of Orthologous Genes), multiple sequence alignments, a set of ‘index’ orthologs representing the most well-conserved members of each ATGC-COG, the phylogenetic tree of the organisms within each ATGC, etc. Although the ATGC database contains several million proteins from thousands of genomes organized into hundreds of clusters (roughly a 4-fold increase since the last version of the ATGC database), it is now built with completely automated methods and will be regularly updated following new releases of the NCBI RefSeq database. The ATGC database is hosted jointly at the University of Iowa at dmk-brain.ecn.uiowa.edu/ATGC/ and the NCBI at ftp.ncbi.nlm.nih.gov/pub/kristensen/ATGC/atgc_home.html. PMID:28053163
Smith, Jeffrey Michael; Gupta, Shivam; Williams, Emma; Brickson, Kate; Ly Sotha, Keth; Tep, Navuth; Calibo, Anthony; Castro, Mary Christine; Marinduque, Bernabe; Hathaway, Mark
2016-12-01
To determine whether a simple quality improvement initiative consisting of a technical update and regular audit and feedback sessions will result in increased use of antenatal corticosteroids among pregnant women at risk of imminent preterm birth delivering at health facilities in the Philippines and Cambodia. Non-randomized, observational study using a pre-/post-intervention design conducted between October 2013 and June 2014. A total of 12 high volume facilities providing Emergency Obstetric and Newborn Care services in Cambodia (6) and Philippines (6). A technical update on preterm birth and use of antenatal corticosteroids, followed by monthly audit and feedback sessions. The proportion of women at risk of imminent preterm birth who received at least one dose of dexamethasone. Coverage of at least one dose of dexamethasone increased from 35% at baseline to 86% at endline in Cambodia (P < 0.0001) and from 34% at baseline to 56% at endline in the Philippines (P < 0.0001), among women who had births at 24-36 weeks. In both settings baseline coverage and magnitude of improvement varied notably by facility. Availability of dexamethasone, knowledge of use and cost were not major barriers to coverage. A simple quality improvement strategy was feasible and effective in increasing use of dexamethasone in the management of preterm birth in 12 hospitals in Cambodia and Philippines. © The Author 2016. Published by Oxford University Press in association with the International Society for Quality in Health Care.
Riegman, Peter H J; de Jong, Bas W D; Llombart-Bosch, Antonio
2010-04-01
Today's translational cancer research increasingly depends on international multi-center studies. Biobanking infrastructure or comprehensive sample exchange platforms to enable networking of clinical cancer biobanks are instrumental to facilitate communication, uniform sample quality, and rules for exchange. The Organization of European Cancer Institutes (OECI) Pathobiology Working Group supports European biobanking infrastructure by maintaining the OECI-TuBaFrost exchange platform and organizing regular meetings. This platform originated from a European Commission project and is updated with knowledge from ongoing and new biobanking projects. This overview describes how European biobanking projects that have a large impact on clinical biobanking, including EuroBoNeT, SPIDIA, and BBMRI, contribute to the update of the OECI-TuBaFrost exchange platform. Combining the results of these European projects enabled the creation of an open (upon valid registration only) catalogue view of cancer biobanks and their available samples to initiate research projects. In addition, closed environments supporting active projects could be developed together with the latest views on quality, access rules, ethics, and law. With these contributions, the OECI Pathobiology Working Group contributes to and stimulates a professional attitude within biobanks at the European comprehensive cancer centers. Improving the fundamentals of cancer sample exchange in Europe stimulates the performance of large multi-center studies, resulting in experiments with the desired statistical significance outcome. With this approach, future innovation in cancer patient care can be realized faster and more reliably.
NASA Astrophysics Data System (ADS)
Gatto, Francesca; Katsanevakis, Stelios; Vandekerkhove, Jochen; Zenetos, Argyro; Cardoso, Ana Cristina
2013-06-01
Europe is severely affected by alien invasions, which impact biodiversity, ecosystem services, economy, and human health. A large number of national, regional, and global online databases provide information on the distribution, pathways of introduction, and impacts of alien species. The sufficiency and efficiency of the current online information systems to assist the European policy on alien species was investigated by a comparative analysis of occurrence data across 43 online databases. Large differences among databases were found which are partially explained by variations in their taxonomical, environmental, and geographical scopes but also by the variable efforts for continuous updates and by inconsistencies on the definition of "alien" or "invasive" species. No single database covered all European environments, countries, and taxonomic groups. In many European countries national databases do not exist, which greatly affects the quality of reported information. To be operational and useful to scientists, managers, and policy makers, online information systems need to be regularly updated through continuous monitoring on a country or regional level. We propose the creation of a network of online interoperable web services through which information in distributed resources can be accessed, aggregated and then used for reporting and further analysis at different geographical and political scales, as an efficient approach to increase the accessibility of information. Harmonization, standardization, conformity on international standards for nomenclature, and agreement on common definitions of alien and invasive species are among the necessary prerequisites.
Autoinflammatory diseases: update on classification diagnosis and management.
Pathak, Shelly; McDermott, Michael F; Savic, Sinisa
2017-01-01
The spectrum of systemic autoinflammatory disorders broadens continually. In part, this is due to the more widespread application of massive parallel sequencing, helping with novel gene discovery in this and other areas of rare diseases. Some of the conditions that have been described fit neatly into a conventional idea of autoinflammation. Others, such as interferon-mediated autoinflammatory diseases, are broadening the concept which we consider to be autoinflammatory disorders. There is also a widening of the clinical phenotypes associated with certain genetic mutations, as genetic testing is used more regularly and increasing numbers of patients are screened. It is also increasingly evident that both autoinflammatory and autoimmune problems are frequently seen as complications of primary immunodeficiency disorders. The aim of this review is to provide an update on some recently discovered conditions and to discuss how these disorders help to define the concept of autoinflammation. The review will also cover recent discoveries in the biology of innate-immune-mediated inflammation and describe how this has provided the biological rationale for using anti-interleukin-1 therapies in the treatment of many such conditions. Finally, we discuss the importance of recognising somatic mutations as causes of autoinflammatory clinical phenotypes and provide practical advice on how this could be tackled in everyday clinical practice. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Expectancy Learning from Probabilistic Input by Infants
Romberg, Alexa R.; Saffran, Jenny R.
2013-01-01
Across the first few years of life, infants readily extract many kinds of regularities from their environment, and this ability is thought to be central to development in a number of domains. Numerous studies have documented infants’ ability to recognize deterministic sequential patterns. However, little is known about the processes infants use to build and update representations of structure in time, and how infants represent patterns that are not completely predictable. The present study investigated how infants’ expectations fora simple structure develope over time, and how infants update their representations with new information. We measured 12-month-old infants’ anticipatory eye movements to targets that appeared in one of two possible locations. During the initial phase of the experiment, infants either saw targets that appeared consistently in the same location (Deterministic condition) or probabilistically in either location, with one side more frequent than the other (Probabilistic condition). After this initial divergent experience, both groups saw the same sequence of trials for the rest of the experiment. The results show that infants readily learn from both deterministic and probabilistic input, with infants in both conditions reliably predicting the most likely target location by the end of the experiment. Local context had a large influence on behavior: infants adjusted their predictions to reflect changes in the target location on the previous trial. This flexibility was particularly evident in infants with more variable prior experience (the Probabilistic condition). The results provide some of the first data showing how infants learn in real time. PMID:23439947
Hansen, Lars Jørgen; Drivsholm, Thomas B
2002-01-28
This review should be cited as: Renders CM, Valk GD, Griffin S. Wagner EH, Eijk JThM van, Assendelft WJJ. Interventions to improve the management of diabetes mellitus in primary care, outpatient and community settings (Cochrane Review). In: The Cochrane Library, Issue 2, 2001. Oxford: Update Software. A substantive amendment to this systematic review was last made on 29 June 2000. Cochrane reviews are regularly checked and updated if necessary. Diabetes is a common chronic disease that is increasingly managed in primary care. Different systems have been proposed to manage diabetes care. To assess the effects of different interventions, targeted at health professionals or the structure in which they deliver care, on the management of patients with diabetes in primary care, outpatient and community settings. We searched the Cochrane Effective Practice and Organisation of Care Group specialised register, the Cochrane Controlled Trials Register (Issue 4 1999), MEDLINE (1966-1999), EMBASE (1980-1999), Cinahl (1982-1999), and reference lists of articles. Randomised trials (RCTs), controlled clinical trials (CCTs), controlled before and after studies (CBAs) and interrupted time series (ITS) analyses of professional, financial and organisational strategies aimed at improving care for people with Type 1 or Type 2 diabetes. The participants were health care professionals, including physicians, nurses and pharmacists. The outcomes included objectively measured health professional performance or patient outcomes, and self-report measures with known validity and reliability. Two reviewers independently extracted data and assessed study quality. Forty-one studies were included involving more than 200 practices and 48,000 patients. Twenty-seven studies were RCTs, 12 were CBAs, and two were ITS. The studies were heterogeneous in terms of interventions, participants, settings and outcomes. The methodological quality of the studies was often poor. In all studies the intervention strategy was multifaceted. In 12 studies the interventions were targeted at health professionals, in nine they were targeted at the organization of care, and 20 studies targeted both. In 15 studies patient education was added to the professional and organisational interventions. A combination of professional interventions improved process outcomes. The effect on patient outcomes remained less clear as these were rarely assessed. Arrangements for follow-up (organisational intervention) also showed a favourable effect on process outcomes. Multiple interventions in which patient education was added or in which the role of the nurse was enhanced also reported favourable effects on patients' health outcomes. REVIEWERS' CONCLUSION: Multifaceted professional interventions can enhance the performance of health professionals in managing patients with diabetes. Organisational interventions that improve regular prompted recall and review of patients (central computerised tracking systems or nurses who regularly contact the patient) can also improve diabetes management. The addition of patient-oriented interventions can lead to improved patient health outcomes. Nurses can play an important role in patient-oriented interventions, through patient education or facilitating adherence to treatment.
Wolf, Andrew M D; Fontham, Elizabeth T H; Church, Timothy R; Flowers, Christopher R; Guerra, Carmen E; LaMonte, Samuel J; Etzioni, Ruth; McKenna, Matthew T; Oeffinger, Kevin C; Shih, Ya-Chen Tina; Walter, Louise C; Andrews, Kimberly S; Brawley, Otis W; Brooks, Durado; Fedewa, Stacey A; Manassaram-Baptiste, Deana; Siegel, Rebecca L; Wender, Richard C; Smith, Robert A
2018-05-30
In the United States, colorectal cancer (CRC) is the fourth most common cancer diagnosed among adults and the second leading cause of death from cancer. For this guideline update, the American Cancer Society (ACS) used an existing systematic evidence review of the CRC screening literature and microsimulation modeling analyses, including a new evaluation of the age to begin screening by race and sex and additional modeling that incorporates changes in US CRC incidence. Screening with any one of multiple options is associated with a significant reduction in CRC incidence through the detection and removal of adenomatous polyps and other precancerous lesions and with a reduction in mortality through incidence reduction and early detection of CRC. Results from modeling analyses identified efficient and model-recommendable strategies that started screening at age 45 years. The ACS Guideline Development Group applied the Grades of Recommendations, Assessment, Development, and Evaluation (GRADE) criteria in developing and rating the recommendations. The ACS recommends that adults aged 45 years and older with an average risk of CRC undergo regular screening with either a high-sensitivity stool-based test or a structural (visual) examination, depending on patient preference and test availability. As a part of the screening process, all positive results on noncolonoscopy screening tests should be followed up with timely colonoscopy. The recommendation to begin screening at age 45 years is a qualified recommendation. The recommendation for regular screening in adults aged 50 years and older is a strong recommendation. The ACS recommends (qualified recommendations) that: 1) average-risk adults in good health with a life expectancy of more than 10 years continue CRC screening through the age of 75 years; 2) clinicians individualize CRC screening decisions for individuals aged 76 through 85 years based on patient preferences, life expectancy, health status, and prior screening history; and 3) clinicians discourage individuals older than 85 years from continuing CRC screening. The options for CRC screening are: fecal immunochemical test annually; high-sensitivity, guaiac-based fecal occult blood test annually; multitarget stool DNA test every 3 years; colonoscopy every 10 years; computed tomography colonography every 5 years; and flexible sigmoidoscopy every 5 years. CA Cancer J Clin 2018;000:000-000. © 2018 American Cancer Society. © 2018 American Cancer Society.
Lewis, Nicola S.; Anderson, Tavis K.; Kitikoon, Pravina; Skepner, Eugene; Burke, David F.
2014-01-01
ABSTRACT Swine influenza A virus is an endemic and economically important pathogen in pigs, with the potential to infect other host species. The hemagglutinin (HA) protein is the primary target of protective immune responses and the major component in swine influenza A vaccines. However, as a result of antigenic drift, vaccine strains must be regularly updated to reflect currently circulating strains. Characterizing the cross-reactivity between strains in pigs and seasonal influenza virus strains in humans is also important in assessing the relative risk of interspecies transmission of viruses from one host population to the other. Hemagglutination inhibition (HI) assay data for swine and human H3N2 viruses were used with antigenic cartography to quantify the antigenic differences among H3N2 viruses isolated from pigs in the United States from 1998 to 2013 and the relative cross-reactivity between these viruses and current human seasonal influenza A virus strains. Two primary antigenic clusters were found circulating in the pig population, but with enough diversity within and between the clusters to suggest updates in vaccine strains are needed. We identified single amino acid substitutions that are likely responsible for antigenic differences between the two primary antigenic clusters and between each antigenic cluster and outliers. The antigenic distance between current seasonal influenza virus H3 strains in humans and those endemic in swine suggests that population immunity may not prevent the introduction of human viruses into pigs, and possibly vice versa, reinforcing the need to monitor and prepare for potential incursions. IMPORTANCE Influenza A virus (IAV) is an important pathogen in pigs and humans. The hemagglutinin (HA) protein is the primary target of protective immune responses and the major target of vaccines. However, vaccine strains must be updated to reflect current strains. Characterizing the differences between seasonal IAV in humans and swine IAV is important in assessing the relative risk of interspecies transmission of viruses. We found two primary antigenic clusters of H3N2 in the U.S. pig population, with enough diversity to suggest updates in swine vaccine strains are needed. We identified changes in the HA protein that are likely responsible for these differences and that may be useful in predicting when vaccines need to be updated. The difference between human H3N2 viruses and those in swine is enough that population immunity is unlikely to prevent new introductions of human IAV into pigs or vice versa, reinforcing the need to monitor and prepare for potential introductions. PMID:24522915
A compressed sensing based 3D resistivity inversion algorithm for hydrogeological applications
NASA Astrophysics Data System (ADS)
Ranjan, Shashi; Kambhammettu, B. V. N. P.; Peddinti, Srinivasa Rao; Adinarayana, J.
2018-04-01
Image reconstruction from discrete electrical responses pose a number of computational and mathematical challenges. Application of smoothness constrained regularized inversion from limited measurements may fail to detect resistivity anomalies and sharp interfaces separated by hydro stratigraphic units. Under favourable conditions, compressed sensing (CS) can be thought of an alternative to reconstruct the image features by finding sparse solutions to highly underdetermined linear systems. This paper deals with the development of a CS assisted, 3-D resistivity inversion algorithm for use with hydrogeologists and groundwater scientists. CS based l1-regularized least square algorithm was applied to solve the resistivity inversion problem. Sparseness in the model update vector is introduced through block oriented discrete cosine transformation, with recovery of the signal achieved through convex optimization. The equivalent quadratic program was solved using primal-dual interior point method. Applicability of the proposed algorithm was demonstrated using synthetic and field examples drawn from hydrogeology. The proposed algorithm has outperformed the conventional (smoothness constrained) least square method in recovering the model parameters with much fewer data, yet preserving the sharp resistivity fronts separated by geologic layers. Resistivity anomalies represented by discrete homogeneous blocks embedded in contrasting geologic layers were better imaged using the proposed algorithm. In comparison to conventional algorithm, CS has resulted in an efficient (an increase in R2 from 0.62 to 0.78; a decrease in RMSE from 125.14 Ω-m to 72.46 Ω-m), reliable, and fast converging (run time decreased by about 25%) solution.
Schizophrenia—Time to Commit to Policy Change
Fleischhacker, W. Wolfgang
2014-01-01
Care and outcomes for people with schizophrenia have improved in recent years, but further progress is needed to help more individuals achieve an independent and fulfilled life. This report sets out the current need, informs policy makers and all relevant stakeholders who influence care quality, and supports their commitment to creating a better future. The authors recommend the following policy actions, based on research evidence, stakeholder consultation, and examples of best practice worldwide. (1) Provide an evidence-based, integrated care package for people with schizophrenia that addresses their mental and physical health needs. (2) Provide support for people with schizophrenia to enter and to remain in their community, and develop mechanisms to help guide them through the complex benefit and employment systems. (3) Provide concrete support, information, and educational programs to families and carers on how to enhance care for an individual living with schizophrenia in a manner that entails minimal disruption to their lives. (4) All stakeholders, including organizations that support people living with schizophrenia, should be consulted to regularly revise, update, and improve policy on the management of schizophrenia. (5) Provide support, which is proportionate to the impact of the disease, for research and development of new treatments. (6) Establish adequately funded, ongoing, and regular awareness-raising campaigns that form an integral part of routine plans of action. Implementation of the above recommendations will require engagement by every stakeholder, but with commitment from all, change can be achieved. PMID:24778411
Woods, Robert A; Artz, Jennifer D; Carrière, Benoit; Field, Simon; Huffman, James; Dong, Sandy L; Bhanji, Farhan; Yiu, Stella; Smith, Sheila; Mengual, Rose; Hicks, Chris; Frank, Jason
2017-05-01
To develop consensus recommendations for training future clinician educators (CEs) in emergency medicine (EM). A panel of EM education leaders was assembled from across Canada and met regularly by teleconference over the course of 1 year. Recommendations for CE training were drafted based on the panel's experience, a literature review, and a survey of current and past EM education leaders in Canada. Feedback was sought from attendees at the Canadian Association of Emergency Physicians (CAEP) annual academic symposium. Recommendations were distributed to the society's Academic Section for further feedback and updated by a consensus of the expert panel. Recommendations were categorized for one of three audiences: 1) Future CEs; 2) Academic departments and divisions (AD&D) that support training to fulfill their education leadership goals; and 3) The CAEP Academic Section. Advanced medical education training is recommended for any emergency physician or resident who pursues an education leadership role. Individuals should seek out mentorship in making decisions about career opportunities and training options. AD&D should regularly perform a needs assessment of their future CE needs and identify and encourage potential individuals who fulfill education leadership roles. AD&D should develop training opportunities at their institution, provide support to complete this training, and advocate for the recognition of education scholarship in their institutional promotions process. The CAEP Academic Section should support mentorship of future CEs on a national scale. These recommendations serve as a framework for training and supporting the next generation of Canadian EM medical educators.
Recursive regularization for inferring gene networks from time-course gene expression profiles
Shimamura, Teppei; Imoto, Seiya; Yamaguchi, Rui; Fujita, André; Nagasaki, Masao; Miyano, Satoru
2009-01-01
Background Inferring gene networks from time-course microarray experiments with vector autoregressive (VAR) model is the process of identifying functional associations between genes through multivariate time series. This problem can be cast as a variable selection problem in Statistics. One of the promising methods for variable selection is the elastic net proposed by Zou and Hastie (2005). However, VAR modeling with the elastic net succeeds in increasing the number of true positives while it also results in increasing the number of false positives. Results By incorporating relative importance of the VAR coefficients into the elastic net, we propose a new class of regularization, called recursive elastic net, to increase the capability of the elastic net and estimate gene networks based on the VAR model. The recursive elastic net can reduce the number of false positives gradually by updating the importance. Numerical simulations and comparisons demonstrate that the proposed method succeeds in reducing the number of false positives drastically while keeping the high number of true positives in the network inference and achieves two or more times higher true discovery rate (the proportion of true positives among the selected edges) than the competing methods even when the number of time points is small. We also compared our method with various reverse-engineering algorithms on experimental data of MCF-7 breast cancer cells stimulated with two ErbB ligands, EGF and HRG. Conclusion The recursive elastic net is a powerful tool for inferring gene networks from time-course gene expression profiles. PMID:19386091
Wireless Relay Selection in Pocket Switched Networks Based on Spatial Regularity of Human Mobility †
Huang, Jianhui; Cheng, Xiuzhen; Bi, Jingping; Chen, Biao
2016-01-01
Pocket switched networks (PSNs) take advantage of human mobility to deliver data. Investigations on real-world trace data indicate that human mobility shows an obvious spatial regularity: a human being usually visits a few places at high frequencies. These most frequently visited places form the home of a node, which is exploited in this paper to design two HomE based Relay selectiOn (HERO) algorithms. Both algorithms input single data copy into the network at any time. In the basic HERO, only the first node encountered by the source and whose home overlaps a destination’s home is selected as a relay while the enhanced HERO keeps finding more optimal relay that visits the destination’s home with higher probability. The two proposed algorithms only require the relays to exchange the information of their home and/or the visiting frequencies to their home when two nodes meet. As a result, the information update is reduced and there is no global status information that needs to be maintained. This causes light loads on relays because of the low communication cost and storage requirements. Additionally, only simple operations are needed in the two proposed algorithms, resulting in little computation overhead at relays. At last, a theoretical analysis is performed on some key metrics and then the real-world based simulations indicate that the two HERO algorithms are efficient and effective through employing only one or a few relays. PMID:26797609
Self-prior strategy for organ reconstruction in fluorescence molecular tomography
Zhou, Yuan; Chen, Maomao; Su, Han; Luo, Jianwen
2017-01-01
The purpose of this study is to propose a strategy for organ reconstruction in fluorescence molecular tomography (FMT) without prior information from other imaging modalities, and to overcome the high cost and ionizing radiation caused by the traditional structural prior strategy. The proposed strategy is designed as an iterative architecture to solve the inverse problem of FMT. In each iteration, a short time Fourier transform (STFT) based algorithm is used to extract the self-prior information in the space-frequency energy spectrum with the assumption that the regions with higher fluorescence concentration have larger energy intensity, then the cost function of the inverse problem is modified by the self-prior information, and lastly an iterative Laplacian regularization algorithm is conducted to solve the updated inverse problem and obtains the reconstruction results. Simulations and in vivo experiments on liver reconstruction are carried out to test the performance of the self-prior strategy on organ reconstruction. The organ reconstruction results obtained by the proposed self-prior strategy are closer to the ground truth than those obtained by the iterative Tikhonov regularization (ITKR) method (traditional non-prior strategy). Significant improvements are shown in the evaluation indexes of relative locational error (RLE), relative error (RE) and contrast-to-noise ratio (CNR). The self-prior strategy improves the organ reconstruction results compared with the non-prior strategy and also overcomes the shortcomings of the traditional structural prior strategy. Various applications such as metabolic imaging and pharmacokinetic study can be aided by this strategy. PMID:29082094
Self-prior strategy for organ reconstruction in fluorescence molecular tomography.
Zhou, Yuan; Chen, Maomao; Su, Han; Luo, Jianwen
2017-10-01
The purpose of this study is to propose a strategy for organ reconstruction in fluorescence molecular tomography (FMT) without prior information from other imaging modalities, and to overcome the high cost and ionizing radiation caused by the traditional structural prior strategy. The proposed strategy is designed as an iterative architecture to solve the inverse problem of FMT. In each iteration, a short time Fourier transform (STFT) based algorithm is used to extract the self-prior information in the space-frequency energy spectrum with the assumption that the regions with higher fluorescence concentration have larger energy intensity, then the cost function of the inverse problem is modified by the self-prior information, and lastly an iterative Laplacian regularization algorithm is conducted to solve the updated inverse problem and obtains the reconstruction results. Simulations and in vivo experiments on liver reconstruction are carried out to test the performance of the self-prior strategy on organ reconstruction. The organ reconstruction results obtained by the proposed self-prior strategy are closer to the ground truth than those obtained by the iterative Tikhonov regularization (ITKR) method (traditional non-prior strategy). Significant improvements are shown in the evaluation indexes of relative locational error (RLE), relative error (RE) and contrast-to-noise ratio (CNR). The self-prior strategy improves the organ reconstruction results compared with the non-prior strategy and also overcomes the shortcomings of the traditional structural prior strategy. Various applications such as metabolic imaging and pharmacokinetic study can be aided by this strategy.
NASA Astrophysics Data System (ADS)
Pathak, Sayan D.; Haynor, David R.; Thompson, Carol L.; Lein, Ed; Hawrylycz, Michael
2009-02-01
Understanding the geography of genetic expression in the mouse brain has opened previously unexplored avenues in neuroinformatics. The Allen Brain Atlas (www.brain-map.org) (ABA) provides genome-wide colorimetric in situ hybridization (ISH) gene expression images at high spatial resolution, all mapped to a common three-dimensional 200μm3 spatial framework defined by the Allen Reference Atlas (ARA) and is a unique data set for studying expression based structural and functional organization of the brain. The goal of this study was to facilitate an unbiased data-driven structural partitioning of the major structures in the mouse brain. We have developed an algorithm that uses nonnegative matrix factorization (NMF) to perform parts based analysis of ISH gene expression images. The standard NMF approach and its variants are limited in their ability to flexibly integrate prior knowledge, in the context of spatial data. In this paper, we introduce spatial connectivity as an additional regularization in NMF decomposition via the use of Markov Random Fields (mNMF). The mNMF algorithm alternates neighborhood updates with iterations of the standard NMF algorithm to exploit spatial correlations in the data. We present the algorithm and show the sub-divisions of hippocampus and somatosensory-cortex obtained via this approach. The results are compared with established neuroanatomic knowledge. We also highlight novel gene expression based sub divisions of the hippocampus identified by using the mNMF algorithm.
Optical engineering capstone design projects with industry sponsors
NASA Astrophysics Data System (ADS)
Bunch, Robert M.; Leisher, Paul O.; Granieri, Sergio C.
2014-09-01
Capstone senior design is the culmination of a student's undergraduate engineering education that prepares them for engineering practice. In fact, any engineering degree program that pursues accreditation by the Engineering Accreditation Commission of ABET must contain "a major design experience based on the knowledge and skills acquired in earlier course work and incorporating appropriate engineering standards and multiple realistic constraints." At Rose-Hulman, we offer an interdisciplinary Optical Engineering / Engineering Physics senior design curriculum that meets this requirement. Part of this curriculum is a two-course sequence where students work in teams on a design project leading to a functional prototype. The students begin work on their capstone project during the first week of their senior year. The courses are deliverable-driven and the students are held accountable for regular technical progress through weekly updates with their faculty advisor and mid-term design reviews. We have found that client-sponsored projects offer students an enriched engineering design experience as it ensures consideration of constraints and standards requirements similar to those that they will encounter as working engineers. Further, client-sponsored projects provide teams with an opportunity for regular customer interactions which help shape the product design. The process that we follow in both soliciting and helping to scope appropriate industry-related design projects will be described. In addition, an outline of the capstone course structure as well as methods used to hold teams accountable for technical milestones will be discussed. Illustrative examples of past projects will be provided.
Fei, Juntao; Lu, Cheng
2018-04-01
In this paper, an adaptive sliding mode control system using a double loop recurrent neural network (DLRNN) structure is proposed for a class of nonlinear dynamic systems. A new three-layer RNN is proposed to approximate unknown dynamics with two different kinds of feedback loops where the firing weights and output signal calculated in the last step are stored and used as the feedback signals in each feedback loop. Since the new structure has combined the advantages of internal feedback NN and external feedback NN, it can acquire the internal state information while the output signal is also captured, thus the new designed DLRNN can achieve better approximation performance compared with the regular NNs without feedback loops or the regular RNNs with a single feedback loop. The new proposed DLRNN structure is employed in an equivalent controller to approximate the unknown nonlinear system dynamics, and the parameters of the DLRNN are updated online by adaptive laws to get favorable approximation performance. To investigate the effectiveness of the proposed controller, the designed adaptive sliding mode controller with the DLRNN is applied to a -axis microelectromechanical system gyroscope to control the vibrating dynamics of the proof mass. Simulation results demonstrate that the proposed methodology can achieve good tracking property, and the comparisons of the approximation performance between radial basis function NN, RNN, and DLRNN show that the DLRNN can accurately estimate the unknown dynamics with a fast speed while the internal states of DLRNN are more stable.
Spatio-temporal imaging of the hemoglobin in the compressed breast with diffuse optical tomography
NASA Astrophysics Data System (ADS)
Boverman, Gregory; Fang, Qianqian; Carp, Stefan A.; Miller, Eric L.; Brooks, Dana H.; Selb, Juliette; Moore, Richard H.; Kopans, Daniel B.; Boas, David A.
2007-07-01
We develop algorithms for imaging the time-varying optical absorption within the breast given diffuse optical tomographic data collected over a time span that is long compared to the dynamics of the medium. Multispectral measurements allow for the determination of the time-varying total hemoglobin concentration and of oxygen saturation. To facilitate the image reconstruction, we decompose the hemodynamics in time into a linear combination of spatio-temporal basis functions, the coefficients of which are estimated using all of the data simultaneously, making use of a Newton-based nonlinear optimization algorithm. The solution of the extremely large least-squares problem which arises in computing the Newton update is obtained iteratively using the LSQR algorithm. A Laplacian spatial regularization operator is applied, and, in addition, we make use of temporal regularization which tends to encourage similarity between the images of the spatio-temporal coefficients. Results are shown for an extensive simulation, in which we are able to image and quantify localized changes in both total hemoglobin concentration and oxygen saturation. Finally, a breast compression study has been performed for a normal breast cancer screening subject, using an instrument which allows for highly accurate co-registration of multispectral diffuse optical measurements with an x-ray tomosynthesis image of the breast. We are able to quantify the global return of blood to the breast following compression, and, in addition, localized changes are observed which correspond to the glandular region of the breast.
A Stochastic Model for Detecting Overlapping and Hierarchical Community Structure
Cao, Xiaochun; Wang, Xiao; Jin, Di; Guo, Xiaojie; Tang, Xianchao
2015-01-01
Community detection is a fundamental problem in the analysis of complex networks. Recently, many researchers have concentrated on the detection of overlapping communities, where a vertex may belong to more than one community. However, most current methods require the number (or the size) of the communities as a priori information, which is usually unavailable in real-world networks. Thus, a practical algorithm should not only find the overlapping community structure, but also automatically determine the number of communities. Furthermore, it is preferable if this method is able to reveal the hierarchical structure of networks as well. In this work, we firstly propose a generative model that employs a nonnegative matrix factorization (NMF) formulization with a l2,1 norm regularization term, balanced by a resolution parameter. The NMF has the nature that provides overlapping community structure by assigning soft membership variables to each vertex; the l2,1 regularization term is a technique of group sparsity which can automatically determine the number of communities by penalizing too many nonempty communities; and hence the resolution parameter enables us to explore the hierarchical structure of networks. Thereafter, we derive the multiplicative update rule to learn the model parameters, and offer the proof of its correctness. Finally, we test our approach on a variety of synthetic and real-world networks, and compare it with some state-of-the-art algorithms. The results validate the superior performance of our new method. PMID:25822148
NASA Astrophysics Data System (ADS)
Ajo Franklin, J. B.; Lindsey, N.; Dou, S.; Freifeld, B. M.; Daley, T. M.; Tracy, C.; Monga, I.
2017-12-01
"Dark Fiber" refers to the large number of fiber-optic lines installed for telecommunication purposes but not currently utilized. With the advent of distributed acoustic sensing (DAS), these unused fibers have the potential to become a seismic sensing network with unparalleled spatial extent and density with applications to monitoring both natural seismicity as well as near-surface soil properties. While the utility of DAS for seismic monitoring has now been conclusively shown on built-for-purpose networks, dark fiber deployments have been challenged by the heterogeneity of fiber installation procedures in telecommunication as well as access limitations. However, the potential of telecom networks to augment existing broadband monitoring stations provides a strong incentive to explore their utilization. We present preliminary results demonstrating the application of DAS to seismic monitoring on a 20 km run of "dark" telecommunications fiber between West Sacramento, CA and Woodland CA, part of the Dark Fiber Testbed maintained by the DOE's ESnet user facility. We show a small catalog of local and regional earthquakes detected by the array and evaluate fiber coupling by using variations in recorded frequency content. Considering the low density of broadband stations across much of the Sacramento Basin, such DAS recordings could provide a crucial data source to constrain small-magnitude local events. We also demonstrate the application of ambient noise interferometry using DAS-recorded waveforms to estimate soil properties under selected sections of the dark fiber transect; the success of this test suggests that the network could be utilized for environmental monitoring at the basin scale. The combination of these two examples demonstrates the exciting potential for combining DAS with ubiquitous dark fiber to greatly extend the reach of existing seismic monitoring networks.
Ranking 93 health interventions for low- and middle-income countries by cost-effectiveness
Gelband, Hellen; Jamison, Dean; Levin, Carol; Nugent, Rachel; Watkins, David
2017-01-01
Background Cost-effectiveness rankings of health interventions are useful inputs for national healthcare planning and budgeting. Previous comprehensive rankings for low- and middle- income countries were undertaken in 2005 and 2006, accompanying the development of strategies for the Millennium Development Goals. We update the rankings using studies published since 2000, as strategies are being considered for the Sustainable Development Goals. Methods Expert systematic searches of the literature were undertaken for a broad range of health interventions. Cost-effectiveness results using Disability Adjusted Life-Years (DALYs) as the health outcome were standardized to 2012 US dollars. Results 149 individual studies of 93 interventions qualified for inclusion. Interventions for Reproductive, Maternal, Newborn and Child Health accounted for 37% of interventions, and major infectious diseases (AIDS, TB, malaria and neglected tropical diseases) for 24%, consistent with the priorities of the Millennium Development Goals. More than half of the interventions considered cost less than $200 per DALY and hence can be considered for inclusion in Universal Health Care packages even in low-income countries. Discussion Important changes have occurred in rankings since 2006. Priorities have changed as a result of new technologies, new methods for changing behavior, and significant price changes for some vaccines and drugs. Achieving the Sustainable Development Goals will require LMICs to study a broader range of health interventions, particularly in adult health. Some interventions are no longer studied, in some cases because they have become usual care, in other cases because they are no longer relevant. Updating cost-effectiveness rankings on a regular basis is potentially a valuable exercise. PMID:28797115
Management of scars: updated practical guidelines and use of silicones.
Meaume, Sylvie; Le Pillouer-Prost, Anne; Richert, Bertrand; Roseeuw, Diane; Vadoud, Javid
2014-01-01
Hypertrophic scars and keloids resulting from surgery, burns, trauma and infection can be associated with substantial physical and psychological distress. Various non-invasive and invasive options are currently available for the prevention and treatment of these scars. Recently, an international multidisciplinary group of 24 experts on scar management (dermatologists; plastic and reconstructive surgeons; general surgeons; physical medicine, rehabilitation and burns specialists; psychosocial and behavioural researchers; epidemiologists; beauticians) convened to update a set of practical guidelines for the prevention and treatment of hypertrophic and keloid scars on the basis of the latest published clinical evidence on existing scar management options. Silicone-based products such as sheets and gels are recommended as the gold standard, first-line, non-invasive option for both the prevention and treatment of scars. Other general scar preventative measures include avoiding sun exposure, compression therapy, taping and the use of moisturisers. Invasive treatment options include intralesional injections of corticosteroids and/or 5-fluorouracil, cryotherapy, radiotherapy, laser therapy and surgical excision. All of these options may be used alone or as part of combination therapy. Of utmost importance is the regular re-evaluation of patients every four to eight weeks to evaluate whether additional treatment is warranted. The amount of scar management measures that are applied to each wound depends on the patient's risk of developing a scar and their level of concern about the scar's appearance. The practical advice presented in the current guidelines should be combined with clinical judgement when deciding on the most appropriate scar management measures for an individual patient.
Claassens, Lily; van Meerbeeck, Jan; Coens, Corneel; Quinten, Chantal; Ghislain, Irina; Sloan, Elizabeth K.; Wang, Xin Shelly; Velikova, Galina; Bottomley, Andrew
2011-01-01
Purpose This study is an update of a systematic review of health-related quality-of-life (HRQOL) methodology reporting in non–small-cell lung cancer (NSCLC) randomized controlled trials (RCTs). The objective was to evaluate HRQOL methodology reporting over the last decade and its benefit for clinical decision making. Methods A MEDLINE systematic literature review was performed. Eligible RCTs implemented patient-reported HRQOL assessments and regular oncology treatments for newly diagnosed adult patients with NSCLC. Included studies were published in English from August 2002 to July 2010. Two independent reviewers evaluated all included RCTs. Results Fifty-three RCTs were assessed. Of the 53 RCTs, 81% reported that there was no significant difference in overall survival (OS). However, 50% of RCTs that were unable to find OS differences reported a significant difference in HRQOL scores. The quality of HRQOL reporting has improved; both reporting of clinically significant differences and statistical testing of HRQOL have improved. A European Organisation for Research and Treatment of Cancer HRQOL questionnaire was used in 57% of the studies. However, reporting of HRQOL hypotheses and rationales for choosing HRQOL instruments were significantly less than before 2002 (P < .05). Conclusion The number of NSCLC RCTs incorporating HRQOL assessments has considerably increased. HRQOL continues to demonstrate its importance in RCTs, especially in those studies in which no OS difference is found. Despite the improved quality of HRQOL methodology reporting, certain aspects remain underrepresented. Our findings suggest need for an international standardization of HRQOL reporting similar to the CONSORT guidelines for clinical findings. PMID:21464420
Pandemic influenza A (H1N1) 2009 vaccine: an update.
Goel, M K; Goel, M; Khanna, P; Mittal, K
2011-01-01
The world witnessed a the first influenza pandemic in this century and fourth overall since first flu pandemic was reported during the World War I. The past experiences with influenza viruses and this pandemic of H1N1 place a consider-able strain on health services and resulted in serious illnesses and a large number of deaths. Develop-ing countries were declared more likely to be at risk from the pandemic effects, as they faced the dual problem of highly vulnerable populations and limited resources to respond H1N1. The public health experts agreed that vaccination is the most effective ways to mitigate the negative effects of the pandemic. The vaccines for H1N1 virus have been used in over 40 countries and administered to over 200 million people helped in a great way and on August 10, 2010, World Health Organization (WHO) announced H1N1 to be in postpandemic period. But based on knowledge about past pandemics, the H1N1 (2009) virus is expected to continue to circulate as a seasonal virus and may undergo some agenic-variation. As WHO strongly recommends vaccination, vigilance for regular updating of the composition of influenza vaccines, based on an assessment of the future impact of circulating viruses along with safety surveillance of the vaccines is necessary. This review has been done to take a stock of the currently available H1N1 vaccines and their possible use as public health intervention in the postpandemic period.
Planned and ongoing projects (pop) database: development and results.
Wild, Claudia; Erdös, Judit; Warmuth, Marisa; Hinterreiter, Gerda; Krämer, Peter; Chalon, Patrice
2014-11-01
The aim of this study was to present the development, structure and results of a database on planned and ongoing health technology assessment (HTA) projects (POP Database) in Europe. The POP Database (POP DB) was set up in an iterative process from a basic Excel sheet to a multifunctional electronic online database. The functionalities, such as the search terminology, the procedures to fill and update the database, the access rules to enter the database, as well as the maintenance roles, were defined in a multistep participatory feedback loop with EUnetHTA Partners. The POP Database has become an online database that hosts not only the titles and MeSH categorizations, but also some basic information on status and contact details about the listed projects of EUnetHTA Partners. Currently, it stores more than 1,200 planned, ongoing or recently published projects of forty-three EUnetHTA Partners from twenty-four countries. Because the POP Database aims to facilitate collaboration, it also provides a matching system to assist in identifying similar projects. Overall, more than 10 percent of the projects in the database are identical both in terms of pathology (indication or disease) and technology (drug, medical device, intervention). In addition, approximately 30 percent of the projects are similar, meaning that they have at least some overlap in content. Although the POP DB is successful concerning regular updates of most national HTA agencies within EUnetHTA, little is known about its actual effects on collaborations in Europe. Moreover, many non-nationally nominated HTA producing agencies neither have access to the POP DB nor can share their projects.
Emerging collective behavior and local properties of financial dynamics in a public investment game
NASA Astrophysics Data System (ADS)
da Silva, Roberto; Bazzan, Ana L. C.; Baraviera, Alexandre T.; Dahmen, Sílvio R.
2006-11-01
In this paper we consider a simple model of a society of economic agents, namely a variation of the well known “public investment game”, where each agent can contribute with a discrete quantity, i.e., cooperate to increase the benefits of the group. Interactions take place among nearest neighbors and depend on the motivation level (insider information, economy prospects). The profit is used to update individual motivations. We first explore a deterministic scenario and the existence of fixed points and attractors. We also consider the presence of noise, where profits fluctuate stochastically. In this scenario we analyze the global persistence as a function of time-a measure of the probability that the amount of money of the entire group remains at least equal to its initial value. Our simulations show that this quantity has a power law behavior. We have also performed simulations with a population of heterogeneous agents, including deceivers and conservatives. We show that, although there is no regular pattern regarding the average wealth, robust power laws for persistence do exist and argue that this can be used to characterize the emerging collective behavior. The influence of the motivation updating and the presence of conservatives and deceivers on persistence is also studied. Simulations for the local persistence exploring two different versions of this concept: the probability of a particular agent not going bankrupt (i.e., remaining wealth ⩾0 up to time t) and the probability of a particular agent making more money than he initially had. Different power law behaviors are also observed in these situations.
NASA Astrophysics Data System (ADS)
Galbraith, N. R.; Graybeal, J.; Bermudez, L. E.; Wright, D.
2005-12-01
The Marine Metadata Interoperability (MMI) initiative promotes the exchange, integration and use of marine data through enhanced data publishing, discovery, documentation and accessibility. The project, operating since late 2004, presents several cultural organizational challenges because of the diversity of participants: scientists, technical experts, and data managers from around the world, all working in organizations with different corporate cultures, funding structures, and systems of decision-making. MMI provides educational resources at several levels. For instance, short introductions to metadata concepts are available, as well as guides and "cookbooks" for the quick and efficient preparation of marine metadata. For those who are building major marine data systems, including ocean-observing capabilities, there are training materials, marine metadata content examples, and resources for mapping elements between different metadata standards. The MMI also provides examples of good metadata practices in existing data systems, including the EU's Marine XML project, and functioning ocean/coastal clearinghouses and atlases developed by MMI team members. Communication tools that help build community: 1) Website, used to introduce the initiative to new visitors, and to provide in-depth guidance and resources to members and visitors. The site is built using Plone, an open source web content management system. Plone allows the site to serve as a wiki, to which every user can contribute material. This keeps the membership engaged and spreads the responsibility for the tasks of updating and expanding the site. 2) Email-lists, to engage the broad ocean sciences community. The discussion forums "news," "ask," and "site-help" are available for receiving regular updates on MMI activities, seeking advice or support on projects and standards, or for assistance with using the MMI site. Internal email lists are provided for the Technical Team, the Steering Committee and Executive Committee, and for several content-centered teams. These lists help keep committee members connected, and have been very successful in building consensus and momentum. 3) Regularly scheduled telecons, to provide the chance for interaction between members without the need to physically attend meetings. Both the steering committee and the technical team convene via phone every month. Discussions are guided by agendas published in advance, and minutes are kept on-line for reference. These telecons have been an important tool in moving the MMI project forward; they give members an opportunity for informal discussion and provide a timeframe for accomplishing tasks. 4) Workshops, to make progress towards community agreement, such as the technical workshop "Advancing Domain Vocabularies" August 9-11, 2005, in Boulder, Colorado, where featured domain and metadata experts developed mappings between existing marine metadata vocabularies. Most of the work of the meeting was performed in six small, carefully organized breakout teams, oriented around specific domains. 5) Calendar of events, to keep update the users and where any event related to marine metadata and interoperability can be posted. 6) Specific tools to reach agreements among distributed communities. For example, we developed a tool called Vocabulary Integration Environment (VINE), that allows formalized agreements of mappings across different vocabularies.
NASA Astrophysics Data System (ADS)
Nuber, André; Manukyan, Edgar; Maurer, Hansruedi
2014-05-01
Conventional methods of interpreting seismic data rely on filtering and processing limited portions of the recorded wavefield. Typically, either reflections, refractions or surface waves are considered in isolation. Particularly in near-surface engineering and environmental investigations (depths less than, say 100 m), these wave types often overlap in time and are difficult to separate. Full waveform inversion is a technique that seeks to exploit and interpret the full information content of the seismic records without the need for separating events first; it yields models of the subsurface at sub-wavelength resolution. We use a finite element modelling code to solve the 2D elastic isotropic wave equation in the frequency domain. This code is part of a Gauss-Newton inversion scheme which we employ to invert for the P- and S-wave velocities as well as for density in the subsurface. For shallow surface data the use of an elastic forward solver is essential because surface waves often dominate the seismograms. This leads to high sensitivities (partial derivatives contained in the Jacobian matrix of the Gauss-Newton inversion scheme) and thus large model updates close to the surface. Reflections from deeper structures may also include useful information, but the large sensitivities of the surface waves often preclude this information from being fully exploited. We have developed two methods that balance the sensitivity distributions and thus may help resolve the deeper structures. The first method includes equilibrating the columns of the Jacobian matrix prior to every inversion step by multiplying them with individual scaling factors. This is expected to also balance the model updates throughout the entire subsurface model. It can be shown that this procedure is mathematically equivalent to balancing the regularization weights of the individual model parameters. A proper choice of the scaling factors required to balance the Jacobian matrix is critical. We decided to normalise the columns of the Jacobian based on their absolute column sum, but defining an upper threshold for the scaling factors. This avoids particularly small and therefore insignificant sensitivities being over-boosted, which would produce unstable results. The second method proposed includes adjusting the inversion cell size with depth. Multiple cells of the forward modelling grid are merged to form larger inversion cells (typical ratios between forward and inversion cells are in the order of 1:100). The irregular inversion grid is adapted to the expected resolution power of full waveform inversion. Besides stabilizing the inversion, this approach also reduces the number of model parameters to be recovered. Consequently, the computational costs and the memory consumption are reduced significantly. This is particularly critical when Gauss-Newton type inversion schemes are employed. Extensive tests with synthetic data demonstrated that both methods stabilise the inversion and improve the inversion results. The two methods have some redundancy, which can be seen when both are applied simultaneously, that is, when scaling of the Jacobian matrix is applied to an irregular inversion grid. The calculated scaling factors are quite balanced and span a much smaller range than in the case of a regular inversion grid.
NASA Astrophysics Data System (ADS)
Hadgu, T.; Kalinina, E.; Klise, K. A.; Wang, Y.
2016-12-01
Disposal of high-level radioactive waste in a deep geological repository in crystalline host rock is one of the potential options for long term isolation. Characterization of the natural barrier system is an important component of the disposal option. In this study we present numerical modeling of flow and transport in fractured crystalline rock using an updated fracture continuum model (FCM). The FCM is a stochastic method that maps the permeability of discrete fractures onto a regular grid. The original method by McKenna and Reeves (2005) has been updated to provide capabilities that enhance representation of fractured rock. As reported in Hadgu et al. (2015) the method was first modified to include fully three-dimensional representations of anisotropic permeability, multiple independent fracture sets, and arbitrary fracture dips and orientations, and spatial correlation. More recently the FCM has been extended to include three different methods. (1) The Sequential Gaussian Simulation (SGSIM) method uses spatial correlation to generate fractures and define their properties for FCM (2) The ELLIPSIM method randomly generates a specified number of ellipses with properties defined by probability distributions. Each ellipse represents a single fracture. (3) Direct conversion of discrete fracture network (DFN) output. Test simulations were conducted to simulate flow and transport using ELLIPSIM and direct conversion of DFN methods. The simulations used a 1 km x 1km x 1km model domain and a structured with grid block of size of 10 m x 10m x 10m, resulting in a total of 106 grid blocks. Distributions of fracture parameters were used to generate a selected number of realizations. For each realization, the different methods were applied to generate representative permeability fields. The PFLOTRAN (Hammond et al., 2014) code was used to simulate flow and transport in the domain. Simulation results and analysis are presented. The results indicate that the FCM approach is a viable method to model fractured crystalline rocks. The FCM is a computationally efficient way to generate realistic representation of complex fracture systems. This approach is of interest for nuclear waste disposal models applied over large domains. SAND2016-7509 A
Multi-Hazard Assessment of Scour Damaged Bridges with UAS-Based Measurements
NASA Astrophysics Data System (ADS)
Özcan, O.; Ozcan, O.
2017-12-01
Flood and stream induced scour occurring in bridge piers constructed on rivers is one of the mostly observed failure reasons in bridges. Scour induced failure risk in bridges and determination of the alterations in bridge safety under seismic effects has the ultimate importance. Thus, for the determination of bridge safety under the scour effects, the scour amount under bridge piers should be designated realistically and should be tracked and updated continuously. Hereby, the scour induced failures in bridge foundation systems will be prevented and bridge substructure design will be conducted safely. In this study, in order to measure the amount of scour in bridge load bearing system (pile foundations and pile abutments) and to attain very high definition 3 dimensional models of river flood plain for the flood analysis, unmanned aircraft system (UAS) based measurement methods were implemented. UAS based measurement systems provide new and practical approach and bring high precision and reliable solutions considering recent measurement systems. For this purpose, the reinforced concrete (RC) bridge that is located on Antalya Boğaçayı River, Turkey and that failed in 2003 due to flood-induced scour was selected as the case study. The amount of scour occurred in bridge piers and piles was determined realistically and the behavior of bridge piers under scour effects was investigated. Future flood effects and the resultant amount of scour was determined with HEC-RAS software by using digital surface models that were obtained at regular intervals using UAS for the riverbed. In the light of the attained scour measurements and expected scour after a probable flood event, the behavior of scour damaged RC bridge was investigated by pushover and time history analyses under lateral and vertical seismic loadings. In the analyses, the load and displacement capacity of bridge was observed to diminish significantly under expected scour. Thus, the deterioration in multi hazard performance of the bridge was monitored significantly in the light of updated bridge load bearing system capacity. Regarding the case study, UAS based and continuously updated bridge multi hazard risk detection system was established that can be used for bridges located on riverbed.
Distorted Born iterative T-matrix method for inversion of CSEM data in anisotropic media
NASA Astrophysics Data System (ADS)
Jakobsen, Morten; Tveit, Svenn
2018-05-01
We present a direct iterative solutions to the nonlinear controlled-source electromagnetic (CSEM) inversion problem in the frequency domain, which is based on a volume integral equation formulation of the forward modelling problem in anisotropic conductive media. Our vectorial nonlinear inverse scattering approach effectively replaces an ill-posed nonlinear inverse problem with a series of linear ill-posed inverse problems, for which there already exist efficient (regularized) solution methods. The solution update the dyadic Green's function's from the source to the scattering-volume and from the scattering-volume to the receivers, after each iteration. The T-matrix approach of multiple scattering theory is used for efficient updating of all dyadic Green's functions after each linearized inversion step. This means that we have developed a T-matrix variant of the Distorted Born Iterative (DBI) method, which is often used in the acoustic and electromagnetic (medical) imaging communities as an alternative to contrast-source inversion. The main advantage of using the T-matrix approach in this context, is that it eliminates the need to perform a full forward simulation at each iteration of the DBI method, which is known to be consistent with the Gauss-Newton method. The T-matrix allows for a natural domain decomposition, since in the sense that a large model can be decomposed into an arbitrary number of domains that can be treated independently and in parallel. The T-matrix we use for efficient model updating is also independent of the source-receiver configuration, which could be an advantage when performing fast-repeat modelling and time-lapse inversion. The T-matrix is also compatible with the use of modern renormalization methods that can potentially help us to reduce the sensitivity of the CSEM inversion results on the starting model. To illustrate the performance and potential of our T-matrix variant of the DBI method for CSEM inversion, we performed a numerical experiments based on synthetic CSEM data associated with 2D VTI and 3D orthorombic model inversions. The results of our numerical experiment suggest that the DBIT method for inversion of CSEM data in anisotropic media is both accurate and efficient.
Historical sources on climate and extreme events before XX century in Calabria (Italy)
NASA Astrophysics Data System (ADS)
Aurora Pasqua, Angela; Petrucci, Olga
2014-05-01
Damaging Hydrogeological Events (DHEs) are defined as the occurrence of destructive phenomena, such as landslides and floods, triggered by extreme rain events. Due to the huge damage that they can cause to people and properties, DHEs are often described in a wide series of historical sources. The historical series of DHEs that affected a study region can supply useful information about the climatic trend of the area. Moreover, it can reveals temporal and spatial increases in vulnerability affecting sectors where urbanization increased throughout the time. On the other side, it can highlight further vulnerability variations occurred throughout the decades and related to specific defensive measures undertaken (or abandoned) in order to prevent damage caused by either landslides or floods. We present the historical series of catastrophic DHEs which affected a Mediterranean region named Calabria that is located in southern Italy. Data presented came from the database named ASICal (the Italian acronym of historically flooded areas in Calabria) that has been built at the beginning of 2000 at CNR-IRPI of Cosenza and that has been continuously updated since then. Currently, this database includes more than 11,000 records about floods and landslides which have been occurred in Calabria since the XVI century. These data came from different information sources as newspapers, archives of regional and national agencies, scientific and technical reports, on-site surveys reports and so on. ASICal is constantly updated. The updating concerns both current DHEs that every years affect the region, and the results of specific historical research that we regularly perform in order to fill data gaps for older epochs. In this work we present the result of a recent survey carried out in some regional public libraries focusing on the early-mid XIX century. The type of data sources available for the regional framework are described and a sketch of the DHEs trend during the last three centuries is presented. Moreover, a panoramic view of both proxy data and irregularly measured parameters concerning climatic trend of the region obtained from the analyzed historical sources is also shown.
Strategic planning for skills and simulation labs in colleges of nursing.
Gantt, Laura T
2010-01-01
While simulation laboratories for clinical nursing education are predicted to grow, budget cuts may threaten these programs. One of the ways to develop a new lab, as well as to keep an existing one on track, is to develop and regularly update a strategic plan. The process of planning not only helps keep the lab faculty and staff apprised of the challenges to be faced, but it also helps to keep senior level management engaged by reason of the need for their input and approval of the plan. The strategic planning documents drafted by those who supervised the development of the new building and Concepts Integration Labs (CILs) helped guide and orient faculty and other personnel hired to implement the plan and fulfill the vision. As the CILs strategic plan was formalized, the draft plans, including the SWOT analysis, were reviewed to provide historical perspective, stimulate discussion, and to make sure old or potential mistakes were not repeated.
2014-01-01
Introduction Halitosis can be caused by oral disease or by respiratory tract conditions such as sinusitis, tonsillitis, and bronchiectasis, but an estimated 40% of affected individuals have no underlying organic disease. Methods and outcomes We conducted a systematic review and aimed to answer the following clinical question: What are the effects of treatments in people with physiological halitosis? We searched: Medline, Embase, The Cochrane Library, and other important databases up to July 2013 (Clinical evidence reviews are updated periodically; please check our website for the most up-to-date version of this review). We included harms alerts from relevant organisations such as the US Food and Drug Administration (FDA) and the UK Medicines and Healthcare products Regulatory Agency (MHRA). Results We found 11 studies that met our inclusion criteria. We performed a GRADE evaluation of the quality of evidence for interventions. Conclusions In this systematic review, we present information relating to the effectiveness and safety of the following interventions: artificial saliva; cleaning, brushing, or scraping the tongue; regular use of mouthwash; sugar-free chewing gums; and zinc toothpastes. PMID:25234037
NASA Astrophysics Data System (ADS)
Liu, Di; Mishra, Ashok K.; Yu, Zhongbo
2016-07-01
This paper examines the combination of support vector machines (SVM) and the dual ensemble Kalman filter (EnKF) technique to estimate root zone soil moisture at different soil layers up to 100 cm depth. Multiple experiments are conducted in a data rich environment to construct and validate the SVM model and to explore the effectiveness and robustness of the EnKF technique. It was observed that the performance of SVM relies more on the initial length of training set than other factors (e.g., cost function, regularization parameter, and kernel parameters). The dual EnKF technique proved to be efficient to improve SVM with observed data either at each time step or at a flexible time steps. The EnKF technique can reach its maximum efficiency when the updating ensemble size approaches a certain threshold. It was observed that the SVM model performance for the multi-layer soil moisture estimation can be influenced by the rainfall magnitude (e.g., dry and wet spells).
IAU astroEDU: an open-access platform for peer-reviewed astronomy education activities
NASA Astrophysics Data System (ADS)
Heenatigala, Thilina; Russo, Pedro; Strubbe, Linda; Gomez, Edward
2015-08-01
astroEDU is an open access platform for peer-reviewed astronomy education activities. It addresses key problems in educational repositories such as variability in quality, not maintained or updated regularly, limited content review, and more. This is achieved through a peer-review process similar to what scholarly articles are based on. Activities submitted are peer-reviewed by an educator and a professional astronomer which gives the credibility to the activities. astroEDU activities are open-access in order to make the activities accessible to educators around the world while letting them discover, review, distribute and remix the activities. The activity submission process allows authors to learn how to apply enquiry-based learning into the activity, identify the process skills required, how to develop core goals and objectives, and how to evaluate the activity to determine the outcome. astroEDU is endorsed by the International Astronomical Union meaning each activity is given an official stamp by the international organisation for professional astronomers.
Algorithm for Lossless Compression of Calibrated Hyperspectral Imagery
NASA Technical Reports Server (NTRS)
Kiely, Aaron B.; Klimesh, Matthew A.
2010-01-01
A two-stage predictive method was developed for lossless compression of calibrated hyperspectral imagery. The first prediction stage uses a conventional linear predictor intended to exploit spatial and/or spectral dependencies in the data. The compressor tabulates counts of the past values of the difference between this initial prediction and the actual sample value. To form the ultimate predicted value, in the second stage, these counts are combined with an adaptively updated weight function intended to capture information about data regularities introduced by the calibration process. Finally, prediction residuals are losslessly encoded using adaptive arithmetic coding. Algorithms of this type are commonly tested on a readily available collection of images from the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) hyperspectral imager. On the standard calibrated AVIRIS hyperspectral images that are most widely used for compression benchmarking, the new compressor provides more than 0.5 bits/sample improvement over the previous best compression results. The algorithm has been implemented in Mathematica. The compression algorithm was demonstrated as beneficial on 12-bit calibrated AVIRIS images.
The antigenic evolution of influenza: drift or thrift?
Wikramaratna, Paul S.; Sandeman, Michi; Recker, Mario; Gupta, Sunetra
2013-01-01
It is commonly assumed that antibody responses against the influenza virus are polarized in the following manner: strong antibody responses are directed at highly variable antigenic epitopes, which consequently undergo ‘antigenic drift’, while weak antibody responses develop against conserved epitopes. As the highly variable epitopes are in a constant state of flux, current antibody-based vaccine strategies are focused on the conserved epitopes in the expectation that they will provide some level of clinical protection after appropriate boosting. Here, we use a theoretical model to suggest the existence of epitopes of low variability, which elicit a high degree of both clinical and transmission-blocking immunity. We show that several epidemiological features of influenza and its serological and molecular profiles are consistent with this model of ‘antigenic thrift’, and that identifying the protective epitopes of low variability predicted by this model could offer a more viable alternative to regularly update the influenza vaccine than exploiting responses to weakly immunogenic conserved regions. PMID:23382423
NASA Technical Reports Server (NTRS)
Borysow, Aleksandra
1998-01-01
Accurate knowledge of certain collision-induced absorption continua of molecular pairs such as H2-H2, H2-He, H2-CH4, CO2-CO2, etc., is a prerequisite for most spectral analyses and modelling attempts of atmospheres of planets and cold stars. We collect and regularly update simple, state of the art computer programs for the calculation of the absorption coefficient of such molecular pairs over a broad range of temperatures and frequencies, for the various rotovibrational bands. The computational results are in agreement with the existing laboratory measurements of such absorption continua, recorded with a spectral resolution of a few wavenumbers, but reliable computational results may be expected even in the far wings, and at temperatures for which laboratory measurements do not exist. Detailed information is given concerning the systems thus studied, the temperature and frequency ranges considered, the rotovibrational bands thus modelled, and how one may obtain copies of the FORTRAN77 computer programs by e-mail.
Adjunctive use of antibiotics in periodontal therapy
Barca, Ece; Cifcibasi, Emine; Cintan, Serdar
2015-01-01
Periodontal diseases are infectious diseases with a mixed microbial aetiology and marked inflammatory response leading to destruction of underlying tissue. Periodontal therapy aims to eliminate pathogens associated with the disease and attain periodontal health. Periodontitis is generally treated by nonsurgical mechanical debridement and regular periodontal maintenance care. Periodontal surgery may be indicated for some patients to improve access to the root surface; however, mechanical debridement alone may not be helpful in all cases. In such cases, adjunctive systemic antibiotic therapy remains the treatment of choice. It can reach microorganisms at the base of the deep periodontal pockets and furcation areas via serum, and also affects organisms residing within gingival epithelium and connective tissue. This review aims to provide an update on clinical issues regarding when and how to prescribe systemic antibiotics in periodontal therapy. The points discussed are the mode of antibiotic action, susceptible periodontal pathogens, antibiotic dosage, antibiotic use in treatment of periodontal disease, and mechanism of bacterial resistance to each antibiotic. PMID:28955547
2D joint inversion of CSAMT and magnetic data based on cross-gradient theory
NASA Astrophysics Data System (ADS)
Wang, Kun-Peng; Tan, Han-Dong; Wang, Tao
2017-06-01
A two-dimensional forward and backward algorithm for the controlled-source audio-frequency magnetotelluric (CSAMT) method is developed to invert data in the entire region (near, transition, and far) and deal with the effects of artificial sources. First, a regularization factor is introduced in the 2D magnetic inversion, and the magnetic susceptibility is updated in logarithmic form so that the inversion magnetic susceptibility is always positive. Second, the joint inversion of the CSAMT and magnetic methods is completed with the introduction of the cross gradient. By searching for the weight of the cross-gradient term in the objective function, the mutual influence between two different physical properties at different locations are avoided. Model tests show that the joint inversion based on cross-gradient theory offers better results than the single-method inversion. The 2D forward and inverse algorithm for CSAMT with source can effectively deal with artificial sources and ensures the reliability of the final joint inversion algorithm.
NASA Astrophysics Data System (ADS)
van Rensburg, L.; Claassens, S.; Bezuidenhout, J. J.; Jansen van Rensburg, P. J.
2009-03-01
The much publicised problem with major asbestos pollution and related health issues in South Africa, has called for action to be taken to negate the situation. The aim of this project was to establish a prioritisation index that would provide a scientifically based sequence in which polluted asbestos mines in Southern Africa ought to be rehabilitated. It was reasoned that a computerised database capable of calculating such a Rehabilitation Prioritisation Index (RPI) would be a fruitful departure from the previously used subjective selection prone to human bias. The database was developed in Microsoft Access and both quantitative and qualitative data were used for the calculation of the RPI value. The logical database structure consists of a number of mines, each consisting of a number of dumps, for which a number of samples have been analysed to determine asbestos fibre contents. For this system to be accurate as well as relevant, the data in the database should be revalidated and updated on a regular basis.
NASA Astrophysics Data System (ADS)
Alpi, Danielle Marie
The 16 sectors of critical infrastructure in the US are susceptible to cyber-attacks. Potential attacks come from internal and external threats. These attacks target the industrial control systems (ICS) of companies within critical infrastructure. Weakness in the energy sector's ICS, specifically the oil and gas industry, can result in economic and ecological disaster. The purpose of this study was to establish means for oil companies to identify and stop cyber-attacks specifically APT threats. This research reviewed current cyber vulnerabilities and ways in which a cyber-attack may be deterred. This research found that there are insecure devices within ICS that are not regularly updated. Therefore, security issues have amassed. Safety procedures and training thereof are often neglected. Jurisdiction is unclear in regard to critical infrastructure. The recommendations this research offers are further examination of information sharing methods, development of analytic platforms, and better methods for the implementation of defense-in-depth security measures.
Marinucci, Gino D.; Luber, George; Uejio, Christopher K.; Saha, Shubhayu; Hess, Jeremy J.
2014-01-01
Climate change is anticipated to have several adverse health impacts. Managing these risks to public health requires an iterative approach. As with many risk management strategies related to climate change, using modeling to project impacts, engaging a wide range of stakeholders, and regularly updating models and risk management plans with new information—hallmarks of adaptive management—are considered central tenets of effective public health adaptation. The Centers for Disease Control and Prevention has developed a framework, entitled Building Resilience Against Climate Effects, or BRACE, to facilitate this process for public health agencies. Its five steps are laid out here. Following the steps laid out in BRACE will enable an agency to use the best available science to project likely climate change health impacts in a given jurisdiction and prioritize interventions. Adopting BRACE will also reinforce public health’s established commitment to evidence-based practice and institutional learning, both of which will be central to successfully engaging the significant new challenges that climate change presents. PMID:24991665
All-Sky Earth Occultation Observations with the Fermi Gamma Ray Burst Monitor
NASA Technical Reports Server (NTRS)
Wilson-Hodge, C. A.; Beklen, E.; Bhat, P. N.; Briggs, M.; Camero-Arranz, A.; Case, G.; Chaplin, V.; Cherry, M.; Connaughton, V.; Finger, M.;
2010-01-01
Using the Gamma Ray Burst Monitor (GBM) on-board Fermi, we are monitoring the hard X-ray/soft gamma ray sky using the Earth occultation technique. Each time a source in our catalog is occulted by (or exits occultation by) the Earth, we measure its flux using the change in count rates due to the occultation. Currently we are using CTIME data with 8 energy channels spanning 8 keV to 1 MeV for the GBM NaI detectors and spanning 150 keV to 40 MeV for the GBM BGO detectors. Our preliminary catalog consists of galactic X-ray binaries, the Crab Nebula, and active galactic nuclei. New sources are added to our catalog as they become active or upon request. In addition to Earth occultations, we have observed numerous occultations with Fermi's solar panels. We will present early results. Regularly updated results will be found on our website http://gammaray.nsstc.nasa.gov/gbm/science/occultation.
The regulation of mobile medical applications.
Yetisen, Ali Kemal; Martinez-Hurtado, J L; da Cruz Vasconcellos, Fernando; Simsekler, M C Emre; Akram, Muhammad Safwan; Lowe, Christopher R
2014-03-07
The rapidly expanding number of mobile medical applications have the potential to transform the patient-healthcare provider relationship by improving the turnaround time and reducing costs. In September 2013, the U.S. Food and Drug Administration (FDA) issued guidance to regulate these applications and protect consumers by minimising the risks associated with their unintended use. This guidance distinguishes between the subset of mobile medical apps which may be subject to regulation and those that are not. The marketing claims of the application determine the intent. Areas of concern include compliance with regular updates of the operating systems and of the mobile medical apps themselves. In this article, we explain the essence of this FDA guidance by providing examples and evaluating the impact on academia, industry and other key stakeholders, such as patients and clinicians. Our assessment indicates that awareness and incorporation of the guidelines into product development can hasten the commercialisation and market entry process. Furthermore, potential obstacles have been discussed and directions for future development suggested.
NASA Astrophysics Data System (ADS)
Zhang, Wancheng; Xu, Yejun; Wang, Huimin
2016-01-01
The aim of this paper is to put forward a consensus reaching method for multi-attribute group decision-making (MAGDM) problems with linguistic information, in which the weight information of experts and attributes is unknown. First, some basic concepts and operational laws of 2-tuple linguistic label are introduced. Then, a grey relational analysis method and a maximising deviation method are proposed to calculate the incomplete weight information of experts and attributes respectively. To eliminate the conflict in the group, a weight-updating model is employed to derive the weights of experts based on their contribution to the consensus reaching process. After conflict elimination, the final group preference can be obtained which will give the ranking of the alternatives. The model can effectively avoid information distortion which is occurred regularly in the linguistic information processing. Finally, an illustrative example is given to illustrate the application of the proposed method and comparative analysis with the existing methods are offered to show the advantages of the proposed method.
A practical globalization of one-shot optimization for optimal design of tokamak divertors
NASA Astrophysics Data System (ADS)
Blommaert, Maarten; Dekeyser, Wouter; Baelmans, Martine; Gauger, Nicolas R.; Reiter, Detlev
2017-01-01
In past studies, nested optimization methods were successfully applied to design of the magnetic divertor configuration in nuclear fusion reactors. In this paper, so-called one-shot optimization methods are pursued. Due to convergence issues, a globalization strategy for the one-shot solver is sought. Whereas Griewank introduced a globalization strategy using a doubly augmented Lagrangian function that includes primal and adjoint residuals, its practical usability is limited by the necessity of second order derivatives and expensive line search iterations. In this paper, a practical alternative is offered that avoids these drawbacks by using a regular augmented Lagrangian merit function that penalizes only state residuals. Additionally, robust rank-two Hessian estimation is achieved by adaptation of Powell's damped BFGS update rule. The application of the novel one-shot approach to magnetic divertor design is considered in detail. For this purpose, the approach is adapted to be complementary with practical in parts adjoint sensitivities. Using the globalization strategy, stable convergence of the one-shot approach is achieved.
Common questions about the diagnosis and management of fibromyalgia.
Kodner, Charles
2015-04-01
Fibromyalgia has a distinct pathophysiology involving central amplification of peripheral sensory signals. Core symptoms are chronic widespread pain, fatigue, and sleep disturbance. Most patients with fibromyalgia have muscle pain and tenderness, forgetfulness or problems concentrating, and significant functional limitations. Fibromyalgia is diagnosed using an updated set of clinical criteria that no longer depend on tender point examination; laboratory testing may rule out other disorders that commonly present with fatigue, such as anemia and thyroid disease. Patients with fibromyalgia should be evaluated for comorbid functional pain syndromes and mood disorders. Management of fibromyalgia should include patient education, symptom relief, and regular aerobic physical activity. Serotoninnorepinephrine reuptake inhibitors, tricyclic antidepressants, antiepileptics, and muscle relaxants have the strongest evidence of benefit for improving pain, fatigue, sleep symptoms, and quality of life. Multiple complementary and alternative medicine therapies have been used but have limited evidence of effectiveness. Opioids should be used to relieve pain in carefully selected patients only if alternative therapies are ineffective.
Piéron’s Law and Optimal Behavior in Perceptual Decision-Making
van Maanen, Leendert; Grasman, Raoul P. P. P.; Forstmann, Birte U.; Wagenmakers, Eric-Jan
2012-01-01
Piéron’s Law is a psychophysical regularity in signal detection tasks that states that mean response times decrease as a power function of stimulus intensity. In this article, we extend Piéron’s Law to perceptual two-choice decision-making tasks, and demonstrate that the law holds as the discriminability between two competing choices is manipulated, even though the stimulus intensity remains constant. This result is consistent with predictions from a Bayesian ideal observer model. The model assumes that in order to respond optimally in a two-choice decision-making task, participants continually update the posterior probability of each response alternative, until the probability of one alternative crosses a criterion value. In addition to predictions for two-choice decision-making tasks, we extend the ideal observer model to predict Piéron’s Law in signal detection tasks. We conclude that Piéron’s Law is a general phenomenon that may be caused by optimality constraints. PMID:22232572
Yeast flocculation: New story in fuel ethanol production.
Zhao, X Q; Bai, F W
2009-01-01
Yeast flocculation has been used in the brewing industry to facilitate biomass recovery for a long time, and thus its mechanism of yeast flocculation has been intensively studied. However, the application of flocculating yeast in ethanol production garnered attention mainly in the 1980s and 1990s. In this article, updated research progress in the molecular mechanism of yeast flocculation and the impact of environmental conditions on yeast flocculation are reviewed. Construction of flocculating yeast strains by genetic approach and utilization of yeast flocculation for ethanol production from various feedstocks were presented. The concept of self-immobilized yeast cells through their flocculation is revisited through a case study of continuous ethanol fermentation with the flocculating yeast SPSC01, and their technical and economic advantages are highlighted by comparing with yeast cells immobilized with supporting materials and regular free yeast cells as well. Taking the flocculating yeast SPSC01 as an example, the ethanol tolerance of the flocculating yeast was also discussed.
Zhang, Shixuan; Kriza, Christine; Kolominsky-Rabas, Peter L
2014-09-01
The objective of this paper is to provide a systematic overview of the Chinese medical device registration processes, identify challenges and suggest how these can be addressed. In addition, the paper will outline the impact of new policies and regulations since the restructuring of the China FDA. A systematic review was performed for journal articles between the year of 2009 and 2013 in the following databases: PubMed, ScienceDirect and Zhongguozhiwang. The review has identified 184 papers which were potentially relevant. Seventeen articles were included in the review, which highlights the challenges and opportunities related to the medical device registration process. In order to understand the actual impact of the regulation environment and its policies including the lack of regulatory guidance regular assessment updates are crucial. The results of this paper are aimed at informing regulatory bodies, health policy decision makers, national and international Health Technology Assessment networks as well as medical devices manufacturers.
Actuator with built-in viscous damping for isolation and structural control
NASA Astrophysics Data System (ADS)
Hyde, T. Tupper; Anderson, Eric H.
1994-05-01
This paper describes the development and experimental application of an actuator with built-in viscous damping. An existing passive damper was modified for use as a novel actuation device for isolation and structural control. The device functions by using the same fluid for viscous damping and as a hydraulic lever for a voice coil actuator. Applications for such an actuator include structural control and active isolation. Lumped parameter models capturing structural and fluid effects are presented. Component tests of free stroke, blocked force, and passive complex stiffness are used to update the assumed model parameters. The structural damping effectiveness of the new actuator is shown to be that of a regular D-strut passively and that of a piezoelectric strut with load cell feedback actively in a complex testbed structure. Open and closed loop results are presented for a force isolation application showing an 8 dB passive and 20 dB active improvement over an undamped mount. An optimized design for a future experimental testbed is developed.
A Comprehensive Curation Shows the Dynamic Evolutionary Patterns of Prokaryotic CRISPRs.
Mai, Guoqin; Ge, Ruiquan; Sun, Guoquan; Meng, Qinghan; Zhou, Fengfeng
2016-01-01
Motivation. Clustered regularly interspaced short palindromic repeat (CRISPR) is a genetic element with active regulation roles for foreign invasive genes in the prokaryotic genomes and has been engineered to work with the CRISPR-associated sequence (Cas) gene Cas9 as one of the modern genome editing technologies. Due to inconsistent definitions, the existing CRISPR detection programs seem to have missed some weak CRISPR signals. Results. This study manually curates all the currently annotated CRISPR elements in the prokaryotic genomes and proposes 95 updates to the annotations. A new definition is proposed to cover all the CRISPRs. The comprehensive comparison of CRISPR numbers on the taxonomic levels of both domains and genus shows high variations for closely related species even in the same genus. The detailed investigation of how CRISPRs are evolutionarily manipulated in the 8 completely sequenced species in the genus Thermoanaerobacter demonstrates that transposons act as a frequent tool for splitting long CRISPRs into shorter ones along a long evolutionary history.
Surgical treatment of lung metastases in patients with embryonal pediatric solid tumors: an update.
Fuchs, Joerg; Seitz, Guido; Handgretinger, Rupert; Schäfer, Juergen; Warmann, Steven W
2012-02-01
Distant metastases regularly occur in children with solid tumors. The most affected organ is the lung. Nearly in all extracranial pediatric solid tumors, the presence of lung metastases is associated with an adverse prognosis for the children. Therefore, the correct treatment of lung metastases is essential and influences the outcome. Despite different national and international trials for pediatric tumor entities, specific surgical aspects or guidelines for lung metastases are usually not addressed thoroughly in these protocols. The aim of this article is to present the diagnostic challenges and principles of surgical treatment by focusing on the influence of surgery on the outcome of children. Special points of interest are discussed that emphasize sarcomas, nephroblastomas, hepatoblastomas, and other tumors. Surgery of lung metastases is safe, has a positive impact on the patients' prognosis, and should be aggressive depending on the tumor entity. An interdisciplinary approach, including pediatric oncology and radiology, is mandatory in any case. Copyright © 2012 Elsevier Inc. All rights reserved.
APPLICATION OF INFORMATION AND COMMUNICATION TECHNOLOGIES IN MEDICAL EDUCATION
Al-Tamimi, Dalal M.
2003-01-01
The recognition that information and communication technologies should play an increasingly important role in medical education is a key to educating physicians in the 21st century. Computer use in medical education includes, Internet hypermedia/multimedia technologies, medical informatics, distance learning and telemedicine. Adaptation to the use of these technologies should ideally start from the elementary school level. Medical schools must introduce medical informatics courses very early in the medical curriculum. Teachers will need regular CME courses to prepare and update themselves with the changing circumstances. Our infrastructure must be prepared for the new developments with computer labs, basic skill labs, close circuit television facilities, virtual class rooms, smart class rooms, simulated teaching facilities, and distance teaching by tele-techniques. Our existing manpower including, doctors, nurses, technicians, librarians, and administration personal require hands-on training, while new recruitment will have to emphasize compulsory knowledge of and familiarity with information technology. This paper highlights these subjects in detail as a means to prepare us to meet the challenges of the 21st century. PMID:23011983
Standardization of Analysis Sets for Reporting Results from ADNI MRI Data
Wyman, Bradley T.; Harvey, Danielle J.; Crawford, Karen; Bernstein, Matt A.; Carmichael, Owen; Cole, Patricia E.; Crane, Paul; DeCarli, Charles; Fox, Nick C.; Gunter, Jeffrey L.; Hill, Derek; Killiany, Ronald J.; Pachai, Chahin; Schwarz, Adam J.; Schuff, Norbert; Senjem, Matthew L.; Suhy, Joyce; Thompson, Paul M.; Weiner, Michael; Jack, Clifford R.
2013-01-01
The ADNI 3D T1-weighted MRI acquisitions provide a rich dataset for developing and testing analysis techniques for extracting structural endpoints. To promote greater rigor in analysis and meaningful comparison of different algorithms, the ADNI MRI Core has created standardized analysis sets of data comprising scans that met minimum quality control requirements. We encourage researchers to test and report their techniques against these data. Standard analysis sets of volumetric scans from ADNI-1 have been created, comprising: screening visits, 1 year completers (subjects who all have screening, 6 and 12 month scans), two year annual completers (screening, 1, and 2 year scans), two year completers (screening, 6 months, 1 year, 18 months (MCI only) and 2 years) and complete visits (screening, 6 months, 1 year, 18 months (MCI only), 2, and 3 year (normal and MCI only) scans). As the ADNI-GO/ADNI-2 data becomes available, updated standard analysis sets will be posted regularly. PMID:23110865
Gene context analysis in the Integrated Microbial Genomes (IMG) data management system.
Mavromatis, Konstantinos; Chu, Ken; Ivanova, Natalia; Hooper, Sean D; Markowitz, Victor M; Kyrpides, Nikos C
2009-11-24
Computational methods for determining the function of genes in newly sequenced genomes have been traditionally based on sequence similarity to genes whose function has been identified experimentally. Function prediction methods can be extended using gene context analysis approaches such as examining the conservation of chromosomal gene clusters, gene fusion events and co-occurrence profiles across genomes. Context analysis is based on the observation that functionally related genes are often having similar gene context and relies on the identification of such events across phylogenetically diverse collection of genomes. We have used the data management system of the Integrated Microbial Genomes (IMG) as the framework to implement and explore the power of gene context analysis methods because it provides one of the largest available genome integrations. Visualization and search tools to facilitate gene context analysis have been developed and applied across all publicly available archaeal and bacterial genomes in IMG. These computations are now maintained as part of IMG's regular genome content update cycle. IMG is available at: http://img.jgi.doe.gov.
The MR-Base platform supports systematic causal inference across the human phenome
Wade, Kaitlin H; Haberland, Valeriia; Baird, Denis; Laurin, Charles; Burgess, Stephen; Bowden, Jack; Langdon, Ryan; Tan, Vanessa Y; Yarmolinsky, James; Shihab, Hashem A; Timpson, Nicholas J; Evans, David M; Relton, Caroline; Martin, Richard M; Davey Smith, George
2018-01-01
Results from genome-wide association studies (GWAS) can be used to infer causal relationships between phenotypes, using a strategy known as 2-sample Mendelian randomization (2SMR) and bypassing the need for individual-level data. However, 2SMR methods are evolving rapidly and GWAS results are often insufficiently curated, undermining efficient implementation of the approach. We therefore developed MR-Base (http://www.mrbase.org): a platform that integrates a curated database of complete GWAS results (no restrictions according to statistical significance) with an application programming interface, web app and R packages that automate 2SMR. The software includes several sensitivity analyses for assessing the impact of horizontal pleiotropy and other violations of assumptions. The database currently comprises 11 billion single nucleotide polymorphism-trait associations from 1673 GWAS and is updated on a regular basis. Integrating data with software ensures more rigorous application of hypothesis-driven analyses and allows millions of potential causal relationships to be efficiently evaluated in phenome-wide association studies. PMID:29846171
A Nursing Intelligence System to Support Secondary Use of Nursing Routine Data
Rauchegger, F.; Ammenwerth, E.
2015-01-01
Summary Background Nursing care is facing exponential growth of information from nursing documentation. This amount of electronically available data collected routinely opens up new opportunities for secondary use. Objectives To present a case study of a nursing intelligence system for reusing routinely collected nursing documentation data for multiple purposes, including quality management of nursing care. Methods The SPIRIT framework for systematically planning the reuse of clinical routine data was leveraged to design a nursing intelligence system which then was implemented using open source tools in a large university hospital group following the spiral model of software engineering. Results The nursing intelligence system is in routine use now and updated regularly, and includes over 40 million data sets. It allows the outcome and quality analysis of data related to the nursing process. Conclusions Following a systematic approach for planning and designing a solution for reusing routine care data appeared to be successful. The resulting nursing intelligence system is useful in practice now, but remains malleable for future changes. PMID:26171085
Kong, Zehui; Liu, Teng
2017-01-01
To further improve the fuel economy of series hybrid electric tracked vehicles, a reinforcement learning (RL)-based real-time energy management strategy is developed in this paper. In order to utilize the statistical characteristics of online driving schedule effectively, a recursive algorithm for the transition probability matrix (TPM) of power-request is derived. The reinforcement learning (RL) is applied to calculate and update the control policy at regular time, adapting to the varying driving conditions. A facing-forward powertrain model is built in detail, including the engine-generator model, battery model and vehicle dynamical model. The robustness and adaptability of real-time energy management strategy are validated through the comparison with the stationary control strategy based on initial transition probability matrix (TPM) generated from a long naturalistic driving cycle in the simulation. Results indicate that proposed method has better fuel economy than stationary one and is more effective in real-time control. PMID:28671967
Kong, Zehui; Zou, Yuan; Liu, Teng
2017-01-01
To further improve the fuel economy of series hybrid electric tracked vehicles, a reinforcement learning (RL)-based real-time energy management strategy is developed in this paper. In order to utilize the statistical characteristics of online driving schedule effectively, a recursive algorithm for the transition probability matrix (TPM) of power-request is derived. The reinforcement learning (RL) is applied to calculate and update the control policy at regular time, adapting to the varying driving conditions. A facing-forward powertrain model is built in detail, including the engine-generator model, battery model and vehicle dynamical model. The robustness and adaptability of real-time energy management strategy are validated through the comparison with the stationary control strategy based on initial transition probability matrix (TPM) generated from a long naturalistic driving cycle in the simulation. Results indicate that proposed method has better fuel economy than stationary one and is more effective in real-time control.
Koczyk, Grzegorz; Berezovsky, Igor N.
2008-01-01
Domain hierarchy and closed loops (DHcL) (http://sitron.bccs.uib.no/dhcl/) is a web server that delineates energy hierarchy of protein domain structure and detects domains at different levels of this hierarchy. The server also identifies closed loops and van der Waals locks, which constitute a structural basis for the protein domain hierarchy. The DHcL can be a useful tool for an express analysis of protein structures and their alternative domain decompositions. The user submits a PDB identifier(s) or uploads a 3D protein structure in a PDB format. The results of the analysis are the location of domains at different levels of hierarchy, closed loops, van der Waals locks and their interactive visualization. The server maintains a regularly updated database of domains, closed loop and van der Waals locks for all X-ray structures in PDB. DHcL server is available at: http://sitron.bccs.uib.no/dhcl. PMID:18502776