Sample records for easily update development

  1. Crucial role of strategy updating for coexistence of strategies in interaction networks.

    PubMed

    Zhang, Jianlei; Zhang, Chunyan; Cao, Ming; Weissing, Franz J

    2015-04-01

    Network models are useful tools for studying the dynamics of social interactions in a structured population. After a round of interactions with the players in their local neighborhood, players update their strategy based on the comparison of their own payoff with the payoff of one of their neighbors. Here we show that the assumptions made on strategy updating are of crucial importance for the strategy dynamics. In the first step, we demonstrate that seemingly small deviations from the standard assumptions on updating have major implications for the evolutionary outcome of two cooperation games: cooperation can more easily persist in a Prisoner's Dilemma game, while it can go more easily extinct in a Snowdrift game. To explain these outcomes, we develop a general model for the updating of states in a network that allows us to derive conditions for the steady-state coexistence of states (or strategies). The analysis reveals that coexistence crucially depends on the number of agents consulted for updating. We conclude that updating rules are as important for evolution on a network as network structure and the nature of the interaction.

  2. Crucial role of strategy updating for coexistence of strategies in interaction networks

    NASA Astrophysics Data System (ADS)

    Zhang, Jianlei; Zhang, Chunyan; Cao, Ming; Weissing, Franz J.

    2015-04-01

    Network models are useful tools for studying the dynamics of social interactions in a structured population. After a round of interactions with the players in their local neighborhood, players update their strategy based on the comparison of their own payoff with the payoff of one of their neighbors. Here we show that the assumptions made on strategy updating are of crucial importance for the strategy dynamics. In the first step, we demonstrate that seemingly small deviations from the standard assumptions on updating have major implications for the evolutionary outcome of two cooperation games: cooperation can more easily persist in a Prisoner's Dilemma game, while it can go more easily extinct in a Snowdrift game. To explain these outcomes, we develop a general model for the updating of states in a network that allows us to derive conditions for the steady-state coexistence of states (or strategies). The analysis reveals that coexistence crucially depends on the number of agents consulted for updating. We conclude that updating rules are as important for evolution on a network as network structure and the nature of the interaction.

  3. Blended Learning: An Evolving Praxis

    ERIC Educational Resources Information Center

    Fogal, Gary G.; Graham, Floyd H., III.; Lavigne, Anthony G.

    2014-01-01

    TED (Technology Entertainment Design), a collection of regularly updated talks, offers a web-based platform that is easily accessible. This platform affords language learners across multiple proficiency levels an opportunity to develop autonomy and critical thinking skills alongside their second language (L2) development. With an international…

  4. Knowledge structure representation and automated updates in intelligent information management systems

    NASA Technical Reports Server (NTRS)

    Corey, Stephen; Carnahan, Richard S., Jr.

    1990-01-01

    A continuing effort to apply rapid prototyping and Artificial Intelligence techniques to problems associated with projected Space Station-era information management systems is examined. In particular, timely updating of the various databases and knowledge structures within the proposed intelligent information management system (IIMS) is critical to support decision making processes. Because of the significantly large amounts of data entering the IIMS on a daily basis, information updates will need to be automatically performed with some systems requiring that data be incorporated and made available to users within a few hours. Meeting these demands depends first, on the design and implementation of information structures that are easily modified and expanded, and second, on the incorporation of intelligent automated update techniques that will allow meaningful information relationships to be established. Potential techniques are studied for developing such an automated update capability and IIMS update requirements are examined in light of results obtained from the IIMS prototyping effort.

  5. Computer program CDCID: an automated quality control program using CDC update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singer, G.L.; Aguilar, F.

    1984-04-01

    A computer program, CDCID, has been developed in coordination with a quality control program to provide a highly automated method of documenting changes to computer codes at EG and G Idaho, Inc. The method uses the standard CDC UPDATE program in such a manner that updates and their associated documentation are easily made and retrieved in various formats. The method allows each card image of a source program to point to the document which describes it, who created the card, and when it was created. The method described is applicable to the quality control of computer programs in general. Themore » computer program described is executable only on CDC computing systems, but the program could be modified and applied to any computing system with an adequate updating program.« less

  6. Auto Draw from Excel Input Files

    NASA Technical Reports Server (NTRS)

    Strauss, Karl F.; Goullioud, Renaud; Cox, Brian; Grimes, James M.

    2011-01-01

    The design process often involves the use of Excel files during project development. To facilitate communications of the information in the Excel files, drawings are often generated. During the design process, the Excel files are updated often to reflect new input. The problem is that the drawings often lag the updates, often leading to confusion of the current state of the design. The use of this program allows visualization of complex data in a format that is more easily understandable than pages of numbers. Because the graphical output can be updated automatically, the manual labor of diagram drawing can be eliminated. The more frequent update of system diagrams can reduce confusion and reduce errors and is likely to uncover symmetric problems earlier in the design cycle, thus reducing rework and redesign.

  7. UPDATE: Applications of Research in Music Education. UPDATE Yearbook

    ERIC Educational Resources Information Center

    Rowman & Littlefield Education, 2005

    2005-01-01

    The Fall 2004 and Spring 2005 issues of "UPDATE: Applications of Research in Music Education," in one print volume, presents hard facts and statistical data in a style that can be easily understood and appreciated by music researchers, teachers, graduates, and undergraduates alike. Includes advice to first-year music teachers, instrument…

  8. Publishing web-based guidelines using interactive decision models.

    PubMed

    Sanders, G D; Nease, R F; Owens, D K

    2001-05-01

    Commonly used methods for guideline development and dissemination do not enable developers to tailor guidelines systematically to specific patient populations and update guidelines easily. We developed a web-based system, ALCHEMIST, that uses decision models and automatically creates evidence-based guidelines that can be disseminated, tailored and updated over the web. Our objective was to demonstrate the use of this system with clinical scenarios that provide challenges for guideline development. We used the ALCHEMIST system to develop guidelines for three clinical scenarios: (1) Chlamydia screening for adolescent women, (2) antiarrhythmic therapy for the prevention of sudden cardiac death; and (3) genetic testing for the BRCA breast-cancer mutation. ALCHEMIST uses information extracted directly from the decision model, combined with the additional information from the author of the decision model, to generate global guidelines. ALCHEMIST generated electronic web-based guidelines for each of the three scenarios. Using ALCHEMIST, we demonstrate that tailoring a guideline for a population at high-risk for Chlamydia changes the recommended policy for control of Chlamydia from contact tracing of reported cases to a population-based screening programme. We used ALCHEMIST to incorporate new evidence about the effectiveness of implantable cardioverter defibrillators (ICD) and demonstrate that the cost-effectiveness of use of ICDs improves from $74 400 per quality-adjusted life year (QALY) gained to $34 500 per QALY gained. Finally, we demonstrate how a clinician could use ALCHEMIST to incorporate a woman's utilities for relevant health states and thereby develop patient-specific recommendations for BRCA testing; the patient-specific recommendation improved quality-adjusted life expectancy by 37 days. The ALCHEMIST system enables guideline developers to publish both a guideline and an interactive decision model on the web. This web-based tool enables guideline developers to tailor guidelines systematically, to update guidelines easily, and to make the underlying evidence and analysis transparent for users.

  9. EPA Releases Update to Popular School Integrated Pest Management Publication

    EPA Pesticide Factsheets

    An updated version reflects recent innovations in school IPM, provides links to new information, and has been redesigned into an easily printable format. It provides an overview of IPM and details the steps a school can follow to establish an IPM program.

  10. Updating Risk Prediction Tools: A Case Study in Prostate Cancer

    PubMed Central

    Ankerst, Donna P.; Koniarski, Tim; Liang, Yuanyuan; Leach, Robin J.; Feng, Ziding; Sanda, Martin G.; Partin, Alan W.; Chan, Daniel W; Kagan, Jacob; Sokoll, Lori; Wei, John T; Thompson, Ian M.

    2013-01-01

    Online risk prediction tools for common cancers are now easily accessible and widely used by patients and doctors for informed decision-making concerning screening and diagnosis. A practical problem is as cancer research moves forward and new biomarkers and risk factors are discovered, there is a need to update the risk algorithms to include them. Typically the new markers and risk factors cannot be retrospectively measured on the same study participants used to develop the original prediction tool, necessitating the merging of a separate study of different participants, which may be much smaller in sample size and of a different design. Validation of the updated tool on a third independent data set is warranted before the updated tool can go online. This article reports on the application of Bayes rule for updating risk prediction tools to include a set of biomarkers measured in an external study to the original study used to develop the risk prediction tool. The procedure is illustrated in the context of updating the online Prostate Cancer Prevention Trial Risk Calculator to incorporate the new markers %freePSA and [−2]proPSA measured on an external case control study performed in Texas, U.S.. Recent state-of-the art methods in validation of risk prediction tools and evaluation of the improvement of updated to original tools are implemented using an external validation set provided by the U.S. Early Detection Research Network. PMID:22095849

  11. Updating risk prediction tools: a case study in prostate cancer.

    PubMed

    Ankerst, Donna P; Koniarski, Tim; Liang, Yuanyuan; Leach, Robin J; Feng, Ziding; Sanda, Martin G; Partin, Alan W; Chan, Daniel W; Kagan, Jacob; Sokoll, Lori; Wei, John T; Thompson, Ian M

    2012-01-01

    Online risk prediction tools for common cancers are now easily accessible and widely used by patients and doctors for informed decision-making concerning screening and diagnosis. A practical problem is as cancer research moves forward and new biomarkers and risk factors are discovered, there is a need to update the risk algorithms to include them. Typically, the new markers and risk factors cannot be retrospectively measured on the same study participants used to develop the original prediction tool, necessitating the merging of a separate study of different participants, which may be much smaller in sample size and of a different design. Validation of the updated tool on a third independent data set is warranted before the updated tool can go online. This article reports on the application of Bayes rule for updating risk prediction tools to include a set of biomarkers measured in an external study to the original study used to develop the risk prediction tool. The procedure is illustrated in the context of updating the online Prostate Cancer Prevention Trial Risk Calculator to incorporate the new markers %freePSA and [-2]proPSA measured on an external case-control study performed in Texas, U.S.. Recent state-of-the art methods in validation of risk prediction tools and evaluation of the improvement of updated to original tools are implemented using an external validation set provided by the U.S. Early Detection Research Network. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. CCD Analog Programmable Microprocessor (APUP) Study

    DTIC Science & Technology

    1980-08-01

    so important to electronic warfare support measures applicatLons. A comprehensive imager develop- ment program is currently being formulated to... comprehensive treatment of this subject could easily fill a book as it has at least twice in the past (1,2) These textbooks F’ (3)are periodically updated... comprehensive treatment of circuit modeling, the resultant noise predictions are included here as expected values in further describing critical

  13. A microprocessor-based cardiotachometer

    NASA Technical Reports Server (NTRS)

    Donaldson, J. A.; Crosier, W. G.

    1979-01-01

    The development of a highly accurate and reliable cardiotachometer for measuring the heart rate of test subjects is discussed. It measures heart rate over the range of 30 to 250 beats/minute and gives instantaneous (beat to beat) updates on the system output so that occasional noise artifacts or ectopic beats could be more easily identified except that occasional missed beats caused by switching ECG leads should not cause a change in the output. The cardiotachometer uses an improved analog filter and R-wave detector and an Intel 8080A microprocessor to handle all of the logic and arithmetic necessary. By using the microprocessor, future hardware modifications could easily be made if functional changes were needed.

  14. An Updated Nuclear Equation of State for Neutron Stars and Supernova Simulations

    NASA Astrophysics Data System (ADS)

    Meixner, M. A.; Mathews, G. J.; Dalhed, H. E.; Lan, N. Q.

    2011-10-01

    We present an updated and improved Equation of State based upon the framework originally developed by Bowers & Wilson. The details of the EoS and improvements are described along with a description of how to access this EOS for numerical simulations. Among the improvements are an updated compressibility based upon recent measurements, the possibility of the formation of proton excess (Ye> 0.5) material and an improved treatment of the nuclear statistical equilibrium and the transition to pasta nuclei as the density approaches nuclear matter density. The possibility of a QCD chiral phase transition is also included at densities above nuclear matter density. We show comparisons of this EOS with the other two publicly available equations of state used in supernova collapse simulations. The advantages of the present EoS is that it is easily amenable to phenomenological parameterization to fit observed explosion properties and to accommodate new physical parameters.

  15. SIDECACHE: Information access, management and dissemination framework for web services.

    PubMed

    Doderer, Mark S; Burkhardt, Cory; Robbins, Kay A

    2011-06-14

    Many bioinformatics algorithms and data sets are deployed using web services so that the results can be explored via the Internet and easily integrated into other tools and services. These services often include data from other sites that is accessed either dynamically or through file downloads. Developers of these services face several problems because of the dynamic nature of the information from the upstream services. Many publicly available repositories of bioinformatics data frequently update their information. When such an update occurs, the developers of the downstream service may also need to update. For file downloads, this process is typically performed manually followed by web service restart. Requests for information obtained by dynamic access of upstream sources is sometimes subject to rate restrictions. SideCache provides a framework for deploying web services that integrate information extracted from other databases and from web sources that are periodically updated. This situation occurs frequently in biotechnology where new information is being continuously generated and the latest information is important. SideCache provides several types of services including proxy access and rate control, local caching, and automatic web service updating. We have used the SideCache framework to automate the deployment and updating of a number of bioinformatics web services and tools that extract information from remote primary sources such as NCBI, NCIBI, and Ensembl. The SideCache framework also has been used to share research results through the use of a SideCache derived web service.

  16. A computational procedure for multibody systems including flexible beam dynamics

    NASA Technical Reports Server (NTRS)

    Downer, J. D.; Park, K. C.; Chiou, J. C.

    1990-01-01

    A computational procedure suitable for the solution of equations of motions for flexible multibody systems has been developed. The flexible beams are modeled using a fully nonlinear theory which accounts for both finite rotations and large deformations. The present formulation incorporates physical measures of conjugate Cauchy stress and covariant strain increments. As a consequence, the beam model can easily be interfaced with real-time strain measurements and feedback control systems. A distinct feature of the present work is the computational preservation of total energy for undamped systems; this is obtained via an objective strain increment/stress update procedure combined with an energy-conserving time integration algorithm which contains an accurate update of angular orientations. The procedure is demonstrated via several example problems.

  17. Integration of Narrative Processing, Data Fusion, and Database Updating Techniques in an Automated System.

    DTIC Science & Technology

    1981-10-29

    are implemented, respectively, in the files "W-Update," "W-combine" and RW-Copy," listed in the appendix. The appendix begins with a typescript of an...the typescript ) and the copying process (steps 45 and 46) are shown as human actions in the typescript , but can be performed easily by a "master...for Natural Language, M. Marcus, MIT Press, 1980. I 29 APPENDIX: DATABASE UPDATING EXPERIMENT 30 CONTENTS Typescript of an experiment in Rosie

  18. Simple colonoscopy reporting system checking the detection rate of colon polyps.

    PubMed

    Kim, Jae Hyun; Choi, Youn Jung; Kwon, Hye Jung; Park, Seun Ja; Park, Moo In; Moon, Won; Kim, Sung Eun

    2015-08-21

    To present a simple colonoscopy reporting system that can be checked easily the detection rate of colon polyps. A simple colonoscopy reporting system Kosin Gastroenterology (KG quality reporting system) was developed. The polyp detection rate (PDR), adenoma detection rate (ADR), serrated polyp detection rate (SDR), and advanced adenoma detection rate (AADR) are easily calculated to use this system. In our gastroenterology center, the PDR, ADR, SDR, and AADR test results from each gastroenterologist were updated, every month. Between June 2014, when the program was started, and December 2014, the overall PDR and ADR in our center were 62.5% and 41.4%, respectively. And the overall SDR and AADR were 7.5% and 12.1%, respectively. We envision that KG quality reporting system can be applied to develop a comprehensive system to check colon polyp detection rates in other gastroenterology centers.

  19. GeoRePORT Input Spreadsheet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    St. Onge, Melinda

    The Geothermal Resource Portfolio Optimization and Reporting Tool (GeoRePORT) was developed as a way to distill large amounts of geothermal project data into an objective, reportable data set that can be used to communicate with experts and non-experts. GeoRePORT summarizes (1) resource grade and certainty and (2) project readiness. This Excel file allows users to easily navigate through the resource grade attributes, using drop-down menus to pick grades and project readiness, and then easily print and share the summary with others. This spreadsheet is the first draft, for which we are soliciting expert feedback. The spreadsheet will be updated basedmore » on this feedback to increase usability of the tool. If you have any comments, please feel free to contact us.« less

  20. An Algorithm Using Twelve Properties of Antibiotics to Find the Recommended Antibiotics, as in CPGs.

    PubMed

    Tsopra, R; Venot, A; Duclos, C

    2014-01-01

    Clinical Decision Support Systems (CDSS) incorporating justifications, updating and adjustable recommendations can considerably improve the quality of healthcare. We propose a new approach to the design of CDSS for empiric antibiotic prescription, based on implementation of the deeper medical reasoning used by experts in the development of clinical practice guidelines (CPGs), to deduce the recommended antibiotics. We investigated two methods ("exclusion" versus "scoring") for reproducing this reasoning based on antibiotic properties. The "exclusion" method reproduced expert reasoning the more accurately, retrieving the full list of recommended antibiotics for almost all clinical situations. This approach has several advantages: (i) it provides convincing explanations for physicians; (ii) updating could easily be incorporated into the CDSS; (iii) it can provide recommendations for clinical situations missing from CPGs.

  1. The Progress of CDAS

    NASA Technical Reports Server (NTRS)

    Zhu, Renjie; Zhang, Xiuzhong; Wei, Wenren; Xiang, Ying; Li, Bin; Wu, Yajun; Shu, Fengchun; Luo, Jintao; Wang, Jinqing; Xue, Zhuhe; hide

    2010-01-01

    The Chinese Data Acquisition System (CDAS) based on FPGA techniques has been developed in China for the purpose of replacing the traditional analog baseband converter. CDAS is a high speed data acquisition and processing system with 1024 Msps sample rate for 512M bandwidth input and up to 16 channels (both USB and LSB) output with VSI interface compatible. The instrument is a flexible environment which can be updated easily. In this paper, the construction, the performance, the experiment results, and the future plans of CDAS will be reported.

  2. The Master Archive Collection Inventory (MACI)

    NASA Astrophysics Data System (ADS)

    Lief, C. J.; Arnfield, J.; Sprain, M.

    2014-12-01

    The Master Archive Collection Inventory (MACI) project at the NOAA National Climatic Data Center (NCDC) is an effort to re-inventory all digital holdings to streamline data set and product titles and update documentation to discovery level ISO 199115-2. Subject Matter Experts (SME) are being identified for each of the holdings and will be responsible for creating and maintaining metadata records. New user-friendly tools are available for the SMEs to easily create and update this documentation. Updated metadata will be available for retrieval by other aggregators and discovery tools, increasing the usability of NCDC data and products.

  3. Current Development at the Southern California Earthquake Data Center (SCEDC)

    NASA Astrophysics Data System (ADS)

    Appel, V. L.; Clayton, R. W.

    2005-12-01

    Over the past year, the SCEDC completed or is near completion of three featured projects: Station Information System (SIS) Development: The SIS will provide users with an interface into complete and accurate station metadata for all current and historic data at the SCEDC. The goal of this project is to develop a system that can interact with a single database source to enter, update and retrieve station metadata easily and efficiently. The system will provide accurate station/channel information for active stations to the SCSN real-time processing system, as will as station/channel information for stations that have parametric data at the SCEDC i.e., for users retrieving data via STP. Additionally, the SIS will supply information required to generate dataless SEED and COSMOS V0 volumes and allow stations to be added to the system with a minimum, but incomplete set of information using predefined defaults that can be easily updated as more information becomes available. Finally, the system will facilitate statewide metadata exchange for both real-time processing and provide a common approach to CISN historic station metadata. Moment Tensor Solutions: The SCEDC is currently archiving and delivering Moment Magnitudes and Moment Tensor Solutions (MTS) produced by the SCSN in real-time and post-processing solutions for events spanning back to 1999. The automatic MTS runs on all local events with magnitudes > 3.0, and all regional events > 3.5. The distributed solution automatically creates links from all USGS Simpson Maps to a text e-mail summary solution, creates a .gif image of the solution, and updates the moment tensor database tables at the SCEDC. Searchable Scanned Waveforms Site: The Caltech Seismological Lab has made available 12,223 scanned images of pre-digital analog recordings of major earthquakes recorded in Southern California between 1962 and 1992 at http://www.data.scec.org/research/scans/. The SCEDC has developed a searchable web interface that allows users to search the available files, select multiple files for download and then retrieve a zipped file containing the results. Scanned images of paper records for M>3.5 southern California earthquakes and several significant teleseisms are available for download via the SCEDC through this search tool.

  4. An Algorithm Using Twelve Properties of Antibiotics to Find the Recommended Antibiotics, as in CPGs

    PubMed Central

    Tsopra, R.; Venot, A.; Duclos, C.

    2014-01-01

    Background Clinical Decision Support Systems (CDSS) incorporating justifications, updating and adjustable recommendations can considerably improve the quality of healthcare. We propose a new approach to the design of CDSS for empiric antibiotic prescription, based on implementation of the deeper medical reasoning used by experts in the development of clinical practice guidelines (CPGs), to deduce the recommended antibiotics. Methods We investigated two methods (“exclusion” versus “scoring”) for reproducing this reasoning based on antibiotic properties. Results The “exclusion” method reproduced expert reasoning the more accurately, retrieving the full list of recommended antibiotics for almost all clinical situations. Discussion This approach has several advantages: (i) it provides convincing explanations for physicians; (ii) updating could easily be incorporated into the CDSS; (iii) it can provide recommendations for clinical situations missing from CPGs. PMID:25954422

  5. Real-time, interactive, visually updated simulator system for telepresence

    NASA Technical Reports Server (NTRS)

    Schebor, Frederick S.; Turney, Jerry L.; Marzwell, Neville I.

    1991-01-01

    Time delays and limited sensory feedback of remote telerobotic systems tend to disorient teleoperators and dramatically decrease the operator's performance. To remove the effects of time delays, key components were designed and developed of a prototype forward simulation subsystem, the Global-Local Environment Telerobotic Simulator (GLETS) that buffers the operator from the remote task. GLETS totally immerses an operator in a real-time, interactive, simulated, visually updated artificial environment of the remote telerobotic site. Using GLETS, the operator will, in effect, enter into a telerobotic virtual reality and can easily form a gestalt of the virtual 'local site' that matches the operator's normal interactions with the remote site. In addition to use in space based telerobotics, GLETS, due to its extendable architecture, can also be used in other teleoperational environments such as toxic material handling, construction, and undersea exploration.

  6. FACETS: using open data to measure community social determinants of health.

    PubMed

    Cantor, Michael N; Chandras, Rajan; Pulgarin, Claudia

    2018-04-01

    To develop a dataset based on open data sources reflective of community-level social determinants of health (SDH). We created FACETS (Factors Affecting Communities and Enabling Targeted Services), an architecture that incorporates open data related to SDH into a single dataset mapped at the census-tract level for New York City. FACETS (https://github.com/mcantor2/FACETS) can be easily used to map individual addresses to their census-tract-level SDH. This dataset facilitates analysis across different determinants that are often not easily accessible. Wider access to open data from government agencies at the local, state, and national level would facilitate the aggregation and analysis of community-level determinants. Timeliness of updates to federal non-census data sources may limit their usefulness. FACETS is an important first step in standardizing and compiling SDH-related data in an open architecture that can give context to a patient's condition and enable better decision-making when developing a plan of care.

  7. Liquid belt radiator design study

    NASA Technical Reports Server (NTRS)

    Teagan, W. P.; Fitzgerald, K. F.

    1986-01-01

    The Liquid Belt Radiator (LBR) is an advanced concept developed to meet the needs of anticipated future space missions. A previous study documented the advantages of this concept as a lightweight, easily deployable alternative to present day space heat rejection systems. The technical efforts associated with this study concentrate on refining the concept of the LBR as well as examining the issues of belt dynamics and potential application of the LBR to intermediate and high temperature heat rejection applications. A low temperature point design developed in previous work is updated assuming the use of diffusion pump oil, Santovac-6, as the heat transfer media. Additional analytical and design effort is directed toward determining the impact of interface heat exchanger, fluid bath sealing, and belt drive mechanism designs on system performance and mass. The updated design supports the earlier result by indicating a significant reduction in system specific system mass as compared to heat pipe or pumped fluid radiator concepts currently under consideration (1.3 kg/sq m versus 5 kg/sq m).

  8. Update: cholera--Western Hemisphere, and recommendations for treatment of cholera.

    PubMed

    1991-08-16

    Epidemic cholera appeared in Peru in January 1991 and subsequently spread to Ecuador, Colombia, Chile, Brazil, Mexico, and Guatemala. Cholera can be a severe, life-threatening illness but is highly preventable and easily treated; however, few health-care practitioners in the United States have experience identifying and treating cholera. This report provides an update on cholera in the Western Hemisphere and provides recommendations on the clinical diagnosis and treatment of cholera in the United States.

  9. Quantifying radionuclide signatures from a γ-γ coincidence system.

    PubMed

    Britton, Richard; Jackson, Mark J; Davies, Ashley V

    2015-11-01

    A method for quantifying gamma coincidence signatures has been developed, and tested in conjunction with a high-efficiency multi-detector system to quickly identify trace amounts of radioactive material. The γ-γ system utilises fully digital electronics and list-mode acquisition to time-stamp each event, allowing coincidence matrices to be easily produced alongside typical 'singles' spectra. To quantify the coincidence signatures a software package has been developed to calculate efficiency and cascade summing corrected branching ratios. This utilises ENSDF records as an input, and can be fully automated, allowing the user to quickly and easily create/update a coincidence library that contains all possible γ and conversion electron cascades, associated cascade emission probabilities, and true-coincidence summing corrected γ cascade detection probabilities. It is also fully searchable by energy, nuclide, coincidence pair, γ multiplicity, cascade probability and half-life of the cascade. The probabilities calculated were tested using measurements performed on the γ-γ system, and found to provide accurate results for the nuclides investigated. Given the flexibility of the method, (it only relies on evaluated nuclear data, and accurate efficiency characterisations), the software can now be utilised for a variety of systems, quickly and easily calculating coincidence signature probabilities. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  10. Object-oriented fault tree models applied to system diagnosis

    NASA Technical Reports Server (NTRS)

    Iverson, David L.; Patterson-Hine, F. A.

    1990-01-01

    When a diagnosis system is used in a dynamic environment, such as the distributed computer system planned for use on Space Station Freedom, it must execute quickly and its knowledge base must be easily updated. Representing system knowledge as object-oriented augmented fault trees provides both features. The diagnosis system described here is based on the failure cause identification process of the diagnostic system described by Narayanan and Viswanadham. Their system has been enhanced in this implementation by replacing the knowledge base of if-then rules with an object-oriented fault tree representation. This allows the system to perform its task much faster and facilitates dynamic updating of the knowledge base in a changing diagnosis environment. Accessing the information contained in the objects is more efficient than performing a lookup operation on an indexed rule base. Additionally, the object-oriented fault trees can be easily updated to represent current system status. This paper describes the fault tree representation, the diagnosis algorithm extensions, and an example application of this system. Comparisons are made between the object-oriented fault tree knowledge structure solution and one implementation of a rule-based solution. Plans for future work on this system are also discussed.

  11. Definition of Tire Properties Required for Landing System Analysis

    NASA Technical Reports Server (NTRS)

    Clark, S. K.; Dodge, R. N.; Luchini, J. R.

    1978-01-01

    The data bank constructed provided two basic advantages for the user of aircraft tire information. First, computerization of the data bank allowed mechanical property data to be stored, corrected, updated, and revised quickly and easily as more reliable tests and measurements were carried out. Secondly, the format of the book which can be printed from the computerized data bank can be easily adjusted to suit the needs of the users without the great expense normally associated with reprinting and editing books set by ordinary typography.

  12. MedlinePlus FAQ: What's the difference between MedlinePlus and MedlinePlus Connect?

    MedlinePlus

    ... MedlinePlus Connect is a free service that allows electronic health record (EHR) systems to easily link users to MedlinePlus, ... updates Subscribe to RSS Follow us Disclaimers Copyright Privacy Accessibility Quality Guidelines Viewers & Players MedlinePlus Connect for ...

  13. Model documentation for relations between continuous real-time and discrete water-quality constituents in Cheney Reservoir near Cheney, Kansas, 2001--2009

    USGS Publications Warehouse

    Stone, Mandy L.; Graham, Jennifer L.; Gatotho, Jackline W.

    2013-01-01

    Cheney Reservoir, located in south-central Kansas, is one of the primary water supplies for the city of Wichita, Kansas. The U.S. Geological Survey has operated a continuous real-time water-quality monitoring station in Cheney Reservoir since 2001; continuously measured physicochemical properties include specific conductance, pH, water temperature, dissolved oxygen, turbidity, fluorescence (wavelength range 650 to 700 nanometers; estimate of total chlorophyll), and reservoir elevation. Discrete water-quality samples were collected during 2001 through 2009 and analyzed for sediment, nutrients, taste-and-odor compounds, cyanotoxins, phytoplankton community composition, actinomycetes bacteria, and other water-quality measures. Regression models were developed to establish relations between discretely sampled constituent concentrations and continuously measured physicochemical properties to compute concentrations of constituents that are not easily measured in real time. The water-quality information in this report is important to the city of Wichita because it allows quantification and characterization of potential constituents of concern in Cheney Reservoir. This report updates linear regression models published in 2006 that were based on data collected during 2001 through 2003. The update uses discrete and continuous data collected during May 2001 through December 2009. Updated models to compute dissolved solids, sodium, chloride, and suspended solids were similar to previously published models. However, several other updated models changed substantially from previously published models. In addition to updating relations that were previously developed, models also were developed for four new constituents, including magnesium, dissolved phosphorus, actinomycetes bacteria, and the cyanotoxin microcystin. In addition, a conversion factor of 0.74 was established to convert the Yellow Springs Instruments (YSI) model 6026 turbidity sensor measurements to the newer YSI model 6136 sensor at the Cheney Reservoir site. Because a high percentage of geosmin and microcystin data were below analytical detection thresholds (censored data), multiple logistic regression was used to develop models that best explained the probability of geosmin and microcystin concentrations exceeding relevant thresholds. The geosmin and microcystin models are particularly important because geosmin is a taste-and-odor compound and microcystin is a cyanotoxin.

  14. How to Set Up an Electronic Bulletin Board.

    ERIC Educational Resources Information Center

    Lukas, Terrence

    1981-01-01

    Describes a versatile, inexpensive information system using microcomputers and television sets which enables Indiana University Northwest to relay information for students to different sites simultaneously and to update information quickly and easily. Illustrates how to set up the hardware, discusses programing, and includes the actual program…

  15. Novel Targeted Therapies for Inflammatory Bowel Disease.

    PubMed

    Coskun, Mehmet; Vermeire, Severine; Nielsen, Ole Haagen

    2017-02-01

    Our growing understanding of the immunopathogenesis of inflammatory bowel disease (IBD) has opened new avenues for developing targeted therapies. These advances in treatment options targeting different mechanisms of action offer new hope for personalized management. In this review we highlight emerging novel and easily administered therapeutics that may be viable candidates for the management of IBD, such as antibodies against interleukin 6 (IL-6) and IL-12/23, small molecules including Janus kinase inhibitors, antisense oligonucleotide against SMAD7 mRNA, and inhibitors of leukocyte trafficking to intestinal sites of inflammation (e.g., sphingosine 1-phosphate receptor modulators). We also provide an update on the current status in clinical development of these new classes of therapeutics. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. The Environment-Power System Analysis Tool development program. [for spacecraft power supplies

    NASA Technical Reports Server (NTRS)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Wilcox, Katherine G.; Stevens, N. John; Putnam, Rand M.; Roche, James C.

    1989-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide engineers with the ability to assess the effects of a broad range of environmental interactions on space power systems. A unique user-interface-data-dictionary code architecture oversees a collection of existing and future environmental modeling codes (e.g., neutral density) and physical interaction models (e.g., sheath ionization). The user-interface presents the engineer with tables, graphs, and plots which, under supervision of the data dictionary, are automatically updated in response to parameter change. EPSAT thus provides the engineer with a comprehensive and responsive environmental assessment tool and the scientist with a framework into which new environmental or physical models can be easily incorporated.

  17. Food for Thought: Expanding School Breakfast to NJ Students. [Updated

    ERIC Educational Resources Information Center

    Advocates for Children of New Jersey, 2014

    2014-01-01

    Often, school districts are reluctant to adopt innovative approaches to serving children breakfast in school because of logistical concerns that are easily overcome. Districts that adopt these more innovative approaches report significant increases in participation rates and improvement in student behavior and performance. This report provides…

  18. Think Visual

    ERIC Educational Resources Information Center

    Thomas, Lisa Carlucci

    2012-01-01

    Photographs tell a story and illustrate an experience more profoundly than words alone. Real-time, text-based communication is an increasingly normal part of daily life as mobile devices and social networks proliferate. Yet, in the steady stream of tweets, comments, status updates, notifications, and e-mail, the details are easily lost in the…

  19. Mechanization of Library Procedures in the Medium-sized Medical Library: IX. Holding Statements in PHILSOM: a Study of their Activity *

    PubMed Central

    Beckwith, Helen K.

    1970-01-01

    A study was made of the serial holding statements in PHILSOM over a six-month period, in order to determine the desirability of printing the complete serial holding statements monthly. Attention was given to the frequency of internal and update changes in both active and dead entries. The results indicate that while sufficient activity is observed in active serial entries to warrant their monthly updating, dead serial entries remain constant over this period. This indicates that a large group of PHILSOM entries can be easily identified and isolated, facilitating division and independent updating of the resultant lists. The desirability of such a division, however, must also take into consideration the user's ease in handling such a segmented listing. Images PMID:5439902

  20. Photon underproduction crisis and the redshift evolution of escape fraction of hydrogen ionizing photons from galaxies

    NASA Astrophysics Data System (ADS)

    Khaire, Vikram; Srianand, Raghunathan

    2016-01-01

    In the standard picture, the only sources of cosmic UV background are the quasars and the star forming galaxies. The hydrogen ionizing emissivity from galaxies depends on a parameter known as escape fraction (fesc). It is the ratio of the escaping hydrogen ionizing photons from galaxies to the total produced by their stellar population. Using available multi-wavelength and multi-epoch galaxy luminosity function measurements, we update the galaxy emissivity by estimating a self-consistent combination of the star formation rate density and dust attenuation. Using the recent quasar luminosity function measurements, we present an updated hydrogen ionizing emissivity from quasars which shows a factor of ~2 increase as compared to the previous estimates at z<2. We use these in a cosmological radiative transfer code developed by us to generate the UV background and show that the recently inferred high values of hydrogen photoionization rates at low redshifts can be easily obtained with reasonable values of fesc. This resolves the problem of 'photon underproduction crisis' and shows that there is no need to invoke non-standard sources of the UV background such as decaying dark matter particles. We will present the implications of this updated quasar and galaxy emissivity on the values of fesc at high redshifts and on the cosmic reionization. We will also present the effect of the updated UV background on the inferred properties of the intergalactic medium, especially on the Lyman alpha forest and the metal line absorption systems.

  1. Simplified Metadata Curation via the Metadata Management Tool

    NASA Astrophysics Data System (ADS)

    Shum, D.; Pilone, D.

    2015-12-01

    The Metadata Management Tool (MMT) is the newest capability developed as part of NASA Earth Observing System Data and Information System's (EOSDIS) efforts to simplify metadata creation and improve metadata quality. The MMT was developed via an agile methodology, taking into account inputs from GCMD's science coordinators and other end-users. In its initial release, the MMT uses the Unified Metadata Model for Collections (UMM-C) to allow metadata providers to easily create and update collection records in the ISO-19115 format. Through a simplified UI experience, metadata curators can create and edit collections without full knowledge of the NASA Best Practices implementation of ISO-19115 format, while still generating compliant metadata. More experienced users are also able to access raw metadata to build more complex records as needed. In future releases, the MMT will build upon recent work done in the community to assess metadata quality and compliance with a variety of standards through application of metadata rubrics. The tool will provide users with clear guidance as to how to easily change their metadata in order to improve their quality and compliance. Through these features, the MMT allows data providers to create and maintain compliant and high quality metadata in a short amount of time.

  2. Simultaneous Assimilation of AMSR-E Brightness Temperature and MODIS LST to Improve Soil Moisture with Dual Ensemble Kalman Smoother

    NASA Astrophysics Data System (ADS)

    Huang, Chunlin; Chen, Weijin; Wang, Weizhen; Gu, Juan

    2017-04-01

    Uncertainties in model parameters can easily cause systematic differences between model states and observations from ground or satellites, which significantly affect the accuracy of soil moisture estimation in data assimilation systems. In this paper, a novel soil moisture assimilation scheme is developed to simultaneously assimilate AMSR-E brightness temperature (TB) and MODIS Land Surface Temperature (LST), which can correct model bias by simultaneously updating model states and parameters with dual ensemble Kalman filter (DEnKS). The Common Land Model (CoLM) and a Q-h Radiative Transfer Model (RTM) are adopted as model operator and observation operator, respectively. The assimilation experiment is conducted in Naqu, Tibet Plateau, from May 31 to September 27, 2011. Compared with in-situ measurements, the accuracy of soil moisture estimation is tremendously improved in terms of a variety of scales. The updated soil temperature by assimilating MODIS LST as input of RTM can reduce the differences between the simulated and observed brightness temperatures to a certain degree, which helps to improve the estimation of soil moisture and model parameters. The updated parameters show large discrepancy with the default ones and the former effectively reduces the states bias of CoLM. Results demonstrate the potential of assimilating both microwave TB and MODIS LST to improve the estimation of soil moisture and related parameters. Furthermore, this study also indicates that the developed scheme is an effective soil moisture downscaling approach for coarse-scale microwave TB.

  3. Developing and refining the methods for a 'one-stop shop' for research evidence about health systems.

    PubMed

    Lavis, John N; Wilson, Michael G; Moat, Kaelan A; Hammill, Amanda C; Boyko, Jennifer A; Grimshaw, Jeremy M; Flottorp, Signe

    2015-02-25

    Policymakers, stakeholders and researchers have not been able to find research evidence about health systems using an easily understood taxonomy of topics, know when they have conducted a comprehensive search of the many types of research evidence relevant to them, or rapidly identify decision-relevant information in their search results. To address these gaps, we developed an approach to building a 'one-stop shop' for research evidence about health systems. We developed a taxonomy of health system topics and iteratively refined it by drawing on existing categorization schemes and by using it to categorize progressively larger bundles of research evidence. We identified systematic reviews, systematic review protocols, and review-derived products through searches of Medline, hand searches of several databases indexing systematic reviews, hand searches of journals, and continuous scanning of listservs and websites. We developed an approach to providing 'added value' to existing content (e.g., coding systematic reviews according to the countries in which included studies were conducted) and to expanding the types of evidence eligible for inclusion (e.g., economic evaluations and health system descriptions). Lastly, we developed an approach to continuously updating the online one-stop shop in seven supported languages. The taxonomy is organized by governance, financial, and delivery arrangements and by implementation strategies. The 'one-stop shop', called Health Systems Evidence, contains a comprehensive inventory of evidence briefs, overviews of systematic reviews, systematic reviews, systematic review protocols, registered systematic review titles, economic evaluations and costing studies, health reform descriptions and health system descriptions, and many types of added-value coding. It is continuously updated and new content is regularly translated into Arabic, Chinese, English, French, Portuguese, Russian, and Spanish. Policymakers and stakeholders can now easily access and use a wide variety of types of research evidence about health systems to inform decision-making and advocacy. Researchers and research funding agencies can use Health Systems Evidence to identify gaps in the current stock of research evidence and domains that could benefit from primary research, systematic reviews, and review overviews.

  4. The Functioning of Autonomous Colleges

    ERIC Educational Resources Information Center

    Rao, V. Pala Prasada; Rao, Digumarti Bhaskara

    2012-01-01

    The college gets separated from the university, though not completely, when it is an autonomous college, which is practice in India. Academic package will become flexible and the decision-making is internalized, changes and updating could be easily carried out, depending on the need as reflected from the feedback taken from alumni, user sectors,…

  5. The Art of Showing Art. Revised and Updated.

    ERIC Educational Resources Information Center

    Reeve, James K.

    This book focuses attention on the art objects collections and how to display them. Designing the effective placement of objects is an easily learned art. Starting with the basics, the book takes the reader step by step through a systematic method, to solutions for display problems. The first chapter covers basic concepts of display including…

  6. The interpretation of ERTS-1 imagery for soil survey of the Merida region, Spain

    NASA Technical Reports Server (NTRS)

    Hilwig, F. W.; Goosen, D. (Principal Investigator); Katsieris, D.

    1975-01-01

    The author has identified the following significant results. Major landforms and some subdivisions could be easily recognized. Water bodies, river courses, extensive areas of miocene clays, and more recent coarse textured deposits could be delineated and existing soil maps at scales up to 1:100,000 could be updated.

  7. A data storage, retrieval and analysis system for endocrine research. [for Skylab

    NASA Technical Reports Server (NTRS)

    Newton, L. E.; Johnston, D. A.

    1975-01-01

    This retrieval system builds, updates, retrieves, and performs basic statistical analyses on blood, urine, and diet parameters for the M071 and M073 Skylab and Apollo experiments. This system permits data entry from cards to build an indexed sequential file. Programs are easily modified for specialized analyses.

  8. A rapid learning and dynamic stepwise updating algorithm for flat neural networks and the application to time-series prediction.

    PubMed

    Chen, C P; Wan, J Z

    1999-01-01

    A fast learning algorithm is proposed to find an optimal weights of the flat neural networks (especially, the functional-link network). Although the flat networks are used for nonlinear function approximation, they can be formulated as linear systems. Thus, the weights of the networks can be solved easily using a linear least-square method. This formulation makes it easier to update the weights instantly for both a new added pattern and a new added enhancement node. A dynamic stepwise updating algorithm is proposed to update the weights of the system on-the-fly. The model is tested on several time-series data including an infrared laser data set, a chaotic time-series, a monthly flour price data set, and a nonlinear system identification problem. The simulation results are compared to existing models in which more complex architectures and more costly training are needed. The results indicate that the proposed model is very attractive to real-time processes.

  9. Responding rapidly to FDA drug withdrawals: design and application of a new approach for a consumer health website.

    PubMed

    Embi, Peter J; Acharya, Prasad; McCuistion, Mark; Kishman, Charles P; Haag, Doris; Marine, Stephen

    2006-09-06

    Information about drug withdrawals may not reach patients in a timely manner, and this could result in adverse events. Increasingly, the public turns to consumer health websites for health information, but such sites may not update their content for days or weeks following important events like Food and Drug Administration (FDA) drug withdrawal actions. There is no recognized standard for how quickly consumer health websites should respond to such events, and reports addressing this issue are lacking. The objective of this study was to develop and implement an approach to enhance the efficiency with which a consumer health website (NetWellness.org) responds to FDA drug withdrawal actions. Evaluation of the current approach used by NetWellness staff to update content affected by FDA action revealed a slow process driven by the goal of performing thorough and comprehensive review and editing. To achieve our desired goal of accurately updating affected content within 24 hours of FDA action, we developed a strategy that included rapid updating of affected Web pages with warning boxes and hyperlinks to the information about the withdrawal. With the next FDA withdrawal event, that of valdecoxib (Bextra) on April 7, 2005, we applied this new approach, observed the time and resource requirements, and monitored the rate at which consumers viewed the updated information to gauge its potential impact. Application of the new approach allowed one person to modify the affected Web pages in less than 1 hour and within 18 hours of the FDA announcement. Using the old strategy, response to a similar event, the withdrawal of rofecoxib (Vioxx) 6 months earlier, had taken over 3 weeks and the efforts of several personnel. Updated valdecoxib content received 188 hits within the first month and 4285 hits within 1 year. Rapid updating of a consumer health website's content in response to an FDA drug withdrawal event was easily accomplished by applying the approach described. This allowed consumers to view accurate information regarding the withdrawn drug much sooner than would otherwise have been the case. Given that consumers increasingly turn to websites for their health information, adoption of a rapid response standard for important health events like FDA drug withdrawals should be considered by the consumer health informatics community.

  10. Evaluating an approach to improving the adoption rate of wireless drug library updates for smart pumps.

    PubMed

    Poppe, Lindsey B; Eckel, Stephen F

    2011-01-15

    An academic medical center's approach to improving the adoption rate of wireless drug library updates for smart pumps was evaluated. A multidisciplinary team composed of pharmacy, nursing, medical engineering, materials management, and patient equipment personnel at an academic medical center collaborated to update the drug libraries of more than 1800 smart pumps via a wireless control system. Two pilot tests were completed to identify and resolve issues before the live wireless update was attempted. The second pilot test, a passive approach, produced an adoption rate of 42% of 1804 pumps at the end of one week and a rate of 56% on day 10. The goal of 80% was not achieved until day 22. The change to an active multidisciplinary process three months later produced an adoption rate of 80% for 1869 pumps on day 10, resulting in a 45.4% increase in the adoption rate between the two trials on day 10 (p < 0.001). Communication regarding the updates was disseminated via e-mail to the entire organization, with fliers posted on all patient care units, and verbally during staff meetings. Patient equipment personnel manually tagged each pump with a blue zip tie after verifying the update to easily identify which pumps had been updated. Areas for improvement include increasing communication to the staff detailing when the update will occur and changing the day of the week the update is performed. A multidisciplinary team actively engaged in the updating of wireless i.v. smart pump drug libraries reduced the amount of time required to reach a goal adoption rate of 80%.

  11. Nuclear Science References Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pritychenko, B., E-mail: pritychenko@bnl.gov; Běták, E.; Singh, B.

    2014-06-15

    The Nuclear Science References (NSR) database together with its associated Web interface, is the world's only comprehensive source of easily accessible low- and intermediate-energy nuclear physics bibliographic information for more than 210,000 articles since the beginning of nuclear science. The weekly-updated NSR database provides essential support for nuclear data evaluation, compilation and research activities. The principles of the database and Web application development and maintenance are described. Examples of nuclear structure, reaction and decay applications are specifically included. The complete NSR database is freely available at the websites of the National Nuclear Data Center (http://www.nndc.bnl.gov/nsr) and the International Atomic Energymore » Agency (http://www-nds.iaea.org/nsr)« less

  12. [Plug-in Based Centralized Control System in Operating Rooms].

    PubMed

    Wang, Yunlong

    2017-05-30

    Centralized equipment controls in an operating room (OR) is crucial to an efficient workflow in the OR. To achieve centralized control, an integrative OR needs to focus on designing a control panel that can appropriately incorporate equipment from different manufactures with various connecting ports and controls. Here we propose to achieve equipment integration using plug-in modules. Each OR will be equipped with a dynamic plug-in control panel containing physically removable connecting ports. Matching outlets will be installed onto the control panels of each equipment used at any given time. This dynamic control panel will be backed with a database containing plug-in modules that can connect any two types of connecting ports common among medical equipment manufacturers. The correct connecting ports will be called using reflection dynamics. This database will be updated regularly to include new connecting ports on the market, making it easy to maintain, update, expand and remain relevant as new equipment are developed. Together, the physical panel and the database will achieve centralized equipment controls in the OR that can be easily adapted to any equipment in the OR.

  13. A New Labwork Course for Physics Students: Devices, Methods and Research Projects

    ERIC Educational Resources Information Center

    Neumann, Knut; Welzel, Manuela

    2007-01-01

    Physics labwork has for a long time now been an important part of academic physics education. But demands on physics education have changed. However, while seminars and lectures have easily been updated with the latest content, it is much more difficult to modernize labwork courses: mere changes of content require expensive new equipment, tight…

  14. Lazy Updating of hubs can enable more realistic models by speeding up stochastic simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ehlert, Kurt; Loewe, Laurence, E-mail: loewe@wisc.edu; Wisconsin Institute for Discovery, University of Wisconsin-Madison, Madison, Wisconsin 53715

    2014-11-28

    To respect the nature of discrete parts in a system, stochastic simulation algorithms (SSAs) must update for each action (i) all part counts and (ii) each action's probability of occurring next and its timing. This makes it expensive to simulate biological networks with well-connected “hubs” such as ATP that affect many actions. Temperature and volume also affect many actions and may be changed significantly in small steps by the network itself during fever and cell growth, respectively. Such trends matter for evolutionary questions, as cell volume determines doubling times and fever may affect survival, both key traits for biological evolution.more » Yet simulations often ignore such trends and assume constant environments to avoid many costly probability updates. Such computational convenience precludes analyses of important aspects of evolution. Here we present “Lazy Updating,” an add-on for SSAs designed to reduce the cost of simulating hubs. When a hub changes, Lazy Updating postpones all probability updates for reactions depending on this hub, until a threshold is crossed. Speedup is substantial if most computing time is spent on such updates. We implemented Lazy Updating for the Sorting Direct Method and it is easily integrated into other SSAs such as Gillespie's Direct Method or the Next Reaction Method. Testing on several toy models and a cellular metabolism model showed >10× faster simulations for its use-cases—with a small loss of accuracy. Thus we see Lazy Updating as a valuable tool for some special but important simulation problems that are difficult to address efficiently otherwise.« less

  15. Issues concerning the updating of finite-element models from experimental data

    NASA Technical Reports Server (NTRS)

    Dunn, Shane A.

    1994-01-01

    Some issues concerning the updating of dynamic finite-element models by incorporation of experimental data are examined here. It is demonstrated how the number of unknowns can be greatly reduced if the physical nature of the model is maintained. The issue of uniqueness is also examined and it is shown that a number of previous workers have been mistaken in their attempts to define both sufficient and necessary measurement requirements for the updating problem to be solved uniquely. The relative merits of modal and frequency response function (frf) data are discussed and it is shown that for measurements at fewer degrees of freedom than are present in the model, frf data will be unlikely to converge easily to a solution. It is then examined how such problems may become more tractable by using new experimental techniques which would allow measurements at all degrees of freedom present in the mathematical model.

  16. Fast fingerprint database maintenance for indoor positioning based on UGV SLAM.

    PubMed

    Tang, Jian; Chen, Yuwei; Chen, Liang; Liu, Jingbin; Hyyppä, Juha; Kukko, Antero; Kaartinen, Harri; Hyyppä, Hannu; Chen, Ruizhi

    2015-03-04

    Indoor positioning technology has become more and more important in the last two decades. Utilizing Received Signal Strength Indicator (RSSI) fingerprints of Signals of OPportunity (SOP) is a promising alternative navigation solution. However, as the RSSIs vary during operation due to their physical nature and are easily affected by the environmental change, one challenge of the indoor fingerprinting method is maintaining the RSSI fingerprint database in a timely and effective manner. In this paper, a solution for rapidly updating the fingerprint database is presented, based on a self-developed Unmanned Ground Vehicles (UGV) platform NAVIS. Several SOP sensors were installed on NAVIS for collecting indoor fingerprint information, including a digital compass collecting magnetic field intensity, a light sensor collecting light intensity, and a smartphone which collects the access point number and RSSIs of the pre-installed WiFi network. The NAVIS platform generates a map of the indoor environment and collects the SOPs during processing of the mapping, and then the SOP fingerprint database is interpolated and updated in real time. Field tests were carried out to evaluate the effectiveness and efficiency of the proposed method. The results showed that the fingerprint databases can be quickly created and updated with a higher sampling frequency (5Hz) and denser reference points compared with traditional methods, and the indoor map can be generated without prior information. Moreover, environmental changes could also be detected quickly for fingerprint indoor positioning.

  17. Forecasting daily streamflow using online sequential extreme learning machines

    NASA Astrophysics Data System (ADS)

    Lima, Aranildo R.; Cannon, Alex J.; Hsieh, William W.

    2016-06-01

    While nonlinear machine methods have been widely used in environmental forecasting, in situations where new data arrive continually, the need to make frequent model updates can become cumbersome and computationally costly. To alleviate this problem, an online sequential learning algorithm for single hidden layer feedforward neural networks - the online sequential extreme learning machine (OSELM) - is automatically updated inexpensively as new data arrive (and the new data can then be discarded). OSELM was applied to forecast daily streamflow at two small watersheds in British Columbia, Canada, at lead times of 1-3 days. Predictors used were weather forecast data generated by the NOAA Global Ensemble Forecasting System (GEFS), and local hydro-meteorological observations. OSELM forecasts were tested with daily, monthly or yearly model updates. More frequent updating gave smaller forecast errors, including errors for data above the 90th percentile. Larger datasets used in the initial training of OSELM helped to find better parameters (number of hidden nodes) for the model, yielding better predictions. With the online sequential multiple linear regression (OSMLR) as benchmark, we concluded that OSELM is an attractive approach as it easily outperformed OSMLR in forecast accuracy.

  18. Description of third instars of Cochliomyia minima (Diptera: Calliphoridae) from West Indies, and updated identification key.

    PubMed

    Yusseff-Vanegas, S

    2014-09-01

    The blow fly Cochliomyia minima Shannon is endemic to the Caribbean, and it has great potential for forensic applications because of its abundance and broad distribution in the region. However, its larval stages are unknown. Here, I update previously published identification keys by describing for the first time the morphology of C. minima larvae. The larvae of C. minima are found to be very similar to those of Cochliomyia macellaria F., but the former can be easily identified by the oral sclerite completely pigmented, visible as a spike between mouth hooks. The description of C. minima larvae in this study will be useful to forensic scientists in the Caribbean region.

  19. Ethics in Public Health Research

    PubMed Central

    Myers, Julie; Frieden, Thomas R.; Bherwani, Kamal M.; Henning, Kelly J.

    2008-01-01

    Public health agencies increasingly use electronic means to acquire, use, maintain, and store personal health information. Electronic data formats can improve performance of core public health functions, but potentially threaten privacy because they can be easily duplicated and transmitted to unauthorized people. Although such security breaches do occur, electronic data can be better secured than paper records, because authentication, authorization, auditing, and accountability can be facilitated. Public health professionals should collaborate with law and information technology colleagues to assess possible threats, implement updated policies, train staff, and develop preventive engineering measures to protect information. Tightened physical and electronic controls can prevent misuse of data, minimize the risk of security breaches, and help maintain the reputation and integrity of public health agencies. PMID:18382010

  20. Recent Advances in Preparation, Structure, Properties and Applications of Graphite Oxide.

    PubMed

    Srivastava, Suneel Kumar; Pionteck, Jürgen

    2015-03-01

    Graphite oxide, also referred as graphitic oxide or graphitic acid, is an oxidized bulk product of graphite with a variable composition. However, it did not receive immense attention until it was identified as an important and easily obtainable precursor for the preparation of graphene. This inspired many researchers to explore facts related to graphite oxide in exploiting its fascinating features. The present article culminates up-dated review on different preparative methods, morphology and characterization of physical/chemical properties of graphite oxide by XRD, XPS, FTIR, Raman, NMR, UV-visible, and DRIFT analyses. Finally, recent developments on intercalation and applications of GO in multifaceted areas of catalysis, sensor, supercapacitors, water purification, hydrogen storage and magnetic shielding etc. has also been reviewed.

  1. Automatic domain updating technique for improving computational efficiency of 2-D flood-inundation simulation

    NASA Astrophysics Data System (ADS)

    Tanaka, T.; Tachikawa, Y.; Ichikawa, Y.; Yorozu, K.

    2017-12-01

    Flood is one of the most hazardous disasters and causes serious damage to people and property around the world. To prevent/mitigate flood damage through early warning system and/or river management planning, numerical modelling of flood-inundation processes is essential. In a literature, flood-inundation models have been extensively developed and improved to achieve flood flow simulation with complex topography at high resolution. With increasing demands on flood-inundation modelling, its computational burden is now one of the key issues. Improvements of computational efficiency of full shallow water equations are made from various perspectives such as approximations of the momentum equations, parallelization technique, and coarsening approaches. To support these techniques and more improve the computational efficiency of flood-inundation simulations, this study proposes an Automatic Domain Updating (ADU) method of 2-D flood-inundation simulation. The ADU method traces the wet and dry interface and automatically updates the simulation domain in response to the progress and recession of flood propagation. The updating algorithm is as follow: first, to register the simulation cells potentially flooded at initial stage (such as floodplains nearby river channels), and then if a registered cell is flooded, to register its surrounding cells. The time for this additional process is saved by checking only cells at wet and dry interface. The computation time is reduced by skipping the processing time of non-flooded area. This algorithm is easily applied to any types of 2-D flood inundation models. The proposed ADU method is implemented to 2-D local inertial equations for the Yodo River basin, Japan. Case studies for two flood events show that the simulation is finished within two to 10 times smaller time showing the same result as that without the ADU method.

  2. Programmer's reference manual for the VAX-Gerber link software package. Revision 1. 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Isobe, G.W.

    1985-10-01

    This guide provides the information necessary to edit, modify, and run the VAX-Gerber software link. Since the project is in the testing stage and still being modified, this guide discussess the final desired stage along with the current stage. The current stage is to set up as to allow the programmer to easily modify and update codes as necessary.

  3. Next Generation Monitoring: Tier 2 Experience

    NASA Astrophysics Data System (ADS)

    Fay, R.; Bland, J.; Jones, S.

    2017-10-01

    Monitoring IT infrastructure is essential for maximizing availability and minimizing disruption by detecting failures and developing issues. The HEP group at Liverpool have recently updated our monitoring infrastructure with the goal of increasing coverage, improving visualization capabilities, and streamlining configuration and maintenance. Here we present a summary of Liverpool’s experience, the monitoring infrastructure, and the tools used to build it. In brief, system checks are configured in Puppet using Hiera, and managed by Sensu, replacing Nagios. Centralised logging is managed with Elasticsearch, together with Logstash and Filebeat. Kibana provides an interface for interactive analysis, including visualization and dashboards. Metric collection is also configured in Puppet, managed by collectd and stored in Graphite, with Grafana providing a visualization and dashboard tool. The Uchiwa dashboard for Sensu provides a web interface for viewing infrastructure status. Alert capabilities are provided via external handlers. A custom alert handler is in development to provide an easily configurable, extensible and maintainable alert facility.

  4. Modeling fuel succession

    USGS Publications Warehouse

    Davis, Brett; Van Wagtendonk, Jan W.; Beck, Jen; van Wagtendonk, Kent A.

    2009-01-01

    Surface fuels data are of critical importance for supporting fire incident management, risk assessment, and fuel management planning, but the development of surface fuels data can be expensive and time consuming. The data development process is extensive, generally beginning with acquisition of remotely sensed spatial data such as aerial photography or satellite imagery (Keane and others 2001). The spatial vegetation data are then crosswalked to a set of fire behavior fuel models that describe the available fuels (the burnable portions of the vegetation) (Anderson 1982, Scott and Burgan 2005). Finally, spatial fuels data are used as input to tools such as FARSITE and FlamMap to model current and potential fire spread and behavior (Finney 1998, Finney 2006). The capture date of the remotely sensed data defines the period for which the vegetation, and, therefore, fuels, data are most accurate. The more time that passes after the capture date, the less accurate the data become due to vegetation growth and processes such as fire. Subsequently, the results of any fire simulation based on these data become less accurate as the data age. Because of the amount of labor and expense required to develop these data, keeping them updated may prove to be a challenge. In this article, we describe the Sierra Nevada Fuel Succession Model, a modeling tool that can quickly and easily update surface fuel models with a minimum of additional input data. Although it was developed for use by Yosemite, Sequoia, and Kings Canyon National Parks, it is applicable to much of the central and southern Sierra Nevada. Furthermore, the methods used to develop the model have national applicability.

  5. The Voronoi spatio-temporal data structure

    NASA Astrophysics Data System (ADS)

    Mioc, Darka

    2002-04-01

    Current GIS models cannot integrate the temporal dimension of spatial data easily. Indeed, current GISs do not support incremental (local) addition and deletion of spatial objects, and they can not support the temporal evolution of spatial data. Spatio-temporal facilities would be very useful in many GIS applications: harvesting and forest planning, cadastre, urban and regional planning, and emergency planning. The spatio-temporal model that can overcome these problems is based on a topological model---the Voronoi data structure. Voronoi diagrams are irregular tessellations of space, that adapt to spatial objects and therefore they are a synthesis of raster and vector spatial data models. The main advantage of the Voronoi data structure is its local and sequential map updates, which allows us to automatically record each event and performed map updates within the system. These map updates are executed through map construction commands that are composed of atomic actions (geometric algorithms for addition, deletion, and motion of spatial objects) on the dynamic Voronoi data structure. The formalization of map commands led to the development of a spatial language comprising a set of atomic operations or constructs on spatial primitives (points and lines), powerful enough to define the complex operations. This resulted in a new formal model for spatio-temporal change representation, where each update is uniquely characterized by the numbers of newly created and inactivated Voronoi regions. This is used for the extension of the model towards the hierarchical Voronoi data structure. In this model, spatio-temporal changes induced by map updates are preserved in a hierarchical data structure that combines events and corresponding changes in topology. This hierarchical Voronoi data structure has an implicit time ordering of events visible through changes in topology, and it is equivalent to an event structure that can support temporal data without precise temporal information. This formal model of spatio-temporal change representation is currently applied to retroactive map updates and visualization of map evolution. It offers new possibilities in the domains of temporal GIS, transaction processing, spatio-temporal queries, spatio-temporal analysis, map animation and map visualization.

  6. Animated Cell Biology: A Quick and Easy Method for Making Effective, High-Quality Teaching Animations

    PubMed Central

    2006-01-01

    There is accumulating evidence that animations aid learning of dynamic concepts in cell biology. However, existing animation packages are expensive and difficult to learn, and the subsequent production of even short animations can take weeks to months. Here I outline the principles and sequence of steps for producing high-quality PowerPoint animations in less than a day that are suitable for teaching in high school through college/university. After developing the animation it can be easily converted to any appropriate movie file format using Camtasia Studio for Internet or classroom presentations. Thus anyone who can use PowerPoint has the potential to make animations. Students who viewed the approximately 3-min PowerPoint/Camtasia Studio animation “Calcium and the Dual Signalling Pathway” over 15 min scored significantly higher marks on a subsequent quiz than those who had viewed still graphics with text for an equivalent time. In addition, results from student evaluations provided some data validating the use of such animations in cell biology teaching with some interesting caveats. Information is also provided on how such animations can be modified or updated easily or shared with others who can modify them to fit their own needs. PMID:17012217

  7. Traditional Medicines in Africa: An Appraisal of Ten Potent African Medicinal Plants

    PubMed Central

    Mahomoodally, M. Fawzi

    2013-01-01

    The use of medicinal plants as a fundamental component of the African traditional healthcare system is perhaps the oldest and the most assorted of all therapeutic systems. In many parts of rural Africa, traditional healers prescribing medicinal plants are the most easily accessible and affordable health resource available to the local community and at times the only therapy that subsists. Nonetheless, there is still a paucity of updated comprehensive compilation of promising medicinal plants from the African continent. The major focus of the present review is to provide an updated overview of 10 promising medicinal plants from the African biodiversity which have short- as well as long-term potential to be developed as future phytopharmaceuticals to treat and/or manage panoply of infectious and chronic conditions. In this endeavour, key scientific databases have been probed to investigate trends in the rapidly increasing number of scientific publications on African traditional medicinal plants. Within the framework of enhancing the significance of traditional African medicinal plants, aspects such as traditional use, phytochemical profile, in vitro, in vivo, and clinical studies and also future challenges pertaining to the use of these plants have been explored. PMID:24367388

  8. Ray-tracing 3D dust radiative transfer with DART-Ray: code upgrade and public release

    NASA Astrophysics Data System (ADS)

    Natale, Giovanni; Popescu, Cristina C.; Tuffs, Richard J.; Clarke, Adam J.; Debattista, Victor P.; Fischera, Jörg; Pasetto, Stefano; Rushton, Mark; Thirlwall, Jordan J.

    2017-11-01

    We present an extensively updated version of the purely ray-tracing 3D dust radiation transfer code DART-Ray. The new version includes five major upgrades: 1) a series of optimizations for the ray-angular density and the scattered radiation source function; 2) the implementation of several data and task parallelizations using hybrid MPI+OpenMP schemes; 3) the inclusion of dust self-heating; 4) the ability to produce surface brightness maps for observers within the models in HEALPix format; 5) the possibility to set the expected numerical accuracy already at the start of the calculation. We tested the updated code with benchmark models where the dust self-heating is not negligible. Furthermore, we performed a study of the extent of the source influence volumes, using galaxy models, which are critical in determining the efficiency of the DART-Ray algorithm. The new code is publicly available, documented for both users and developers, and accompanied by several programmes to create input grids for different model geometries and to import the results of N-body and SPH simulations. These programmes can be easily adapted to different input geometries, and for different dust models or stellar emission libraries.

  9. Fast Fingerprint Database Maintenance for Indoor Positioning Based on UGV SLAM

    PubMed Central

    Tang, Jian; Chen, Yuwei; Chen, Liang; Liu, Jingbin; Hyyppä, Juha; Kukko, Antero; Kaartinen, Harri; Hyyppä, Hannu; Chen, Ruizhi

    2015-01-01

    Indoor positioning technology has become more and more important in the last two decades. Utilizing Received Signal Strength Indicator (RSSI) fingerprints of Signals of OPportunity (SOP) is a promising alternative navigation solution. However, as the RSSIs vary during operation due to their physical nature and are easily affected by the environmental change, one challenge of the indoor fingerprinting method is maintaining the RSSI fingerprint database in a timely and effective manner. In this paper, a solution for rapidly updating the fingerprint database is presented, based on a self-developed Unmanned Ground Vehicles (UGV) platform NAVIS. Several SOP sensors were installed on NAVIS for collecting indoor fingerprint information, including a digital compass collecting magnetic field intensity, a light sensor collecting light intensity, and a smartphone which collects the access point number and RSSIs of the pre-installed WiFi network. The NAVIS platform generates a map of the indoor environment and collects the SOPs during processing of the mapping, and then the SOP fingerprint database is interpolated and updated in real time. Field tests were carried out to evaluate the effectiveness and efficiency of the proposed method. The results showed that the fingerprint databases can be quickly created and updated with a higher sampling frequency (5Hz) and denser reference points compared with traditional methods, and the indoor map can be generated without prior information. Moreover, environmental changes could also be detected quickly for fingerprint indoor positioning. PMID:25746096

  10. Upgrades to the TPSX Material Properties Database

    NASA Technical Reports Server (NTRS)

    Squire, T. H.; Milos, F. S.; Partridge, Harry (Technical Monitor)

    2001-01-01

    The TPSX Material Properties Database is a web-based tool that serves as a database for properties of advanced thermal protection materials. TPSX provides an easy user interface for retrieving material property information in a variety of forms, both graphical and text. The primary purpose and advantage of TPSX is to maintain a high quality source of often used thermal protection material properties in a convenient, easily accessible form, for distribution to government and aerospace industry communities. Last year a major upgrade to the TPSX web site was completed. This year, through the efforts of researchers at several NASA centers, the Office of the Chief Engineer awarded funds to update and expand the databases in TPSX. The FY01 effort focuses on updating correcting the Ames and Johnson thermal protection materials databases. In this session we will summarize the improvements made to the web site last year, report on the status of the on-going database updates, describe the planned upgrades for FY02 and FY03, and provide a demonstration of TPSX.

  11. Reversing Implicit First Impressions through Reinterpretation after a Two-Day Delay

    PubMed Central

    Mann, Thomas C.; Ferguson, Melissa J.

    2016-01-01

    People are adept at forming impressions of others, but how easily can impressions be updated? Although implicit first impressions have been characterized as difficult to overturn, recent work shows that they can be reversed through reinterpretation of earlier learning. However, such reversal has been demonstrated only in the same experimental session in which the impression formed, suggesting that implicit updating might be possible only within a brief temporal window, before impressions are consolidated and when memory about the initial information is strongest. Implicit impressions may be unable to be revised when reinterpreting details are learned later, due to memory consolidation or forgetting of the details to be reinterpreted. This study tested whether implicit first impressions can be reversed through reinterpretation after a two-day delay following the initial formation. Results showed that implicit revision emerged after the delay, even among those with poor explicit recall or who were not cued to recall. We discuss implications for theory on impression formation and updating. PMID:28017977

  12. Reversing Implicit First Impressions through Reinterpretation after a Two-Day Delay.

    PubMed

    Mann, Thomas C; Ferguson, Melissa J

    2017-01-01

    People are adept at forming impressions of others, but how easily can impressions be updated? Although implicit first impressions have been characterized as difficult to overturn, recent work shows that they can be reversed through reinterpretation of earlier learning. However, such reversal has been demonstrated only in the same experimental session in which the impression formed, suggesting that implicit updating might be possible only within a brief temporal window, before impressions are consolidated and when memory about the initial information is strongest. Implicit impressions may be unable to be revised when reinterpreting details are learned later, due to memory consolidation or forgetting of the details to be reinterpreted. This study tested whether implicit first impressions can be reversed through reinterpretation after a two-day delay following the initial formation. Results showed that implicit revision emerged after the delay, even among those with poor explicit recall or who were not cued to recall. We discuss implications for theory on impression formation and updating.

  13. High resolution regional soil carbon mapping in Madagascar : towards easy to update maps

    NASA Astrophysics Data System (ADS)

    Grinand, Clovis; Dessay, Nadine; Razafimbelo, Tantely; Razakamanarivo, Herintsitoaina; Albrecht, Alain; Vaudry, Romuald; Tiberghien, Matthieu; Rasamoelina, Maminiaina; Bernoux, Martial

    2013-04-01

    The soil organic carbon plays an important role in climate change regulation through carbon emissions and sequestration due to land use changes, notably tropical deforestation. Monitoring soil carbon emissions from shifting-cultivation requires to evaluate the amount of carbon stored at plot scale with a sufficient level of accuracy to be able to detect changes. The objective of this work was to map soil carbon stocks (30 cm and 100 cm depths) for different land use at regional scale using high resolution satellite dataset. The Andohahela National Parc and its surroundings (South-Est Madagascar) - a region with the largest deforestation rate in the country - was selected as a pilot area for the development of the methodology. A three steps approach was set up: (i) carbon inventory using mid infra-red spectroscopy and stock calculation, (ii) spatial data processing and (iii) modeling and mapping. Soil spectroscopy was successfully used for measuring organic carbon in this region. The results show that Random Forest was the inference model that produced the best estimates on calibration and validation datasets. By using a simple and robust method, we estimated uncertainty levels of of 35% and 43% for 30-cm and 100-cm carbon maps respectively. The approach developed in this study was based on open data and open source software that can be easily replicated to other regions and for other time periods using updated satellite images.

  14. The National energy modeling system

    NASA Astrophysics Data System (ADS)

    The DOE uses a variety of energy and economic models to forecast energy supply and demand. It also uses a variety of more narrowly focussed analytical tools to examine energy policy options. For the purpose of the scope of this work, this set of models and analytical tools is called the National Energy Modeling System (NEMS). The NEMS is the result of many years of development of energy modeling and analysis tools, many of which were developed for different applications and under different assumptions. As such, NEMS is believed to be less than satisfactory in certain areas. For example, NEMS is difficult to keep updated and expensive to use. Various outputs are often difficult to reconcile. Products were not required to interface, but were designed to stand alone. Because different developers were involved, the inner workings of the NEMS are often not easily or fully understood. Even with these difficulties, however, NEMS comprises the best tools currently identified to deal with our global, national and regional energy modeling, and energy analysis needs.

  15. Development and Validation of a New Air Carrier Block Time Prediction Model and Methodology

    NASA Astrophysics Data System (ADS)

    Litvay, Robyn Olson

    Commercial airline operations rely on predicted block times as the foundation for critical, successive decisions that include fuel purchasing, crew scheduling, and airport facility usage planning. Small inaccuracies in the predicted block times have the potential to result in huge financial losses, and, with profit margins for airline operations currently almost nonexistent, potentially negate any possible profit. Although optimization techniques have resulted in many models targeting airline operations, the challenge of accurately predicting and quantifying variables months in advance remains elusive. The objective of this work is the development of an airline block time prediction model and methodology that is practical, easily implemented, and easily updated. Research was accomplished, and actual U.S., domestic, flight data from a major airline was utilized, to develop a model to predict airline block times with increased accuracy and smaller variance in the actual times from the predicted times. This reduction in variance represents tens of millions of dollars (U.S.) per year in operational cost savings for an individual airline. A new methodology for block time prediction is constructed using a regression model as the base, as it has both deterministic and probabilistic components, and historic block time distributions. The estimation of the block times for commercial, domestic, airline operations requires a probabilistic, general model that can be easily customized for a specific airline’s network. As individual block times vary by season, by day, and by time of day, the challenge is to make general, long-term estimations representing the average, actual block times while minimizing the variation. Predictions of block times for the third quarter months of July and August of 2011 were calculated using this new model. The resulting, actual block times were obtained from the Research and Innovative Technology Administration, Bureau of Transportation Statistics (Airline On-time Performance Data, 2008-2011) for comparison and analysis. Future block times are shown to be predicted with greater accuracy, without exception and network-wide, for a major, U.S., domestic airline.

  16. Using Third Party Data to Update a Reference Dataset in a Quality Evaluation Service

    NASA Astrophysics Data System (ADS)

    Xavier, E. M. A.; Ariza-López, F. J.; Ureña-Cámara, M. A.

    2016-06-01

    Nowadays it is easy to find many data sources for various regions around the globe. In this 'data overload' scenario there are few, if any, information available about the quality of these data sources. In order to easily provide these data quality information we presented the architecture of a web service for the automation of quality control of spatial datasets running over a Web Processing Service (WPS). For quality procedures that require an external reference dataset, like positional accuracy or completeness, the architecture permits using a reference dataset. However, this reference dataset is not ageless, since it suffers the natural time degradation inherent to geospatial features. In order to mitigate this problem we propose the Time Degradation & Updating Module which intends to apply assessed data as a tool to maintain the reference database updated. The main idea is to utilize datasets sent to the quality evaluation service as a source of 'candidate data elements' for the updating of the reference database. After the evaluation, if some elements of a candidate dataset reach a determined quality level, they can be used as input data to improve the current reference database. In this work we present the first design of the Time Degradation & Updating Module. We believe that the outcomes can be applied in the search of a full-automatic on-line quality evaluation platform.

  17. Design Tools for Assessing Manufacturing Environmental Impact.

    DTIC Science & Technology

    1997-11-26

    the material report alone. In order to more easily design, update and verify the output report, many of the cells which contained the information...needed for the material balance calculations were named. The cell name was then used in the calculations. Where possible the same names that were used in...Material balance information was used extensively to ensure all the equations were correct and were put into the appropriate cells . A summary of the

  18. Enhanced Virtual Presence for Immersive Visualization of Complex Situations for Mission Rehearsal

    DTIC Science & Technology

    1997-06-01

    taken. We propose to join both these technologies together in a registration device . The registration device would be small and portable and easily...registering the panning of the camera (or other sensing device ) and also stitch together the shots to automatically generate panoramic files necessary to...database and as the base information changes each of the linked drawings is automatically updated. Filename Format A specific naming convention should be

  19. OLTARIS: On-Line Tool for the Assessment of Radiation in Space

    NASA Technical Reports Server (NTRS)

    Sandridge, Chris A.; Blattnig, Steve R.; Clowdsley, Martha S.; Norbury, John; Qualis, Garry D.; Simonsen, Lisa C.; Singleterry, Robert C.; Slaba, Tony C.; Walker, Steven A.; Badavi, Francis F.; hide

    2009-01-01

    The effects of ionizing radiation on humans in space is a major technical challenge for exploration to the moon and beyond. The radiation shielding team at NASA Langley Research Center has been working for over 30 years to develop techniques that can efficiently assist the engineer throughout the entire design process. OLTARIS: On-Line Tool for the Assessment of Radiation in Space is a new NASA website (http://oltaris.larc.nasa.gov) that allows engineers and physicists to access a variety of tools and models to study the effects of ionizing space radiation on humans and shielding materials. The site is intended to be an analysis and design tool for those working radiation issues for current and future manned missions, as well as a research tool for developing advanced material and shielding concepts. The site, along with the analysis tools and models within, have been developed using strict software practices to ensure reliable and reproducible results in a production environment. They have also been developed as a modular system so that models and algorithms can be easily added or updated.

  20. A real-time robot arm collision detection system

    NASA Technical Reports Server (NTRS)

    Shaffer, Clifford A.; Herb, Gregory M.

    1990-01-01

    A data structure and update algorithm are presented for a prototype real time collision detection safety system for a multi-robot environment. The data structure is a variant of the octree, which serves as a spatial index. An octree recursively decomposes 3-D space into eight equal cubic octants until each octant meets some decomposition criteria. The octree stores cylspheres (cylinders with spheres on each end) and rectangular solids as primitives (other primitives can easily be added as required). These primitives make up the two seven degrees-of-freedom robot arms and environment modeled by the system. Octree nodes containing more than a predetermined number N of primitives are decomposed. This rule keeps the octree small, as the entire environment for the application can be modeled using a few dozen primitives. As robot arms move, the octree is updated to reflect their changed positions. During most update cycles, any given primitive does not change which octree nodes it is in. Thus, modification to the octree is rarely required. Incidents in which one robot arm comes too close to another arm or an object are reported. Cycle time for interpreting current joint angles, updating the octree, and detecting/reporting imminent collisions averages 30 milliseconds on an Intel 80386 processor running at 20 MHz.

  1. Web Based Tool for Mission Operations Scenarios

    NASA Technical Reports Server (NTRS)

    Boyles, Carole A.; Bindschadler, Duane L.

    2008-01-01

    A conventional practice for spaceflight projects is to document scenarios in a monolithic Operations Concept document. Such documents can be hundreds of pages long and may require laborious updates. Software development practice utilizes scenarios in the form of smaller, individual use cases, which are often structured and managed using UML. We have developed a process and a web-based scenario tool that utilizes a similar philosophy of smaller, more compact scenarios (but avoids the formality of UML). The need for a scenario process and tool became apparent during the authors' work on a large astrophysics mission. It was noted that every phase of the Mission (e.g., formulation, design, verification and validation, and operations) looked back to scenarios to assess completeness of requirements and design. It was also noted that terminology needed to be clarified and structured to assure communication across all levels of the project. Attempts to manage, communicate, and evolve scenarios at all levels of a project using conventional tools (e.g., Excel) and methods (Scenario Working Group meetings) were not effective given limitations on budget and staffing. The objective of this paper is to document the scenario process and tool created to offer projects a low-cost capability to create, communicate, manage, and evolve scenarios throughout project development. The process and tool have the further benefit of allowing the association of requirements with particular scenarios, establishing and viewing relationships between higher- and lower-level scenarios, and the ability to place all scenarios in a shared context. The resulting structured set of scenarios is widely visible (using a web browser), easily updated, and can be searched according to various criteria including the level (e.g., Project, System, and Team) and Mission Phase. Scenarios are maintained in a web-accessible environment that provides a structured set of scenario fields and allows for maximum visibility across the project. One key aspect is that the tool was built for a scenario process that accounts for stakeholder input, review, comment, and concurrence. By creating well-designed opportunities for stakeholder input and concurrence and by making the scenario content easily accessible to all project personnel, we maximize the opportunities for stakeholders to both understand and agree on the concepts for how their mission is to be carried out.

  2. Customized laboratory information management system for a clinical and research leukemia cytogenetics laboratory.

    PubMed

    Bakshi, Sonal R; Shukla, Shilin N; Shah, Pankaj M

    2009-01-01

    We developed a Microsoft Access-based laboratory management system to facilitate database management of leukemia patients referred for cytogenetic tests in regards to karyotyping and fluorescence in situ hybridization (FISH). The database is custom-made for entry of patient data, clinical details, sample details, cytogenetics test results, and data mining for various ongoing research areas. A number of clinical research laboratoryrelated tasks are carried out faster using specific "queries." The tasks include tracking clinical progression of a particular patient for multiple visits, treatment response, morphological and cytogenetics response, survival time, automatic grouping of patient inclusion criteria in a research project, tracking various processing steps of samples, turn-around time, and revenue generated. Since 2005 we have collected of over 5,000 samples. The database is easily updated and is being adapted for various data maintenance and mining needs.

  3. Chelation in Metal Intoxication

    PubMed Central

    Flora, Swaran J.S.; Pachauri, Vidhu

    2010-01-01

    Chelation therapy is the preferred medical treatment for reducing the toxic effects of metals. Chelating agents are capable of binding to toxic metal ions to form complex structures which are easily excreted from the body removing them from intracellular or extracellular spaces. 2,3-Dimercaprol has long been the mainstay of chelation therapy for lead or arsenic poisoning, however its serious side effects have led researchers to develop less toxic analogues. Hydrophilic chelators like meso-2,3-dimercaptosuccinic acid effectively promote renal metal excretion, but their ability to access intracellular metals is weak. Newer strategies to address these drawbacks like combination therapy (use of structurally different chelating agents) or co-administration of antioxidants have been reported recently. In this review we provide an update of the existing chelating agents and the various strategies available for the treatment of heavy metals and metalloid intoxications. PMID:20717537

  4. Evaluation of liquefaction potential for building code

    NASA Astrophysics Data System (ADS)

    Nunziata, C.; De Nisco, G.; Panza, G. F.

    2008-07-01

    The standard approach for the evaluation of the liquefaction susceptibility is based on the estimation of a safety factor between the cyclic shear resistance to liquefaction and the earthquake induced shear stress. Recently, an updated procedure based on shear-wave velocities (Vs) has been proposed which could be more easily applied. These methods have been applied at La Plaja beach of Catania, that experienced liquefaction because of the 1693 earthquake. The detailed geotechnical and Vs information and the realistic ground motion computed for the 1693 event let us compare the two approaches. The successful application of the Vs procedure, slightly modified to fit historical and safety factor information, even if additional field performances are needed, encourages the development of a guide for liquefaction potential analysis, based on well defined Vs profiles to be included in the italian seismic code.

  5. Graphics enhanced computer emulation for improved timing-race and fault tolerance control system analysis. [of Centaur liquid-fuel booster

    NASA Technical Reports Server (NTRS)

    Szatkowski, G. P.

    1983-01-01

    A computer simulation system has been developed for the Space Shuttle's advanced Centaur liquid fuel booster rocket, in order to conduct systems safety verification and flight operations training. This simulation utility is designed to analyze functional system behavior by integrating control avionics with mechanical and fluid elements, and is able to emulate any system operation, from simple relay logic to complex VLSI components, with wire-by-wire detail. A novel graphics data entry system offers a pseudo-wire wrap data base that can be easily updated. Visual subsystem operations can be selected and displayed in color on a six-monitor graphics processor. System timing and fault verification analyses are conducted by injecting component fault modes and min/max timing delays, and then observing system operation through a red line monitor.

  6. Automated identification of drug and food allergies entered using non-standard terminology.

    PubMed

    Epstein, Richard H; St Jacques, Paul; Stockin, Michael; Rothman, Brian; Ehrenfeld, Jesse M; Denny, Joshua C

    2013-01-01

    An accurate computable representation of food and drug allergy is essential for safe healthcare. Our goal was to develop a high-performance, easily maintained algorithm to identify medication and food allergies and sensitivities from unstructured allergy entries in electronic health record (EHR) systems. An algorithm was developed in Transact-SQL to identify ingredients to which patients had allergies in a perioperative information management system. The algorithm used RxNorm and natural language processing techniques developed on a training set of 24 599 entries from 9445 records. Accuracy, specificity, precision, recall, and F-measure were determined for the training dataset and repeated for the testing dataset (24 857 entries from 9430 records). Accuracy, precision, recall, and F-measure for medication allergy matches were all above 98% in the training dataset and above 97% in the testing dataset for all allergy entries. Corresponding values for food allergy matches were above 97% and above 93%, respectively. Specificities of the algorithm were 90.3% and 85.0% for drug matches and 100% and 88.9% for food matches in the training and testing datasets, respectively. The algorithm had high performance for identification of medication and food allergies. Maintenance is practical, as updates are managed through upload of new RxNorm versions and additions to companion database tables. However, direct entry of codified allergy information by providers (through autocompleters or drop lists) is still preferred to post-hoc encoding of the data. Data tables used in the algorithm are available for download. A high performing, easily maintained algorithm can successfully identify medication and food allergies from free text entries in EHR systems.

  7. Human-System Integration Scorecard Update to VB.Net

    NASA Technical Reports Server (NTRS)

    Sanders, Blaze D.

    2009-01-01

    The purpose of this project was to create Human-System Integration (HSI) scorecard software, which could be utilized to validate that human factors have been considered early in hardware/system specifications and design. The HSI scorecard is partially based upon the revised Human Rating Requirements (HRR) intended for NASA's Constellation program. This software scorecard will allow for quick appraisal of HSI factors, by using visual aids to highlight low and rapidly changing scores. This project consisted of creating a user-friendly Visual Basic program that could be easily distributed and updated, to and by fellow colleagues. Updating the Microsoft Word version of the HSI scorecard to a computer application will allow for the addition of useful features, improved easy of use, and decreased completion time for user. One significant addition is the ability to create Microsoft Excel graphs automatically from scorecard data, to allow for clear presentation of problematic areas. The purpose of this paper is to describe the rational and benefits of creating the HSI scorecard software, the problems and goals of project, and future work that could be done.

  8. Web-based Altimeter Service

    NASA Astrophysics Data System (ADS)

    Callahan, P. S.; Wilson, B. D.; Xing, Z.; Raskin, R. G.

    2010-12-01

    We have developed a web-based system to allow updating and subsetting of TOPEX data. The Altimeter Service will be operated by PODAAC along with their other provision of oceanographic data. The Service could be easily expanded to other mission data. An Altimeter Service is crucial to the improvement and expanded use of altimeter data. A service is necessary for altimetry because the result of most interest - sea surface height anomaly (SSHA) - is composed of several components that are updated individually and irregularly by specialized experts. This makes it difficult for projects to provide the most up-to-date products. Some components are the subject of ongoing research, so the ability for investigators to make products for comparison or sharing is important. The service will allow investigators/producers to get their component models or processing into widespread use much more quickly. For coastal altimetry, the ability to subset the data to the area of interest and insert specialized models (e.g., tides) or data processing results is crucial. A key part of the Altimeter Service is having data producers provide updated or local models and data. In order for this to succeed, producers need to register their products with the Altimeter Service and to provide the product in a form consistent with the service update methods. We will describe the capabilities of the web service and the methods for providing new components. Currently the Service is providing TOPEX GDRs with Retracking (RGDRs) in netCDF format that has been coordinated with Jason data. Users can add new orbits, tide models, gridded geophysical fields such as mean sea surface, and along-track corrections as they become available and are installed by PODAAC. The updated fields are inserted into the netCDF files while the previous values are retained for comparison. The Service will also generate SSH and SSHA. In addition, the Service showcases a feature that plots any variable from files in netCDF. The research described here was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration.

  9. Integrating Radar Image Data with Google Maps

    NASA Technical Reports Server (NTRS)

    Chapman, Bruce D.; Gibas, Sarah

    2010-01-01

    A public Web site has been developed as a method for displaying the multitude of radar imagery collected by NASA s Airborne Synthetic Aperture Radar (AIRSAR) instrument during its 16-year mission. Utilizing NASA s internal AIRSAR site, the new Web site features more sophisticated visualization tools that enable the general public to have access to these images. The site was originally maintained at NASA on six computers: one that held the Oracle database, two that took care of the software for the interactive map, and three that were for the Web site itself. Several tasks were involved in moving this complicated setup to just one computer. First, the AIRSAR database was migrated from Oracle to MySQL. Then the back-end of the AIRSAR Web site was updated in order to access the MySQL database. To do this, a few of the scripts needed to be modified; specifically three Perl scripts that query that database. The database connections were then updated from Oracle to MySQL, numerous syntax errors were corrected, and a query was implemented that replaced one of the stored Oracle procedures. Lastly, the interactive map was designed, implemented, and tested so that users could easily browse and access the radar imagery through the Google Maps interface.

  10. A low-cost GPS/INS integrated vehicle heading angle measurement system

    NASA Astrophysics Data System (ADS)

    Wu, Ye; Gao, Tongyue; Ding, Yi

    2018-04-01

    GPS can provide continuous heading information, but the accuracy is easily affected by the velocity and shelter from buildings or trees. For vehicle systems, we propose a low-cost heading angle update algorithm. Based on the GPS/INS integrated navigation kalman filter, we add the GPS heading angle to the measurement vector, and establish its error model. The experiment results show that this algorithm can effectively improve the accuracy of GPS heading angle.

  11. Updated Palaeotsunami Database for Aotearoa/New Zealand

    NASA Astrophysics Data System (ADS)

    Gadsby, M. R.; Goff, J. R.; King, D. N.; Robbins, J.; Duesing, U.; Franz, T.; Borrero, J. C.; Watkins, A.

    2016-12-01

    The updated configuration, design, and implementation of a national palaeotsunami (pre-historic tsunami) database for Aotearoa/New Zealand (A/NZ) is near completion. This tool enables correlation of events along different stretches of the NZ coastline, provides information on frequency and extent of local, regional and distant-source tsunamis, and delivers detailed information on the science and proxies used to identify the deposits. In A/NZ a plethora of data, scientific research and experience surrounds palaeotsunami deposits, but much of this information has been difficult to locate, has variable reporting standards, and lacked quality assurance. The original database was created by Professor James Goff while working at the National Institute of Water & Atmospheric Research in A/NZ, but has subsequently been updated during his tenure at the University of New South Wales. The updating and establishment of the national database was funded by the Ministry of Civil Defence and Emergency Management (MCDEM), led by Environment Canterbury Regional Council, and supported by all 16 regions of A/NZ's local government. Creation of a single database has consolidated a wide range of published and unpublished research contributions from many science providers on palaeotsunamis in A/NZ. The information is now easily accessible and quality assured and allows examination of frequency, extent and correlation of events. This provides authoritative scientific support for coastal-marine planning and risk management. The database will complement the GNS New Zealand Historical Database, and contributes to a heightened public awareness of tsunami by being a "one-stop-shop" for information on past tsunami impacts. There is scope for this to become an international database, enabling the pacific-wide correlation of large events, as well as identifying smaller regional ones. The Australian research community has already expressed an interest, and the database is also compatible with a similar one currently under development in Japan. Expressions of interest in collaborating with the A/NZ team to expand the database are invited from other Pacific nations.

  12. National Hospital Management Portal (NHMP): a framework for e-health implementation.

    PubMed

    Adetiba, E; Eleanya, M; Fatumo, S A; Matthews, V O

    2009-01-01

    Health information represents the main basis for health decision-making process and there have been some efforts to increase access to health information in developing countries. However, most of these efforts are based on the internet which has minimal penetration especially in the rural and sub-urban part of developing countries. In this work, a platform for medical record acquisition via the ubiquitous 2.5G/3G wireless communications technologies is presented. The National Hospital Management Portal (NHMP) platform has a central database at each specific country's national hospital which could be updated/accessed from hosts at health centres, clinics, medical laboratories, teaching hospitals, private hospitals and specialist hospitals across the country. With this, doctors can have access to patients' medical records more easily, get immediate access to test results from laboratories, deliver prescription directly to pharmacists. If a particular treatment can be provided to a patient more effectively in another country, NHMP makes it simpler to organise and carry out such treatment abroad.

  13. Updated Global Data from the Guvi Instrument: New Products, Updated Calibration, and a New Web Interface

    NASA Astrophysics Data System (ADS)

    Schaefer, R. K.; Paxton, L. J.; Romeo, G.; Wolven, B. C.; Zhang, Y.; Comberiate, J.

    2014-12-01

    With it's high inclination orbit, GUVI provides global coverage of the ionosphere/thermosphere system, revisiting each polar region 15 times a day. The GUVI instrument has long been a resource for the ITM community with a panoply of data products available from the GUVI website (http://guvi.jhuapl.edu). GUVI is in a high inclination orbit and so provides coverage of both hemispheres. With the release last year of the data products from the DMSO/SSUSI instrument, particularly more detailed auroral zone products (Q, E0, Hemispheric Power, discrete auroral arcs, proton precipitation regions), new equatorial ionospheric products (3D electron densities, bubbles), a whole new set of UV data products has become available. SSUSI are available from http://ssusi.jhuapl.edu. To leverage the experience and knowledge gained from running all of these instruments we have adapted the SSUSI products so they can be made from GUVI telemetry. There are now updated versions of GUVI legacy products as well as brand new products. In addition, better on-orbit calibration techniques developed for SSUSI have now been applied to the GUVI instrument calibration - there is now a common set of software for calibrating both instruments. With a common data format, calibration, and product definition, the data from all SSUSI and GUVI instruments can now be easily combined to get multiple instruments to cover the hemispheres to do a variety of global studies. In addition, the GUVI spectrographic mode data provides great detail about spectrographic features (e.g. O/N2 ratios, NO band emission) that are important for understanding dynamical processes in the thermosphere. A new version of the GUVI website (with the same interface as the SSUSI website) has been launched from guvi.jhuapl.edu to showcase the legacy products made with the new calibration and also highlight the newly developed products for the GUVI imaging and spectrographic modes.

  14. Development of the updated system of city underground pipelines based on Visual Studio

    NASA Astrophysics Data System (ADS)

    Zhang, Jianxiong; Zhu, Yun; Li, Xiangdong

    2009-10-01

    Our city has owned the integrated pipeline network management system with ArcGIS Engine 9.1 as the bottom development platform and with Oracle9i as basic database for storaging data. In this system, ArcGIS SDE9.1 is applied as the spatial data engine, and the system was a synthetic management software developed with Visual Studio visualization procedures development tools. As the pipeline update function of the system has the phenomenon of slower update and even sometimes the data lost, to ensure the underground pipeline data can real-time be updated conveniently and frequently, and the actuality and integrity of the underground pipeline data, we have increased a new update module in the system developed and researched by ourselves. The module has the powerful data update function, and can realize the function of inputting and outputting and rapid update volume of data. The new developed module adopts Visual Studio visualization procedures development tools, and uses access as the basic database to storage data. We can edit the graphics in AutoCAD software, and realize the database update using link between the graphics and the system. Practice shows that the update module has good compatibility with the original system, reliable and high update efficient of the database.

  15. Nonstandard Logistics Success in Unconventional Warfare

    DTIC Science & Technology

    2015-11-01

    water from local farms , vendors, and markets. The PACE plan would be con- stantly updated as the operational picture changes, such as a variation in...sus- tainability. For example, if a forward element is purchasing a noticeable amount of rice from a local farm , the team might begin purchasing rice...another OE be forced to use another item that has value. Cash may work in a city center, but a goat in a rural area is just as valuable and easily sold

  16. Catalog Descriptions Using VOTable Files

    NASA Astrophysics Data System (ADS)

    Thompson, R.; Levay, K.; Kimball, T.; White, R.

    2008-08-01

    Additional information is frequently required to describe database table contents and make it understandable to users. For this reason, the Multimission Archive at Space Telescope (MAST) creates Òdescription filesÓ for each table/catalog. After trying various XML and CSV formats, we finally chose VOTable. These files are easy to update via an HTML form, easily read using an XML parser such as (in our case) the PHP5 SimpleXML extension, and have found multiple uses in our data access/retrieval process.

  17. Guideline on terminology and definitions of updating clinical guidelines: The Updating Glossary.

    PubMed

    Martínez García, Laura; Pardo-Hernández, Hector; Sanabria, Andrea Juliana; Alonso-Coello, Pablo; Penman, Katrina; McFarlane, Emma

    2018-03-01

    The Guidelines International Network (G-I-N) Updating Guidelines Working Group launched an initiative to develop a glossary (the Updating Glossary) with domains, terms, definitions, and synonyms related to updating of clinical guidelines (CGs). The steering committee developed an initial list of domains, terms, definitions, and synonyms through brainstorming and discussion. The panel members participated in three rounds of feedback to discuss, refine, and clarify the proposed terms, definitions, and synonyms. Finally, the panel members were surveyed to assess their level of agreement regarding the glossary. Eighteen terms were identified and defined: (1) continuous updating, (2) decision to update, (3) fixed updating, (4) full updating, (5) impact of the new evidence, (6) partial updating, (7) prioritization process, (8) reporting process, (9) signal for an update, (10) surveillance process, (11) time of validity, (12) timeframe, (13) tools and resources, (14) up to date, (15) update cycle, (16) update unit, (17) updated version, and (18) updating strategy. Consensus was reached for all terms, definitions, and synonyms (median agreement scores ≥ 6); except for one term. The G-I-N Updating Guidelines Working Group assembled the Updating Glossary to facilitate and improve the knowledge exchange among CGs developers, researchers, and users. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Implications of next generation attenuation ground motion prediction equations for site coefficients used in earthquake resistant design

    USGS Publications Warehouse

    Borcherdt, Roger D.

    2014-01-01

    Proposals are developed to update Tables 11.4-1 and 11.4-2 of Minimum Design Loads for Buildings and Other Structures published as American Society of Civil Engineers Structural Engineering Institute standard 7-10 (ASCE/SEI 7–10). The updates are mean next generation attenuation (NGA) site coefficients inferred directly from the four NGA ground motion prediction equations used to derive the maximum considered earthquake response maps adopted in ASCE/SEI 7–10. Proposals include the recommendation to use straight-line interpolation to infer site coefficients at intermediate values of (average shear velocity to 30-m depth). The NGA coefficients are shown to agree well with adopted site coefficients at low levels of input motion (0.1 g) and those observed from the Loma Prieta earthquake. For higher levels of input motion, the majority of the adopted values are within the 95% epistemic-uncertainty limits implied by the NGA estimates with the exceptions being the mid-period site coefficient, Fv, for site class D and the short-period coefficient, Fa, for site class C, both of which are slightly less than the corresponding 95% limit. The NGA data base shows that the median value  of 913 m/s for site class B is more typical than 760 m/s as a value to characterize firm to hard rock sites as the uniform ground condition for future maximum considered earthquake response ground motion estimates. Future updates of NGA ground motion prediction equations can be incorporated easily into future adjustments of adopted site coefficients using procedures presented herein. 

  19. Utilizing semantic Wiki technology for intelligence analysis at the tactical edge

    NASA Astrophysics Data System (ADS)

    Little, Eric

    2014-05-01

    Challenges exist for intelligence analysts to efficiently and accurately process large amounts of data collected from a myriad of available data sources. These challenges are even more evident for analysts who must operate within small military units at the tactical edge. In such environments, decisions must be made quickly without guaranteed access to the kinds of large-scale data sources available to analysts working at intelligence agencies. Improved technologies must be provided to analysts at the tactical edge to make informed, reliable decisions, since this is often a critical collection point for important intelligence data. To aid tactical edge users, new types of intelligent, automated technology interfaces are required to allow them to rapidly explore information associated with the intersection of hard and soft data fusion, such as multi-INT signals, semantic models, social network data, and natural language processing of text. Abilities to fuse these types of data is paramount to providing decision superiority. For these types of applications, we have developed BLADE. BLADE allows users to dynamically add, delete and link data via a semantic wiki, allowing for improved interaction between different users. Analysts can see information updates in near-real-time due to a common underlying set of semantic models operating within a triple store that allows for updates on related data points from independent users tracking different items (persons, events, locations, organizations, etc.). The wiki can capture pictures, videos and related information. New information added directly to pages is automatically updated in the triple store and its provenance and pedigree is tracked over time, making that data more trustworthy and easily integrated with other users' pages.

  20. Patient data system for monitoring shunts.

    PubMed

    Frank, E; Su, E; Smith, K

    1988-01-01

    Rapidly locating accurate data on a patient's shunt system is often extremely difficult. We have developed a simple system to fill a perceived need for recording current data on a patients shunt. This system employs an easily updated record in the patient's hospital or clinic chart as well as a wallet-sized data card for the patient or his family to carry. The data in the chart include the configuration of the patient's current shunt system and a graphic record of previous shunt problems. The small patient data card describes the age of the shunt system and its current configuration. We have found that this system provides assistance in the routine follow-up of patients with shunts and plays an extremely necessary role in the emergency evaluation of these patients, particularly when an emergency evaluation is undertaken in facilities distant from the location of regular treatment.

  1. PDBe: towards reusable data delivery infrastructure at protein data bank in Europe

    PubMed Central

    Alhroub, Younes; Anyango, Stephen; Armstrong, David R; Berrisford, John M; Clark, Alice R; Conroy, Matthew J; Dana, Jose M; Gupta, Deepti; Gutmanas, Aleksandras; Haslam, Pauline; Mak, Lora; Mukhopadhyay, Abhik; Nadzirin, Nurul; Paysan-Lafosse, Typhaine; Sehnal, David; Sen, Sanchayita; Smart, Oliver S; Varadi, Mihaly; Kleywegt, Gerard J

    2018-01-01

    Abstract The Protein Data Bank in Europe (PDBe, pdbe.org) is actively engaged in the deposition, annotation, remediation, enrichment and dissemination of macromolecular structure data. This paper describes new developments and improvements at PDBe addressing three challenging areas: data enrichment, data dissemination and functional reusability. New features of the PDBe Web site are discussed, including a context dependent menu providing links to raw experimental data and improved presentation of structures solved by hybrid methods. The paper also summarizes the features of the LiteMol suite, which is a set of services enabling fast and interactive 3D visualization of structures, with associated experimental maps, annotations and quality assessment information. We introduce a library of Web components which can be easily reused to port data and functionality available at PDBe to other services. We also introduce updates to the SIFTS resource which maps PDB data to other bioinformatics resources, and the PDBe REST API. PMID:29126160

  2. Nonlinear and Digital Man-machine Control Systems Modeling

    NASA Technical Reports Server (NTRS)

    Mekel, R.

    1972-01-01

    An adaptive modeling technique is examined by which controllers can be synthesized to provide corrective dynamics to a human operator's mathematical model in closed loop control systems. The technique utilizes a class of Liapunov functions formulated for this purpose, Liapunov's stability criterion and a model-reference system configuration. The Liapunov function is formulated to posses variable characteristics to take into consideration the identification dynamics. The time derivative of the Liapunov function generate the identification and control laws for the mathematical model system. These laws permit the realization of a controller which updates the human operator's mathematical model parameters so that model and human operator produce the same response when subjected to the same stimulus. A very useful feature is the development of a digital computer program which is easily implemented and modified concurrent with experimentation. The program permits the modeling process to interact with the experimentation process in a mutually beneficial way.

  3. Ultralight amorphous silicon alloy photovoltaic modules for space and terrestrial applications

    NASA Astrophysics Data System (ADS)

    Hanak, J. J.; Fulton, C.; Myatt, A.; Nath, P.; Woodyard, J. R.

    This paper gives a review and an update on recently developed ultralight photovoltaic modules based on amorphous silicon (a-Si) alloys. They consist of tandem-junction solar cells deposited by a continuous, roll-to-roll process onto thin, foil substrates of bare metal, high temperature resin or metal coated with insulators. They have the following features: size, up to 71 cm x 30.5 cm; total thickness, 8 to 50 microns; power-to-weight at AM1, 2.4 kW/kg; and power-to-volume ratio 6.5 MW/cu m. Cells of a-Si alloys are over 50 times more tolerant to irradiation with 1 MeV and with 200 keV protons than crystalline cells and the damage is easily annealable. The modules have high power density and stability, they are portable, stowable, deployable, retractable, tolerant to radiation and meteorite or projectile impact and attractive for terrestrial and aerospace applications.

  4. Thirty Years, One Million Spectra: Public Access to the SAO Spectral Archives

    NASA Astrophysics Data System (ADS)

    Mink, J.; Moran, S.

    2015-09-01

    Over the last 30 years, the SAO Telescope Data Center has reduced and archived over 1,000,000 spectra, consisting of 287,000 spectra from five high dispersion Echelle spectrographs and 717,000 spectra from four low dispersion spectrographs, across three telescopes. 151,000 spectra from six instruments are currently online and publicly available, covering many interesting objects in the northern sky, including most of the galaxies in the Updated Zwicky Catalog which are reachable through NED or Simbad. A majority of the high dispersion spectra will soon be made public, as will more data from the MMT multi-fiber spectrographs. Many objects in the archive have multiple spectra over time, which make them a valuable resource for archival time-domain studies. We are now developing a system to make all of the public spectra more easily searchable and viewable through the Virtual Observatory.

  5. BioMAJ: a flexible framework for databanks synchronization and processing.

    PubMed

    Filangi, Olivier; Beausse, Yoann; Assi, Anthony; Legrand, Ludovic; Larré, Jean-Marc; Martin, Véronique; Collin, Olivier; Caron, Christophe; Leroy, Hugues; Allouche, David

    2008-08-15

    Large- and medium-scale computational molecular biology projects require accurate bioinformatics software and numerous heterogeneous biological databanks, which are distributed around the world. BioMAJ provides a flexible, robust, fully automated environment for managing such massive amounts of data. The JAVA application enables automation of the data update cycle process and supervision of the locally mirrored data repository. We have developed workflows that handle some of the most commonly used bioinformatics databases. A set of scripts is also available for post-synchronization data treatment consisting of indexation or format conversion (for NCBI blast, SRS, EMBOSS, GCG, etc.). BioMAJ can be easily extended by personal homemade processing scripts. Source history can be kept via html reports containing statements of locally managed databanks. http://biomaj.genouest.org. BioMAJ is free open software. It is freely available under the CECILL version 2 license.

  6. SU-E-J-92: CERR: New Tools to Analyze Image Registration Precision.

    PubMed

    Apte, A; Wang, Y; Oh, J; Saleh, Z; Deasy, J

    2012-06-01

    To present new tools in CERR (The Computational Environment for Radiotherapy Research) to analyze image registration and other software updates/additions. CERR continues to be a key environment (cited more than 129 times to date) for numerous RT-research studies involving outcomes modeling, prototyping algorithms for segmentation, and registration, experiments with phantom dosimetry, IMRT research, etc. Image registration is one of the key technologies required in many research studies. CERR has been interfaced with popular image registration frameworks like Plastimatch and ITK. Once the images have been autoregistered, CERR provides tools to analyze the accuracy of registration using the following innovative approaches (1)Distance Discordance Histograms (DDH), described in detail in a separate paper and (2)'MirrorScope', explained as follows: for any view plane the 2-d image is broken up into a 2d grid of medium-sized squares. Each square contains a right-half, which is the reference image, and a left-half, which is the mirror flipped version of the overlay image. The user can increase or decrease the size of this grid to control the resolution of the analysis. Other updates to CERR include tools to extract image and dosimetric features programmatically and storage in a central database and tools to interface with Statistical analysis software like SPSS and Matlab Statistics toolbox. MirrorScope was compared on various examples, including 'perfect' registration examples and 'artificially translated' registrations. for 'perfect' registration, the patterns obtained within each circles are symmetric, and are easily, visually recognized as aligned. For registrations that are off, the patterns obtained in the circles located in the regions of imperfections show unsymmetrical patterns that are easily recognized. The new updates to CERR further increase its utility for RT-research. Mirrorscope is a visually intuitive method of monitoring the accuracy of image registration that improves on the visual confusion of standard methods. © 2012 American Association of Physicists in Medicine.

  7. Development of an expert analysis tool based on an interactive subsidence hazard map for urban land use in the city of Celaya, Mexico

    NASA Astrophysics Data System (ADS)

    Alloy, A.; Gonzalez Dominguez, F.; Nila Fonseca, A. L.; Ruangsirikulchai, A.; Gentle, J. N., Jr.; Cabral, E.; Pierce, S. A.

    2016-12-01

    Land Subsidence as a result of groundwater extraction in central Mexico's larger urban centers initiated in the 80's as a result of population and economic growth. The city of Celaya has undergone subsidence for a few decades and a consequence is the development of an active normal fault system that affects its urban infrastructure and residential areas. To facilitate its analysis and a land use decision-making process we created an online interactive map enabling users to easily obtain information associated with land subsidence. Geological and socioeconomic data of the city was collected, including fault location, population data, and other important infrastructure and structural data has been obtained from fieldwork as part of a study abroad interchange undergraduate course. The subsidence and associated faulting hazard map was created using an InSAR derived subsidence velocity map and population data from INEGI to identify hazard zones using a subsidence gradient spatial analysis approach based on a subsidence gradient and population risk matrix. This interactive map provides a simple perspective of different vulnerable urban elements. As an accessible visualization tool, it will enhance communication between scientific and socio-economic disciplines. Our project also lays the groundwork for a future expert analysis system with an open source and easily accessible Python coded, SQLite database driven website which archives fault and subsidence data along with visual damage documentation to civil structures. This database takes field notes and provides an entry form for uniform datasets, which are used to generate a JSON. Such a database is useful because it allows geoscientists to have a centralized repository and access to their observations over time. Because of the widespread presence of the subsidence phenomena throughout cities in central Mexico, the spatial analysis has been automated using the open source software R. Raster, rgeos, shapefiles, and rgdal libraries have been used to develop the script which permits to obtain the raster maps of horizontal gradient and population density. An advantage is that this analysis can be automated for periodic updates or repurposed for similar analysis in other cities, providing an easily accessible tool for land subsidence hazard assessments.

  8. [Purity Detection Model Update of Maize Seeds Based on Active Learning].

    PubMed

    Tang, Jin-ya; Huang, Min; Zhu, Qi-bing

    2015-08-01

    Seed purity reflects the degree of seed varieties in typical consistent characteristics, so it is great important to improve the reliability and accuracy of seed purity detection to guarantee the quality of seeds. Hyperspectral imaging can reflect the internal and external characteristics of seeds at the same time, which has been widely used in nondestructive detection of agricultural products. The essence of nondestructive detection of agricultural products using hyperspectral imaging technique is to establish the mathematical model between the spectral information and the quality of agricultural products. Since the spectral information is easily affected by the sample growth environment, the stability and generalization of model would weaken when the test samples harvested from different origin and year. Active learning algorithm was investigated to add representative samples to expand the sample space for the original model, so as to implement the rapid update of the model's ability. Random selection (RS) and Kennard-Stone algorithm (KS) were performed to compare the model update effect with active learning algorithm. The experimental results indicated that in the division of different proportion of sample set (1:1, 3:1, 4:1), the updated purity detection model for maize seeds from 2010 year which was added 40 samples selected by active learning algorithm from 2011 year increased the prediction accuracy for 2011 new samples from 47%, 33.75%, 49% to 98.89%, 98.33%, 98.33%. For the updated purity detection model of 2011 year, its prediction accuracy for 2010 new samples increased by 50.83%, 54.58%, 53.75% to 94.57%, 94.02%, 94.57% after adding 56 new samples from 2010 year. Meanwhile the effect of model updated by active learning algorithm was better than that of RS and KS. Therefore, the update for purity detection model of maize seeds is feasible by active learning algorithm.

  9. Recommendations for kidney disease guideline updating: a report by the KDIGO Methods Committee

    PubMed Central

    Uhlig, Katrin; Berns, Jeffrey S.; Carville, Serena; Chan, Wiley; Cheung, Michael; Guyatt, Gordon H.; Hart, Allyson; Lewis, Sandra Zelman; Tonelli, Marcello; Webster, Angela C.; Wilt, Timothy J.; Kasiske, Bertram L.

    2017-01-01

    Updating rather than de novo guideline development now accounts for the majority of guideline activities for many guideline development organizations, including Kidney Disease: Improving Global Outcomes (KDIGO), an international kidney disease guideline development entity that has produced guidelines on kidney diseases since 2008. Increasingly, guideline developers are moving away from updating at fixed intervals in favor of more flexible approaches that use periodic expert assessment of guideline currency (with or without an updated systematic review) to determine the need for updating. Determining the need for guideline updating in an efficient, transparent, and timely manner is challenging, and updating of systematic reviews and guidelines is labor intensive. Ideally, guidelines should be updated dynamically when new evidence indicates a need for a substantive change in the guideline based on a priori criteria. This dynamic updating (sometimes referred to as a living guideline model) can be facilitated with the use of integrated electronic platforms that allow updating of specific recommendations. This report summarizes consensus-based recommendations from a panel of guideline methodology professionals on how to keep KDIGO guidelines up to date. PMID:26994574

  10. Finite element model updating of riveted joints of simplified model aircraft structure

    NASA Astrophysics Data System (ADS)

    Yunus, M. A.; Rani, M. N. Abdul; Sani, M. S. M.; Shah, M. A. S. Aziz

    2018-04-01

    Thin metal sheets are widely used to fabricate a various type of aerospace structures because of its flexibility and easily to form into any type shapes of structure. The riveted joint has turn out to be one of the popular joint types in jointing the aerospace structures because they can be easily be disassembled, maintained and inspected. In this paper, thin metal sheet components are assembled together via riveted joints to form a simplified model of aerospace structure. However, to model the jointed structure that are attached together via the mechanical joints such as riveted joint are very difficult due to local effects. Understandably that the dynamic characteristic of the joined structure can be significantly affected by these joints due to local effects at the mating areas of the riveted joints such as surface contact, clamping force and slips. A few types of element connectors that available in MSC NATRAN/PATRAN have investigated in order to presented as the rivet joints. Thus, the results obtained in term of natural frequencies and mode shapes are then contrasted with experimental counterpart in order to investigate the acceptance level of accuracy between element connectors that are used in modelling the rivet joints of the riveted joints structure. The reconciliation method via finiteelement model updating is used to minimise the discrepancy of the initial finite element model of the riveted joined structure as close as experimental data and their results are discussed.

  11. Balancing exploration and exploitation in population-based sampling improves fragment-based de novo protein structure prediction.

    PubMed

    Simoncini, David; Schiex, Thomas; Zhang, Kam Y J

    2017-05-01

    Conformational search space exploration remains a major bottleneck for protein structure prediction methods. Population-based meta-heuristics typically enable the possibility to control the search dynamics and to tune the balance between local energy minimization and search space exploration. EdaFold is a fragment-based approach that can guide search by periodically updating the probability distribution over the fragment libraries used during model assembly. We implement the EdaFold algorithm as a Rosetta protocol and provide two different probability update policies: a cluster-based variation (EdaRose c ) and an energy-based one (EdaRose en ). We analyze the search dynamics of our new Rosetta protocols and show that EdaRose c is able to provide predictions with lower C αRMSD to the native structure than EdaRose en and Rosetta AbInitio Relax protocol. Our software is freely available as a C++ patch for the Rosetta suite and can be downloaded from http://www.riken.jp/zhangiru/software/. Our protocols can easily be extended in order to create alternative probability update policies and generate new search dynamics. Proteins 2017; 85:852-858. © 2016 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  12. OLIVER: an online library of images for veterinary education and research.

    PubMed

    McGreevy, Paul; Shaw, Tim; Burn, Daniel; Miller, Nick

    2007-01-01

    As part of a strategic move by the University of Sydney toward increased flexibility in learning, the Faculty of Veterinary Science undertook a number of developments involving Web-based teaching and assessment. OLIVER underpins them by providing a rich, durable repository for learning objects. To integrate Web-based learning, case studies, and didactic presentations for veterinary and animal science students, we established an online library of images and other learning objects for use by academics in the Faculties of Veterinary Science and Agriculture. The objectives of OLIVER were to maximize the use of the faculty's teaching resources by providing a stable archiving facility for graphic images and other multimedia learning objects that allows flexible and precise searching, integrating indexing standards, thesauri, pull-down lists of preferred terms, and linking of objects within cases. OLIVER offers a portable and expandable Web-based shell that facilitates ongoing storage of learning objects in a range of media. Learning objects can be downloaded in common, standardized formats so that they can be easily imported for use in a range of applications, including Microsoft PowerPoint, WebCT, and Microsoft Word. OLIVER now contains more than 9,000 images relating to many facets of veterinary science; these are annotated and supported by search engines that allow rapid access to both images and relevant information. The Web site is easily updated and adapted as required.

  13. Self-emulsifying drug delivery systems (SEDDS): formulation development, characterization, and applications.

    PubMed

    Singh, Bhupinder; Bandopadhyay, Shantanu; Kapil, Rishi; Singh, Ramandeep; Katare, O

    2009-01-01

    Self-emulsifying drug delivery systems (SEDDS) possess unparalleled potential in improving oral bioavailability of poorly water-soluble drugs. Following their oral administration, these systems rapidly disperse in gastrointestinal fluids, yielding micro- or nanoemulsions containing the solubilized drug. Owing to its miniscule globule size, the micro/nanoemulsifed drug can easily be absorbed through lymphatic pathways, bypassing the hepatic first-pass effect. We present an exhaustive and updated account of numerous literature reports and patents on diverse types of self-emulsifying drug formulations, with emphasis on their formulation, characterization, and systematic optimization strategies. Recent advancements in various methodologies employed to characterize their globule size and shape, ability to encapsulate the drug, gastrointestinal and thermodynamic stability, rheological characteristics, and so forth, are discussed comprehensively to guide the formula-tor in preparing an effective and robust SEDDS formulation. Also, this exhaustive review offers an explicit discussion on vital applications of the SEDDS in bioavailability enhancement of various drugs, outlining an overview on myriad in vitro, in situ, and ex vivo techniques to assess the absorption and/ or permeation potential of drugs incorporated in the SEDDS in animal and cell line models, and the subsequent absorption pathways followed by them. In short, the current article furnishes an updated compilation of wide-ranging information on all the requisite vistas of the self-emulsifying formulations, thus paving the way for accelerated progress into the SEDDS application in pharmaceutical research.

  14. Rice Annotation Project Database (RAP-DB): an integrative and interactive database for rice genomics.

    PubMed

    Sakai, Hiroaki; Lee, Sung Shin; Tanaka, Tsuyoshi; Numa, Hisataka; Kim, Jungsok; Kawahara, Yoshihiro; Wakimoto, Hironobu; Yang, Ching-chia; Iwamoto, Masao; Abe, Takashi; Yamada, Yuko; Muto, Akira; Inokuchi, Hachiro; Ikemura, Toshimichi; Matsumoto, Takashi; Sasaki, Takuji; Itoh, Takeshi

    2013-02-01

    The Rice Annotation Project Database (RAP-DB, http://rapdb.dna.affrc.go.jp/) has been providing a comprehensive set of gene annotations for the genome sequence of rice, Oryza sativa (japonica group) cv. Nipponbare. Since the first release in 2005, RAP-DB has been updated several times along with the genome assembly updates. Here, we present our newest RAP-DB based on the latest genome assembly, Os-Nipponbare-Reference-IRGSP-1.0 (IRGSP-1.0), which was released in 2011. We detected 37,869 loci by mapping transcript and protein sequences of 150 monocot species. To provide plant researchers with highly reliable and up to date rice gene annotations, we have been incorporating literature-based manually curated data, and 1,626 loci currently incorporate literature-based annotation data, including commonly used gene names or gene symbols. Transcriptional activities are shown at the nucleotide level by mapping RNA-Seq reads derived from 27 samples. We also mapped the Illumina reads of a Japanese leading japonica cultivar, Koshihikari, and a Chinese indica cultivar, Guangluai-4, to the genome and show alignments together with the single nucleotide polymorphisms (SNPs) and gene functional annotations through a newly developed browser, Short-Read Assembly Browser (S-RAB). We have developed two satellite databases, Plant Gene Family Database (PGFD) and Integrative Database of Cereal Gene Phylogeny (IDCGP), which display gene family and homologous gene relationships among diverse plant species. RAP-DB and the satellite databases offer simple and user-friendly web interfaces, enabling plant and genome researchers to access the data easily and facilitating a broad range of plant research topics.

  15. Monitoring of the data processing and simulated production at CMS with a web-based service: the Production Monitoring Platform (pMp)

    NASA Astrophysics Data System (ADS)

    Franzoni, G.; Norkus, A.; Pol, A. A.; Srimanobhas, N.; Walker, J.

    2017-10-01

    Physics analysis at the Compact Muon Solenoid requires both the production of simulated events and processing of the data collected by the experiment. Since the end of the LHC Run-I in 2012, CMS has produced over 20 billion simulated events, from 75 thousand processing requests organised in one hundred different campaigns. These campaigns emulate different configurations of collision events, the detector, and LHC running conditions. In the same time span, sixteen data processing campaigns have taken place to reconstruct different portions of the Run-I and Run-II data with ever improving algorithms and calibrations. The scale and complexity of the events simulation and processing, and the requirement that multiple campaigns must proceed in parallel, demand that a comprehensive, frequently updated and easily accessible monitoring be made available. The monitoring must serve both the analysts, who want to know which and when datasets will become available, and the central production teams in charge of submitting, prioritizing, and running the requests across the distributed computing infrastructure. The Production Monitoring Platform (pMp) web-based service, has been developed in 2015 to address those needs. It aggregates information from multiple services used to define, organize, and run the processing requests. Information is updated hourly using a dedicated elastic database and the monitoring provides multiple configurable views to assess the status of single datasets as well as entire production campaigns. This contribution will describe the pMp development, the evolution of its functionalities, and one and half year of operational experience.

  16. Two new species of obligatory termitophilous rove beetles from Brazil (Coleoptera: Staphylinidae: Termitomorpha Wasmann).

    PubMed

    Caron, Edilson; Bortoluzzi, Sidnei; Rosa, Cassiano S

    2018-04-23

    In recent surveys carried out in Brazil, we detected two new species of Termitomorpha Wasmann, a genus of Termitogastrina (Aleocharinae, Corotocini). This genus is easily recognizable in the subtribe by the evenly convex and smooth pronotum. The new species are: Termitomorpha sinuosa Caron, Bortoluzzi Rosa sp. nov., and Termitomorpha alata Caron Bortoluzzi, sp. nov. Both species are here described and illustrated, including scanning electron photographs. Their host termites were identified as species of Nasutitermes. The key to species of Termitomorpha is updated based on recent literature.

  17. Disturbing effects of attitude control maneuvers on the orbital motion of the Helios spacecraft

    NASA Technical Reports Server (NTRS)

    Georgevic, R. M.

    1976-01-01

    The position of the spin axis of the Helios A spacecraft has been maintained and updated by a series of attitude control maneuvers, by means of a sequence of unbalanced jet forces which produce an additional disturbed motion of the spacecraft's center of mass. The character of this motion, its magnitude and direction was studied. For practical purposes of the orbit determination of the spacecraft, a computer program is given which shows how the components of the disturbing acceleration in the spacecraft-fixed reference frame can be easily computed.

  18. A Global Model for Effective Use and Evaluation of e-Learning in Health

    PubMed Central

    Farrington, Conor; Brayne, Carol

    2013-01-01

    Abstract Healthcare systems worldwide face a wide range of challenges, including demographic change, rising drug and medical technology costs, and persistent and widening health inequalities both within and between countries. Simultaneously, issues such as professional silos, static medical curricula, and perceptions of “information overload” have made it difficult for medical training and continued professional development (CPD) to adapt to the changing needs of healthcare professionals in increasingly patient-centered, collaborative, and/or remote delivery contexts. In response to these challenges, increasing numbers of medical education and CPD programs have adopted e-learning approaches, which have been shown to provide flexible, low-cost, user-centered, and easily updated learning. The effectiveness of e-learning varies from context to context, however, and has also been shown to make considerable demands on users' motivation and “digital literacy” and on providing institutions. Consequently, there is a need to evaluate the effectiveness of e-learning in healthcare as part of ongoing quality improvement efforts. This article outlines the key issues for developing successful models for analyzing e-health learning. PMID:23472702

  19. The Genome Portal of the Department of Energy Joint Genome Institute

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nordberg, Henrik; Cantor, Michael; Dushekyo, Serge

    2014-03-14

    The JGI Genome Portal (http://genome.jgi.doe.gov) provides unified access to all JGI genomic databases and analytical tools. A user can search, download and explore multiple data sets available for all DOE JGI sequencing projects including their status, assemblies and annotations of sequenced genomes. Genome Portal in the past 2 years was significantly updated, with a specific emphasis on efficient handling of the rapidly growing amount of diverse genomic data accumulated in JGI. A critical aspect of handling big data in genomics is the development of visualization and analysis tools that allow scientists to derive meaning from what are otherwise terrabases ofmore » inert sequence. An interactive visualization tool developed in the group allows us to explore contigs resulting from a single metagenome assembly. Implemented with modern web technologies that take advantage of the power of the computer's graphical processing unit (gpu), the tool allows the user to easily navigate over a 100,000 data points in multiple dimensions, among many biologically meaningful parameters of a dataset such as relative abundance, contig length, and G+C content.« less

  20. A global model for effective use and evaluation of e-learning in health.

    PubMed

    Ruggeri, Kai; Farrington, Conor; Brayne, Carol

    2013-04-01

    Healthcare systems worldwide face a wide range of challenges, including demographic change, rising drug and medical technology costs, and persistent and widening health inequalities both within and between countries. Simultaneously, issues such as professional silos, static medical curricula, and perceptions of "information overload" have made it difficult for medical training and continued professional development (CPD) to adapt to the changing needs of healthcare professionals in increasingly patient-centered, collaborative, and/or remote delivery contexts. In response to these challenges, increasing numbers of medical education and CPD programs have adopted e-learning approaches, which have been shown to provide flexible, low-cost, user-centered, and easily updated learning. The effectiveness of e-learning varies from context to context, however, and has also been shown to make considerable demands on users' motivation and "digital literacy" and on providing institutions. Consequently, there is a need to evaluate the effectiveness of e-learning in healthcare as part of ongoing quality improvement efforts. This article outlines the key issues for developing successful models for analyzing e-health learning.

  1. A user friendly database for use in ALARA job dose assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zodiates, A.M.; Willcock, A.

    1995-03-01

    The pressurized water reactor (PWR) design chosen for adoption by Nuclear Electric plc was based on the Westinghouse Standard Nuclear Unit Power Plant (SNUPPS). This design was developed to meet the United Kingdom requirements and these improvements are embodied in the Sizewell B plant which will start commercial operation in 1994. A user-friendly database was developed to assist the station in the dose and ALARP assessments of the work expected to be carried out during station operation and outage. The database stores the information in an easily accessible form and enables updating, editing, retrieval, and searches of the information. Themore » database contains job-related information such as job locations, number of workers required, job times, and the expected plant doserates. It also contains the means to flag job requirements such as requirements for temporary shielding, flushing, scaffolding, etc. Typical uses of the database are envisaged to be in the prediction of occupational doses, the identification of high collective and individual dose jobs, use in ALARP assessments, setting of dose targets, monitoring of dose control performance, and others.« less

  2. OntologyWidget – a reusable, embeddable widget for easily locating ontology terms

    PubMed Central

    Beauheim, Catherine C; Wymore, Farrell; Nitzberg, Michael; Zachariah, Zachariah K; Jin, Heng; Skene, JH Pate; Ball, Catherine A; Sherlock, Gavin

    2007-01-01

    Background Biomedical ontologies are being widely used to annotate biological data in a computer-accessible, consistent and well-defined manner. However, due to their size and complexity, annotating data with appropriate terms from an ontology is often challenging for experts and non-experts alike, because there exist few tools that allow one to quickly find relevant ontology terms to easily populate a web form. Results We have produced a tool, OntologyWidget, which allows users to rapidly search for and browse ontology terms. OntologyWidget can easily be embedded in other web-based applications. OntologyWidget is written using AJAX (Asynchronous JavaScript and XML) and has two related elements. The first is a dynamic auto-complete ontology search feature. As a user enters characters into the search box, the appropriate ontology is queried remotely for terms that match the typed-in text, and the query results populate a drop-down list with all potential matches. Upon selection of a term from the list, the user can locate this term within a generic and dynamic ontology browser, which comprises the second element of the tool. The ontology browser shows the paths from a selected term to the root as well as parent/child tree hierarchies. We have implemented web services at the Stanford Microarray Database (SMD), which provide the OntologyWidget with access to over 40 ontologies from the Open Biological Ontology (OBO) website [1]. Each ontology is updated weekly. Adopters of the OntologyWidget can either use SMD's web services, or elect to rely on their own. Deploying the OntologyWidget can be accomplished in three simple steps: (1) install Apache Tomcat [2] on one's web server, (2) download and install the OntologyWidget servlet stub that provides access to the SMD ontology web services, and (3) create an html (HyperText Markup Language) file that refers to the OntologyWidget using a simple, well-defined format. Conclusion We have developed OntologyWidget, an easy-to-use ontology search and display tool that can be used on any web page by creating a simple html description. OntologyWidget provides a rapid auto-complete search function paired with an interactive tree display. We have developed a web service layer that communicates between the web page interface and a database of ontology terms. We currently store 40 of the ontologies from the OBO website [1], as well as a several others. These ontologies are automatically updated on a weekly basis. OntologyWidget can be used in any web-based application to take advantage of the ontologies we provide via web services or any other ontology that is provided elsewhere in the correct format. The full source code for the JavaScript and description of the OntologyWidget is available from . PMID:17854506

  3. OntologyWidget - a reusable, embeddable widget for easily locating ontology terms.

    PubMed

    Beauheim, Catherine C; Wymore, Farrell; Nitzberg, Michael; Zachariah, Zachariah K; Jin, Heng; Skene, J H Pate; Ball, Catherine A; Sherlock, Gavin

    2007-09-13

    Biomedical ontologies are being widely used to annotate biological data in a computer-accessible, consistent and well-defined manner. However, due to their size and complexity, annotating data with appropriate terms from an ontology is often challenging for experts and non-experts alike, because there exist few tools that allow one to quickly find relevant ontology terms to easily populate a web form. We have produced a tool, OntologyWidget, which allows users to rapidly search for and browse ontology terms. OntologyWidget can easily be embedded in other web-based applications. OntologyWidget is written using AJAX (Asynchronous JavaScript and XML) and has two related elements. The first is a dynamic auto-complete ontology search feature. As a user enters characters into the search box, the appropriate ontology is queried remotely for terms that match the typed-in text, and the query results populate a drop-down list with all potential matches. Upon selection of a term from the list, the user can locate this term within a generic and dynamic ontology browser, which comprises the second element of the tool. The ontology browser shows the paths from a selected term to the root as well as parent/child tree hierarchies. We have implemented web services at the Stanford Microarray Database (SMD), which provide the OntologyWidget with access to over 40 ontologies from the Open Biological Ontology (OBO) website 1. Each ontology is updated weekly. Adopters of the OntologyWidget can either use SMD's web services, or elect to rely on their own. Deploying the OntologyWidget can be accomplished in three simple steps: (1) install Apache Tomcat 2 on one's web server, (2) download and install the OntologyWidget servlet stub that provides access to the SMD ontology web services, and (3) create an html (HyperText Markup Language) file that refers to the OntologyWidget using a simple, well-defined format. We have developed OntologyWidget, an easy-to-use ontology search and display tool that can be used on any web page by creating a simple html description. OntologyWidget provides a rapid auto-complete search function paired with an interactive tree display. We have developed a web service layer that communicates between the web page interface and a database of ontology terms. We currently store 40 of the ontologies from the OBO website 1, as well as a several others. These ontologies are automatically updated on a weekly basis. OntologyWidget can be used in any web-based application to take advantage of the ontologies we provide via web services or any other ontology that is provided elsewhere in the correct format. The full source code for the JavaScript and description of the OntologyWidget is available from http://smd.stanford.edu/ontologyWidget/.

  4. Reporting Items for Updated Clinical Guidelines: Checklist for the Reporting of Updated Guidelines (CheckUp)

    PubMed Central

    Vernooij, Robin W. M.; Alonso-Coello, Pablo; Brouwers, Melissa

    2017-01-01

    Background Scientific knowledge is in constant development. Consequently, regular review to assure the trustworthiness of clinical guidelines is required. However, there is still a lack of preferred reporting items of the updating process in updated clinical guidelines. The present article describes the development process of the Checklist for the Reporting of Updated Guidelines (CheckUp). Methods and Findings We developed an initial list of items based on an overview of research evidence on clinical guideline updating, the Appraisal of Guidelines for Research and Evaluation (AGREE) II Instrument, and the advice of the CheckUp panel (n = 33 professionals). A multistep process was used to refine this list, including an assessment of ten existing updated clinical guidelines, interviews with key informants (response rate: 54.2%; 13/24), a three-round Delphi consensus survey with the CheckUp panel (33 participants), and an external review with clinical guideline methodologists (response rate: 90%; 53/59) and users (response rate: 55.6%; 10/18). CheckUp includes 16 items that address (1) the presentation of an updated guideline, (2) editorial independence, and (3) the methodology of the updating process. In this article, we present the methodology to develop CheckUp and include as a supplementary file an explanation and elaboration document. Conclusions CheckUp can be used to evaluate the completeness of reporting in updated guidelines and as a tool to inform guideline developers about reporting requirements. Editors may request its completion from guideline authors when submitting updated guidelines for publication. Adherence to CheckUp will likely enhance the comprehensiveness and transparency of clinical guideline updating for the benefit of patients and the public, health care professionals, and other relevant stakeholders. PMID:28072838

  5. Capital update factor: a new era approaches.

    PubMed

    Grimaldi, P L

    1993-02-01

    The Health Care Financing Administration (HCFA) has constructed a preliminary model of a new capital update method which is consistent with the framework being developed to refine the update method for PPS operating costs. HCFA's eventual goal is to develop a single update framework for operating and capital costs. Initial results suggest that adopting the new capital update method would reduce capital payments substantially, which might intensify creditor's concerns about extending loans to hospitals.

  6. Semantic Likelihood Models for Bayesian Inference in Human-Robot Interaction

    NASA Astrophysics Data System (ADS)

    Sweet, Nicholas

    Autonomous systems, particularly unmanned aerial systems (UAS), remain limited in au- tonomous capabilities largely due to a poor understanding of their environment. Current sensors simply do not match human perceptive capabilities, impeding progress towards full autonomy. Recent work has shown the value of humans as sources of information within a human-robot team; in target applications, communicating human-generated 'soft data' to autonomous systems enables higher levels of autonomy through large, efficient information gains. This requires development of a 'human sensor model' that allows soft data fusion through Bayesian inference to update the probabilistic belief representations maintained by autonomous systems. Current human sensor models that capture linguistic inputs as semantic information are limited in their ability to generalize likelihood functions for semantic statements: they may be learned from dense data; they do not exploit the contextual information embedded within groundings; and they often limit human input to restrictive and simplistic interfaces. This work provides mechanisms to synthesize human sensor models from constraints based on easily attainable a priori knowledge, develops compression techniques to capture information-dense semantics, and investigates the problem of capturing and fusing semantic information contained within unstructured natural language. A robotic experimental testbed is also developed to validate the above contributions.

  7. A 3D visualization system for molecular structures

    NASA Technical Reports Server (NTRS)

    Green, Terry J.

    1989-01-01

    The properties of molecules derive in part from their structures. Because of the importance of understanding molecular structures various methodologies, ranging from first principles to empirical technique, were developed for computing the structure of molecules. For large molecules such as polymer model compounds, the structural information is difficult to comprehend by examining tabulated data. Therefore, a molecular graphics display system, called MOLDS, was developed to help interpret the data. MOLDS is a menu-driven program developed to run on the LADC SNS computer systems. This program can read a data file generated by the modeling programs or data can be entered using the keyboard. MOLDS has the following capabilities: draws the 3-D representation of a molecule using stick, ball and ball, or space filled model from Cartesian coordinates, draws different perspective views of the molecule; rotates the molecule on the X, Y, Z axis or about some arbitrary line in space, zooms in on a small area of the molecule in order to obtain a better view of a specific region; and makes hard copy representation of molecules on a graphic printer. In addition, MOLDS can be easily updated and readily adapted to run on most computer systems.

  8. Learning to merge: a new tool for interactive mapping

    NASA Astrophysics Data System (ADS)

    Porter, Reid B.; Lundquist, Sheng; Ruggiero, Christy

    2013-05-01

    The task of turning raw imagery into semantically meaningful maps and overlays is a key area of remote sensing activity. Image analysts, in applications ranging from environmental monitoring to intelligence, use imagery to generate and update maps of terrain, vegetation, road networks, buildings and other relevant features. Often these tasks can be cast as a pixel labeling problem, and several interactive pixel labeling tools have been developed. These tools exploit training data, which is generated by analysts using simple and intuitive paint-program annotation tools, in order to tailor the labeling algorithm for the particular dataset and task. In other cases, the task is best cast as a pixel segmentation problem. Interactive pixel segmentation tools have also been developed, but these tools typically do not learn from training data like the pixel labeling tools do. In this paper we investigate tools for interactive pixel segmentation that also learn from user input. The input has the form of segment merging (or grouping). Merging examples are 1) easily obtained from analysts using vector annotation tools, and 2) more challenging to exploit than traditional labels. We outline the key issues in developing these interactive merging tools, and describe their application to remote sensing.

  9. A high resolution spatial population database of Somalia for disease risk mapping.

    PubMed

    Linard, Catherine; Alegana, Victor A; Noor, Abdisalan M; Snow, Robert W; Tatem, Andrew J

    2010-09-14

    Millions of Somali have been deprived of basic health services due to the unstable political situation of their country. Attempts are being made to reconstruct the health sector, in particular to estimate the extent of infectious disease burden. However, any approach that requires the use of modelled disease rates requires reasonable information on population distribution. In a low-income country such as Somalia, population data are lacking, are of poor quality, or become outdated rapidly. Modelling methods are therefore needed for the production of contemporary and spatially detailed population data. Here land cover information derived from satellite imagery and existing settlement point datasets were used for the spatial reallocation of populations within census units. We used simple and semi-automated methods that can be implemented with free image processing software to produce an easily updatable gridded population dataset at 100 × 100 meters spatial resolution. The 2010 population dataset was matched to administrative population totals projected by the UN. Comparison tests between the new dataset and existing population datasets revealed important differences in population size distributions, and in population at risk of malaria estimates. These differences are particularly important in more densely populated areas and strongly depend on the settlement data used in the modelling approach. The results show that it is possible to produce detailed, contemporary and easily updatable settlement and population distribution datasets of Somalia using existing data. The 2010 population dataset produced is freely available as a product of the AfriPop Project and can be downloaded from: http://www.afripop.org.

  10. A high resolution spatial population database of Somalia for disease risk mapping

    PubMed Central

    2010-01-01

    Background Millions of Somali have been deprived of basic health services due to the unstable political situation of their country. Attempts are being made to reconstruct the health sector, in particular to estimate the extent of infectious disease burden. However, any approach that requires the use of modelled disease rates requires reasonable information on population distribution. In a low-income country such as Somalia, population data are lacking, are of poor quality, or become outdated rapidly. Modelling methods are therefore needed for the production of contemporary and spatially detailed population data. Results Here land cover information derived from satellite imagery and existing settlement point datasets were used for the spatial reallocation of populations within census units. We used simple and semi-automated methods that can be implemented with free image processing software to produce an easily updatable gridded population dataset at 100 × 100 meters spatial resolution. The 2010 population dataset was matched to administrative population totals projected by the UN. Comparison tests between the new dataset and existing population datasets revealed important differences in population size distributions, and in population at risk of malaria estimates. These differences are particularly important in more densely populated areas and strongly depend on the settlement data used in the modelling approach. Conclusions The results show that it is possible to produce detailed, contemporary and easily updatable settlement and population distribution datasets of Somalia using existing data. The 2010 population dataset produced is freely available as a product of the AfriPop Project and can be downloaded from: http://www.afripop.org. PMID:20840751

  11. OReFiL: an online resource finder for life sciences.

    PubMed

    Yamamoto, Yasunori; Takagi, Toshihisa

    2007-08-06

    Many online resources for the life sciences have been developed and introduced in peer-reviewed papers recently, ranging from databases and web applications to data-analysis software. Some have been introduced in special journal issues or websites with a search function, but others remain scattered throughout the Internet and in the published literature. The searchable resources on these sites are collected and maintained manually and are therefore of higher quality than automatically updated sites, but also require more time and effort. We developed an online resource search system called OReFiL to address these issues. We developed a crawler to gather all of the web pages whose URLs appear in MEDLINE abstracts and full-text papers on the BioMed Central open-access journals. The URLs were extracted using regular expressions and rules based on our heuristic knowledge. We then indexed the online resources to facilitate their retrieval and comparison by researchers. Because every online resource has at least one PubMed ID, we can easily acquire its summary with Medical Subject Headings (MeSH) terms and confirm its credibility through reference to the corresponding PubMed entry. In addition, because OReFiL automatically extracts URLs and updates the index, minimal time and effort is needed to maintain the system. We developed OReFiL, a search system for online life science resources, which is freely available. The system's distinctive features include the ability to return up-to-date query-relevant online resources introduced in peer-reviewed papers; the ability to search using free words, MeSH terms, or author names; easy verification of each hit following links to the corresponding PubMed entry or to papers citing the URL through the search systems of BioMed Central, Scirus, HighWire Press, or Google Scholar; and quick confirmation of the existence of an online resource web page.

  12. OReFiL: an online resource finder for life sciences

    PubMed Central

    Yamamoto, Yasunori; Takagi, Toshihisa

    2007-01-01

    Background Many online resources for the life sciences have been developed and introduced in peer-reviewed papers recently, ranging from databases and web applications to data-analysis software. Some have been introduced in special journal issues or websites with a search function, but others remain scattered throughout the Internet and in the published literature. The searchable resources on these sites are collected and maintained manually and are therefore of higher quality than automatically updated sites, but also require more time and effort. Description We developed an online resource search system called OReFiL to address these issues. We developed a crawler to gather all of the web pages whose URLs appear in MEDLINE abstracts and full-text papers on the BioMed Central open-access journals. The URLs were extracted using regular expressions and rules based on our heuristic knowledge. We then indexed the online resources to facilitate their retrieval and comparison by researchers. Because every online resource has at least one PubMed ID, we can easily acquire its summary with Medical Subject Headings (MeSH) terms and confirm its credibility through reference to the corresponding PubMed entry. In addition, because OReFiL automatically extracts URLs and updates the index, minimal time and effort is needed to maintain the system. Conclusion We developed OReFiL, a search system for online life science resources, which is freely available. The system's distinctive features include the ability to return up-to-date query-relevant online resources introduced in peer-reviewed papers; the ability to search using free words, MeSH terms, or author names; easy verification of each hit following links to the corresponding PubMed entry or to papers citing the URL through the search systems of BioMed Central, Scirus, HighWire Press, or Google Scholar; and quick confirmation of the existence of an online resource web page. PMID:17683589

  13. Design of affordable and ruggedized biomedical devices using virtual instrumentation.

    PubMed

    Mathern, Ryan Michael; Schopman, Sarah; Kalchthaler, Kyle; Mehta, Khanjan; Butler, Peter

    2013-05-01

    Abstract This paper presents the designs of four low-cost and ruggedized biomedical devices, including a blood pressure monitor, thermometer, weighing scale and spirometer, designed for the East African context. The design constraints included a mass-production price point of $10, accuracy and precision comparable to commercial devices and ruggedness to function effectively in the harsh environment of East Africa. The blood pressure device, thermometer and weighing scale were field-tested in Kenya and each recorded data within 6% error of the measurements from commercial devices and withstood the adverse climate and rough handling. The spirometer functioned according to specifications, but a re-design is needed to improve operability and usability by patients. This article demonstrates the feasibility of designing and commercializing virtual instrumentation-based biomedical devices in resource-constrained environments through context-driven design. The next steps for the devices include designing them such that they can be more easily manufactured, use standardized materials, are easily calibrated in the field and have more user-friendly software programs that can be updated remotely.

  14. Simultaneous travel time tomography for updating both velocity and reflector geometry in triangular/tetrahedral cell model

    NASA Astrophysics Data System (ADS)

    Bai, Chao-ying; He, Lei-yu; Li, Xing-wang; Sun, Jia-yu

    2018-05-01

    To conduct forward and simultaneous inversion in a complex geological model, including an irregular topography (or irregular reflector or velocity anomaly), we in this paper combined our previous multiphase arrival tracking method (referred as triangular shortest-path method, TSPM) in triangular (2D) or tetrahedral (3D) cell model and a linearized inversion solver (referred to as damped minimum norms and constrained least squares problem solved using the conjugate gradient method, DMNCLS-CG) to formulate a simultaneous travel time inversion method for updating both velocity and reflector geometry by using multiphase arrival times. In the triangular/tetrahedral cells, we deduced the partial derivative of velocity variation with respective to the depth change of reflector. The numerical simulation results show that the computational accuracy can be tuned to a high precision in forward modeling and the irregular velocity anomaly and reflector geometry can be accurately captured in the simultaneous inversion, because the triangular/tetrahedral cell can be easily used to stitch the irregular topography or subsurface interface.

  15. Update on Smoking Cessation: E-Cigarettes, Emerging Tobacco Products Trends, and New Technology-Based Interventions.

    PubMed

    Das, Smita; Tonelli, Makenzie; Ziedonis, Douglas

    2016-05-01

    Tobacco use disorders (TUDs) continue to be overly represented in patients treated in mental health and addiction treatment settings. It is the most common substance use disorder (SUD) and the leading cause of health disparities and increased morbidity/mortality amongst individuals with a psychiatric disorder. There are seven Food and Drug Administration (FDA) approved medications and excellent evidence-based psychosocial treatment interventions to use in TUD treatment. In the past few years, access to and use of other tobacco or nicotine emerging products are on the rise, including the highly publicized electronic cigarette (e-cigarette). There has also been a proliferation of technology-based interventions to support standard TUD treatment, including mobile apps and web-based interventions. These tools are easily accessed 24/7 to support outpatient treatment. This update will review the emerging products and counter-measure intervention technologies, including how clinicians can integrate these tools and other community-based resources into their practice.

  16. Multiphase Interface Tracking with Fast Semi-Lagrangian Contouring.

    PubMed

    Li, Xiaosheng; He, Xiaowei; Liu, Xuehui; Zhang, Jian J; Liu, Baoquan; Wu, Enhua

    2016-08-01

    We propose a semi-Lagrangian method for multiphase interface tracking. In contrast to previous methods, our method maintains an explicit polygonal mesh, which is reconstructed from an unsigned distance function and an indicator function, to track the interface of arbitrary number of phases. The surface mesh is reconstructed at each step using an efficient multiphase polygonization procedure with precomputed stencils while the distance and indicator function are updated with an accurate semi-Lagrangian path tracing from the meshes of the last step. Furthermore, we provide an adaptive data structure, multiphase distance tree, to accelerate the updating of both the distance function and the indicator function. In addition, the adaptive structure also enables us to contour the distance tree accurately with simple bisection techniques. The major advantage of our method is that it can easily handle topological changes without ambiguities and preserve both the sharp features and the volume well. We will evaluate its efficiency, accuracy and robustness in the results part with several examples.

  17. Update on the management of Helicobacter pylori infection. Position paper from the Catalan Society of Digestology.

    PubMed

    Sánchez Delgado, Jordi; García-Iglesias, Pilar; Titó, Llúcia; Puig, Ignasi; Planella, Montse; Gené, Emili; Saló, Joan; Martínez-Cerezo, Francesc; Molina-Infante, Javier; Gisbert, Javier P; Calvet, Xavier

    2018-04-01

    More than 30 years after its discovery, Helicobacter pylori (H. pylori) infection remains the most common cause of gastric and duodenal diseases. H. pylori is the leading cause of chronic gastritis, peptic ulcer, gastric MALT lymphoma and gastric adenocarcinoma. Several consensuses have recently been published on the management of H. pylori infection. The general guidelines of the Spanish consensus, the Toronto Consensus and the Maastricht V Consensus of 2016 are similar but concrete recommendations can vary significantly. In addition, the recommendations of some of these consensuses are decidedly complex. This position paper from the Catalan Society of Digestology is an update of evidence-based recommendations on the management and treatment of H. pylori infection. The aim of this document is to review this information in order to make recommendations for routine clinical practice that are simple, specific and easily applied to our setting. Copyright © 2018 Elsevier España, S.L.U. All rights reserved.

  18. Simultaneous travel time tomography for updating both velocity and reflector geometry in triangular/tetrahedral cell model

    NASA Astrophysics Data System (ADS)

    Bai, Chao-ying; He, Lei-yu; Li, Xing-wang; Sun, Jia-yu

    2017-12-01

    To conduct forward and simultaneous inversion in a complex geological model, including an irregular topography (or irregular reflector or velocity anomaly), we in this paper combined our previous multiphase arrival tracking method (referred as triangular shortest-path method, TSPM) in triangular (2D) or tetrahedral (3D) cell model and a linearized inversion solver (referred to as damped minimum norms and constrained least squares problem solved using the conjugate gradient method, DMNCLS-CG) to formulate a simultaneous travel time inversion method for updating both velocity and reflector geometry by using multiphase arrival times. In the triangular/tetrahedral cells, we deduced the partial derivative of velocity variation with respective to the depth change of reflector. The numerical simulation results show that the computational accuracy can be tuned to a high precision in forward modeling and the irregular velocity anomaly and reflector geometry can be accurately captured in the simultaneous inversion, because the triangular/tetrahedral cell can be easily used to stitch the irregular topography or subsurface interface.

  19. Data update in a land information network

    NASA Astrophysics Data System (ADS)

    Mullin, Robin C.

    1988-01-01

    The on-going update of data exchanged in a land information network is examined. In the past, major developments have been undertaken to enable the exchange of data between land information systems. A model of a land information network and the data update process have been developed. Based on these, a functional description of the database and software to perform data updating is presented. A prototype of the data update process was implemented using the ARC/INFO geographic information system. This was used to test four approaches to data updating, i.e., bulk, block, incremental, and alert updates. A bulk update is performed by replacing a complete file with an updated file. A block update requires that the data set be partitioned into blocks. When an update occurs, only the blocks which are affected need to be transferred. An incremental update approach records each feature which is added or deleted and transmits only the features needed to update the copy of the file. An alert is a marker indicating that an update has occurred. It can be placed in a file to warn a user that if he is active in an area containing markers, updated data is available. The four approaches have been tested using a cadastral data set.

  20. Age correction in monitoring audiometry: method to update OSHA age-correction tables to include older workers

    PubMed Central

    Dobie, Robert A; Wojcik, Nancy C

    2015-01-01

    Objectives The US Occupational Safety and Health Administration (OSHA) Noise Standard provides the option for employers to apply age corrections to employee audiograms to consider the contribution of ageing when determining whether a standard threshold shift has occurred. Current OSHA age-correction tables are based on 40-year-old data, with small samples and an upper age limit of 60 years. By comparison, recent data (1999–2006) show that hearing thresholds in the US population have improved. Because hearing thresholds have improved, and because older people are increasingly represented in noisy occupations, the OSHA tables no longer represent the current US workforce. This paper presents 2 options for updating the age-correction tables and extending values to age 75 years using recent population-based hearing survey data from the US National Health and Nutrition Examination Survey (NHANES). Both options provide scientifically derived age-correction values that can be easily adopted by OSHA to expand their regulatory guidance to include older workers. Methods Regression analysis was used to derive new age-correction values using audiometric data from the 1999–2006 US NHANES. Using the NHANES median, better-ear thresholds fit to simple polynomial equations, new age-correction values were generated for both men and women for ages 20–75 years. Results The new age-correction values are presented as 2 options. The preferred option is to replace the current OSHA tables with the values derived from the NHANES median better-ear thresholds for ages 20–75 years. The alternative option is to retain the current OSHA age-correction values up to age 60 years and use the NHANES-based values for ages 61–75 years. Conclusions Recent NHANES data offer a simple solution to the need for updated, population-based, age-correction tables for OSHA. The options presented here provide scientifically valid and relevant age-correction values which can be easily adopted by OSHA to expand their regulatory guidance to include older workers. PMID:26169804

  1. Age correction in monitoring audiometry: method to update OSHA age-correction tables to include older workers.

    PubMed

    Dobie, Robert A; Wojcik, Nancy C

    2015-07-13

    The US Occupational Safety and Health Administration (OSHA) Noise Standard provides the option for employers to apply age corrections to employee audiograms to consider the contribution of ageing when determining whether a standard threshold shift has occurred. Current OSHA age-correction tables are based on 40-year-old data, with small samples and an upper age limit of 60 years. By comparison, recent data (1999-2006) show that hearing thresholds in the US population have improved. Because hearing thresholds have improved, and because older people are increasingly represented in noisy occupations, the OSHA tables no longer represent the current US workforce. This paper presents 2 options for updating the age-correction tables and extending values to age 75 years using recent population-based hearing survey data from the US National Health and Nutrition Examination Survey (NHANES). Both options provide scientifically derived age-correction values that can be easily adopted by OSHA to expand their regulatory guidance to include older workers. Regression analysis was used to derive new age-correction values using audiometric data from the 1999-2006 US NHANES. Using the NHANES median, better-ear thresholds fit to simple polynomial equations, new age-correction values were generated for both men and women for ages 20-75 years. The new age-correction values are presented as 2 options. The preferred option is to replace the current OSHA tables with the values derived from the NHANES median better-ear thresholds for ages 20-75 years. The alternative option is to retain the current OSHA age-correction values up to age 60 years and use the NHANES-based values for ages 61-75 years. Recent NHANES data offer a simple solution to the need for updated, population-based, age-correction tables for OSHA. The options presented here provide scientifically valid and relevant age-correction values which can be easily adopted by OSHA to expand their regulatory guidance to include older workers. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  2. Computational statistics using the Bayesian Inference Engine

    NASA Astrophysics Data System (ADS)

    Weinberg, Martin D.

    2013-09-01

    This paper introduces the Bayesian Inference Engine (BIE), a general parallel, optimized software package for parameter inference and model selection. This package is motivated by the analysis needs of modern astronomical surveys and the need to organize and reuse expensive derived data. The BIE is the first platform for computational statistics designed explicitly to enable Bayesian update and model comparison for astronomical problems. Bayesian update is based on the representation of high-dimensional posterior distributions using metric-ball-tree based kernel density estimation. Among its algorithmic offerings, the BIE emphasizes hybrid tempered Markov chain Monte Carlo schemes that robustly sample multimodal posterior distributions in high-dimensional parameter spaces. Moreover, the BIE implements a full persistence or serialization system that stores the full byte-level image of the running inference and previously characterized posterior distributions for later use. Two new algorithms to compute the marginal likelihood from the posterior distribution, developed for and implemented in the BIE, enable model comparison for complex models and data sets. Finally, the BIE was designed to be a collaborative platform for applying Bayesian methodology to astronomy. It includes an extensible object-oriented and easily extended framework that implements every aspect of the Bayesian inference. By providing a variety of statistical algorithms for all phases of the inference problem, a scientist may explore a variety of approaches with a single model and data implementation. Additional technical details and download details are available from http://www.astro.umass.edu/bie. The BIE is distributed under the GNU General Public License.

  3. Obs4MIPS: Satellite Observations for Model Evaluation

    NASA Astrophysics Data System (ADS)

    Ferraro, R.; Waliser, D. E.; Gleckler, P. J.

    2017-12-01

    This poster will review the current status of the obs4MIPs project, whose purpose is to provide a limited collection of well-established and documented datasets for comparison with Earth system models (https://www.earthsystemcog.org/projects/obs4mips/). These datasets have been reformatted to correspond with the CMIP5 model output requirements, and include technical documentation specifically targeted for their use in model output evaluation. The project holdings now exceed 120 datasets with observations that directly correspond to CMIP5 model output variables, with new additions in response to the CMIP6 experiments. With the growth in climate model output data volume, it is increasing more difficult to bring the model output and the observations together to do evaluations. The positioning of the obs4MIPs datasets within the Earth System Grid Federation (ESGF) allows for the use of currently available and planned online tools within the ESGF to perform analysis using model output and observational datasets without necessarily downloading everything to a local workstation. This past year, obs4MIPs has updated its submission guidelines to closely align with changes in the CMIP6 experiments, and is implementing additional indicators and ancillary data to allow users to more easily determine the efficacy of an obs4MIPs dataset for specific evaluation purposes. This poster will present the new guidelines and indicators, and update the list of current obs4MIPs holdings and their connection to the ESGF evaluation and analysis tools currently available, and being developed for the CMIP6 experiments.

  4. Physician access to drug profiles to reduce adverse reactions

    NASA Astrophysics Data System (ADS)

    Yasnoff, William A.; Tomkins, Edward L.; Dunn, Louise M.

    1995-10-01

    Adverse drug reactions (ADRs) are a major source of preventable morbidity and mortality, especially among the elderly, who use more drugs and are more sensitive to them. The insurance industry has recently addressed this problem through the implementation of drug interaction alerts to pharmacists in conjunction with immediate online claims adjudication for almost 60% of prescriptions (expected to reach 90% within 5 years). These alerts are based on stored patient drug profiles maintained by pharmacy benefit managers (PBMs) which are updated whenever prescriptions are filled. While these alerts are very helpful, the pharmacist does not prescribe, resulting in time-consuming and costly delays to contact the physician and remedy potential interactions. We have developed and demonstrated the feasibility of the PINPOINT (Pharmaceutical Information Network for prevention of interactions) system for making the drug profile and interaction information easily available to the physician before the prescription is written. We plan to test the cost-effectiveness of the system in a prospective controlled clinical trial.

  5. Certification of a hybrid parameter model of the fully flexible Shuttle Remote Manipulator System

    NASA Technical Reports Server (NTRS)

    Barhorst, Alan A.

    1995-01-01

    The development of high fidelity models of mechanical systems with flexible components is in flux. Many working models of these devices assume the elastic motion is small and can be superimposed on the overall rigid body motion. A drawback associated with this type of modeling technique is that it is required to regenerate the linear modal model of the device if the elastic motion is sufficiently far from the base rigid motion. An advantage to this type of modeling is that it uses NASTRAN modal data which is the NASA standard means of modal information exchange. A disadvantage to the linear modeling is that it fails to accurately represent large motion of the system, unless constant modal updates are performed. In this study, which is a continuation of a project started last year, the drawback of the currently used modal snapshot modeling technique is addressed in a rigorous fashion by novel and easily applied means.

  6. PDBe: towards reusable data delivery infrastructure at protein data bank in Europe.

    PubMed

    Mir, Saqib; Alhroub, Younes; Anyango, Stephen; Armstrong, David R; Berrisford, John M; Clark, Alice R; Conroy, Matthew J; Dana, Jose M; Deshpande, Mandar; Gupta, Deepti; Gutmanas, Aleksandras; Haslam, Pauline; Mak, Lora; Mukhopadhyay, Abhik; Nadzirin, Nurul; Paysan-Lafosse, Typhaine; Sehnal, David; Sen, Sanchayita; Smart, Oliver S; Varadi, Mihaly; Kleywegt, Gerard J; Velankar, Sameer

    2018-01-04

    The Protein Data Bank in Europe (PDBe, pdbe.org) is actively engaged in the deposition, annotation, remediation, enrichment and dissemination of macromolecular structure data. This paper describes new developments and improvements at PDBe addressing three challenging areas: data enrichment, data dissemination and functional reusability. New features of the PDBe Web site are discussed, including a context dependent menu providing links to raw experimental data and improved presentation of structures solved by hybrid methods. The paper also summarizes the features of the LiteMol suite, which is a set of services enabling fast and interactive 3D visualization of structures, with associated experimental maps, annotations and quality assessment information. We introduce a library of Web components which can be easily reused to port data and functionality available at PDBe to other services. We also introduce updates to the SIFTS resource which maps PDB data to other bioinformatics resources, and the PDBe REST API. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  7. Image-based tracking and sensor resource management for UAVs in an urban environment

    NASA Astrophysics Data System (ADS)

    Samant, Ashwin; Chang, K. C.

    2010-04-01

    Coordination and deployment of multiple unmanned air vehicles (UAVs) requires a lot of human resources in order to carry out a successful mission. The complexity of such a surveillance mission is significantly increased in the case of an urban environment where targets can easily escape from the UAV's field of view (FOV) due to intervening building and line-of-sight obstruction. In the proposed methodology, we focus on the control and coordination of multiple UAVs having gimbaled video sensor onboard for tracking multiple targets in an urban environment. We developed optimal path planning algorithms with emphasis on dynamic target prioritizations and persistent target updates. The command center is responsible for target prioritization and autonomous control of multiple UAVs, enabling a single operator to monitor and control a team of UAVs from a remote location. The results are obtained using extensive 3D simulations in Google Earth using Tangent plus Lyapunov vector field guidance for target tracking.

  8. MyWEST: my Web Extraction Software Tool for effective mining of annotations from web-based databanks.

    PubMed

    Masseroli, Marco; Stella, Andrea; Meani, Natalia; Alcalay, Myriam; Pinciroli, Francesco

    2004-12-12

    High-throughput technologies create the necessity to mine large amounts of gene annotations from diverse databanks, and to integrate the resulting data. Most databanks can be interrogated only via Web, for a single gene at a time, and query results are generally available only in the HTML format. Although some databanks provide batch retrieval of data via FTP, this requires expertise and resources for locally reimplementing the databank. We developed MyWEST, a tool aimed at researchers without extensive informatics skills or resources, which exploits user-defined templates to easily mine selected annotations from different Web-interfaced databanks, and aggregates and structures results in an automatically updated database. Using microarray results from a model system of retinoic acid-induced differentiation, MyWEST effectively gathered relevant annotations from various biomolecular databanks, highlighted significant biological characteristics and supported a global approach to the understanding of complex cellular mechanisms. MyWEST is freely available for non-profit use at http://www.medinfopoli.polimi.it/MyWEST/

  9. Computational Intelligence for Medical Imaging Simulations.

    PubMed

    Chang, Victor

    2017-11-25

    This paper describes how to simulate medical imaging by computational intelligence to explore areas that cannot be easily achieved by traditional ways, including genes and proteins simulations related to cancer development and immunity. This paper has presented simulations and virtual inspections of BIRC3, BIRC6, CCL4, KLKB1 and CYP2A6 with their outputs and explanations, as well as brain segment intensity due to dancing. Our proposed MapReduce framework with the fusion algorithm can simulate medical imaging. The concept is very similar to the digital surface theories to simulate how biological units can get together to form bigger units, until the formation of the entire unit of biological subject. The M-Fusion and M-Update function by the fusion algorithm can achieve a good performance evaluation which can process and visualize up to 40 GB of data within 600 s. We conclude that computational intelligence can provide effective and efficient healthcare research offered by simulations and visualization.

  10. Just a tweet away.

    PubMed

    Gamble, Kate Huvane

    2009-05-01

    Hospitals and health systems are utilizing Web 2.0 tools to improve staff communication, recruit for research, facilitate networking and build the hospital's brand. A number of hospitals are reporting that tools like YouTube (for Webcasts) can significantly increase traffic to the hospital's site. Mobile CIOs can stay in touch with IT staffs from the road by sending and receiving Twitter updates. Social media can break down hierarchal boundaries by making C-suite executives more easily accessible to others in the organization. Sites like LinkedIn and Plaxo can be a valuable tool for CIOs looking to fill positions or network with peers.

  11. Discovering Astronomy: An Astro 101 e-book

    NASA Astrophysics Data System (ADS)

    Shawl, Stephen J.; Byrd, Gene; Deustua, Susana E.; LoPresto, Michael C.

    2016-01-01

    Discovering Astronomy, now available in its 6th edition as an eText, has many advantages and features for your students. We have partnered with etextink.com and WebAssign.net to produce an affordable set of cost-saving options for your students. Also available is the Discovering Astronomy Activity Manual, which provides students with an active-learning experience.Our etext is device independent and thus accessible through any web browser. Americans with Disabilities Act compatibility provides access for all students. Hotlinks to outside sites provide further information for interested students. Lecture demonstration videos of important concepts, made specifically for this new edition, are embedded within the text as appropriate. Students can highlight text, take notes, and bookmark locations within the text. Important terms are linked to the glossary. Search capabilities allow students to easily find what they want.Instructors can interact with their students directly through the etext once the class roster has been provided. For example, instructors can embed assignments into their students' etext and add their own notes and updates, which are immediately visible to their students.Updates can be quickly made by us as new findings become available. For example, updates from New Horizons were added at the time of the closest approach to Pluto, and an update on the recent announcement of current water on Mars was added the day of the announcement.We will present results of our own experience with college and high school students' use of Discovering Astronomy in online courses.Details of the book, a sample chapter, and other information are available at discoveringastronomy.weebly.com.

  12. 10 CFR 72.70 - Safety analysis report updating.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... original FSAR or, as appropriate, the last update to the FSAR under this section. The update shall include... for an ISFSI or MRS shall update periodically, as provided in paragraphs (b) and (c) of this section... applicant commitments developed during the license approval and/or hearing process. (b) Each update shall...

  13. 10 CFR 72.70 - Safety analysis report updating.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... original FSAR or, as appropriate, the last update to the FSAR under this section. The update shall include... for an ISFSI or MRS shall update periodically, as provided in paragraphs (b) and (c) of this section... applicant commitments developed during the license approval and/or hearing process. (b) Each update shall...

  14. 10 CFR 72.70 - Safety analysis report updating.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... original FSAR or, as appropriate, the last update to the FSAR under this section. The update shall include... for an ISFSI or MRS shall update periodically, as provided in paragraphs (b) and (c) of this section... applicant commitments developed during the license approval and/or hearing process. (b) Each update shall...

  15. 10 CFR 72.70 - Safety analysis report updating.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... original FSAR or, as appropriate, the last update to the FSAR under this section. The update shall include... for an ISFSI or MRS shall update periodically, as provided in paragraphs (b) and (c) of this section... applicant commitments developed during the license approval and/or hearing process. (b) Each update shall...

  16. 24 CFR 50.36 - Updating of environmental reviews.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 1 2011-04-01 2011-04-01 false Updating of environmental reviews... Urban Development PROTECTION AND ENHANCEMENT OF ENVIRONMENTAL QUALITY Environmental Assessments and Related Reviews § 50.36 Updating of environmental reviews. The environmental review must be re-evaluated...

  17. 24 CFR 50.36 - Updating of environmental reviews.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Updating of environmental reviews... Urban Development PROTECTION AND ENHANCEMENT OF ENVIRONMENTAL QUALITY Environmental Assessments and Related Reviews § 50.36 Updating of environmental reviews. The environmental review must be re-evaluated...

  18. Developing research in partnership with Aboriginal communities - strategies for improving recruitment and retention.

    PubMed

    Rae, K; Weatherall, L; Hollebone, K; Apen, K; McLean, M; Blackwell, C; Eades, S; Boulton, J; Lumbers, E; Smith, R

    2013-01-01

    Australian Aboriginal communities in urban, rural and remote areas are continuing to suffer high rates of perinatal mortality and morbidity that will impact on the future health of the community. It has been well documented that Aboriginal women have extreme distrust of mainstream pregnancy-related health care and suggested that late entry into antenatal care is as high as 50% in the Aboriginal population. Although medical and midwifery staff have long discussed strategies to improve uptake of antenatal health care for Aboriginal women, researchers in many areas have found the recruitment of Aboriginal people into scientific studies almost impossible. This article seeks to share the strategies that have been developed over a period of time by the authors that have proved useful for recruitment and retention into research. It is anticipated that these strategies would also apply for health practitioners in maintaining their patients for clinical care management. Although each research location (regional, rural and remote) has had to spend time determining what approach is best for meeting the research outcomes, many of these suggestions become applicable to clinicians seeking to develop better connections with Aboriginal patients in their clinics. With the management of ongoing chronic health conditions for Aboriginal people a priority in 'Closing the Gap', a number of these suggestions could easily be implemented by clinicians. Remembering that each community has specific needs that must be addressed, priorities for assistance for that community will be easily identifiable after community consultation (eg transport, or ability to access medical testing). Opportunities for the use of new social media (eg Facebook) as communication tools for researchers and clinicians will have increasing applicability as further software updates are created. With open and trusting dialogues between researchers, clinicians and Aboriginal communities, we can go a long way towards understanding the needs of individual communities and working in partnerships to close the gap.

  19. Female sterilization: update on clinical efficacy, side effects and contraindications.

    PubMed

    Gizzo, Salvatore; Bertocco, Anna; Saccardi, Carlo; Di Gangi, Stefania; Litta, Pietro Salvatore; D'antona, Donato; Nardelli, Giovanni Battista

    2014-10-01

    The aim of this review is to compare studies concerning female sterilization in order to define the most suitable approach and device for each patient considering timing, safety, cost-effectiveness, failure rate, complication rate and patient satisfaction. A systematic literature search was conducted in electronic databases MEDLINE-EMBASE-Sciencedirect and Cochrane Library between 2000 and 2012. All original descriptions, case reports, retrospective and review articles on tubal sterilization methods have been considered. Outcome measures were effectiveness, tolerability, procedure complications and female satisfaction. The ideal female sterilization system should be a simple, safe, highly efficient, easily learned, inexpensive, one-time procedure without negative side-effects. Nowadays, the trans-cervical approach is associated with minimal postoperative pain, allowing short hospitalization and fast resumption of daily activities. Laparoscopic and laparotomic approaches are considered second choices, since, particularly in developing countries, the transcervical hysteroscopic methods will increasingly spread within gynaecological clinical practice. Safety issues, hospital stay, costs and surgeons' experience are important factors in decision-making of the method for female sterilization. Hysteroscopic devices should be preferred when possible. The counselling time remains a fundamental step in choice. The decision concerning method depends on the setting, the surgeon's experience, the country's economical development and the woman's preference.

  20. Micromechanics Analysis Code Post-Processing (MACPOST) User Guide. 1.0

    NASA Technical Reports Server (NTRS)

    Goldberg, Robert K.; Comiskey, Michele D.; Bednarcyk, Brett A.

    1999-01-01

    As advanced composite materials have gained wider usage. the need for analytical models and computer codes to predict the thermomechanical deformation response of these materials has increased significantly. Recently, a micromechanics technique called the generalized method of cells (GMC) has been developed, which has the capability to fulfill this -oal. Tc provide a framework for GMC, the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) has been developed. As MAC/GMC has been updated, significant improvements have been made to the post-processing capabilities of the code. Through the MACPOST program, which operates directly within the MSC/PATRAN graphical pre- and post-processing package, a direct link between the analysis capabilities of MAC/GMC and the post-processing capabilities of MSC/PATRAN has been established. MACPOST has simplified the production, printing. and exportation of results for unit cells analyzed by MAC/GMC. MACPOST allows different micro-level quantities to be plotted quickly and easily in contour plots. In addition, meaningful data for X-Y plots can be examined. MACPOST thus serves as an important analysis and visualization tool for the macro- and micro-level data generated by MAC/GMC. This report serves as the user's manual for the MACPOST program.

  1. A Probabilistic Approach to Model Update

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Reaves, Mercedes C.; Voracek, David F.

    2001-01-01

    Finite element models are often developed for load validation, structural certification, response predictions, and to study alternate design concepts. In rare occasions, models developed with a nominal set of parameters agree with experimental data without the need to update parameter values. Today, model updating is generally heuristic and often performed by a skilled analyst with in-depth understanding of the model assumptions. Parameter uncertainties play a key role in understanding the model update problem and therefore probabilistic analysis tools, developed for reliability and risk analysis, may be used to incorporate uncertainty in the analysis. In this work, probability analysis (PA) tools are used to aid the parameter update task using experimental data and some basic knowledge of potential error sources. Discussed here is the first application of PA tools to update parameters of a finite element model for a composite wing structure. Static deflection data at six locations are used to update five parameters. It is shown that while prediction of individual response values may not be matched identically, the system response is significantly improved with moderate changes in parameter values.

  2. 2017 State of Wind Development in the United States by Region

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oteri, Frank A; Baranowski, Ruth E; Baring-Gould, Edward I

    This document summarizes the status and drivers for U.S. wind energy development during 2017. Regional Resource Center (RRC) leaders provided a report of wind energy development in their regions, which was combined with findings from National Renewable Energy Laboratory (NREL) researchers to provide an account of the state of the regions, as well as updates on developments in individual states. NREL researchers and state partners added updates for all states that are not directly supported by an RRC. Accounts for each region include updates on renewable portfolio standards, workforce development, manufacturing and economic development, and individual state updates for installedmore » wind capacity, ongoing policy developments, planned projects and their status, transmission progress reports, etc. This report also highlights the efforts of the RRCs to engage stakeholders in their individual regions.« less

  3. Improving the representation of peptide-like inhibitor and antibiotic molecules in the Protein Data Bank

    PubMed Central

    Dutta, Shuchismita; Dimitropoulos, Dimitris; Feng, Zukang; Persikova, Irina; Sen, Sanchayita; Shao, Chenghua; Westbrook, John; Young, Jasmine; Zhuravleva, Marina A; Kleywegt, Gerard J; Berman, Helen M

    2014-01-01

    With the accumulation of a large number and variety of molecules in the Protein Data Bank (PDB) comes the need on occasion to review and improve their representation. The Worldwide PDB (wwPDB) partners have periodically updated various aspects of structural data representation to improve the integrity and consistency of the archive. The remediation effort described here was focused on improving the representation of peptide-like inhibitor and antibiotic molecules so that they can be easily identified and analyzed. Peptide-like inhibitors or antibiotics were identified in over 1000 PDB entries, systematically reviewed and represented either as peptides with polymer sequence or as single components. For the majority of the single-component molecules, their peptide-like composition was captured in a new representation, called the subcomponent sequence. A novel concept called “group” was developed for representing complex peptide-like antibiotics and inhibitors that are composed of multiple polymer and nonpolymer components. In addition, a reference dictionary was developed with detailed information about these peptide-like molecules to aid in their annotation, identification and analysis. Based on the experience gained in this remediation, guidelines, procedures, and tools were developed to annotate new depositions containing peptide-like inhibitors and antibiotics accurately and consistently. © 2013 Wiley Periodicals, Inc. Biopolymers 101: 659–668, 2014. PMID:24173824

  4. Updates to the Demographic and Spatial Allocation Models to ...

    EPA Pesticide Factsheets

    EPA's announced the availability of the final report, Updates to the Demographic and Spatial Allocation Models to Produce Integrated Climate and Land Use Scenarios (ICLUS) (Version 2). This update furthered land change modeling by providing nationwide housing development scenarios up to 2100. This newest version includes updated population and land use data sets and addresses limitations identified in ICLUS v1 in both the migration and spatial allocation models. The companion user guide (Final Report) describes the development of ICLUS v2 and the updates that were made to the original data sets and the demographic and spatial allocation models. The GIS tool enables users to run SERGoM with the population projections developed for the ICLUS project and allows users to modify the spatial allocation housing density across the landscape.

  5. [Cardiovascular disease prevention in adults with type 2 diabetes mellitus according to the recent statement from the American Heart Association/American Diabetes Association].

    PubMed

    Avogaro, Angelo

    2016-03-01

    There is a clear epidemiologic association between glycemic control and cardiovascular disease. There is strong evidence of a microvascular benefit by lowering glycated hemoglobin <7% while acknowledging lack of proven macrovascular benefits. It is therefore relevant, in all diabetic patients, to control all major cardiovascular risk factors such as obesity, hypertension, and dyslipidemia. These risk factors, easily measurable, account for 90% of acute myocardial infarction. In this review, the update on prevention of cardiovascular disease in adults with type 2 diabetes mellitus from the American Heart Association and the American Diabetes Association is discussed and commented.

  6. GSE, data management system programmers/User' manual

    NASA Technical Reports Server (NTRS)

    Schlagheck, R. A.; Dolerhie, B. D., Jr.; Ghiglieri, F. J.

    1974-01-01

    The GSE data management system is a computerized program which provides for a central storage source for key data associated with the mechanical ground support equipment (MGSE). Eight major sort modes can be requested by the user. Attributes that are printed automatically with each sort include the GSE end item number, description, class code, functional code, fluid media, use location, design responsibility, weight, cost, quantity, dimensions, and applicable documents. Multiple subsorts are available for the class code, functional code, fluid media, use location, design responsibility, and applicable document categories. These sorts and how to use them are described. The program and GSE data bank may be easily updated and expanded.

  7. Utilizing online resources for taxonomy: a cybercatalog of Afrotropical apiocerid flies (Insecta: Diptera: Apioceridae).

    PubMed

    Dikow, Torsten; Agosti, Donat

    2015-01-01

    A cybercatalog to the Apioceridae (apiocerid flies) of the Afrotropical Region is provided. Each taxon entry includes links to open-access, online repositories such as ZooBank, BHL/BioStor/BLR, Plazi, GBIF, Morphbank, EoL, and a research web-site to access taxonomic information, digitized literature, morphological descriptions, specimen occurrence data, and images. Cybercatalogs as the one presented here will need to become the future of taxonomic catalogs taking advantage of the growing number of online repositories, linked data, and be easily updatable. Comments on the deposition of the holotype of Apiocera braunsi Melander, 1907 are made.

  8. Operational Support for Instrument Stability through ODI-PPA Metadata Visualization and Analysis

    NASA Astrophysics Data System (ADS)

    Young, M. D.; Hayashi, S.; Gopu, A.; Kotulla, R.; Harbeck, D.; Liu, W.

    2015-09-01

    Over long time scales, quality assurance metrics taken from calibration and calibrated data products can aid observatory operations in quantifying the performance and stability of the instrument, and identify potential areas of concern or guide troubleshooting and engineering efforts. Such methods traditionally require manual SQL entries, assuming the requisite metadata has even been ingested into a database. With the ODI-PPA system, QA metadata has been harvested and indexed for all data products produced over the life of the instrument. In this paper we will describe how, utilizing the industry standard Highcharts Javascript charting package with a customized AngularJS-driven user interface, we have made the process of visualizing the long-term behavior of these QA metadata simple and easily replicated. Operators can easily craft a custom query using the powerful and flexible ODI-PPA search interface and visualize the associated metadata in a variety of ways. These customized visualizations can be bookmarked, shared, or embedded externally, and will be dynamically updated as new data products enter the system, enabling operators to monitor the long-term health of their instrument with ease.

  9. Update on Vaccine-Derived Polioviruses - Worldwide, January 2015-May 2016.

    PubMed

    Jorba, Jaume; Diop, Ousmane M; Iber, Jane; Sutter, Roland W; Wassilak, Steven G; Burns, Cara C

    2016-08-05

    In 1988, the World Health Assembly resolved to eradicate poliomyelitis worldwide (1). One of the main tools used in polio eradication efforts has been the live, attenuated, oral poliovirus vaccine (OPV) (2), an inexpensive vaccine easily administered by trained volunteers. OPV might require several doses to induce immunity, but provides long-term protection against paralytic disease. Through effective use of OPV, the Global Polio Eradication Initiative (GPEI) has brought wild polioviruses to the threshold of eradication (1). However, OPV use, particularly in areas with low routine vaccination coverage, is associated with the emergence of genetically divergent vaccine-derived polioviruses (VDPVs) whose genetic drift from the parental OPV strains indicates prolonged replication or circulation (3). VDPVs can emerge among immunologically normal vaccine recipients and their contacts as well as among persons with primary immunodeficiencies (PIDs). Immunodeficiency-associated VDPVs (iVDPVs) can replicate for years in some persons with PIDs. In addition, circulating vaccine-derived polioviruses (cVDPVs) (3) can emerge in areas with low OPV coverage and can cause outbreaks of paralytic polio. This report updates previous summaries regarding VDPVs (4).

  10. Addressing the Big-Earth-Data Variety Challenge with the Hierarchical Triangular Mesh

    NASA Technical Reports Server (NTRS)

    Rilee, Michael L.; Kuo, Kwo-Sen; Clune, Thomas; Oloso, Amidu; Brown, Paul G.; Yu, Honfeng

    2016-01-01

    We have implemented an updated Hierarchical Triangular Mesh (HTM) as the basis for a unified data model and an indexing scheme for geoscience data to address the variety challenge of Big Earth Data. We observe that, in the absence of variety, the volume challenge of Big Data is relatively easily addressable with parallel processing. The more important challenge in achieving optimal value with a Big Data solution for Earth Science (ES) data analysis, however, is being able to achieve good scalability with variety. With HTM unifying at least the three popular data models, i.e. Grid, Swath, and Point, used by current ES data products, data preparation time for integrative analysis of diverse datasets can be drastically reduced and better variety scaling can be achieved. In addition, since HTM is also an indexing scheme, when it is used to index all ES datasets, data placement alignment (or co-location) on the shared nothing architecture, which most Big Data systems are based on, is guaranteed and better performance is ensured. Moreover, our updated HTM encoding turns most geospatial set operations into integer interval operations, gaining further performance advantages.

  11. Event Boundaries in Perception Affect Memory Encoding and Updating

    PubMed Central

    Swallow, Khena M.; Zacks, Jeffrey M.; Abrams, Richard A.

    2010-01-01

    Memory for naturalistic events over short delays is important for visual scene processing, reading comprehension, and social interaction. The research presented here examined relations between how an ongoing activity is perceptually segmented into events and how those events are remembered a few seconds later. In several studies participants watched movie clips that presented objects in the context of goal-directed activities. Five seconds after an object was presented, the clip paused for a recognition test. Performance on the recognition test depended on the occurrence of perceptual event boundaries. Objects that were present when an event boundary occurred were better recognized than other objects, suggesting that event boundaries structure the contents of memory. This effect was strongest when an object’s type was tested, but was also observed for objects’ perceptual features. Memory also depended on whether an event boundary occurred between presentation and test; this variable produced complex interactive effects that suggested that the contents of memory are updated at event boundaries. These data indicate that perceptual event boundaries have immediate consequences for what, when, and how easily information can be remembered. PMID:19397382

  12. Diverse strategy-learning styles promote cooperation in evolutionary spatial prisoner's dilemma game

    NASA Astrophysics Data System (ADS)

    Liu, Run-Ran; Jia, Chun-Xiao; Rong, Zhihai

    2015-11-01

    Observational learning and practice learning are two important learning styles and play important roles in our information acquisition. In this paper, we study a spacial evolutionary prisoner's dilemma game, where players can choose the observational learning rule or the practice learning rule when updating their strategies. In the proposed model, we use a parameter p controlling the preference of players choosing the observational learning rule, and found that there exists an optimal value of p leading to the highest cooperation level, which indicates that the cooperation can be promoted by these two learning rules collaboratively and one single learning rule is not favor the promotion of cooperation. By analysing the dynamical behavior of the system, we find that the observational learning rule can make the players residing on cooperative clusters more easily realize the bad sequence of mutual defection. However, a too high observational learning probability suppresses the players to form compact cooperative clusters. Our results highlight the importance of a strategy-updating rule, more importantly, the observational learning rule in the evolutionary cooperation.

  13. Two Cultures in Modern Science and Technology: For Safety and Validity Does Medicine Have to Update?

    PubMed

    Becker, Robert E

    2016-01-11

    Two different scientific cultures go unreconciled in modern medicine. Each culture accepts that scientific knowledge and technologies are vulnerable to and easily invalidated by methods and conditions of acquisition, interpretation, and application. How these vulnerabilities are addressed separates the 2 cultures and potentially explains medicine's difficulties eradicating errors. A traditional culture, dominant in medicine, leaves error control in the hands of individual and group investigators and practitioners. A competing modern scientific culture accepts errors as inevitable, pernicious, and pervasive sources of adverse events throughout medical research and patient care too malignant for individuals or groups to control. Error risks to the validity of scientific knowledge and safety in patient care require systemwide programming able to support a culture in medicine grounded in tested, continually updated, widely promulgated, and uniformly implemented standards of practice for research and patient care. Experiences from successes in other sciences and industries strongly support the need for leadership from the Institute of Medicine's recommended Center for Patient Safely within the Federal Executive branch of government.

  14. Personal computer security: part 1. Firewalls, antivirus software, and Internet security suites.

    PubMed

    Caruso, Ronald D

    2003-01-01

    Personal computer (PC) security in the era of the Health Insurance Portability and Accountability Act of 1996 (HIPAA) involves two interrelated elements: safeguarding the basic computer system itself and protecting the information it contains and transmits, including personal files. HIPAA regulations have toughened the requirements for securing patient information, requiring every radiologist with such data to take further precautions. Security starts with physically securing the computer. Account passwords and a password-protected screen saver should also be set up. A modern antivirus program can easily be installed and configured. File scanning and updating of virus definitions are simple processes that can largely be automated and should be performed at least weekly. A software firewall is also essential for protection from outside intrusion, and an inexpensive hardware firewall can provide yet another layer of protection. An Internet security suite yields additional safety. Regular updating of the security features of installed programs is important. Obtaining a moderate degree of PC safety and security is somewhat inconvenient but is necessary and well worth the effort. Copyright RSNA, 2003

  15. Potential of SENTINEL-1A for Nation-Wide Routine Updates of Active Landslide Maps

    NASA Astrophysics Data System (ADS)

    Lazecky, M.; Canaslan Comut, F.; Nikolaeva, E.; Bakon, M.; Papco, J.; Ruiz-Armenteros, A. M.; Qin, Y.; de Sousa, J. J. M.; Ondrejka, P.

    2016-06-01

    Slope deformation is one of the typical geohazards that causes an extensive economic damage in mountainous regions. As such, they are usually intensively monitored by means of modern expertise commonly by national geological or emergency services. Resulting landslide susceptibility maps, or landslide inventories, offer an overview of areas affected by previously activated landslides as well as slopes known to be unstable currently. Current slope instabilities easily transform into a landslide after various triggering factors, such as an intensive rainfall or a melting snow cover. In these inventories, the majority of the existing landslide-affected slopes are marked as either stable or active, after a continuous investigative work of the experts in geology. In this paper we demonstrate the applicability of Sentinel-1A satellite SAR interferometry (InSAR) to assist by identifying slope movement activity and use the information to update national landslide inventories. This can be done reliably in cases of semi-arid regions or low vegetated slopes. We perform several analyses based on multitemporal InSAR techniques of Sentinel-1A data over selected areas prone to landslides.

  16. Updating flood maps efficiently using existing hydraulic models, very-high-accuracy elevation data, and a geographic information system; a pilot study on the Nisqually River, Washington

    USGS Publications Warehouse

    Jones, Joseph L.; Haluska, Tana L.; Kresch, David L.

    2001-01-01

    A method of updating flood inundation maps at a fraction of the expense of using traditional methods was piloted in Washington State as part of the U.S. Geological Survey Urban Geologic and Hydrologic Hazards Initiative. Large savings in expense may be achieved by building upon previous Flood Insurance Studies and automating the process of flood delineation with a Geographic Information System (GIS); increases in accuracy and detail result from the use of very-high-accuracy elevation data and automated delineation; and the resulting digital data sets contain valuable ancillary information such as flood depth, as well as greatly facilitating map storage and utility. The method consists of creating stage-discharge relations from the archived output of the existing hydraulic model, using these relations to create updated flood stages for recalculated flood discharges, and using a GIS to automate the map generation process. Many of the effective flood maps were created in the late 1970?s and early 1980?s, and suffer from a number of well recognized deficiencies such as out-of-date or inaccurate estimates of discharges for selected recurrence intervals, changes in basin characteristics, and relatively low quality elevation data used for flood delineation. FEMA estimates that 45 percent of effective maps are over 10 years old (FEMA, 1997). Consequently, Congress has mandated the updating and periodic review of existing maps, which have cost the Nation almost 3 billion (1997) dollars. The need to update maps and the cost of doing so were the primary motivations for piloting a more cost-effective and efficient updating method. New technologies such as Geographic Information Systems and LIDAR (Light Detection and Ranging) elevation mapping are key to improving the efficiency of flood map updating, but they also improve the accuracy, detail, and usefulness of the resulting digital flood maps. GISs produce digital maps without manual estimation of inundated areas between cross sections, and can generate working maps across a broad range of scales, for any selected area, and overlayed with easily updated cultural features. Local governments are aggressively collecting very-high-accuracy elevation data for numerous reasons; this not only lowers the cost and increases accuracy of flood maps, but also inherently boosts the level of community involvement in the mapping process. These elevation data are also ideal for hydraulic modeling, should an existing model be judged inadequate.

  17. Development of the Updating Executive Function: From 7-Year-Olds to Young Adults

    ERIC Educational Resources Information Center

    Carriedo, Nuria; Corral, Antonio; Montoro, Pedro R.; Herrero, Laura; Rucián, Mercedes

    2016-01-01

    Updating information in working memory (WM) is a critical executive function responsible both for continuously replacing outdated information with new relevant data and to suppress or inhibit content that is no longer relevant according to task demands. The goal of the present research is twofold: First, we aimed to study updating development in…

  18. Formulation of consumables management models: Consumables flight planning worksheet update. [space shuttles

    NASA Technical Reports Server (NTRS)

    Newman, C. M.

    1977-01-01

    The updated consumables flight planning worksheet (CFPWS) is documented. The update includes: (1) additional consumables: ECLSS ammonia, APU propellant, HYD water; (2) additional on orbit activity for development flight instrumentation (DFI); (3) updated use factors for all consumables; and (4) sources and derivations of the use factors.

  19. Reformulating Non-Monotonic Theories for Inference and Updating

    NASA Technical Reports Server (NTRS)

    Grosof, Benjamin N.

    1992-01-01

    We aim to help build programs that do large-scale, expressive non-monotonic reasoning (NMR): especially, 'learning agents' that store, and revise, a body of conclusions while continually acquiring new, possibly defeasible, premise beliefs. Currently available procedures for forward inference and belief revision are exhaustive, and thus impractical: they compute the entire non-monotonic theory, then re-compute from scratch upon updating with new axioms. These methods are thus badly intractable. In most theories of interest, even backward reasoning is combinatoric (at least NP-hard). Here, we give theoretical results for prioritized circumscription that show how to reformulate default theories so as to make forward inference be selective, as well as concurrent; and to restrict belief revision to a part of the theory. We elaborate a detailed divide-and-conquer strategy. We develop concepts of structure in NM theories, by showing how to reformulate them in a particular fashion: to be conjunctively decomposed into a collection of smaller 'part' theories. We identify two well-behaved special cases that are easily recognized in terms of syntactic properties: disjoint appearances of predicates, and disjoint appearances of individuals (terms). As part of this, we also definitionally reformulate the global axioms, one by one, in addition to applying decomposition. We identify a broad class of prioritized default theories, generalizing default inheritance, for which our results especially bear fruit. For this asocially monadic class, decomposition permits reasoning to be localized to individuals (ground terms), and reduced to propositional. Our reformulation methods are implementable in polynomial time, and apply to several other NM formalisms beyond circumscription.

  20. Fish Is Food - The FAO’s Fish Price Index

    PubMed Central

    Tveterås, Sigbjørn; Asche, Frank; Bellemare, Marc F.; Smith, Martin D.; Guttormsen, Atle G.; Lem, Audun; Lien, Kristin; Vannuccini, Stefania

    2012-01-01

    World food prices hit an all-time high in February 2011 and are still almost two and a half times those of 2000. Although three billion people worldwide use seafood as a key source of animal protein, the Food and Agriculture Organization (FAO) of the United Nations–which compiles prices for other major food categories–has not tracked seafood prices. We fill this gap by developing an index of global seafood prices that can help to understand food crises and may assist in averting them. The fish price index (FPI) relies on trade statistics because seafood is heavily traded internationally, exposing non-traded seafood to price competition from imports and exports. Easily updated trade data can thus proxy for domestic seafood prices that are difficult to observe in many regions and costly to update with global coverage. Calculations of the extent of price competition in different countries support the plausibility of reliance on trade data. Overall, the FPI shows less volatility and fewer price spikes than other food price indices including oils, cereals, and dairy. The FPI generally reflects seafood scarcity, but it can also be separated into indices by production technology, fish species, or region. Splitting FPI into capture fisheries and aquaculture suggests increased scarcity of capture fishery resources in recent years, but also growth in aquaculture that is keeping pace with demand. Regionally, seafood price volatility varies, and some prices are negatively correlated. These patterns hint that regional supply shocks are consequential for seafood prices in spite of the high degree of seafood tradability. PMID:22590598

  1. Fish is food--the FAO's fish price index.

    PubMed

    Tveterås, Sigbjørn; Asche, Frank; Bellemare, Marc F; Smith, Martin D; Guttormsen, Atle G; Lem, Audun; Lien, Kristin; Vannuccini, Stefania

    2012-01-01

    World food prices hit an all-time high in February 2011 and are still almost two and a half times those of 2000. Although three billion people worldwide use seafood as a key source of animal protein, the Food and Agriculture Organization (FAO) of the United Nations-which compiles prices for other major food categories-has not tracked seafood prices. We fill this gap by developing an index of global seafood prices that can help to understand food crises and may assist in averting them. The fish price index (FPI) relies on trade statistics because seafood is heavily traded internationally, exposing non-traded seafood to price competition from imports and exports. Easily updated trade data can thus proxy for domestic seafood prices that are difficult to observe in many regions and costly to update with global coverage. Calculations of the extent of price competition in different countries support the plausibility of reliance on trade data. Overall, the FPI shows less volatility and fewer price spikes than other food price indices including oils, cereals, and dairy. The FPI generally reflects seafood scarcity, but it can also be separated into indices by production technology, fish species, or region. Splitting FPI into capture fisheries and aquaculture suggests increased scarcity of capture fishery resources in recent years, but also growth in aquaculture that is keeping pace with demand. Regionally, seafood price volatility varies, and some prices are negatively correlated. These patterns hint that regional supply shocks are consequential for seafood prices in spite of the high degree of seafood tradability.

  2. Fast Updating National Geo-Spatial Databases with High Resolution Imagery: China's Methodology and Experience

    NASA Astrophysics Data System (ADS)

    Chen, J.; Wang, D.; Zhao, R. L.; Zhang, H.; Liao, A.; Jiu, J.

    2014-04-01

    Geospatial databases are irreplaceable national treasure of immense importance. Their up-to-dateness referring to its consistency with respect to the real world plays a critical role in its value and applications. The continuous updating of map databases at 1:50,000 scales is a massive and difficult task for larger countries of the size of more than several million's kilometer squares. This paper presents the research and technological development to support the national map updating at 1:50,000 scales in China, including the development of updating models and methods, production tools and systems for large-scale and rapid updating, as well as the design and implementation of the continuous updating workflow. The use of many data sources and the integration of these data to form a high accuracy, quality checked product were required. It had in turn required up to date techniques of image matching, semantic integration, generalization, data base management and conflict resolution. Design and develop specific software tools and packages to support the large-scale updating production with high resolution imagery and large-scale data generalization, such as map generalization, GIS-supported change interpretation from imagery, DEM interpolation, image matching-based orthophoto generation, data control at different levels. A national 1:50,000 databases updating strategy and its production workflow were designed, including a full coverage updating pattern characterized by all element topographic data modeling, change detection in all related areas, and whole process data quality controlling, a series of technical production specifications, and a network of updating production units in different geographic places in the country.

  3. Model-Based Engineering Design for Trade Space Exploration throughout the Design Cycle

    NASA Technical Reports Server (NTRS)

    Lamassoure, Elisabeth S.; Wall, Stephen D.; Easter, Robert W.

    2004-01-01

    This paper presents ongoing work to standardize model-based system engineering as a complement to point design development in the conceptual design phase of deep space missions. It summarizes two first steps towards practical application of this capability within the framework of concurrent engineering design teams and their customers. The first step is standard generation of system sensitivities models as the output of concurrent engineering design sessions, representing the local trade space around a point design. A review of the chosen model development process, and the results of three case study examples, demonstrate that a simple update to the concurrent engineering design process can easily capture sensitivities to key requirements. It can serve as a valuable tool to analyze design drivers and uncover breakpoints in the design. The second step is development of rough-order- of-magnitude, broad-range-of-validity design models for rapid exploration of the trade space, before selection of a point design. At least one case study demonstrated the feasibility to generate such models in a concurrent engineering session. The experiment indicated that such a capability could yield valid system-level conclusions for a trade space composed of understood elements. Ongoing efforts are assessing the practicality of developing end-to-end system-level design models for use before even convening the first concurrent engineering session, starting with modeling an end-to-end Mars architecture.

  4. A review of statistical updating methods for clinical prediction models.

    PubMed

    Su, Ting-Li; Jaki, Thomas; Hickey, Graeme L; Buchan, Iain; Sperrin, Matthew

    2018-01-01

    A clinical prediction model is a tool for predicting healthcare outcomes, usually within a specific population and context. A common approach is to develop a new clinical prediction model for each population and context; however, this wastes potentially useful historical information. A better approach is to update or incorporate the existing clinical prediction models already developed for use in similar contexts or populations. In addition, clinical prediction models commonly become miscalibrated over time, and need replacing or updating. In this article, we review a range of approaches for re-using and updating clinical prediction models; these fall in into three main categories: simple coefficient updating, combining multiple previous clinical prediction models in a meta-model and dynamic updating of models. We evaluated the performance (discrimination and calibration) of the different strategies using data on mortality following cardiac surgery in the United Kingdom: We found that no single strategy performed sufficiently well to be used to the exclusion of the others. In conclusion, useful tools exist for updating existing clinical prediction models to a new population or context, and these should be implemented rather than developing a new clinical prediction model from scratch, using a breadth of complementary statistical methods.

  5. The Development of Time-Based Prospective Memory in Childhood: The Role of Working Memory Updating

    ERIC Educational Resources Information Center

    Voigt, Babett; Mahy, Caitlin E. V.; Ellis, Judi; Schnitzspahn, Katharina; Krause, Ivonne; Altgassen, Mareike; Kliegel, Matthias

    2014-01-01

    This large-scale study examined the development of time-based prospective memory (PM) across childhood and the roles that working memory updating and time monitoring play in driving age effects in PM performance. One hundred and ninety-seven children aged 5 to 14 years completed a time-based PM task where working memory updating load was…

  6. Guidance for updating clinical practice guidelines: a systematic review of methodological handbooks.

    PubMed

    Vernooij, Robin W M; Sanabria, Andrea Juliana; Solà, Ivan; Alonso-Coello, Pablo; Martínez García, Laura

    2014-01-02

    Updating clinical practice guidelines (CPGs) is a crucial process for maintaining the validity of recommendations. Methodological handbooks should provide guidance on both developing and updating CPGs. However, little is known about the updating guidance provided by these handbooks. We conducted a systematic review to identify and describe the updating guidance provided by CPG methodological handbooks and included handbooks that provide updating guidance for CPGs. We searched in the Guidelines International Network library, US National Guidelines Clearinghouse and MEDLINE (PubMed) from 1966 to September 2013. Two authors independently selected the handbooks and extracted the data. We used descriptive statistics to analyze the extracted data and conducted a narrative synthesis. We included 35 handbooks. Most handbooks (97.1%) focus mainly on developing CPGs, including variable degrees of information about updating. Guidance on identifying new evidence and the methodology of assessing the need for an update is described in 11 (31.4%) and eight handbooks (22.8%), respectively. The period of time between two updates is described in 25 handbooks (71.4%), two to three years being the most frequent (40.0%). The majority of handbooks do not provide guidance for the literature search, evidence selection, assessment, synthesis, and external review of the updating process. Guidance for updating CPGs is poorly described in methodological handbooks. This guidance should be more rigorous and explicit. This could lead to a more optimal updating process, and, ultimately to valid trustworthy guidelines.

  7. The development of executive functions and early mathematics: a dynamic relationship.

    PubMed

    Van der Ven, Sanne H G; Kroesbergen, Evelyn H; Boom, Jan; Leseman, Paul P M

    2012-03-01

    The relationship between executive functions and mathematical skills has been studied extensively, but results are inconclusive, and how this relationship evolves longitudinally is largely unknown. The aim was to investigate the factor structure of executive functions in inhibition, shifting, and updating; the longitudinal development of executive functions and mathematics; and the relation between them. A total of 211 children in grade 2 (7-8 years old) from 10 schools in the Netherlands. Children were followed in grade 1 and 2 of primary education. Executive functions and mathematics were measured four times. The test battery contained multiple tasks for each executive function: Animal stroop, local global, and Simon task for inhibition; Animal Shifting, Trail Making Test in Colours, and Sorting Task for shifting; and Digit Span Backwards, Odd One Out, and Keep Track for updating. The factor structure of executive functions was assessed and relations with mathematics were investigated using growth modelling. Confirmatory factor analysis (CFA) showed that inhibition and shifting could not be distinguished from each other. Updating was a separate factor, and its development was strongly related to mathematical development while inhibition and shifting did not predict mathematics in the presence of the updating factor. The strong relationship between updating and mathematics suggest that updating skills play a key role in the maths learning process. This makes updating a promising target for future intervention studies. ©2011 The British Psychological Society.

  8. Global Location-Based Access to Web Applications Using Atom-Based Automatic Update

    NASA Astrophysics Data System (ADS)

    Singh, Kulwinder; Park, Dong-Won

    We propose an architecture which enables people to enquire about information available in directory services by voice using regular phones. We implement a Virtual User Agent (VUA) which mediates between the human user and a business directory service. The system enables the user to search for the nearest clinic, gas station by price, motel by price, food / coffee, banks/ATM etc. and fix an appointment, or automatically establish a call between the user and the business party if the user prefers. The user also has an option to receive appointment confirmation by phone, SMS, or e-mail. The VUA is accessible by a toll free DID (Direct Inward Dialing) number using a phone by anyone, anywhere, anytime. We use the Euclidean formula for distance measurement. Since, shorter geodesic distances (on the Earth’s surface) correspond to shorter Euclidean distances (measured by a straight line through the Earth). Our proposed architecture uses Atom XML syndication format protocol for data integration, VoiceXML for creating the voice user interface (VUI) and CCXML for controlling the call components. We also provide an efficient algorithm for parsing Atom feeds which provide data to the system. Moreover, we describe a cost-effective way for providing global access to the VUA based on Asterisk (an open source IP-PBX). We also provide some information on how our system can be integrated with GPS for locating the user coordinates and therefore efficiently and spontaneously enhancing the system response. Additionally, the system has a mechanism for validating the phone numbers in its database, and it updates the number and other information such as daily price of gas, motel etc. automatically using an Atom-based feed. Currently, the commercial directory services (Example 411) do not have facilities to update the listing in the database automatically, so that why callers most of the times get out-of-date phone numbers or other information. Our system can be integrated very easily with an existing web infrastructure, thereby making the wealth of Web information easily available to the user by phone. This kind of system can be deployed as an extension to 911 and 411 services to share the workload with human operators. This paper presents all the underlying principles, architecture, features, and an example of the real world deployment of our proposed system. The source code and documentations are available for commercial productions.

  9. The updating of clinical practice guidelines: insights from an international survey

    PubMed Central

    2011-01-01

    Background Clinical practice guidelines (CPGs) have become increasingly popular, and the methodology to develop guidelines has evolved enormously. However, little attention has been given to the updating process, in contrast to the appraisal of the available literature. We conducted an international survey to identify current practices in CPG updating and explored the need to standardize and improve the methods. Methods We developed a questionnaire (28 items) based on a review of the existing literature about guideline updating and expert comments. We carried out the survey between March and July 2009, and it was sent by email to 106 institutions: 69 members of the Guidelines International Network who declared that they developed CPGs; 30 institutions included in the U.S. National Guideline Clearinghouse database that published more than 20 CPGs; and 7 institutions selected by an expert committee. Results Forty-four institutions answered the questionnaire (42% response rate). In the final analysis, 39 completed questionnaires were included. Thirty-six institutions (92%) reported that they update their guidelines. Thirty-one institutions (86%) have a formal procedure for updating their guidelines, and 19 (53%) have a formal procedure for deciding when a guideline becomes out of date. Institutions describe the process as moderately rigorous (36%) or acknowledge that it could certainly be more rigorous (36%). Twenty-two institutions (61%) alert guideline users on their website when a guideline is older than three to five years or when there is a risk of being outdated. Twenty-five institutions (64%) support the concept of "living guidelines," which are continuously monitored and updated. Eighteen institutions (46%) have plans to design a protocol to improve their guideline-updating process, and 21 (54%) are willing to share resources with other organizations. Conclusions Our study is the first to describe the process of updating CPGs among prominent guideline institutions across the world, providing a comprehensive picture of guideline updating. There is an urgent need to develop rigorous international standards for this process and to minimize duplication of effort internationally. PMID:21914177

  10. Ontology for the asexual development and anatomy of the colonial chordate Botryllus schlosseri.

    PubMed

    Manni, Lucia; Gasparini, Fabio; Hotta, Kohji; Ishizuka, Katherine J; Ricci, Lorenzo; Tiozzo, Stefano; Voskoboynik, Ayelet; Dauga, Delphine

    2014-01-01

    Ontologies provide an important resource to integrate information. For developmental biology and comparative anatomy studies, ontologies of a species are used to formalize and annotate data that are related to anatomical structures, their lineage and timing of development. Here, we have constructed the first ontology for anatomy and asexual development (blastogenesis) of a bilaterian, the colonial tunicate Botryllus schlosseri. Tunicates, like Botryllus schlosseri, are non-vertebrates and the only chordate taxon species that reproduce both sexually and asexually. Their tadpole larval stage possesses structures characteristic of all chordates, i.e. a notochord, a dorsal neural tube, and gill slits. Larvae settle and metamorphose into individuals that are either solitary or colonial. The latter reproduce both sexually and asexually and these two reproductive modes lead to essentially the same adult body plan. The Botryllus schlosseri Ontology of Development and Anatomy (BODA) will facilitate the comparison between both types of development. BODA uses the rules defined by the Open Biomedical Ontologies Foundry. It is based on studies that investigate the anatomy, blastogenesis and regeneration of this organism. BODA features allow the users to easily search and identify anatomical structures in the colony, to define the developmental stage, and to follow the morphogenetic events of a tissue and/or organ of interest throughout asexual development. We invite the scientific community to use this resource as a reference for the anatomy and developmental ontology of B. schlosseri and encourage recommendations for updates and improvements.

  11. Methods to Develop the Eye-tem Bank to Measure Ophthalmic Quality of Life.

    PubMed

    Khadka, Jyoti; Fenwick, Eva; Lamoureux, Ecosse; Pesudovs, Konrad

    2016-12-01

    There is an increasing demand for high-standard, comprehensive, and reliable patient-reported outcome (PRO) instruments in all the disciplines of health care including in ophthalmology and optometry. Over the past two decades, a plethora of PRO instruments have been developed to assess the impact of eye diseases and their treatments. Despite this large number of instruments, significant shortcomings exist for the measurement of ophthalmic quality of life (QoL). Most PRO instruments are short-form instruments designed for clinical use, but this limits their content coverage often poorly targeting any study population other than that which they were developed for. Also, existing instruments are static paper and pencil based and unable to be updated easily leading to outdated and irrelevant item content. Scores obtained from different PRO instruments may not be directly comparable. These shortcomings can be addressed using item banking implemented with computer-adaptive testing (CAT). Therefore, we designed a multicenter project (The Eye-tem Bank project) to develop and validate such PROs to enable comprehensive measurement of ophthalmic QoL in eye diseases. Development of the Eye-tem Bank follows four phases: Phase I, Content Development; Phase II, Pilot Testing and Item Calibration; Phase III, Validation; and Phase IV, Evaluation. This project will deliver technologically advanced comprehensive QoL PROs in the form of item banking implemented via a CAT system in eye diseases. Here, we present a detailed methodological framework of this project.

  12. Ontology for the Asexual Development and Anatomy of the Colonial Chordate Botryllus schlosseri

    PubMed Central

    Manni, Lucia; Gasparini, Fabio; Hotta, Kohji; Ishizuka, Katherine J.; Ricci, Lorenzo; Tiozzo, Stefano; Voskoboynik, Ayelet; Dauga, Delphine

    2014-01-01

    Ontologies provide an important resource to integrate information. For developmental biology and comparative anatomy studies, ontologies of a species are used to formalize and annotate data that are related to anatomical structures, their lineage and timing of development. Here, we have constructed the first ontology for anatomy and asexual development (blastogenesis) of a bilaterian, the colonial tunicate Botryllus schlosseri. Tunicates, like Botryllus schlosseri, are non-vertebrates and the only chordate taxon species that reproduce both sexually and asexually. Their tadpole larval stage possesses structures characteristic of all chordates, i.e. a notochord, a dorsal neural tube, and gill slits. Larvae settle and metamorphose into individuals that are either solitary or colonial. The latter reproduce both sexually and asexually and these two reproductive modes lead to essentially the same adult body plan. The Botryllus schlosseri Ontology of Development and Anatomy (BODA) will facilitate the comparison between both types of development. BODA uses the rules defined by the Open Biomedical Ontologies Foundry. It is based on studies that investigate the anatomy, blastogenesis and regeneration of this organism. BODA features allow the users to easily search and identify anatomical structures in the colony, to define the developmental stage, and to follow the morphogenetic events of a tissue and/or organ of interest throughout asexual development. We invite the scientific community to use this resource as a reference for the anatomy and developmental ontology of B. schlosseri and encourage recommendations for updates and improvements. PMID:24789338

  13. A Real-time 3D Visualization of Global MHD Simulation for Space Weather Forecasting

    NASA Astrophysics Data System (ADS)

    Murata, K.; Matsuoka, D.; Kubo, T.; Shimazu, H.; Tanaka, T.; Fujita, S.; Watari, S.; Miyachi, H.; Yamamoto, K.; Kimura, E.; Ishikura, S.

    2006-12-01

    Recently, many satellites for communication networks and scientific observation are launched in the vicinity of the Earth (geo-space). The electromagnetic (EM) environments around the spacecraft are always influenced by the solar wind blowing from the Sun and induced electromagnetic fields. They occasionally cause various troubles or damages, such as electrification and interference, to the spacecraft. It is important to forecast the geo-space EM environment as well as the ground weather forecasting. Owing to the recent remarkable progresses of super-computer technologies, numerical simulations have become powerful research methods in the solar-terrestrial physics. For the necessity of space weather forecasting, NICT (National Institute of Information and Communications Technology) has developed a real-time global MHD simulation system of solar wind-magnetosphere-ionosphere couplings, which has been performed on a super-computer SX-6. The real-time solar wind parameters from the ACE spacecraft at every one minute are adopted as boundary conditions for the simulation. Simulation results (2-D plots) are updated every 1 minute on a NICT website. However, 3D visualization of simulation results is indispensable to forecast space weather more accurately. In the present study, we develop a real-time 3D webcite for the global MHD simulations. The 3-D visualization results of simulation results are updated every 20 minutes in the following three formats: (1)Streamlines of magnetic field lines, (2)Isosurface of temperature in the magnetosphere and (3)Isoline of conductivity and orthogonal plane of potential in the ionosphere. For the present study, we developed a 3-D viewer application working on Internet Explorer browser (ActiveX) is implemented, which was developed on the AVS/Express. Numerical data are saved in the HDF5 format data files every 1 minute. Users can easily search, retrieve and plot past simulation results (3D visualization data and numerical data) by using the STARS (Solar-terrestrial data Analysis and Reference System). The STARS is a data analysis system for satellite and ground-based observation data for solar-terrestrial physics.

  14. PyChimera: use UCSF Chimera modules in any Python 2.7 project.

    PubMed

    Rodríguez-Guerra Pedregal, Jaime; Maréchal, Jean-Didier

    2018-05-15

    UCSF Chimera is a powerful visualization tool remarkably present in the computational chemistry and structural biology communities. Built on a C++ core wrapped under a Python 2.7 environment, one could expect to easily import UCSF Chimera's arsenal of resources in custom scripts or software projects. Nonetheless, this is not readily possible if the script is not executed within UCSF Chimera due to the isolation of the platform. UCSF ChimeraX, successor to the original Chimera, partially solves the problem but yet major upgrades need to be undergone so that this updated version can offer all UCSF Chimera features. PyChimera has been developed to overcome these limitations and provide access to the UCSF Chimera codebase from any Python 2.7 interpreter, including interactive programming with tools like IPython and Jupyter Notebooks, making it easier to use with additional third-party software. PyChimera is LGPL-licensed and available at https://github.com/insilichem/pychimera. jaime.rodriguezguerra@uab.cat or jeandidier.marechal@uab.cat. Supplementary data are available at Bioinformatics online.

  15. The QKD network: model and routing scheme

    NASA Astrophysics Data System (ADS)

    Yang, Chao; Zhang, Hongqi; Su, Jinhai

    2017-11-01

    Quantum key distribution (QKD) technology can establish unconditional secure keys between two communicating parties. Although this technology has some inherent constraints, such as the distance and point-to-point mode limits, building a QKD network with multiple point-to-point QKD devices can overcome these constraints. Considering the development level of current technology, the trust relaying QKD network is the first choice to build a practical QKD network. However, the previous research didn't address a routing method on the trust relaying QKD network in detail. This paper focuses on the routing issues, builds a model of the trust relaying QKD network for easily analysing and understanding this network, and proposes a dynamical routing scheme for this network. From the viewpoint of designing a dynamical routing scheme in classical network, the proposed scheme consists of three components: a Hello protocol helping share the network topology information, a routing algorithm to select a set of suitable paths and establish the routing table and a link state update mechanism helping keep the routing table newly. Experiments and evaluation demonstrates the validity and effectiveness of the proposed routing scheme.

  16. Phantom study and accuracy evaluation of an image-to-world registration approach used with electro-magnetic tracking system for neurosurgery

    NASA Astrophysics Data System (ADS)

    Li, Senhu; Sarment, David

    2015-12-01

    Minimally invasive neurosurgery needs intraoperative imaging updates and high efficient image guide system to facilitate the procedure. An automatic image guided system utilized with a compact and mobile intraoperative CT imager was introduced in this work. A tracking frame that can be easily attached onto the commercially available skull clamp was designed. With known geometry of fiducial and tracking sensor arranged on this rigid frame that was fabricated through high precision 3D printing, not only was an accurate, fully automatic registration method developed in a simple and less-costly approach, but also it helped in estimating the errors from fiducial localization in image space through image processing, and in patient space through the calibration of tracking frame. Our phantom study shows the fiducial registration error as 0.348+/-0.028mm, comparing the manual registration error as 1.976+/-0.778mm. The system in this study provided a robust and accurate image-to-patient registration without interruption of routine surgical workflow and any user interactions involved through the neurosurgery.

  17. PAQ: Persistent Adaptive Query Middleware for Dynamic Environments

    NASA Astrophysics Data System (ADS)

    Rajamani, Vasanth; Julien, Christine; Payton, Jamie; Roman, Gruia-Catalin

    Pervasive computing applications often entail continuous monitoring tasks, issuing persistent queries that return continuously updated views of the operational environment. We present PAQ, a middleware that supports applications' needs by approximating a persistent query as a sequence of one-time queries. PAQ introduces an integration strategy abstraction that allows composition of one-time query responses into streams representing sophisticated spatio-temporal phenomena of interest. A distinguishing feature of our middleware is the realization that the suitability of a persistent query's result is a function of the application's tolerance for accuracy weighed against the associated overhead costs. In PAQ, programmers can specify an inquiry strategy that dictates how information is gathered. Since network dynamics impact the suitability of a particular inquiry strategy, PAQ associates an introspection strategy with a persistent query, that evaluates the quality of the query's results. The result of introspection can trigger application-defined adaptation strategies that alter the nature of the query. PAQ's simple API makes developing adaptive querying systems easily realizable. We present the key abstractions, describe their implementations, and demonstrate the middleware's usefulness through application examples and evaluation.

  18. Ambulatory position and orientation tracking fusing magnetic and inertial sensing.

    PubMed

    Roetenberg, Daniel; Slycke, Per J; Veltink, Peter H

    2007-05-01

    This paper presents the design and testing of a portable magnetic system combined with miniature inertial sensors for ambulatory 6 degrees of freedom (DOF) human motion tracking. The magnetic system consists of three orthogonal coils, the source, fixed to the body and 3-D magnetic sensors, fixed to remote body segments, which measure the fields generated by the source. Based on the measured signals, a processor calculates the relative positions and orientations between source and sensor. Magnetic actuation requires a substantial amount of energy which limits the update rate with a set of batteries. Moreover, the magnetic field can easily be disturbed by ferromagnetic materials or other sources. Inertial sensors can be sampled at high rates, require only little energy and do not suffer from magnetic interferences. However, accelerometers and gyroscopes can only measure changes in position and orientation and suffer from integration drift. By combing measurements from both systems in a complementary Kalman filter structure, an optimal solution for position and orientation estimates is obtained. The magnetic system provides 6 DOF measurements at a relatively low update rate while the inertial sensors track the changes position and orientation in between the magnetic updates. The implemented system is tested against a lab-bound camera tracking system for several functional body movements. The accuracy was about 5 mm for position and 3 degrees for orientation measurements. Errors were higher during movements with high velocities due to relative movement between source and sensor within one cycle of magnetic actuation.

  19. An Interactive Map Viewer for the Urban Geology of Ottawa (Canada): an Example of Web Publishing

    NASA Astrophysics Data System (ADS)

    Giroux, D.; Bélanger, R.

    2003-04-01

    Developed by the Terrain Sciences Division (TSD) of the Geological Survey of Canada (GSC), an interactive map viewer, called GEOSERV (www.geoserv.org), is now available on the Internet. The purpose of this viewer is to provide engineers, planners, decision makers, and the general public with the geoscience information required for sound regional planning in densely populated areas, such as Canada's national capital, Ottawa (Ontario). Urban geology studies rely on diverse branches of earth sciences such as hydrology, engineering geology, geochemistry, stratigraphy, and geomorphology in order to build a three-dimensional model of the character of the land and to explain the geological processes involved in the dynamic equilibrium of the local environment. Over the past few years, TSD has compiled geoscientific information derived from various sources such as borehole logs, geological maps, hydrological reports and digital elevation models, compiled it in digital format and stored it in georeferenced databases in the form of point, linear, and polygonal data. This information constitutes the geoscience knowledge base which is then processed by Geographic Information Systems (GIS) to integrate the various sources of information and produce derived graphics, maps and models describing the geological infrastructure and response of the geological environment to human activities. Urban Geology of Canada's National Capital Area is a pilot project aiming at developing approaches, methodologies and standards that can be applied to other major urban centres of the country, while providing the geoscience knowledge required for sound regional planning and environmental protection of the National Capital Area. Based on an application developed by ESRI (Environmental System Research Institute), namely ArcIMS, the TSD has customized this web application to give free access to geoscience information of the Ottawa/Outaouais (Ontario/Québec) area including geological history, subsurface database, stratigraphy, bedrock, surficial and hydrogeology maps, and a few others. At present, each layer of geospatial information in TSD's interactive map viewer is connected to simple independent flat files (i.e. shapefiles), but it is also possible to connect GEOSERV to other types of (relational) databases (e.g. Microsoft SQL Server, Oracle). Frequent updating of shapefiles could be a cumbersome task, when new records are added, since we have to completely rebuild the updated shapefiles. However, new attributes can be added to existing shapefiles easily. At present, the updating process can not be done on-the-fly; we must stop and restart the updated MapService if one of its shapefiles is changed. The public can access seventeen MapServices that provide interactive tools that users can use to query, zoom, pan, select, and so on, or print the map displayed on their monitor. The map viewer is light-weight as it uses HTML and Javascript, so end users do not have to download and install any plug-ins. A free CD and a companion web site were also developed to give access to complementary information, like high resolution raster maps and reports. Some of the datasets are available free of charge, on-line.

  20. Numerical model updating technique for structures using firefly algorithm

    NASA Astrophysics Data System (ADS)

    Sai Kubair, K.; Mohan, S. C.

    2018-03-01

    Numerical model updating is a technique used for updating the existing experimental models for any structures related to civil, mechanical, automobiles, marine, aerospace engineering, etc. The basic concept behind this technique is updating the numerical models to closely match with experimental data obtained from real or prototype test structures. The present work involves the development of numerical model using MATLAB as a computational tool and with mathematical equations that define the experimental model. Firefly algorithm is used as an optimization tool in this study. In this updating process a response parameter of the structure has to be chosen, which helps to correlate the numerical model developed with the experimental results obtained. The variables for the updating can be either material or geometrical properties of the model or both. In this study, to verify the proposed technique, a cantilever beam is analyzed for its tip deflection and a space frame has been analyzed for its natural frequencies. Both the models are updated with their respective response values obtained from experimental results. The numerical results after updating show that there is a close relationship that can be brought between the experimental and the numerical models.

  1. Robust master-slave synchronization for general uncertain delayed dynamical model based on adaptive control scheme.

    PubMed

    Wang, Tianbo; Zhou, Wuneng; Zhao, Shouwei; Yu, Weiqin

    2014-03-01

    In this paper, the robust exponential synchronization problem for a class of uncertain delayed master-slave dynamical system is investigated by using the adaptive control method. Different from some existing master-slave models, the considered master-slave system includes bounded unmodeled dynamics. In order to compensate the effect of unmodeled dynamics and effectively achieve synchronization, a novel adaptive controller with simple updated laws is proposed. Moreover, the results are given in terms of LMIs, which can be easily solved by LMI Toolbox in Matlab. A numerical example is given to illustrate the effectiveness of the method. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  2. USGS Coal Desorption Equipment and a Spreadsheet for Analysis of Lost and Total Gas from Canister Desorption Measurements

    USGS Publications Warehouse

    Barker, Charles E.; Dallegge, Todd A.; Clark, Arthur C.

    2002-01-01

    We have updated a simple polyvinyl chloride plastic canister design by adding internal headspace temperature measurement, and redesigned it so it is made with mostly off-the-shelf components for ease of construction. Using self-closing quick connects, this basic canister is mated to a zero-head manometer to make a simple coalbed methane desorption system that is easily transported in small aircraft to remote localities. This equipment is used to gather timed measurements of pressure, volume and temperature data that are corrected to standard pressure and temperature (STP) and graphically analyzed using an Excel(tm)-based spreadsheet. Used together these elements form an effective, practical canister desorption method.

  3. Malassezia (Pityrosporum) Folliculitis

    PubMed Central

    Rubenstein, Richard M.

    2014-01-01

    Malassezia (Pityrosporum) folliculitis is a fungal acneiform condition commonly misdiagnosed as acne vulgaris. Although often associated with common acne, this condition may persist for years without complete resolution with typical acne medications. Malassezia folliculitis results from overgrowth of yeast present in the normal cutaneous flora. Eruptions may be associated with conditions altering this flora, such as immunosuppression and antibiotic use. The most common presentation is monomorphic papules and pustules, often on the chest, back, posterior arms, and face. Oral antifungals are the most effective treatment and result in rapid improvement. The association with acne vulgaris may require combinations of both antifungal and acne medications. This article reviews and updates readers on this not uncommon, but easily missed, condition. PMID:24688625

  4. Exploring first-order phase transitions with population annealing

    NASA Astrophysics Data System (ADS)

    Barash, Lev Yu.; Weigel, Martin; Shchur, Lev N.; Janke, Wolfhard

    2017-03-01

    Population annealing is a hybrid of sequential and Markov chain Monte Carlo methods geared towards the efficient parallel simulation of systems with complex free-energy landscapes. Systems with first-order phase transitions are among the problems in computational physics that are difficult to tackle with standard methods such as local-update simulations in the canonical ensemble, for example with the Metropolis algorithm. It is hence interesting to see whether such transitions can be more easily studied using population annealing. We report here our preliminary observations from population annealing runs for the two-dimensional Potts model with q > 4, where it undergoes a first-order transition.

  5. Safeguards and Security by Design (SSBD) for Small Modular Reactors (SMRs) through a Common Global Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Badwan, Faris M.; Demuth, Scott Francis; Miller, Michael Conrad

    Small Modular Reactors (SMR) with power levels significantly less than the currently standard 1000 to 1600-MWe reactors have been proposed as a potential game changer for future nuclear power. SMRs may offer a simpler, more standardized, and safer modular design by using factory built and easily transportable components. Additionally, SMRs may be more easily built and operated in isolated locations, and may require smaller initial capital investment and shorter construction times. Because many SMRs designs are still conceptual and consequently not yet fixed, designers have a unique opportunity to incorporate updated design basis threats, emergency preparedness requirements, and then fullymore » integrate safety, physical security, and safeguards/material control and accounting (MC&A) designs. Integrating safety, physical security, and safeguards is often referred to as integrating the 3Ss, and early consideration of safeguards and security in the design is often referred to as safeguards and security by design (SSBD). This paper describes U.S./Russian collaborative efforts toward developing an internationally accepted common approach for implementing SSBD/3Ss for SMRs based upon domestic requirements, and international guidance and requirements. These collaborative efforts originated with the Nuclear Energy and Nuclear Security working group established under the U.S.-Russia Bilateral Presidential Commission during the 2009 Presidential Summit. Initial efforts have focused on review of U.S. and Russian domestic requirements for Security and MC&A, IAEA guidance for security and MC&A, and IAEA requirements for international safeguards. Additionally, example SMR design features that can enhance proliferation resistance and physical security have been collected from past work and reported here. The development of a U.S./Russian common approach for SSBD/3Ss should aid the designer of SMRs located anywhere in the world. More specifically, the application of this approach may lead to more proliferation resistant and physically secure design features for SMRs.« less

  6. Italian Present-day Stress Indicators: IPSI Database

    NASA Astrophysics Data System (ADS)

    Mariucci, M. T.; Montone, P.

    2017-12-01

    In Italy, since the 90s of the last century, researches concerning the contemporary stress field have been developing at Istituto Nazionale di Geofisica e Vulcanologia (INGV) with local and regional scale studies. Throughout the years many data have been analysed and collected: now they are organized and available for an easy end-use online. IPSI (Italian Present-day Stress Indicators) database, is the first geo-referenced repository of information on the crustal present-day stress field maintained at INGV through a web application database and website development by Gabriele Tarabusi. Data consist of horizontal stress orientations analysed and compiled in a standardized format and quality-ranked for reliability and comparability on a global scale with other database. Our first database release includes 855 data records updated to December 2015. Here we present an updated version that will be released in 2018, after new earthquake data entry up to December 2017. The IPSI web site (http://ipsi.rm.ingv.it/) allows accessing data on a standard map viewer and choose which data (category and/or quality) to plot easily. The main information of each single element (type, quality, orientation) can be viewed simply going over the related symbol, all the information appear by clicking the element. At the same time, simple basic information on the different data type, tectonic regime assignment, quality ranking method are available with pop-up windows. Data records can be downloaded in some common formats, moreover it is possible to download a file directly usable with SHINE, a web based application to interpolate stress orientations (http://shine.rm.ingv.it). IPSI is mainly conceived for those interested in studying the characters of Italian peninsula and surroundings although Italian data are part of the World Stress Map (http://www.world-stress-map.org/) as evidenced by many links that redirect to this database for more details on standard practices in this field.

  7. Modules for Experiments in Stellar Astrophysics (MESA): Planets, Oscillations, Rotation, and Massive Stars

    NASA Astrophysics Data System (ADS)

    Paxton, Bill; Cantiello, Matteo; Arras, Phil; Bildsten, Lars; Brown, Edward F.; Dotter, Aaron; Mankovich, Christopher; Montgomery, M. H.; Stello, Dennis; Timmes, F. X.; Townsend, Richard

    2013-09-01

    We substantially update the capabilities of the open source software package Modules for Experiments in Stellar Astrophysics (MESA), and its one-dimensional stellar evolution module, MESA star. Improvements in MESA star's ability to model the evolution of giant planets now extends its applicability down to masses as low as one-tenth that of Jupiter. The dramatic improvement in asteroseismology enabled by the space-based Kepler and CoRoT missions motivates our full coupling of the ADIPLS adiabatic pulsation code with MESA star. This also motivates a numerical recasting of the Ledoux criterion that is more easily implemented when many nuclei are present at non-negligible abundances. This impacts the way in which MESA star calculates semi-convective and thermohaline mixing. We exhibit the evolution of 3-8 M ⊙ stars through the end of core He burning, the onset of He thermal pulses, and arrival on the white dwarf cooling sequence. We implement diffusion of angular momentum and chemical abundances that enable calculations of rotating-star models, which we compare thoroughly with earlier work. We introduce a new treatment of radiation-dominated envelopes that allows the uninterrupted evolution of massive stars to core collapse. This enables the generation of new sets of supernovae, long gamma-ray burst, and pair-instability progenitor models. We substantially modify the way in which MESA star solves the fully coupled stellar structure and composition equations, and we show how this has improved the scaling of MESA's calculational speed on multi-core processors. Updates to the modules for equation of state, opacity, nuclear reaction rates, and atmospheric boundary conditions are also provided. We describe the MESA Software Development Kit that packages all the required components needed to form a unified, maintained, and well-validated build environment for MESA. We also highlight a few tools developed by the community for rapid visualization of MESA star results.

  8. Using PIDs to Support the Full Research Data Publishing Lifecycle

    NASA Astrophysics Data System (ADS)

    Waard, A. D.

    2016-12-01

    Persistent identifiers can help support scientific research, track scientific impact and let researchers achieve recognition for their work. We discuss a number of ways in which Elsevier utilizes PIDs to support the scholarly lifecycle: To improve the process of storing and sharing data, Mendeley Data (http://data.mendeley.com) makes use of persistent identifiers to support the dynamic nature of data and software, by tracking and recording the provenance and versioning of datasets. This system now allows the comparison of different versions of a dataset, to see precisely what was changed during a versioning update. To present research data in context for the reader, we include PIDs in research articles as hyperlinks: https://www.elsevier.com/books-and-journals/content-innovation/data-base-linking. In some cases, PIDs fetch data files from the repositories provide that allow the embedding of visualizations, e.g. with PANGAEA and PubChem: https://www.elsevier.com/books-and-journals/content-innovation/protein-viewer; https://www.elsevier.com/books-and-journals/content-innovation/pubchem. To normalize referenced data elements, the Resource Identification Initiative - which we developed together with members of the Force11 RRID group - introduces a unified standard for resource identifiers (RRIDs) that can easily be interpreted by both humans and text mining tools. https://www.force11.org/group/resource-identification-initiative/update-resource-identification-initiative, as can be seen in our Antibody Data app: https://www.elsevier.com/books-and-journals/content-innovation/antibody-data To enable better citation practices and support robust metrics system for sharing research data, we have helped develop, and are early adopters of the Force11 Data Citation Principles and Implementation groups (https://www.force11.org/group/dcip) Lastly, through our work with the Research Data Alliance Publishing Data Services group, we helped create a set of guidelines (http://www.scholix.org/guidelines) and a demonstrator service (http://dliservice.research-infrastructures.eu/#/) for a linked data network connecting datasets, articles, and individuals, which all rely on robust PIDs.

  9. Alternatives to relational databases in precision medicine: Comparison of NoSQL approaches for big data storage using supercomputers

    NASA Astrophysics Data System (ADS)

    Velazquez, Enrique Israel

    Improvements in medical and genomic technologies have dramatically increased the production of electronic data over the last decade. As a result, data management is rapidly becoming a major determinant, and urgent challenge, for the development of Precision Medicine. Although successful data management is achievable using Relational Database Management Systems (RDBMS), exponential data growth is a significant contributor to failure scenarios. Growing amounts of data can also be observed in other sectors, such as economics and business, which, together with the previous facts, suggests that alternate database approaches (NoSQL) may soon be required for efficient storage and management of big databases. However, this hypothesis has been difficult to test in the Precision Medicine field since alternate database architectures are complex to assess and means to integrate heterogeneous electronic health records (EHR) with dynamic genomic data are not easily available. In this dissertation, we present a novel set of experiments for identifying NoSQL database approaches that enable effective data storage and management in Precision Medicine using patients' clinical and genomic information from the cancer genome atlas (TCGA). The first experiment draws on performance and scalability from biologically meaningful queries with differing complexity and database sizes. The second experiment measures performance and scalability in database updates without schema changes. The third experiment assesses performance and scalability in database updates with schema modifications due dynamic data. We have identified two NoSQL approach, based on Cassandra and Redis, which seems to be the ideal database management systems for our precision medicine queries in terms of performance and scalability. We present NoSQL approaches and show how they can be used to manage clinical and genomic big data. Our research is relevant to the public health since we are focusing on one of the main challenges to the development of Precision Medicine and, consequently, investigating a potential solution to the progressively increasing demands on health care.

  10. An Integrated Approach to Functional Engineering: An Engineering Database for Harness, Avionics and Software

    NASA Astrophysics Data System (ADS)

    Piras, Annamaria; Malucchi, Giovanni

    2012-08-01

    In the design and development phase of a new program one of the critical aspects is the integration of all the functional requirements of the system and the control of the overall consistency between the identified needs on one side and the available resources on the other side, especially when both the required needs and available resources are not yet consolidated, but they are evolving as the program maturity increases.The Integrated Engineering Harness Avionics and Software database (IDEHAS) is a tool that has been developed to support this process in the frame of the Avionics and Software disciplines through the different phases of the program. The tool is in fact designed to allow an incremental build up of the avionics and software systems, from the description of the high level architectural data (available in the early stages of the program) to the definition of the pin to pin connectivity information (typically consolidated in the design finalization stages) and finally to the construction and validation of the detailed telemetry parameters and commands to be used in the test phases and in the Mission Control Centre. The key feature of this approach and of the associated tool is that it allows the definition and the maintenance / update of all these data in a single, consistent environment.On one side a system level and concurrent approach requires the feasibility to easily integrate and update the best data available since the early stages of a program in order to improve confidence in the consistency and to control the design information.On the other side, the amount of information of different typologies and the cross-relationships among the data imply highly consolidated structures requiring lot of checks to guarantee the data content consistency with negative effects on simplicity and flexibility and often limiting the attention to special needs and to the interfaces with other disciplines.

  11. Pre-industrial and recent (1970-2010) atmospheric deposition of sulfate and mercury in snow on southern Baffin Island, Arctic Canada.

    PubMed

    Zdanowicz, Christian; Kruemmel, Eva; Lean, David; Poulain, Alexandre; Kinnard, Christophe; Yumvihoze, Emmanuel; Chen, JiuBin; Hintelmann, Holger

    2015-03-15

    Sulfate (SO4(2-)) and mercury (Hg) are airborne pollutants transported to the Arctic where they can affect properties of the atmosphere and the health of marine or terrestrial ecosystems. Detecting trends in Arctic Hg pollution is challenging because of the short period of direct observations, particularly of actual deposition. Here, we present an updated proxy record of atmospheric SO4(2-) and a new 40-year record of total Hg (THg) and monomethyl Hg (MeHg) deposition developed from a firn core (P2010) drilled from Penny Ice Cap, Baffin Island, Canada. The updated P2010 record shows stable mean SO4(2-) levels over the past 40 years, which is inconsistent with observations of declining atmospheric SO4(2-) or snow acidity in the Arctic during the same period. A sharp THg enhancement in the P2010 core ca 1991 is tentatively attributed to the fallout from the eruption of the Icelandic volcano Hekla. Although MeHg accumulation on Penny Ice Cap had remained constant since 1970, THg accumulation increased after the 1980s. This increase is not easily explained by changes in snow accumulation, marine aerosol inputs or air mass trajectories; however, a causal link may exist with the declining sea-ice cover conditions in the Baffin Bay sector. The ratio of THg accumulation between pre-industrial times (reconstructed from archived ice cores) and the modern industrial era is estimated at between 4- and 16-fold, which is consistent with estimates from Arctic lake sediment cores. The new P2010 THg record is the first of its kind developed from the Baffin Island region of the eastern Canadian Arctic and one of very few such records presently available in the Arctic. As such, it may help to bridge the knowledge gap linking direct observation of gaseous Hg in the Arctic atmosphere and actual net deposition and accumulation in various terrestrial media. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. pyam: Python Implementation of YaM

    NASA Technical Reports Server (NTRS)

    Myint, Steven; Jain, Abhinandan

    2012-01-01

    pyam is a software development framework with tools for facilitating the rapid development of software in a concurrent software development environment. pyam provides solutions for development challenges associated with software reuse, managing multiple software configurations, developing software product lines, and multiple platform development and build management. pyam uses release-early, release-often development cycles to allow developers to integrate their changes incrementally into the system on a continual basis. It facilitates the creation and merging of branches to support the isolated development of immature software to avoid impacting the stability of the development effort. It uses modules and packages to organize and share software across multiple software products, and uses the concepts of link and work modules to reduce sandbox setup times even when the code-base is large. One sidebenefit is the enforcement of a strong module-level encapsulation of a module s functionality and interface. This increases design transparency, system stability, and software reuse. pyam is written in Python and is organized as a set of utilities on top of the open source SVN software version control package. All development software is organized into a collection of modules. pyam packages are defined as sub-collections of the available modules. Developers can set up private sandboxes for module/package development. All module/package development takes place on private SVN branches. High-level pyam commands support the setup, update, and release of modules and packages. Released and pre-built versions of modules are available to developers. Developers can tailor the source/link module mix for their sandboxes so that new sandboxes (even large ones) can be built up easily and quickly by pointing to pre-existing module releases. All inter-module interfaces are publicly exported via links. A minimal, but uniform, convention is used for building modules.

  13. 76 FR 16785 - Meeting for Software Developers on the Technical Specifications for Common Formats for Patient...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-25

    ... Software Developers on the Technical Specifications for Common Formats for Patient Safety Data Collection... designed as an interactive forum where PSOs and software developers can provide input on these technical... updated event descriptions, forms, and technical specifications for software developers. As an update to...

  14. Updates to the Demographic and Spatial Allocation Models to ...

    EPA Pesticide Factsheets

    EPA announced the availability of the draft report, Updates to the Demographic and Spatial Allocation Models to Produce Integrated Climate and Land Use Scenarios (ICLUS) for a 30-day public comment period. The ICLUS version 2 (v2) modeling tool furthered land change modeling by providing nationwide housing development scenarios up to 2100. ICLUS V2 includes updated population and land use data sets and addressing limitations identified in ICLUS v1 in both the migration and spatial allocation models. The companion user guide describes the development of ICLUS v2 and the updates that were made to the original data sets and the demographic and spatial allocation models. [2017 UPDATE] Get the latest version of ICLUS and stay up-to-date by signing up to the ICLUS mailing list. The GIS tool enables users to run SERGoM with the population projections developed for the ICLUS project and allows users to modify the spatial allocation housing density across the landscape.

  15. Data management system for USGS/USEPA urban hydrology studies program

    USGS Publications Warehouse

    Doyle, W.H.; Lorens, J.A.

    1982-01-01

    A data management system was developed to store, update, and retrieve data collected in urban stormwater studies jointly conducted by the U.S. Geological Survey and U.S. Environmental Protection Agency in 11 cities in the United States. The data management system is used to retrieve and combine data from USGS data files for use in rainfall, runoff, and water-quality models and for data computations such as storm loads. The system is based on the data management aspect of the Statistical Analysis System (SAS) and was used to create all the data files in the data base. SAS is used for storage and retrieval of basin physiography, land-use, and environmental practices inventory data. Also, storm-event water-quality characteristics are stored in the data base. The advantages of using SAS to create and manage a data base are many with a few being that it is simple, easy to use, contains a comprehensive statistical package, and can be used to modify files very easily. Data base system development has progressed rapidly during the last two decades and the data managment system concepts used in this study reflect the advancement made in computer technology during this era. Urban stormwater data is, however, just one application for which the system can be used. (USGS)

  16. Kennedy Space Center Director Update

    NASA Image and Video Library

    2014-03-06

    CAPE CANAVERAL, Fla. - Community leaders, business executives, educators, and state and local government leaders were updated on NASA Kennedy Space Center programs and accomplishments during Center Director Bob Cabana’s Center Director Update at the Debus Center at the Kennedy Space Center Visitor Complex in Florida. Attendees mingled and visited various displays, including Ground Systems Development and Operations Program and Education Office displays. Attendees talked with Cabana and other senior Kennedy managers and visited displays featuring updates on Kennedy programs and projects, including International Space Station, Commercial Crew, Ground System Development and Operations, Launch Services, Center Planning and Development, Technology, KSC Swamp Works and NASA Education. The morning concluded with a tour of the new Space Shuttle Atlantis exhibit at the visitor complex. For more information, visit http://www.nasa.gov/kennedy. Photo credit: NASA/Daniel Casper

  17. Kennedy Space Center Director Update

    NASA Image and Video Library

    2014-03-06

    CAPE CANAVERAL, Fla. - Community leaders, business executives, educators, and state and local government leaders were updated on NASA Kennedy Space Center programs and accomplishments during Center Director Bob Cabana’s Center Director Update at the Debus Center at the Kennedy Space Center Visitor Complex in Florida. An attendee talks with engineers Jason Hopkins and Lisa Lutz, at the Ground Systems Development and Operations display. Attendees talked with Cabana and other senior Kennedy managers and visited displays featuring updates on Kennedy programs and projects, including International Space Station, Commercial Crew, Ground System Development and Operations, Launch Services, Center Planning and Development, Technology, KSC Swamp Works and NASA Education. The morning concluded with a tour of the new Space Shuttle Atlantis exhibit at the visitor complex. For more information, visit http://www.nasa.gov/kennedy. Photo credit: NASA/Daniel Casper

  18. Improving Spacecraft Data Visualization Using Splunk

    NASA Technical Reports Server (NTRS)

    Conte, Matthew

    2012-01-01

    EPOXI, like all spacecraft missions, receives large volumes of telemetry data from its spacecraft, DIF. It is extremely important for this data to be updated quickly and presented in a readable manner so that the flight team can monitor the status of the spacecraft. Existing DMD pages for monitoring spacecraft telemetry, while functional, are limited and do not take advantage of modern search technology. For instance, they only display current data points from instruments on the spacecraft and have limited graphing capabilities, making it difficult to see historical data. The DMD pages have fixed refresh rates so the team must often wait several minutes to see the most recent data, even after it is received on the ground. The pages are also rigid and require an investment of time and money to update. To more easily organize and visualize spacecraft telemetry, the EPOXI team has begun experimenting with Splunk, a commercially-available data mining system. Splunk can take data received from the spacecraft's different data channels, often in different formats, and index all the data into a common format. Splunk allows flight team members to search through the different data formats from a single interface and to filter results by time range and data field to make finding specific spacecraft events quick and easy. Furthermore, Splunk provides functions to create custom interfaces which help team members visualize the data in charts and graphs to show how the health of the spacecraft has changed over time.One of the goals of my internship with my mentor, Victor Hwang, was to develop new Splunk interfaces to replace the DMD pages and give the spacecraft team access to historical data and visualizations that were previously unavailable. The specific requirements of these pages are discussed in the next section.

  19. Simultaneous state-parameter estimation supports the evaluation of data assimilation performance and measurement design for soil-water-atmosphere-plant system

    NASA Astrophysics Data System (ADS)

    Hu, Shun; Shi, Liangsheng; Zha, Yuanyuan; Williams, Mathew; Lin, Lin

    2017-12-01

    Improvements to agricultural water and crop managements require detailed information on crop and soil states, and their evolution. Data assimilation provides an attractive way of obtaining these information by integrating measurements with model in a sequential manner. However, data assimilation for soil-water-atmosphere-plant (SWAP) system is still lack of comprehensive exploration due to a large number of variables and parameters in the system. In this study, simultaneous state-parameter estimation using ensemble Kalman filter (EnKF) was employed to evaluate the data assimilation performance and provide advice on measurement design for SWAP system. The results demonstrated that a proper selection of state vector is critical to effective data assimilation. Especially, updating the development stage was able to avoid the negative effect of ;phenological shift;, which was caused by the contrasted phenological stage in different ensemble members. Simultaneous state-parameter estimation (SSPE) assimilation strategy outperformed updating-state-only (USO) assimilation strategy because of its ability to alleviate the inconsistency between model variables and parameters. However, the performance of SSPE assimilation strategy could deteriorate with an increasing number of uncertain parameters as a result of soil stratification and limited knowledge on crop parameters. In addition to the most easily available surface soil moisture (SSM) and leaf area index (LAI) measurements, deep soil moisture, grain yield or other auxiliary data were required to provide sufficient constraints on parameter estimation and to assure the data assimilation performance. This study provides an insight into the response of soil moisture and grain yield to data assimilation in SWAP system and is helpful for soil moisture movement and crop growth modeling and measurement design in practice.

  20. The `WikiGuidelines' smartphone application: Bridging the gaps in availability of evidence-based smartphone mental health applications.

    PubMed

    Zhang, Melvyn W B; Ho, Roger C M; Mcintyre, Roger S

    2016-07-27

    Over the past decade, there have been massive advances in technology. These advances in technology have significantly transformed various aspects of healthcare. The advent of E-health and its influence on healthcare practice also implies that there is a paradigm shift in the way healthcare professionals work. Conventionally, healthcare professionals would have to refer to books and journals for updates in treatment algorithms, but with the advent of technology, they could access this information via the web or via various smartphone applications on the go. In the field of Psychiatry, one of the commonest mental health disorder to date, with significant morbidity and mortality is that of Major depressive disorder. Routinely, clinicians and healthcare professionals are advised to refer to standard guidelines in guiding them with regards to their treatment options. Given the high prevalence of conditions like Major Depressive Disorder, it is thus of importance that whatever guidelines that clinicians and healthcare professionals refer to are constantly kept up to date, so that patients could benefit from latest evidence based therapy and treatment. A review of the current literature highlights that whilst there are a multitude of smartphone applications designed for mental health care, previous systematic review has highlighted a paucity of evidence based applications. More importantly, current literature with regards to provision of treatment information to healthcare professionals and patients are limited to web-based interventions. It is the aim of this technical note to highlight a methodology to which the authors have conceptualized in the implementation of an evidence based mental health guideline applications, known as the `Wiki Guidelines' smartphone application. The authors hope to illustrate the algorithms behind the development of the application, and how it could be easily updated by the guidelines working group.

  1. ASP-G: an ASP-based method for finding attractors in genetic regulatory networks

    PubMed Central

    Mushthofa, Mushthofa; Torres, Gustavo; Van de Peer, Yves; Marchal, Kathleen; De Cock, Martine

    2014-01-01

    Motivation: Boolean network models are suitable to simulate GRNs in the absence of detailed kinetic information. However, reducing the biological reality implies making assumptions on how genes interact (interaction rules) and how their state is updated during the simulation (update scheme). The exact choice of the assumptions largely determines the outcome of the simulations. In most cases, however, the biologically correct assumptions are unknown. An ideal simulation thus implies testing different rules and schemes to determine those that best capture an observed biological phenomenon. This is not trivial because most current methods to simulate Boolean network models of GRNs and to compute their attractors impose specific assumptions that cannot be easily altered, as they are built into the system. Results: To allow for a more flexible simulation framework, we developed ASP-G. We show the correctness of ASP-G in simulating Boolean network models and obtaining attractors under different assumptions by successfully recapitulating the detection of attractors of previously published studies. We also provide an example of how performing simulation of network models under different settings help determine the assumptions under which a certain conclusion holds. The main added value of ASP-G is in its modularity and declarativity, making it more flexible and less error-prone than traditional approaches. The declarative nature of ASP-G comes at the expense of being slower than the more dedicated systems but still achieves a good efficiency with respect to computational time. Availability and implementation: The source code of ASP-G is available at http://bioinformatics.intec.ugent.be/kmarchal/Supplementary_Information_Musthofa_2014/asp-g.zip. Contact: Kathleen.Marchal@UGent.be or Martine.DeCock@UGent.be Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25028722

  2. LitPathExplorer: a confidence-based visual text analytics tool for exploring literature-enriched pathway models.

    PubMed

    Soto, Axel J; Zerva, Chrysoula; Batista-Navarro, Riza; Ananiadou, Sophia

    2018-04-15

    Pathway models are valuable resources that help us understand the various mechanisms underpinning complex biological processes. Their curation is typically carried out through manual inspection of published scientific literature to find information relevant to a model, which is a laborious and knowledge-intensive task. Furthermore, models curated manually cannot be easily updated and maintained with new evidence extracted from the literature without automated support. We have developed LitPathExplorer, a visual text analytics tool that integrates advanced text mining, semi-supervised learning and interactive visualization, to facilitate the exploration and analysis of pathway models using statements (i.e. events) extracted automatically from the literature and organized according to levels of confidence. LitPathExplorer supports pathway modellers and curators alike by: (i) extracting events from the literature that corroborate existing models with evidence; (ii) discovering new events which can update models; and (iii) providing a confidence value for each event that is automatically computed based on linguistic features and article metadata. Our evaluation of event extraction showed a precision of 89% and a recall of 71%. Evaluation of our confidence measure, when used for ranking sampled events, showed an average precision ranging between 61 and 73%, which can be improved to 95% when the user is involved in the semi-supervised learning process. Qualitative evaluation using pair analytics based on the feedback of three domain experts confirmed the utility of our tool within the context of pathway model exploration. LitPathExplorer is available at http://nactem.ac.uk/LitPathExplorer_BI/. sophia.ananiadou@manchester.ac.uk. Supplementary data are available at Bioinformatics online.

  3. Mouse IDGenes: a reference database for genetic interactions in the developing mouse brain

    PubMed Central

    Matthes, Michaela; Preusse, Martin; Zhang, Jingzhong; Schechter, Julia; Mayer, Daniela; Lentes, Bernd; Theis, Fabian; Prakash, Nilima; Wurst, Wolfgang; Trümbach, Dietrich

    2014-01-01

    The study of developmental processes in the mouse and other vertebrates includes the understanding of patterning along the anterior–posterior, dorsal–ventral and medial– lateral axis. Specifically, neural development is also of great clinical relevance because several human neuropsychiatric disorders such as schizophrenia, autism disorders or drug addiction and also brain malformations are thought to have neurodevelopmental origins, i.e. pathogenesis initiates during childhood and adolescence. Impacts during early neurodevelopment might also predispose to late-onset neurodegenerative disorders, such as Parkinson’s disease. The neural tube develops from its precursor tissue, the neural plate, in a patterning process that is determined by compartmentalization into morphogenetic units, the action of local signaling centers and a well-defined and locally restricted expression of genes and their interactions. While public databases provide gene expression data with spatio-temporal resolution, they usually neglect the genetic interactions that govern neural development. Here, we introduce Mouse IDGenes, a reference database for genetic interactions in the developing mouse brain. The database is highly curated and offers detailed information about gene expressions and the genetic interactions at the developing mid-/hindbrain boundary. To showcase the predictive power of interaction data, we infer new Wnt/β-catenin target genes by machine learning and validate one of them experimentally. The database is updated regularly. Moreover, it can easily be extended by the research community. Mouse IDGenes will contribute as an important resource to the research on mouse brain development, not exclusively by offering data retrieval, but also by allowing data input. Database URL: http://mouseidgenes.helmholtz-muenchen.de. PMID:25145340

  4. Mouse IDGenes: a reference database for genetic interactions in the developing mouse brain.

    PubMed

    Matthes, Michaela; Preusse, Martin; Zhang, Jingzhong; Schechter, Julia; Mayer, Daniela; Lentes, Bernd; Theis, Fabian; Prakash, Nilima; Wurst, Wolfgang; Trümbach, Dietrich

    2014-01-01

    The study of developmental processes in the mouse and other vertebrates includes the understanding of patterning along the anterior-posterior, dorsal-ventral and medial- lateral axis. Specifically, neural development is also of great clinical relevance because several human neuropsychiatric disorders such as schizophrenia, autism disorders or drug addiction and also brain malformations are thought to have neurodevelopmental origins, i.e. pathogenesis initiates during childhood and adolescence. Impacts during early neurodevelopment might also predispose to late-onset neurodegenerative disorders, such as Parkinson's disease. The neural tube develops from its precursor tissue, the neural plate, in a patterning process that is determined by compartmentalization into morphogenetic units, the action of local signaling centers and a well-defined and locally restricted expression of genes and their interactions. While public databases provide gene expression data with spatio-temporal resolution, they usually neglect the genetic interactions that govern neural development. Here, we introduce Mouse IDGenes, a reference database for genetic interactions in the developing mouse brain. The database is highly curated and offers detailed information about gene expressions and the genetic interactions at the developing mid-/hindbrain boundary. To showcase the predictive power of interaction data, we infer new Wnt/β-catenin target genes by machine learning and validate one of them experimentally. The database is updated regularly. Moreover, it can easily be extended by the research community. Mouse IDGenes will contribute as an important resource to the research on mouse brain development, not exclusively by offering data retrieval, but also by allowing data input. http://mouseidgenes.helmholtz-muenchen.de. © The Author(s) 2014. Published by Oxford University Press.

  5. A Markov model for planning and permitting offshore wind energy: A case study of radio-tracked terns in the Gulf of Maine, USA.

    PubMed

    Cranmer, Alexana; Smetzer, Jennifer R; Welch, Linda; Baker, Erin

    2017-05-15

    Quantifying and managing the potential adverse wildlife impacts of offshore wind energy is critical for developing offshore wind energy in a sustainable and timely manner, but poses a significant challenge, particularly for small marine birds that are difficult to monitor. We developed a discrete-time Markov model of seabird movement around a colony site parameterized by automated radio telemetry data from common terns (Sterna hirundo) and Arctic terns (S. paradisaea), and derived impact functions that estimate the probability of collision fatality as a function of the distance and bearing of wind turbines from a colony. Our purpose was to develop and demonstrate a new, flexible tool that can be used for specific management and wind-energy planning applications when adequate data are available, rather than inform wind-energy development at this site. We demonstrate how the tool can be used 1) in marine spatial planning exercises to quantitatively identify setback distances under development scenarios given a risk threshold, 2) to examine the ecological and technical trade-offs of development alternatives to facilitate negotiation between objectives, and 3) in the U.S. National Environmental Policy Act (NEPA) process to estimate collision fatality under alternative scenarios. We discuss model limitations and data needs, and highlight opportunities for future model extension and development. We present a highly flexible tool for wind energy planning that can be easily extended to other central place foragers and data sources, and can be updated and improved as new monitoring data arises. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. U.S. EPA'S RESEARCH TO UPDATE GUIDANCE FOR QUANTIFYING LANDFILL GAS EMISSIONS

    EPA Science Inventory

    Landfill emissions, if left uncontrolled, contribute to air toxics, climate change, tropospheric ozone, and urban smog. EPA's Office of Research and Development is conducting research to help update EPA's landfill gas emission factors. The last update to EPA's landfill gas emiss...

  7. Development of the updating executive function: From 7-year-olds to young adults.

    PubMed

    Carriedo, Nuria; Corral, Antonio; Montoro, Pedro R; Herrero, Laura; Rucián, Mercedes

    2016-04-01

    Updating information in working memory (WM) is a critical executive function responsible both for continuously replacing outdated information with new relevant data and to suppress or inhibit content that is no longer relevant according to task demands. The goal of the present research is twofold: First, we aimed to study updating development in 548 participants of 4 different age ranges--7-, 11-, and 15-year-olds and young adults--using the updating task devised by R. De Beni and P. Palladino (2004), which allows differentiating maintenance and inhibition processes. Second, we attempted to determine the relation between these processes across development as well as the differentiation among different types of inhibition processes tapped by this task. Results showed that there was an improvement of memory performance with age along with an upgrading of inhibitory efficiency. However, whereas in memory performance, a progressive increase was observed until the age of 15 years followed by stabilization, in inhibition, a continuous progressive increase was observed until young adulthood. Importantly, results showed that development of the different inhibitory mechanisms does not progress equally. All the groups committed more errors related to inefficient suppression mechanisms in WM than errors related to control of long-term memory interference. Principal component analysis showed that updating implies different subprocesses: active maintenance/suppression of information in WM and control of proactive interference. Developmental trajectories showed that the maintenance/suppression of information in the WM component continues to develop far beyond adolescence but that proactive interference control is responsible for variations in updating across development. (c) 2016 APA, all rights reserved).

  8. GEOCAB Portal: A gateway for discovering and accessing capacity building resources in Earth Observation

    NASA Astrophysics Data System (ADS)

    Desconnets, Jean-Christophe; Giuliani, Gregory; Guigoz, Yaniss; Lacroix, Pierre; Mlisa, Andiswa; Noort, Mark; Ray, Nicolas; Searby, Nancy D.

    2017-02-01

    The discovery of and access to capacity building resources are often essential to conduct environmental projects based on Earth Observation (EO) resources, whether they are Earth Observation products, methodological tools, techniques, organizations that impart training in these techniques or even projects that have shown practical achievements. Recognizing this opportunity and need, the European Commission through two FP7 projects jointly with the Group on Earth Observations (GEO) teamed up with the Committee on Earth observation Satellites (CEOS). The Global Earth Observation CApacity Building (GEOCAB) portal aims at compiling all current capacity building efforts on the use of EO data for societal benefits into an easily updateable and user-friendly portal. GEOCAB offers a faceted search to improve user discovery experience with a fully interactive world map with all inventoried projects and activities. This paper focuses on the conceptual framework used to implement the underlying platform. An ISO19115 metadata model associated with a terminological repository are the core elements that provide a semantic search application and an interoperable discovery service. The organization and the contribution of different user communities to ensure the management and the update of the content of GEOCAB are addressed.

  9. The power of PowerPoint.

    PubMed

    Niamtu , J

    2001-08-01

    Carousel slide presentations have been used for academic and clinical presentations since the late 1950s. However, advances in computer technology have caused a paradigm shift, and digital presentations are quickly becoming standard for clinical presentations. The advantages of digital presentations include cost savings; portability; easy updating capability; Internet access; multimedia functions, such as animation, pictures, video, and sound; and customization to augment audience interest and attention. Microsoft PowerPoint has emerged as the most popular digital presentation software and is currently used by many practitioners with and without significant computer expertise. The user-friendly platform of PowerPoint enables even the novice presenter to incorporate digital presentations into his or her profession. PowerPoint offers many advanced options that, with a minimal investment of time, can be used to create more interactive and professional presentations for lectures, patient education, and marketing. Examples of advanced PowerPoint applications are presented in a stepwise manner to unveil the full power of PowerPoint. By incorporating these techniques, medical practitioners can easily personalize, customize, and enhance their PowerPoint presentations. Complications, pitfalls, and caveats are discussed to detour and prevent misadventures in digital presentations. Relevant Web sites are listed to further update, customize, and communicate PowerPoint techniques.

  10. Carcinogenicity of chromium and chemoprevention: a brief update

    PubMed Central

    Gu, Yuanliang; Song, Xin; Zhao, Jinshun

    2017-01-01

    Chromium has two main valence states: hexavalent chromium (Cr[VI]) and trivalent chromium (Cr[III]). Cr(VI), a well-established human carcinogen, can enter cells by way of a sulfate/phosphate anion-transport system, and then be reduced to lower-valence intermediates consisting of pentavalent chromium (Cr[V]), tetravalent chromium (Cr[IV]) or Cr(III) via cellular reductants. These intermediates may directly or indirectly result in DNA damage or DNA–protein cross-links. Although Cr(III) complexes cannot pass easily through cell membranes, they have the ability to accumulate around cells to induce cell-surface morphological alteration and result in cell-membrane lipid injuries via disruption of cellular functions and integrity, and finally to cause DNA damage. In recent years, more research, including in vitro, in vivo, and epidemiological studies, has been conducted to evaluate the genotoxicity/carcinogenicity induced by Cr(VI) and/or Cr(III) compounds. At the same time, various therapeutic agents, especially antioxidants, have been explored through in vitro and in vivo studies for preventing chromium-induced genotoxicity/carcinogenesis. This review aims to provide a brief update on the carcinogenicity of Cr(VI) and Cr(III) and chemoprevention with different antioxidants. PMID:28860815

  11. A recursive technique for adaptive vector quantization

    NASA Technical Reports Server (NTRS)

    Lindsay, Robert A.

    1989-01-01

    Vector Quantization (VQ) is fast becoming an accepted, if not preferred method for image compression. The VQ performs well when compressing all types of imagery including Video, Electro-Optical (EO), Infrared (IR), Synthetic Aperture Radar (SAR), Multi-Spectral (MS), and digital map data. The only requirement is to change the codebook to switch the compressor from one image sensor to another. There are several approaches for designing codebooks for a vector quantizer. Adaptive Vector Quantization is a procedure that simultaneously designs codebooks as the data is being encoded or quantized. This is done by computing the centroid as a recursive moving average where the centroids move after every vector is encoded. When computing the centroid of a fixed set of vectors the resultant centroid is identical to the previous centroid calculation. This method of centroid calculation can be easily combined with VQ encoding techniques. The defined quantizer changes after every encoded vector by recursively updating the centroid of minimum distance which is the selected by the encoder. Since the quantizer is changing definition or states after every encoded vector, the decoder must now receive updates to the codebook. This is done as side information by multiplexing bits into the compressed source data.

  12. Effect of display update interval, update type, and background on perception of aircraft separation on a cockpit display on traffic information

    NASA Technical Reports Server (NTRS)

    Jago, S.; Baty, D.; Oconnor, S.; Palmer, E.

    1981-01-01

    The concept of a cockpit display of traffic information (CDTI) includes the integration of air traffic, navigation, and other pertinent information in a single electronic display in the cockpit. Concise display symbology was developed for use in later full-mission simulator evaluations of the CDTI concept. Experimental variables used included the update interval motion of the aircraft, the update type, (that is, whether the two aircraft were updated at the same update interval or not), the background (grid pattern or no background), and encounter type (straight or curved). Only the type of encounter affected performance.

  13. 75 FR 62501 - Senior Executive Service Performance Review Board: Update

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-12

    ... AGENCY FOR INTERNATIONAL DEVELOPMENT Senior Executive Service Performance Review Board: Update... Development, Office of Inspector General's Senior Executive Service Performance Review Board. DATES: September... reference-- USAID OIG Senior Executive Service (SES) Performance Review Board). SUPPLEMENTARY INFORMATION: 5...

  14. Safe Surgery Trainer

    DTIC Science & Technology

    2014-11-15

    design, testing, and development. b) Prototype Development – Continue developing SST software, game -flow, and mechanics. Continue developing art...refined learning objectives into measurement outlines. Update IRB submissions, edit usability game play study, and update I/ITSEC IRB. Provide case...minimal or near zero. 9) Related Activities a) Presenting at the Design of Learning Games Community Workshop, at I/ITSEC, Wednesday, Dec 3 rd

  15. Inter-firm Networks, Organizational Learning and Knowledge Updating: An Empirical Study

    NASA Astrophysics Data System (ADS)

    Zhang, Su-rong; Wang, Wen-ping

    In the era of knowledge-based economy which information technology develops rapidly, the rate of knowledge updating has become a critical factor for enterprises to gaining competitive advantage .We build an interactional theoretical model among inter-firm networks, organizational learning and knowledge updating thereby and demonstrate it with empirical study at last. The result shows that inter-firm networks and organizational learning is the source of knowledge updating.

  16. Country Update: Israel 2005

    ERIC Educational Resources Information Center

    Marar, Marianne Maurice

    2005-01-01

    Country Updates is a new section of "Intercultural Education." Starting in "Intercultural Education," Volume 16 No. 5, this column will focus on recent developments during the last two to three years in the field of intercultural education in one particular country. These updates can include recent policy decisions, the main…

  17. Improving Safety on the International Space Station: Transitioning to Electronic Emergency Procedure Books on the International Space Station

    NASA Technical Reports Server (NTRS)

    Carter-Journet, Katrina; Clahoun, Jessica; Morrow, Jason; Duncan, Gary

    2012-01-01

    The National Aeronautics and Space Administration (NASA) originally designed the International Space Station (ISS) to operate until 2015, but have extended operations until at least 2020. As part of this very dynamic Program, there is an effort underway to simplify the certification of Commercial ]of ]the ]Shelf (COTS) hardware. This change in paradigm allows the ISS Program to take advantage of technologically savvy and commercially available hardware, such as the iPad. The iPad, a line of tablet computers designed and marketed by Apple Inc., was chosen to support this endeavor. The iPad is functional, portable, and could be easily accessed in an emergency situation. The iPad Electronic Flight Bag (EFB), currently approved for use in flight by the Federal Aviation Administration (FAA), is a fraction of the cost of a traditional Class 2 EFB. In addition, the iPad fs ability to use electronic aeronautical data in lieu of paper in route charts and approach plates can cut the annual cost of paper data in half for commercial airlines. ISS may be able to benefit from this type of trade since one of the most important factors considered is information management. Emergency procedures onboard the ISS are currently available to the crew in paper form. Updates to the emergency books can either be launched on an upcoming visiting vehicle such as a Russian Soyuz flight or printed using the onboard ISS printer. In both cases, it is costly to update hardcopy procedures. A new operations concept was proposed to allow for the use of a tablet system that would provide a flexible platform to support space station crew operations. The purpose of the system would be to provide the crew the ability to view and maintain operational data, such as emergency procedures while also allowing Mission Control Houston to update the procedures. The ISS Program is currently evaluating the safety risks associated with the use of iPads versus paper. Paper products can contribute to the flammability risk and require manual updates that take time away from research tasks. The ISS program has recently purchased three iPads for the astronauts and the certification has been approved. The crew is currently using the iPads onboard. The results of this analysis could be used to discern whether the iPad is a viable option for use in emergencies by assessing the risk posture through the development of a quantitative probabilistic risk assessment (PRA).

  18. Leveraging High Resolution Topography for Education and Outreach: Updates to OpenTopography to make EarthScope and Other Lidar Datasets more Prominent in Geoscience Education

    NASA Astrophysics Data System (ADS)

    Kleber, E.; Crosby, C. J.; Arrowsmith, R.; Robinson, S.; Haddad, D. E.

    2013-12-01

    The use of Light Detection and Ranging (lidar) derived topography has become an indispensable tool in Earth science research. The collection of high-resolution lidar topography from an airborne or terrestrial platform allows landscapes and landforms to be represented at sub-meter resolution and in three dimensions. In addition to its high value for scientific research, lidar derived topography has tremendous potential as a tool for Earth science education. Recent science education initiatives and a community call for access to research-level data make the time ripe to expose lidar data and derived data products as a teaching tool. High resolution topographic data fosters several Disciplinary Core Ideas (DCIs) of the Next Generation Science Standards (NGS, 2013), presents respective Big Ideas of the new community-driven Earth Science Literacy Initiative (ESLI, 2009), teaches to a number National Science Education Standards (NSES, 1996), and Benchmarks for Science Literacy (AAAS, 1993) for science education for undergraduate physical and environmental earth science classes. The spatial context of lidar data complements concepts like visualization, place-based learning, inquiry based teaching and active learning essential to teaching in the geosciences. As official host to EarthScope lidar datasets for tectonically active areas in the western United States, the NSF-funded OpenTopography facility provides user-friendly access to a wealth of data that is easily incorporated into Earth science educational materials. OpenTopography (www.opentopography.org), in collaboration with EarthScope, has developed education and outreach activities to foster teacher, student and researcher utilization of lidar data. These educational resources use lidar data coupled with free tools such as Google Earth to provide a means for students and the interested public to visualize and explore Earth's surface in an interactive manner not possible with most other remotely sensed imagery. The education section of the OpenTopography portal has recently been strengthened with the addition of several new resources and the re-organization of existing content for easy discovery. New resources include a detailed frequently asked questions (FAQ) section, updated 'How-to' videos for downloading data from OpenTopography and additional webpages aimed at students, educators and researchers leveraging existing and updated resources from OpenTopography, EarthScope and other organizations. In addition, the OpenLandform catalog, an online collection of classic geologic landforms depicted in lidar, has been updated to include additional tectonic landforms from EarthScope lidar datasets.

  19. Monitoring road safety development at regional level: A case study in the ASEAN region.

    PubMed

    Chen, Faan; Wang, Jianjun; Wu, Jiaorong; Chen, Xiaohong; Zegras, P Christopher

    2017-09-01

    Persistent monitoring of progress, evaluating the results of interventions and recalibrating to achieve continuous improvement over time is widely recognized as being crucial towards the successful development of road safety. In the ASEAN (Association of Southeast Asian Nations) region there is a lack of well-resourced teams that contain multidisciplinary safety professionals, and specialists in individual countries, who are able to carry out this work effectively. In this context, not only must the monitoring framework be effective, it must also be easy to use and adapt. This paper provides a case study that can be easily reproduced; based on an updated and refined Road Safety Development Index (RSDI), by means of the RSR (Rank-sum ratio)-based model, for monitoring/reporting road safety development at regional level. The case study was focused on the road safety achievements in eleven Southeast Asian countries; identifying the areas of poor performance, potential problems and delays. These countries are finally grouped into several classes based on an overview of their progress and achievements regarding to road safety. The results allow the policymakers to better understand their own road safety progress toward their desired impact; more importantly, these results enable necessary interventions to be made in a quick and timely manner. Keeping action plans on schedule if things are not progressing as desired. This would avoid 'reinventing the wheel' and trial and error approaches to road safety, making the implementation of action plans more effective. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Kennedy Space Center Director Update

    NASA Image and Video Library

    2014-03-06

    CAPE CANAVERAL, Fla. - Community leaders, business executives, educators, and state and local government leaders were updated on NASA Kennedy Space Center programs and accomplishments during KSC Center Director Bob Cabana’s Center Director Update at the Debus Center at the Kennedy Space Center Visitor Complex in Florida. Attendees talked with Cabana and other senior Kennedy managers and visited displays featuring updates on Kennedy programs and projects, including International Space Station, Commercial Crew, Ground System Development and Operations, Launch Services, Center Planning and Development, Technology, KSC Swamp Works and NASA Education. The morning concluded with a tour of the new Space Shuttle Atlantis exhibit at the visitor complex. For more information, visit http://www.nasa.gov/kennedy. Photo credit: NASA/Daniel Casper

  1. Kennedy Space Center Director Update

    NASA Image and Video Library

    2014-03-06

    CAPE CANAVERAL, Fla. - Community leaders, business executives, educators, and state and local government leaders were updated on NASA Kennedy Space Center programs and accomplishments during Center Director Bob Cabana’s Center Director Update at the Debus Center at the Kennedy Space Center Visitor Complex in Florida. Attendees talked with Cabana and other senior Kennedy managers and visited displays featuring updates on Kennedy programs and projects, including International Space Station, Commercial Crew, Ground System Development and Operations, Launch Services, Center Planning and Development, Technology, KSC Swamp Works and NASA Education. The morning concluded with a tour of the new Space Shuttle Atlantis exhibit at the visitor complex. For more information, visit http://www.nasa.gov/kennedy. Photo credit: NASA/Daniel Casper

  2. 78 FR 53773 - Select Updates for Non-Clinical Engineering Tests and Recommended Labeling for Intravascular...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-30

    ...] Select Updates for Non-Clinical Engineering Tests and Recommended Labeling for Intravascular Stents and... Engineering Tests and Recommended Labeling for Intravascular Stents and Associated Delivery Systems.'' FDA has developed this guidance to inform the coronary and peripheral stent industry about selected updates to FDA's...

  3. 10 CFR 474.5 - Review and Update

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 3 2014-01-01 2014-01-01 false Review and Update 474.5 Section 474.5 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ELECTRIC AND HYBRID VEHICLE RESEARCH, DEVELOPMENT, AND DEMONSTRATION PROGRAM; PETROLEUM-EQUIVALENT FUEL ECONOMY CALCULATION § 474.5 Review and Update The Department will review part 474...

  4. 10 CFR 474.5 - Review and Update

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 3 2012-01-01 2012-01-01 false Review and Update 474.5 Section 474.5 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ELECTRIC AND HYBRID VEHICLE RESEARCH, DEVELOPMENT, AND DEMONSTRATION PROGRAM; PETROLEUM-EQUIVALENT FUEL ECONOMY CALCULATION § 474.5 Review and Update The Department will review part 474...

  5. 10 CFR 474.5 - Review and Update

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 3 2013-01-01 2013-01-01 false Review and Update 474.5 Section 474.5 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ELECTRIC AND HYBRID VEHICLE RESEARCH, DEVELOPMENT, AND DEMONSTRATION PROGRAM; PETROLEUM-EQUIVALENT FUEL ECONOMY CALCULATION § 474.5 Review and Update The Department will review part 474...

  6. 10 CFR 474.5 - Review and Update

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 3 2011-01-01 2011-01-01 false Review and Update 474.5 Section 474.5 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ELECTRIC AND HYBRID VEHICLE RESEARCH, DEVELOPMENT, AND DEMONSTRATION PROGRAM; PETROLEUM-EQUIVALENT FUEL ECONOMY CALCULATION § 474.5 Review and Update The Department will review Part 474...

  7. 10 CFR 474.5 - Review and Update

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Review and Update 474.5 Section 474.5 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ELECTRIC AND HYBRID VEHICLE RESEARCH, DEVELOPMENT, AND DEMONSTRATION PROGRAM; PETROLEUM-EQUIVALENT FUEL ECONOMY CALCULATION § 474.5 Review and Update The Department will review Part 474...

  8. 78 FR 22540 - Notice of Public Meeting/Webinar: EPA Method Development Update on Drinking Water Testing Methods...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-16

    ...: EPA Method Development Update on Drinking Water Testing Methods for Contaminant Candidate List... Division will describe methods currently in development for many CCL contaminants, with an expectation that several of these methods will support future cycles of the Unregulated Contaminant Monitoring Rule (UCMR...

  9. Spatio-Semantic Comparison of Large 3d City Models in Citygml Using a Graph Database

    NASA Astrophysics Data System (ADS)

    Nguyen, S. H.; Yao, Z.; Kolbe, T. H.

    2017-10-01

    A city may have multiple CityGML documents recorded at different times or surveyed by different users. To analyse the city's evolution over a given period of time, as well as to update or edit the city model without negating modifications made by other users, it is of utmost importance to first compare, detect and locate spatio-semantic changes between CityGML datasets. This is however difficult due to the fact that CityGML elements belong to a complex hierarchical structure containing multi-level deep associations, which can basically be considered as a graph. Moreover, CityGML allows multiple syntactic ways to define an object leading to syntactic ambiguities in the exchange format. Furthermore, CityGML is capable of including not only 3D urban objects' graphical appearances but also their semantic properties. Since to date, no known algorithm is capable of detecting spatio-semantic changes in CityGML documents, a frequent approach is to replace the older models completely with the newer ones, which not only costs computational resources, but also loses track of collaborative and chronological changes. Thus, this research proposes an approach capable of comparing two arbitrarily large-sized CityGML documents on both semantic and geometric level. Detected deviations are then attached to their respective sources and can easily be retrieved on demand. As a result, updating a 3D city model using this approach is much more efficient as only real changes are committed. To achieve this, the research employs a graph database as the main data structure for storing and processing CityGML datasets in three major steps: mapping, matching and updating. The mapping process transforms input CityGML documents into respective graph representations. The matching process compares these graphs and attaches edit operations on the fly. Found changes can then be executed using the Web Feature Service (WFS), the standard interface for updating geographical features across the web.

  10. 75 FR 53643 - Examination Guidelines Update: Developments in the Obviousness Inquiry After KSR v.Teleflex

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-01

    ...The United States Patent and Trademark Office (USPTO or Office) is issuing an update (2010 KSR Guidelines Update) to its obviousness guidelines for its personnel to be used when applying the law of obviousness under 35 U.S.C. 103. This 2010 KSR Guidelines Update highlights case law developments on obviousness under 35 U.S.C. 103 since the 2007 decision by the United States Supreme Court (Supreme Court) in KSR Int'l Co. v. Teleflex Inc. These guidelines are intended to be used by Office personnel in conjunction with the guidance in the Manual of Patent Examining Procedure when applying the law of obviousness under 35 U.S.C. 103. Members of the public are invited to provide comments on the 2010 KSR Guidelines Update. The Office is especially interested in receiving suggestions of recent decisional law in the field of obviousness that would have particular value as teaching tools.

  11. An Online Risk Monitor System (ORMS) to Increase Safety and Security Levels in Industry

    NASA Astrophysics Data System (ADS)

    Zubair, M.; Rahman, Khalil Ur; Hassan, Mehmood Ul

    2013-12-01

    The main idea of this research is to develop an Online Risk Monitor System (ORMS) based on Living Probabilistic Safety Assessment (LPSA). The article highlights the essential features and functions of ORMS. The basic models and modules such as, Reliability Data Update Model (RDUM), running time update, redundant system unavailability update, Engineered Safety Features (ESF) unavailability update and general system update have been described in this study. ORMS not only provides quantitative analysis but also highlights qualitative aspects of risk measures. ORMS is capable of automatically updating the online risk models and reliability parameters of equipment. ORMS can support in the decision making process of operators and managers in Nuclear Power Plants.

  12. Geometric database maintenance using CCTV cameras and overlay graphics

    NASA Astrophysics Data System (ADS)

    Oxenberg, Sheldon C.; Landell, B. Patrick; Kan, Edwin

    1988-01-01

    An interactive graphics system using closed circuit television (CCTV) cameras for remote verification and maintenance of a geometric world model database has been demonstrated in GE's telerobotics testbed. The database provides geometric models and locations of objects viewed by CCTV cameras and manipulated by telerobots. To update the database, an operator uses the interactive graphics system to superimpose a wireframe line drawing of an object with known dimensions on a live video scene containing that object. The methodology used is multipoint positioning to easily superimpose a wireframe graphic on the CCTV image of an object in the work scene. An enhanced version of GE's interactive graphics system will provide the object designation function for the operator control station of the Jet Propulsion Laboratory's telerobot demonstration system.

  13. TRAPPIST-1 Compared to Jovian Moons and Inner Solar System - Updated Feb. 2018

    NASA Image and Video Library

    2018-02-05

    All seven planets discovered in orbit around the red dwarf star TRAPPIST-1 could easily fit inside the orbit of Mercury, the innermost planet of our solar system. In fact, they would have room to spare. TRAPPIST-1 also is only a fraction of the size of our Sun; it isn't much larger than Jupiter. So, the TRAPPIST-1 system's proportions look more like Jupiter and its moons than those of our solar system. The seven planets of TRAPPIST-1 are all Earth-sized and terrestrial. TRAPPIST-1 is an ultra-cool dwarf star in the constellation Aquarius, and its planets orbit very close to it. https://photojournal.jpl.nasa.gov/catalog/PIA22096

  14. Recent Updates to the CFD General Notation System (CGNS)

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Wedan, Bruce; Hauser, Thomas; Poinot, Marc

    2012-01-01

    The CFD General Notation System (CGNS) - a general, portable, and extensible standard for the storage and retrieval of computational fluid dynamics (CFD) analysis data has been in existence for more than a decade (Version 1.0 was released in May 1998). Both structured and unstructured CFD data are covered by the standard, and CGNS can be easily extended to cover any sort of data imaginable, while retaining backward compatibility with existing CGNS data files and software. Although originally designed for CFD, it is readily extendable to any field of computational analysis. In early 2011, CGNS Version 3.1 was released, which added significant capabilities. This paper describes these recent enhancements and highlights the continued usefulness of the CGNS methodology.

  15. Complex amplitude reconstruction by iterative amplitude-phase retrieval algorithm with reference

    NASA Astrophysics Data System (ADS)

    Shen, Cheng; Guo, Cheng; Tan, Jiubin; Liu, Shutian; Liu, Zhengjun

    2018-06-01

    Multi-image iterative phase retrieval methods have been successfully applied in plenty of research fields due to their simple but efficient implementation. However, there is a mismatch between the measurement of the first long imaging distance and the sequential interval. In this paper, an amplitude-phase retrieval algorithm with reference is put forward without additional measurements or priori knowledge. It gets rid of measuring the first imaging distance. With a designed update formula, it significantly raises the convergence speed and the reconstruction fidelity, especially in phase retrieval. Its superiority over the original amplitude-phase retrieval (APR) method is validated by numerical analysis and experiments. Furthermore, it provides a conceptual design of a compact holographic image sensor, which can achieve numerical refocusing easily.

  16. Novel conformal technique to reduce staircasing artifacts at material boundaries for FDTD modeling of the bioheat equation.

    PubMed

    Neufeld, E; Chavannes, N; Samaras, T; Kuster, N

    2007-08-07

    The modeling of thermal effects, often based on the Pennes Bioheat Equation, is becoming increasingly popular. The FDTD technique commonly used in this context suffers considerably from staircasing errors at boundaries. A new conformal technique is proposed that can easily be integrated into existing implementations without requiring a special update scheme. It scales fluxes at interfaces with factors derived from the local surface normal. The new scheme is validated using an analytical solution, and an error analysis is performed to understand its behavior. The new scheme behaves considerably better than the standard scheme. Furthermore, in contrast to the standard scheme, it is possible to obtain with it more accurate solutions by increasing the grid resolution.

  17. 75 FR 57274 - Financial Management and Assurance; Government Auditing Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-20

    ... contained in the 2010 Exposure Draft update GAGAS to reflect major developments in the accountability and audit profession and emphasize specific considerations applicable to the government environment. In addition, this proposed revision modernizes GAGAS, with updates to reflect major developments in the...

  18. Updating National Topographic Data Base Using Change Detection Methods

    NASA Astrophysics Data System (ADS)

    Keinan, E.; Felus, Y. A.; Tal, Y.; Zilberstien, O.; Elihai, Y.

    2016-06-01

    The traditional method for updating a topographic database on a national scale is a complex process that requires human resources, time and the development of specialized procedures. In many National Mapping and Cadaster Agencies (NMCA), the updating cycle takes a few years. Today, the reality is dynamic and the changes occur every day, therefore, the users expect that the existing database will portray the current reality. Global mapping projects which are based on community volunteers, such as OSM, update their database every day based on crowdsourcing. In order to fulfil user's requirements for rapid updating, a new methodology that maps major interest areas while preserving associated decoding information, should be developed. Until recently, automated processes did not yield satisfactory results, and a typically process included comparing images from different periods. The success rates in identifying the objects were low, and most were accompanied by a high percentage of false alarms. As a result, the automatic process required significant editorial work that made it uneconomical. In the recent years, the development of technologies in mapping, advancement in image processing algorithms and computer vision, together with the development of digital aerial cameras with NIR band and Very High Resolution satellites, allow the implementation of a cost effective automated process. The automatic process is based on high-resolution Digital Surface Model analysis, Multi Spectral (MS) classification, MS segmentation, object analysis and shape forming algorithms. This article reviews the results of a novel change detection methodology as a first step for updating NTDB in the Survey of Israel.

  19. Updating Positive and Negative Stimuli in Working Memory in Depression

    ERIC Educational Resources Information Center

    Levens, Sara M.; Gotlib, Ian H.

    2010-01-01

    Difficulties in the ability to update stimuli in working memory (WM) may underlie the problems with regulating emotions that lead to the development and perpetuation of mood disorders such as depression. To examine the ability to update affective material in WM, the authors had diagnosed depressed and never-disordered control participants perform…

  20. Kennedy Space Center Director Update

    NASA Image and Video Library

    2014-03-06

    CAPE CANAVERAL, Fla. - Community leaders, business executives, educators, and state and local government leaders were updated on NASA Kennedy Space Center programs and accomplishments during Center Director Bob Cabana’s Center Director Update at the Debus Center at the Kennedy Space Center Visitor Complex in Florida. From left, Scott Thurston, Kennedy deputy of the spacecraft office of the Commercial Crew Program, talks with Scott Colloredo, director of the Center Planning and Development Directorate. Attendees talked with Cabana and other senior Kennedy managers and visited displays featuring updates on Kennedy programs and projects, including International Space Station, Commercial Crew, Ground System Development and Operations, Launch Services, Center Planning and Development, Technology, KSC Swamp Works and NASA Education. The morning concluded with a tour of the new Space Shuttle Atlantis exhibit at the visitor complex. For more information, visit http://www.nasa.gov/kennedy. Photo credit: NASA/Daniel Casper

  1. Kennedy Space Center Director Update

    NASA Image and Video Library

    2014-03-06

    CAPE CANAVERAL, Fla. - Community leaders, business executives, educators, and state and local government leaders were updated on NASA Kennedy Space Center programs and accomplishments during Center Director Bob Cabana’s Center Director Update at the Debus Center at the Kennedy Space Center Visitor Complex in Florida. Attendees talk with Trey Carlson, Kennedy Master Planner, at the Center Planning and Development Directorate, or CPDD, display. In the background is Mario Busacca, chief of CPDD’s Spaceport Planning Office. Attendees talked with Cabana and other senior Kennedy managers and visited displays featuring updates on Kennedy programs and projects, including International Space Station, Commercial Crew, Ground System Development and Operations, Launch Services, Center Planning and Development, Technology, KSC Swamp Works and NASA Education. The morning concluded with a tour of the new Space Shuttle Atlantis exhibit at the visitor complex. For more information, visit http://www.nasa.gov/kennedy. Photo credit: NASA/Daniel Casper

  2. Comparison of international guideline programs to evaluate and update the Dutch program for clinical guideline development in physical therapy

    PubMed Central

    Van der Wees, Philip J; Hendriks, Erik JM; Custers, Jan WH; Burgers, Jako S; Dekker, Joost; de Bie, Rob A

    2007-01-01

    Background Clinical guidelines are considered important instruments to improve quality in health care. Since 1998 the Royal Dutch Society for Physical Therapy (KNGF) produced evidence-based clinical guidelines, based on a standardized program. New developments in the field of guideline research raised the need to evaluate and update the KNGF guideline program. Purpose of this study is to compare different guideline development programs and review the KNGF guideline program for physical therapy in the Netherlands, in order to update the program. Method Six international guideline development programs were selected, and the 23 criteria of the AGREE Instrument were used to evaluate the guideline programs. Information about the programs was retrieved from published handbooks of the organizations. Also, the Dutch program for guideline development in physical therapy was evaluated using the AGREE criteria. Further comparison the six guideline programs was carried out using the following elements of the guideline development processes: Structure and organization; Preparation and initiation; Development; Validation; Dissemination and implementation; Evaluation and update. Results Compliance with the AGREE criteria of the guideline programs was high. Four programs addressed 22 AGREE criteria, and two programs addressed 20 AGREE criteria. The previous Dutch program for guideline development in physical therapy lacked in compliance with the AGREE criteria, meeting only 13 criteria. Further comparison showed that all guideline programs perform systematic literature searches to identify the available evidence. Recommendations are formulated and graded, based on evidence and other relevant factors. It is not clear how decisions in the development process are made. In particular, the process of translating evidence into practice recommendations can be improved. Conclusion As a result of international developments and consensus, the described processes for developing clinical practice guidelines have much in common. The AGREE criteria are common basis for the development of guidelines, although it is not clear how final decisions are made. Detailed comparison of the different guideline programs was used for updating the Dutch program. As a result the updated KNGF program complied with 22 AGREE criteria. International discussion is continuing and will be used for further improvement of the program. PMID:18036215

  3. Comparison of international guideline programs to evaluate and update the Dutch program for clinical guideline development in physical therapy.

    PubMed

    Van der Wees, Philip J; Hendriks, Erik J M; Custers, Jan W H; Burgers, Jako S; Dekker, Joost; de Bie, Rob A

    2007-11-23

    Clinical guidelines are considered important instruments to improve quality in health care. Since 1998 the Royal Dutch Society for Physical Therapy (KNGF) produced evidence-based clinical guidelines, based on a standardized program. New developments in the field of guideline research raised the need to evaluate and update the KNGF guideline program. Purpose of this study is to compare different guideline development programs and review the KNGF guideline program for physical therapy in the Netherlands, in order to update the program. Six international guideline development programs were selected, and the 23 criteria of the AGREE Instrument were used to evaluate the guideline programs. Information about the programs was retrieved from published handbooks of the organizations. Also, the Dutch program for guideline development in physical therapy was evaluated using the AGREE criteria. Further comparison the six guideline programs was carried out using the following elements of the guideline development processes: Structure and organization; Preparation and initiation; Development; Validation; Dissemination and implementation; Evaluation and update. Compliance with the AGREE criteria of the guideline programs was high. Four programs addressed 22 AGREE criteria, and two programs addressed 20 AGREE criteria. The previous Dutch program for guideline development in physical therapy lacked in compliance with the AGREE criteria, meeting only 13 criteria. Further comparison showed that all guideline programs perform systematic literature searches to identify the available evidence. Recommendations are formulated and graded, based on evidence and other relevant factors. It is not clear how decisions in the development process are made. In particular, the process of translating evidence into practice recommendations can be improved. As a result of international developments and consensus, the described processes for developing clinical practice guidelines have much in common. The AGREE criteria are common basis for the development of guidelines, although it is not clear how final decisions are made. Detailed comparison of the different guideline programs was used for updating the Dutch program. As a result the updated KNGF program complied with 22 AGREE criteria. International discussion is continuing and will be used for further improvement of the program.

  4. Updating contextualized clinical practice guidelines on stroke rehabilitation and low back pain management using a novel assessment framework that standardizes decisions.

    PubMed

    Gambito, Ephraim D V; Gonzalez-Suarez, Consuelo B; Grimmer, Karen A; Valdecañas, Carolina M; Dizon, Janine Margarita R; Beredo, Ma Eulalia J; Zamora, Marcelle Theresa G

    2015-11-04

    Clinical practice guidelines need to be regularly updated with current literature in order to remain relevant. This paper reports on the approach taken by the Philippine Academy of Rehabilitation Medicine (PARM). This dovetails with its writing guide, which underpinned its foundational work in contextualizing guidelines for stroke and low back pain (LBP) in 2011. Working groups of Filipino rehabilitation physicians and allied health practitioners met to reconsider and modify, where indicated, the 'typical' Filipino patient care pathways established in the foundation guidelines. New clinical guidelines on stroke and low back pain which had been published internationally in the last 3 years were identified using a search of electronic databases. The methodological quality of each guideline was assessed using the iCAHE Guideline Quality Checklist, and only those guidelines which provided full text references, evidence hierarchy and quality appraisal of the included literature, were included in the PARM update. Each of the PARM-endorsed recommendations was then reviewed, in light of new literature presented in the included clinical guidelines. A novel standard updating approach was developed based on the criteria reported by Johnston et al. (Int J Technol Assess Health Care 19(4):646-655, 2003) and then modified to incorporate wording from the foundational PARM writing guide. The new updating tool was debated, pilot-tested and agreed upon by the PARM working groups, before being applied to the guideline updating process. Ten new guidelines on stroke and eleven for low back pain were identified. Guideline quality scores were moderate to good, however not all guidelines comprehensively linked the evidence body underpinning recommendations with the literature. Consequently only five stroke and four low back pain guidelines were included. The modified PARM updating guide was applied by all working groups to ensure standardization of the wording of updated recommendations and the underpinning evidence bases. The updating tool provides a simple, standard and novel approach that incorporates evidence hierarchy and quality, and wordings of recommendations. It could be used efficiently by other guideline updaters particularly in developing countries, where resources for guideline development and updates are limited. When many people are involved in guideline writing, there is always the possibility of 'slippage' in use of wording and interpretation of evidence. The PARM updating tool provides a mechanism for maintaining a standard process for guideline updating processes that can be followed by clinicians with basic training in evidence-based practice principles.

  5. Waste Information Management System-2012 - 12114

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Upadhyay, H.; Quintero, W.; Shoffner, P.

    2012-07-01

    The Waste Information Management System (WIMS) -2012 was updated to support the Department of Energy (DOE) accelerated cleanup program. The schedule compression required close coordination and a comprehensive review and prioritization of the barriers that impeded treatment and disposition of the waste streams at each site. Many issues related to waste treatment and disposal were potential critical path issues under the accelerated schedule. In order to facilitate accelerated cleanup initiatives, waste managers at DOE field sites and at DOE Headquarters in Washington, D.C., needed timely waste forecast and transportation information regarding the volumes and types of radioactive waste that wouldmore » be generated by DOE sites over the next 40 years. Each local DOE site historically collected, organized, and displayed waste forecast information in separate and unique systems. In order for interested parties to understand and view the complete DOE complex-wide picture, the radioactive waste and shipment information of each DOE site needed to be entered into a common application. The WIMS application was therefore created to serve as a common application to improve stakeholder comprehension and improve DOE radioactive waste treatment and disposal planning and scheduling. WIMS allows identification of total forecasted waste volumes, material classes, disposition sites, choke points, technological or regulatory barriers to treatment and disposal, along with forecasted waste transportation information by rail, truck and inter-modal shipments. The Applied Research Center (ARC) at Florida International University (FIU) in Miami, Florida, developed and deployed the web-based forecast and transportation system and is responsible for updating the radioactive waste forecast and transportation data on a regular basis to ensure the long-term viability and value of this system. WIMS continues to successfully accomplish the goals and objectives set forth by DOE for this project. It has replaced the historic process of each DOE site gathering, organizing, and reporting their waste forecast information utilizing different databases and display technologies. In addition, WIMS meets DOE's objective to have the complex-wide waste forecast and transportation information available to all stakeholders and the public in one easy-to-navigate system. The enhancements to WIMS made since its initial deployment include the addition of new DOE sites and facilities, an updated waste and transportation information, and the ability to easily display and print customized waste forecast, the disposition maps, GIS maps and transportation information. The system also allows users to customize and generate reports over the web. These reports can be exported to various formats, such as Adobe{sup R} PDF, Microsoft Excel{sup R}, and Microsoft Word{sup R} and downloaded to the user's computer. Future enhancements will include database/application migration to the next level. A new data import interface will be developed to integrate 2012-13 forecast waste streams. In addition, the application is updated on a continuous basis based on DOE feedback. (authors)« less

  6. 76 FR 9339 - Biomass Research and Development Technical Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-17

    ... DEPARTMENT OF ENERGY Office of Energy Efficiency and Renewable Energy Biomass Research and... Biomass Research and Development Technical Advisory Committee under Section 9008(d) of the Food.... Tentative Agenda: Agenda will include the following: Update on USDA Biomass R&D Activities. Update on DOE...

  7. 76 FR 63614 - Biomass Research and Development Technical Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-13

    ... DEPARTMENT OF ENERGY Energy Efficiency and Renewable Energy Biomass Research and Development...: Notice of open meeting. SUMMARY: This notice announces an open meeting of the Biomass Research and... Update on USDA Biomass R&D Activities; Update on DOE Biomass R&D Activities; Presentation on Current...

  8. 77 FR 26276 - Biomass Research and Development Technical Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-03

    ... DEPARTMENT OF ENERGY Energy Efficiency and Renewable Energy Biomass Research and Development...: Notice of open meeting. SUMMARY: This notice announces an open meeting of the Biomass Research and... include the following: Update on USDA Biomass R&D Activities Update on DOE Biomass R&D Activities...

  9. Industry Panel Curriculum Project. Final Report.

    ERIC Educational Resources Information Center

    Kenneke, Larry J.

    This project was conducted to develop and disseminate updated curriculum guides for nine selected cluster areas of vocational education programs in Oregon; only five were developed. The updates were based on recommendations made by industry review panels (IRPs). Specialists from the Oregon Department of Education and Oregon State University…

  10. Development of rehabilitation training support system for occupational therapy of upper limb motor function

    NASA Astrophysics Data System (ADS)

    Morita, Yoshifumi; Hirose, Akinori; Uno, Takashi; Uchid, Masaki; Ukai, Hiroyuki; Matsui, Nobuyuki

    2007-12-01

    In this paper we propose a new rehabilitation training support system for upper limbs. The proposed system enables therapists to quantitatively evaluate the therapeutic effect of upper limb motor function during training, to easily change the load of resistance of training and to easily develop a new training program suitable for the subjects. For this purpose we develop control algorithms of training programs in the 3D force display robot. The 3D force display robot has parallel link mechanism with three motors. The control algorithm simulating sanding training is developed for the 3D force display robot. Moreover the teaching/training function algorithm is developed. It enables the therapists to easily make training trajectory suitable for subject's condition. The effectiveness of the developed control algorithms is verified by experiments.

  11. Nutrient estimation from an FFQ developed for a black Zimbabwean population

    PubMed Central

    Merchant, Anwar T; Dehghan, Mahshid; Chifamba, Jephat; Terera, Getrude; Yusuf, Salim

    2005-01-01

    Background There is little information in the literature on methods of food composition database development to calculate nutrient intake from food frequency questionnaire (FFQ) data. The aim of this study is to describe the development of an FFQ and a food composition table to calculate nutrient intake in a Black Zimbabwean population. Methods Trained interviewers collected 24-hour dietary recalls (24 hr DR) from high and low income families in urban and rural Zimbabwe. Based on these data and input from local experts we developed an FFQ, containing a list of frequently consumed foods, standard portion sizes, and categories of consumption frequency. We created a food composition table of the foods found in the FFQ so that we could compute nutrient intake. We used the USDA nutrient database as the main resource because it is relatively complete, updated, and easily accessible. To choose the food item in the USDA nutrient database that most closely matched the nutrient content of the local food we referred to a local food composition table. Results Almost all the participants ate sadza (maize porridge) at least 5 times a week, and about half had matemba (fish) and caterpillar more than once a month. Nutrient estimates obtained from the FFQ data by using the USDA and Zimbabwean food composition tables were similar for total energy intake intra class correlation (ICC) = 0.99, and carbohydrate (ICC = 0.99), but different for vitamin A (ICC = 0.53), and total folate (ICC = 0.68). Conclusion We have described a standardized process of FFQ and food composition database development for a Black Zimbabwean population. PMID:16351722

  12. Development and Psychometric Analysis of a Nurses’ Attitudes and Skills Safety Scale: Initial Results

    PubMed Central

    Armstrong, Gail E.; Dietrich, Mary; Norman, Linda; Barnsteiner, Jane; Mion, Lorraine

    2016-01-01

    Health care organizations have incorporated updated safety principles in the analysis of errors and in norms and standards. Yet no research exists that assesses bedside nurses’ perceived skills or attitudes toward updated safety concepts. The aims of this study were to develop a scale assessing nurses’ perceived skills and attitudes toward updated safety concepts, determine content validity, and examine internal consistency of the scale and subscales. Understanding nurses’ perceived skills and attitudes about safety concepts can be used in targeting strategies to enhance their safety practices. PMID:27479518

  13. Development and Psychometric Analysis of a Nurses' Attitudes and Skills Safety Scale: Initial Results.

    PubMed

    Armstrong, Gail E; Dietrich, Mary; Norman, Linda; Barnsteiner, Jane; Mion, Lorraine

    Health care organizations have incorporated updated safety principles in the analysis of errors and in norms and standards. Yet no research exists that assesses bedside nurses' perceived skills or attitudes toward updated safety concepts. The aims of this study were to develop a scale assessing nurses' perceived skills and attitudes toward updated safety concepts, determine content validity, and examine internal consistency of the scale and subscales. Understanding nurses' perceived skills and attitudes about safety concepts can be used in targeting strategies to enhance their safety practices.

  14. Metabolic Biosynthesis of Potato (Solanum tuberosum l.) Antioxidants and Implications for Human Health.

    PubMed

    Lovat, Christie; Nassar, Atef M K; Kubow, Stan; Li, Xiu-Qing; Donnelly, Danielle J

    2016-10-25

    Potato (Solanum tuberosum L.) is common, affordable, readily stored, easily prepared for consumption, and nutritious. For these reasons, potato has become one of the top five crops consumed worldwide. Consequently, it is important to understand its contribution to both our daily and long-term health. Potato is one of the most important sources of antioxidants in the human diet. As such, it supports the antioxidant defense network in our bodies that reduces cellular and tissue toxicities that result from free radical-induced protein, lipid, carbohydrate, and DNA damage. In this way, potato antioxidants may reduce the risk for cancers, cardiovascular diseases, diabetes, and even radiation damage. A better understanding of these components of potato is needed by the food industry, health professionals, and consumers. This review provides referenced summaries of all of the antioxidant groups present in potato tubers and updated schematics including genetic regulation for the major antioxidant biosynthesis pathways. This review complements current knowledge on the role of potato in human health. We hope it will provide impetus toward breeding efforts to develop cultivars with increased antioxidant capacity as 'functional foods' and encourage potato consumers and processors to work toward preservation of antioxidant capacity in cooked potato and potato products.

  15. Metazen – metadata capture for metagenomes

    PubMed Central

    2014-01-01

    Background As the impact and prevalence of large-scale metagenomic surveys grow, so does the acute need for more complete and standards compliant metadata. Metadata (data describing data) provides an essential complement to experimental data, helping to answer questions about its source, mode of collection, and reliability. Metadata collection and interpretation have become vital to the genomics and metagenomics communities, but considerable challenges remain, including exchange, curation, and distribution. Currently, tools are available for capturing basic field metadata during sampling, and for storing, updating and viewing it. Unfortunately, these tools are not specifically designed for metagenomic surveys; in particular, they lack the appropriate metadata collection templates, a centralized storage repository, and a unique ID linking system that can be used to easily port complete and compatible metagenomic metadata into widely used assembly and sequence analysis tools. Results Metazen was developed as a comprehensive framework designed to enable metadata capture for metagenomic sequencing projects. Specifically, Metazen provides a rapid, easy-to-use portal to encourage early deposition of project and sample metadata. Conclusions Metazen is an interactive tool that aids users in recording their metadata in a complete and valid format. A defined set of mandatory fields captures vital information, while the option to add fields provides flexibility. PMID:25780508

  16. Metazen - metadata capture for metagenomes.

    PubMed

    Bischof, Jared; Harrison, Travis; Paczian, Tobias; Glass, Elizabeth; Wilke, Andreas; Meyer, Folker

    2014-01-01

    As the impact and prevalence of large-scale metagenomic surveys grow, so does the acute need for more complete and standards compliant metadata. Metadata (data describing data) provides an essential complement to experimental data, helping to answer questions about its source, mode of collection, and reliability. Metadata collection and interpretation have become vital to the genomics and metagenomics communities, but considerable challenges remain, including exchange, curation, and distribution. Currently, tools are available for capturing basic field metadata during sampling, and for storing, updating and viewing it. Unfortunately, these tools are not specifically designed for metagenomic surveys; in particular, they lack the appropriate metadata collection templates, a centralized storage repository, and a unique ID linking system that can be used to easily port complete and compatible metagenomic metadata into widely used assembly and sequence analysis tools. Metazen was developed as a comprehensive framework designed to enable metadata capture for metagenomic sequencing projects. Specifically, Metazen provides a rapid, easy-to-use portal to encourage early deposition of project and sample metadata. Metazen is an interactive tool that aids users in recording their metadata in a complete and valid format. A defined set of mandatory fields captures vital information, while the option to add fields provides flexibility.

  17. Full Eulerian simulations of biconcave neo-Hookean particles in a Poiseuille flow

    NASA Astrophysics Data System (ADS)

    Sugiyama, Kazuyasu; , Satoshi, II; Takeuchi, Shintaro; Takagi, Shu; Matsumoto, Yoichiro

    2010-03-01

    For a given initial configuration of a multi-component geometry represented by voxel-based data on a fixed Cartesian mesh, a full Eulerian finite difference method facilitates solution of dynamic interaction problems between Newtonian fluid and hyperelastic material. The solid volume fraction, and the left Cauchy-Green deformation tensor are temporally updated on the Eulerian frame, respectively, to distinguish the fluid and solid phases, and to describe the solid deformation. The simulation method is applied to two- and three-dimensional motions of two biconcave neo-Hookean particles in a Poiseuille flow. Similar to the numerical study on the red blood cell motion in a circular pipe (Gong et al. in J Biomech Eng 131:074504, 2009), in which Skalak’s constitutive laws of the membrane are considered, the deformation, the relative position and orientation of a pair of particles are strongly dependent upon the initial configuration. The increase in the apparent viscosity is dependent upon the developed arrangement of the particles. The present Eulerian approach is demonstrated that it has the potential to be easily extended to larger system problems involving a large number of particles of complicated geometries.

  18. Sparsity-based image monitoring of crystal size distribution during crystallization

    NASA Astrophysics Data System (ADS)

    Liu, Tao; Huo, Yan; Ma, Cai Y.; Wang, Xue Z.

    2017-07-01

    To facilitate monitoring crystal size distribution (CSD) during a crystallization process by using an in-situ imaging system, a sparsity-based image analysis method is proposed for real-time implementation. To cope with image degradation arising from in-situ measurement subject to particle motion, solution turbulence, and uneven illumination background in the crystallizer, sparse representation of a real-time captured crystal image is developed based on using an in-situ image dictionary established in advance, such that the noise components in the captured image can be efficiently removed. Subsequently, the edges of a crystal shape in a captured image are determined in terms of the salience information defined from the denoised crystal images. These edges are used to derive a blur kernel for reconstruction of a denoised image. A non-blind deconvolution algorithm is given for the real-time reconstruction. Consequently, image segmentation can be easily performed for evaluation of CSD. The crystal image dictionary and blur kernels are timely updated in terms of the imaging conditions to improve the restoration efficiency. An experimental study on the cooling crystallization of α-type L-glutamic acid (LGA) is shown to demonstrate the effectiveness and merit of the proposed method.

  19. Implementation of the US EPA (United States Environmental Protection Agency) Regional Oxidant Modeling System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Novak, J.H.

    1984-05-01

    Model design, implementation and quality assurance procedures can have a significant impact on the effectiveness of long term utility of any modeling approach. The Regional Oxidant Modeling System (ROMS) is exceptionally complex because it treats all chemical and physical processes thought to affect ozone concentration on a regional scale. Thus, to effectively illustrate useful design and implementation techniques, this paper describes the general modeling framework which forms the basis of the ROMS. This framework is flexible enough to allow straightforward update or replacement of the chemical kinetics mechanism and/or any theoretical formulations of the physical processes. Use of the Jacksonmore » Structured Programming (JSP) method to implement this modeling framework has not only increased programmer productivity and quality of the resulting programs, but also has provided standardized program design, dynamic documentation, and easily maintainable and transportable code. A summary of the JSP method is presented to encourage modelers to pursue this technique in their own model development efforts. In addition, since data preparation is such an integral part of a successful modeling system, the ROMS processor network is described with emphasis on the internal quality control techniques.« less

  20. E-learning for grass-roots emergency public health personnel: Preliminary lessons from a national program in China.

    PubMed

    Xu, Wangquan; Jiang, Qicheng; Qin, Xia; Fang, Guixia; Hu, Zhi

    2016-07-19

    In China, grass-roots emergency public health personnel have relatively limited emergency response capabilities and they are constantly required to update their professional knowledge and skills due to recurring and new public health emergencies. However, professional training, a principal solution to this problem, is inadequate because of limitations in manpower and financial resources at grass-roots public health agencies. In order to provide a cost-effective and easily expandable way for grass-roots personnel to acquire knowledge and skills, the National Health Planning Commission of China developed an emergency response information platform and provided trial access to this platform in Anhui and Heilongjiang provinces in China. E-learning was one of the modules of the platform and this paper has focused on an e-learning pilot program. Results indicated that e-learning had satisfactorily improved the knowledge and ability of grass-roots emergency public health personnel, and the program provided an opportunity to gain experience in e-course design and implementing e-learning. Issues such as the lack of personalized e-courses and the difficulty of evaluating the effectiveness of e-learning are topics for further study.

  1. Enzymatic Processes in Marine Biotechnology.

    PubMed

    Trincone, Antonio

    2017-03-25

    In previous review articles the attention of the biocatalytically oriented scientific community towards the marine environment as a source of biocatalysts focused on the habitat-related properties of marine enzymes. Updates have already appeared in the literature, including marine examples of oxidoreductases, hydrolases, transferases, isomerases, ligases, and lyases ready for food and pharmaceutical applications. Here a new approach for searching the literature and presenting a more refined analysis is adopted with respect to previous surveys, centering the attention on the enzymatic process rather than on a single novel activity. Fields of applications are easily individuated: (i) the biorefinery value-chain, where the provision of biomass is one of the most important aspects, with aquaculture as the prominent sector; (ii) the food industry, where the interest in the marine domain is similarly developed to deal with the enzymatic procedures adopted in food manipulation; (iii) the selective and easy extraction/modification of structurally complex marine molecules, where enzymatic treatments are a recognized tool to improve efficiency and selectivity; and (iv) marine biomarkers and derived applications (bioremediation) in pollution monitoring are also included in that these studies could be of high significance for the appreciation of marine bioprocesses.

  2. W3MAMCAT: a world wide web based tool for mammillary and catenary compartmental modeling and expert system distinguishability.

    PubMed

    Russell, Solomon; Distefano, Joseph J

    2006-07-01

    W(3)MAMCAT is a new web-based and interactive system for building and quantifying the parameters or parameter ranges of n-compartment mammillary and catenary model structures, with input and output in the first compartment, from unstructured multiexponential (sum-of-n-exponentials) models. It handles unidentifiable as well as identifiable models and, as such, provides finite parameter interval solutions for unidentifiable models, whereas direct parameter search programs typically do not. It also tutorially develops the theory of model distinguishability for same order mammillary versus catenary models, as did its desktop application predecessor MAMCAT+. This includes expert system analysis for distinguishing mammillary from catenary structures, given input and output in similarly numbered compartments. W(3)MAMCAT provides for universal deployment via the internet and enhanced application error checking. It uses supported Microsoft technologies to form an extensible application framework for maintaining a stable and easily updatable application. Most important, anybody, anywhere, is welcome to access it using Internet Explorer 6.0 over the internet for their teaching or research needs. It is available on the Biocybernetics Laboratory website at UCLA: www.biocyb.cs.ucla.edu.

  3. Enzymatic Processes in Marine Biotechnology

    PubMed Central

    Trincone, Antonio

    2017-01-01

    In previous review articles the attention of the biocatalytically oriented scientific community towards the marine environment as a source of biocatalysts focused on the habitat-related properties of marine enzymes. Updates have already appeared in the literature, including marine examples of oxidoreductases, hydrolases, transferases, isomerases, ligases, and lyases ready for food and pharmaceutical applications. Here a new approach for searching the literature and presenting a more refined analysis is adopted with respect to previous surveys, centering the attention on the enzymatic process rather than on a single novel activity. Fields of applications are easily individuated: (i) the biorefinery value-chain, where the provision of biomass is one of the most important aspects, with aquaculture as the prominent sector; (ii) the food industry, where the interest in the marine domain is similarly developed to deal with the enzymatic procedures adopted in food manipulation; (iii) the selective and easy extraction/modification of structurally complex marine molecules, where enzymatic treatments are a recognized tool to improve efficiency and selectivity; and (iv) marine biomarkers and derived applications (bioremediation) in pollution monitoring are also included in that these studies could be of high significance for the appreciation of marine bioprocesses. PMID:28346336

  4. 75 FR 61702 - Fisheries of the South Atlantic and Gulf of Mexico; Southeast Data, Assessment and Review (SEDAR...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-06

    ... of SEDAR spiny lobster update assessment review. SUMMARY: SEDAR will hold a meeting of the spiny lobster update assessment review panel. The meeting will be held in Key West, FL. See SUPPLEMENTARY... review of the updated spiny lobster assessment. They will develop stock status and fishing level...

  5. Background Information Document for Updating AP42 Section 2.4 for Estimating Emissions from Municipal Solid Waste Landfills

    EPA Science Inventory

    This revised draft document was prepared for U.S. EPA's Office of Research and Development, and describes the data analysis undertaken to update the Municipal Solid Waste (MSW) Landfill section of AP-42. This 2008 update includes the addition of data from 62 landfill gas emission...

  6. Imputatoin and Model-Based Updating Technique for Annual Forest Inventories

    Treesearch

    Ronald E. McRoberts

    2001-01-01

    The USDA Forest Service is developing an annual inventory system to establish the capability of producing annual estimates of timber volume and related variables. The inventory system features measurement of an annual sample of field plots with options for updating data for plots measured in previous years. One imputation and two model-based updating techniques are...

  7. Fundamentals of Collection Development and Management. Second Edition

    ERIC Educational Resources Information Center

    Johnson, Peggy

    2009-01-01

    In this fully updated revision, expert instructor and librarian Peggy Johnson addresses the art in controlling and updating your library's collection. Each chapter offers complete coverage of one aspect of collection development, including suggestions for further reading and a narrative case study exploring the issue. Johnson also integrates…

  8. Update of NDL’s list of key foods based on the 2007-2008 WWEIA-NHANES

    USDA-ARS?s Scientific Manuscript database

    The Nutrient Data Laboratory is responsible for developing authoritative nutrient databases that contain a wide range of food composition values of the nation's food supply. This requires updating and revising the USDA Nutrient Database for Standard Reference (SR) and developing various special int...

  9. Teaching Adaptability of Object-Oriented Programming Language Curriculum

    ERIC Educational Resources Information Center

    Zhu, Xiao-dong

    2012-01-01

    The evolution of object-oriented programming languages includes update of their own versions, update of development environments, and reform of new languages upon old languages. In this paper, the evolution analysis of object-oriented programming languages is presented in term of the characters and development. The notion of adaptive teaching upon…

  10. Mass Deacidification: An Update on Possibilities and Limitations.

    ERIC Educational Resources Information Center

    Porck, Henk J.

    This report provides an update of the possibilities and limitations of currently available mass deacidification methods, focusing on the major developments in research and application of the main operational systems. This study is intended primarily to support the development of a well-considered preservation policy by librarians and archivists,…

  11. Utilizing Flight Data to Update Aeroelastic Stability Estimates

    NASA Technical Reports Server (NTRS)

    Lind, Rick; Brenner, Marty

    1997-01-01

    Stability analysis of high performance aircraft must account for errors in the system model. A method for computing flutter margins that incorporates flight data has been developed using robust stability theory. This paper considers applying this method to update flutter margins during a post-flight or on-line analysis. Areas of modeling uncertainty that arise when using flight data with this method are investigated. The amount of conservatism in the resulting flutter margins depends on the flight data sets used to update the model. Post-flight updates of flutter margins for an F/A-18 are presented along with a simulation of on-line updates during a flight test.

  12. Monitoring urban land cover change by updating the national land cover database impervious surface products

    USGS Publications Warehouse

    Xian, George Z.; Homer, Collin G.

    2009-01-01

    The U.S. Geological Survey (USGS) National Land Cover Database (NLCD) 2001 is widely used as a baseline for national land cover and impervious conditions. To ensure timely and relevant data, it is important to update this base to a more recent time period. A prototype method was developed to update the land cover and impervious surface by individual Landsat path and row. This method updates NLCD 2001 to a nominal date of 2006 by using both Landsat imagery and data from NLCD 2001 as the baseline. Pairs of Landsat scenes in the same season from both 2001 and 2006 were acquired according to satellite paths and rows and normalized to allow calculation of change vectors between the two dates. Conservative thresholds based on Anderson Level I land cover classes were used to segregate the change vectors and determine areas of change and no-change. Once change areas had been identified, impervious surface was estimated for areas of change by sampling from NLCD 2001 in unchanged areas. Methods were developed and tested across five Landsat path/row study sites that contain a variety of metropolitan areas. Results from the five study areas show that the vast majority of impervious surface changes associated with urban developments were accurately captured and updated. The approach optimizes mapping efficiency and can provide users a flexible method to generate updated impervious surface at national and regional scales.

  13. Extraordinary Matter: Visualizing Space Plasmas and Particles

    NASA Astrophysics Data System (ADS)

    Barbier, B.; Bartolone, L. M.; Christian, E. R.; Eastman, T. E.; Lewis, E.; Thieman, J. R.

    2009-12-01

    Atoms and sub-atomic particles play a crucial role in the dynamics of our universe, but these particles and the space plasmas comprised of such particles are often overlooked in popular scientific and educational resources. Even the most basic particle and plasma physics principles are generally unfamiliar to non-scientists. Educators and public communicators need assistance in explaining these concepts that cannot be easily demonstrated in the everyday world. Active visuals are a highly effective aid to understanding, but resources of this type are currently few in number and difficult to find, and most do not provide suitable context for audience comprehension. To address this need, our team of space science educators and scientists from NASA Goddard Space Flight Center and the Adler Planetarium are in the process of developing an online multimedia reference library of resources such as animations, visualizations, interactivities, videos, etc. This website, Extraordinary Matter: Visualizing Space Plasmas and Particles, is designed to assist educators with explaining these concepts that cannot be easily demonstrated in the everyday world. The site will target primarily grades 9-14 and the equivalent in informal education and public outreach. Each ready-to-use product will be accompanied by a supporting explanation at a reading level matching the educational level of the concept. It will also have information on relevant STEM education standards, date of development, credits, restrictions on use, and possibly related products, links, and suggested uses. These products are intended to stand alone, making them adaptable to the widest range of uses, including scientist presentations, museum displays, educational websites and CDs, teacher professional development, and classroom use. Our team has surveyed the potential user community for their specific needs, gaps, and priorities. Referencing STEM educational standards, we are accumulating and enhancing the best available existing materials, and we have concurrently begun the development of new products to fill remaining gaps. We are focusing initially on the simplest concepts and gradually moving on to the more complex, because simpler concepts apply to a wider range of space science, from heliophysics and astrophysics to technology and human exploration. Visitors to the poster will have the opportunity to provide input and sign up to receive periodic email updates on the status of the website.

  14. A model-updating procedure to stimulate piezoelectric transducers accurately.

    PubMed

    Piranda, B; Ballandras, S; Steichen, W; Hecart, B

    2001-09-01

    The use of numerical calculations based on finite element methods (FEM) has yielded significant improvements in the simulation and design of piezoelectric transducers piezoelectric transducer utilized in acoustic imaging. However, the ultimate precision of such models is directly controlled by the accuracy of material characterization. The present work is dedicated to the development of a model-updating technique adapted to the problem of piezoelectric transducer. The updating process is applied using the experimental admittance of a given structure for which a finite element analysis is performed. The mathematical developments are reported and then applied to update the entries of a FEM of a two-layer structure (a PbZrTi-PZT-ridge glued on a backing) for which measurements were available. The efficiency of the proposed approach is demonstrated, yielding the definition of a new set of constants well adapted to predict the structure response accurately. Improvement of the proposed approach, consisting of the updating of material coefficients not only on the admittance but also on the impedance data, is finally discussed.

  15. Kennedy Space Center Director Update

    NASA Image and Video Library

    2014-03-06

    CAPE CANAVERAL, Fla. - Community leaders, business executives, educators, and state and local government leaders were updated on NASA Kennedy Space Center programs and accomplishments during Center Director Bob Cabana’s Center Director Update at the Debus Center at the Kennedy Space Center Visitor Complex in Florida. Rob Mueller, senior technologist, talks with attendees at the Swamp Works display. Attendees talked with Cabana and other senior Kennedy managers and visited displays featuring updates on Kennedy programs and projects, including International Space Station, Commercial Crew, Ground System Development and Operations, Launch Services, Center Planning and Development, Technology, KSC Swamp Works and NASA Education. The morning concluded with a tour of the new Space Shuttle Atlantis exhibit at the visitor complex. For more information, visit http://www.nasa.gov/kennedy. Photo credit: NASA/Daniel Casper

  16. Kennedy Space Center Director Update

    NASA Image and Video Library

    2014-03-06

    CAPE CANAVERAL, Fla. - Community leaders, business executives, educators, and state and local government leaders were updated on NASA Kennedy Space Center programs and accomplishments during Center Director Bob Cabana’s Center Director Update at the Debus Center at the Kennedy Space Center Visitor Complex in Florida. An attendee talks with Scott Thurston, Kennedy deputy of the spacecraft office at the Commercial Crew Program display. Attendees talked with Cabana and other senior Kennedy managers and visited displays featuring updates on Kennedy programs and projects, including International Space Station, Commercial Crew, Ground System Development and Operations, Launch Services, Center Planning and Development, Technology, KSC Swamp Works and NASA Education. The morning concluded with a tour of the new Space Shuttle Atlantis exhibit at the visitor complex. For more information, visit http://www.nasa.gov/kennedy. Photo credit: NASA/Daniel Casper

  17. Kennedy Space Center Director Update

    NASA Image and Video Library

    2014-03-06

    CAPE CANAVERAL, Fla. - Community leaders, business executives, educators, and state and local government leaders were updated on NASA Kennedy Space Center programs and accomplishments during Center Director Bob Cabana’s Center Director Update at the Debus Center at the Kennedy Space Center Visitor Complex in Florida. Rob Mueller, a senior technologist, talks to an attendee about Kennedy’s Swamp Works Laboratory. Attendees talked with Cabana and other senior Kennedy managers and visited displays featuring updates on Kennedy programs and projects, including International Space Station, Commercial Crew, Ground System Development and Operations, Launch Services, Center Planning and Development, Technology, KSC Swamp Works and NASA Education. The morning concluded with a tour of the new Space Shuttle Atlantis exhibit at the visitor complex. For more information, visit http://www.nasa.gov/kennedy. Photo credit: NASA/Daniel Casper

  18. TOUGH2Biot - A simulator for coupled thermal-hydrodynamic-mechanical processes in subsurface flow systems: Application to CO2 geological storage and geothermal development

    NASA Astrophysics Data System (ADS)

    Lei, Hongwu; Xu, Tianfu; Jin, Guangrong

    2015-04-01

    Coupled thermal-hydrodynamic-mechanical processes have become increasingly important in studying the issues affecting subsurface flow systems, such as CO2 sequestration in deep saline aquifers and geothermal development. In this study, a mechanical module based on the extended Biot consolidation model was developed and incorporated into the well-established thermal-hydrodynamic simulator TOUGH2, resulting in an integrated numerical THM simulation program TOUGH2Biot. A finite element method was employed to discretize space for rock mechanical calculation and the Mohr-Coulomb failure criterion was used to determine if the rock undergoes shear-slip failure. Mechanics is partly coupled with the thermal-hydrodynamic processes and gives feedback to flow through stress-dependent porosity and permeability. TOUGH2Biot was verified against analytical solutions for the 1D Terzaghi consolidation and cooling-induced subsidence. TOUGH2Biot was applied to evaluate the thermal, hydrodynamic, and mechanical responses of CO2 geological sequestration at the Ordos CCS Demonstration Project, China and geothermal exploitation at the Geysers geothermal field, California. The results demonstrate that TOUGH2Biot is capable of analyzing change in pressure and temperature, displacement, stress, and potential shear-slip failure caused by large scale underground man-made activity in subsurface flow systems. TOUGH2Biot can also be easily extended for complex coupled process problems in fractured media and be conveniently updated to parallel versions on different platforms to take advantage of high-performance computing.

  19. Harmonize Pipeline and Archiving Aystem: PESSTO@IA2 Use Case

    NASA Astrophysics Data System (ADS)

    Smareglia, R.; Knapic, C.; Molinaro, M.; Young, D.; Valenti, S.

    2013-10-01

    Italian Astronomical Archives Center (IA2) is a research infrastructure project that aims at coordinating different national and international initiatives to improve the quality of astrophysical data services. IA2 is now also involved in the PESSTO (Public ESO Spectroscopic Survey of Transient Objects) collaboration, developing a complete archiving system to store calibrated post processed data (including sensitive intermediate products), a user interface to access private data and Virtual Observatory (VO) compliant web services to access public fast reduction data via VO tools. The archive system shall rely on the PESSTO Marshall to provide file data and its associated metadata output by the PESSTO data-reduction pipeline. To harmonize the object repository, data handling and archiving system, new tools are under development. These systems must have a strong cross-interaction without increasing the complexities of any single task, in order to improve the performances of the whole system and must have a sturdy logic in order to perform all operations in coordination with the other PESSTO tools. MySQL Replication technology and triggers are used for the synchronization of new data in an efficient, fault tolerant manner. A general purpose library is under development to manage data starting from raw observations to final calibrated ones, open to the overriding of different sources, formats, management fields, storage and publication policies. Configurations for all the systems are stored in a dedicated schema (no configuration files), but can be easily updated by a planned Archiving System Configuration Interface (ASCI).

  20. Minimum Information for Reporting Next Generation Sequence Genotyping (MIRING): Guidelines for Reporting HLA and KIR Genotyping via Next Generation Sequencing

    PubMed Central

    Mack, Steven J.; Milius, Robert P.; Gifford, Benjamin D.; Sauter, Jürgen; Hofmann, Jan; Osoegawa, Kazutoyo; Robinson, James; Groeneweg, Mathijs; Turenchalk, Gregory S.; Adai, Alex; Holcomb, Cherie; Rozemuller, Erik H.; Penning, Maarten T.; Heuer, Michael L.; Wang, Chunlin; Salit, Marc L.; Schmidt, Alexander H.; Parham, Peter R.; Müller, Carlheinz; Hague, Tim; Fischer, Gottfried; Fernandez-Viňa, Marcelo; Hollenbach, Jill A; Norman, Paul J.; Maiers, Martin

    2015-01-01

    The development of next-generation sequencing (NGS) technologies for HLA and KIR genotyping is rapidly advancing knowledge of genetic variation of these highly polymorphic loci. NGS genotyping is poised to replace older methods for clinical use, but standard methods for reporting and exchanging these new, high quality genotype data are needed. The Immunogenomic NGS Consortium, a broad collaboration of histocompatibility and immunogenetics clinicians, researchers, instrument manufacturers and software developers, has developed the Minimum Information for Reporting Immunogenomic NGS Genotyping (MIRING) reporting guidelines. MIRING is a checklist that specifies the content of NGS genotyping results as well as a set of messaging guidelines for reporting the results. A MIRING message includes five categories of structured information – message annotation, reference context, full genotype, consensus sequence and novel polymorphism – and references to three categories of accessory information – NGS platform documentation, read processing documentation and primary data. These eight categories of information ensure the long-term portability and broad application of this NGS data for all current histocompatibility and immunogenetics use cases. In addition, MIRING can be extended to allow the reporting of genotype data generated using pre-NGS technologies. Because genotyping results reported using MIRING are easily updated in accordance with reference and nomenclature databases, MIRING represents a bold departure from previous methods of reporting HLA and KIR genotyping results, which have provided static and less-portable data. More information about MIRING can be found online at miring.immunogenomics.org. PMID:26407912

  1. The Veterans Affairs Cardiac Risk Score: Recalibrating the Atherosclerotic Cardiovascular Disease Score for Applied Use.

    PubMed

    Sussman, Jeremy B; Wiitala, Wyndy L; Zawistowski, Matthew; Hofer, Timothy P; Bentley, Douglas; Hayward, Rodney A

    2017-09-01

    Accurately estimating cardiovascular risk is fundamental to good decision-making in cardiovascular disease (CVD) prevention, but risk scores developed in one population often perform poorly in dissimilar populations. We sought to examine whether a large integrated health system can use their electronic health data to better predict individual patients' risk of developing CVD. We created a cohort using all patients ages 45-80 who used Department of Veterans Affairs (VA) ambulatory care services in 2006 with no history of CVD, heart failure, or loop diuretics. Our outcome variable was new-onset CVD in 2007-2011. We then developed a series of recalibrated scores, including a fully refit "VA Risk Score-CVD (VARS-CVD)." We tested the different scores using standard measures of prediction quality. For the 1,512,092 patients in the study, the Atherosclerotic cardiovascular disease risk score had similar discrimination as the VARS-CVD (c-statistic of 0.66 in men and 0.73 in women), but the Atherosclerotic cardiovascular disease model had poor calibration, predicting 63% more events than observed. Calibration was excellent in the fully recalibrated VARS-CVD tool, but simpler techniques tested proved less reliable. We found that local electronic health record data can be used to estimate CVD better than an established risk score based on research populations. Recalibration improved estimates dramatically, and the type of recalibration was important. Such tools can also easily be integrated into health system's electronic health record and can be more readily updated.

  2. "TPSX: Thermal Protection System Expert and Material Property Database"

    NASA Technical Reports Server (NTRS)

    Squire, Thomas H.; Milos, Frank S.; Rasky, Daniel J. (Technical Monitor)

    1997-01-01

    The Thermal Protection Branch at NASA Ames Research Center has developed a computer program for storing, organizing, and accessing information about thermal protection materials. The program, called Thermal Protection Systems Expert and Material Property Database, or TPSX, is available for the Microsoft Windows operating system. An "on-line" version is also accessible on the World Wide Web. TPSX is designed to be a high-quality source for TPS material properties presented in a convenient, easily accessible form for use by engineers and researchers in the field of high-speed vehicle design. Data can be displayed and printed in several formats. An information window displays a brief description of the material with properties at standard pressure and temperature. A spread sheet window displays complete, detailed property information. Properties which are a function of temperature and/or pressure can be displayed as graphs. In any display the data can be converted from English to SI units with the click of a button. Two material databases included with TPSX are: 1) materials used and/or developed by the Thermal Protection Branch at NASA Ames Research Center, and 2) a database compiled by NASA Johnson Space Center 9JSC). The Ames database contains over 60 advanced TPS materials including flexible blankets, rigid ceramic tiles, and ultra-high temperature ceramics. The JSC database contains over 130 insulative and structural materials. The Ames database is periodically updated and expanded as required to include newly developed materials and material property refinements.

  3. Open Source Seismic Software in NOAA's Next Generation Tsunami Warning System

    NASA Astrophysics Data System (ADS)

    Hellman, S. B.; Baker, B. I.; Hagerty, M. T.; Leifer, J. M.; Lisowski, S.; Thies, D. A.; Donnelly, B. K.; Griffith, F. P.

    2014-12-01

    The Tsunami Information technology Modernization (TIM) is a project spearheaded by National Oceanic and Atmospheric Administration to update the United States' Tsunami Warning System software currently employed at the Pacific Tsunami Warning Center (Eva Beach, Hawaii) and the National Tsunami Warning Center (Palmer, Alaska). This entirely open source software project will integrate various seismic processing utilities with the National Weather Service Weather Forecast Office's core software, AWIPS2. For the real-time and near real-time seismic processing aspect of this project, NOAA has elected to integrate the open source portions of GFZ's SeisComP 3 (SC3) processing system into AWIPS2. To provide for better tsunami threat assessments we are developing open source tools for magnitude estimations (e.g., moment magnitude, energy magnitude, surface wave magnitude), detection of slow earthquakes with the Theta discriminant, moment tensor inversions (e.g. W-phase and teleseismic body waves), finite fault inversions, and array processing. With our reliance on common data formats such as QuakeML and seismic community standard messaging systems, all new facilities introduced into AWIPS2 and SC3 will be available as stand-alone tools or could be easily integrated into other real time seismic monitoring systems such as Earthworm, Antelope, etc. Additionally, we have developed a template based design paradigm so that the developer or scientist can efficiently create upgrades, replacements, and/or new metrics to the seismic data processing with only a cursory knowledge of the underlying SC3.

  4. The Colleges and the Courts: The Developing Law of the Student and the College. 1976 Updating Supplement.

    ERIC Educational Resources Information Center

    Chambers, M. M.

    Constant progress and change in the law of higher education have necessitated this update of the original 1972 edition of this document (the original appeared with the same title) and its first updated supplement. Approximately 60 court decisions are cited that deal with students and their relationship to their colleges. Specific areas discussed…

  5. General Separations Area (GSA) Groundwater Flow Model Update: Hydrostratigraphic Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bagwell, L.; Bennett, P.; Flach, G.

    2017-02-21

    This document describes the assembly, selection, and interpretation of hydrostratigraphic data for input to an updated groundwater flow model for the General Separations Area (GSA; Figure 1) at the Department of Energy’s (DOE) Savannah River Site (SRS). This report is one of several discrete but interrelated tasks that support development of an updated groundwater model (Bagwell and Flach, 2016).

  6. Proposal for an Update of the Definition and Scope of Behavioral Medicine.

    PubMed

    Dekker, Joost; Stauder, Adrienne; Penedo, Frank J

    2017-02-01

    We aim to provide an update of the definition and scope of behavioral medicine in the Charter of ISBM, as the present version was developed more than 25 years ago. We identify issues which need clarification or updating. This leads us to propose an update of the definition and scope of behavioral medicine. Issues in need of clarification or updating include the scope of behavioral medicine (biobehavioral mechanisms, clinical diagnosis and intervention, and prevention and health promotion); research as an essential characteristic of all three areas of behavioral medicine; the application of behavioral medicine; the terminology of behavioral medicine as a multidisciplinary field; and the relationship and distinction between behavioral medicine, mental health, health psychology, and psychosomatic medicine. We propose the following updated definition and scope of behavioral medicine: "Behavioral medicine can be defined as the multidisciplinary field concerned with the development and integration of biomedical and behavioral knowledge relevant to health and disease, and the application of this knowledge to prevention, health promotion, diagnosis, treatment, rehabilitation, and care. The scope of behavioral medicine extends from biobehavioral mechanisms (i.e., the interaction of biomedical processes with psychological, social, societal, cultural, and environmental processes), to clinical diagnosis and intervention, and to public health."

  7. Integration of stereotactic ultrasonic data into an interactive image-guided neurosurgical system

    NASA Astrophysics Data System (ADS)

    Shima, Daniel W.; Galloway, Robert L., Jr.

    1998-06-01

    Stereotactic ultrasound can be incorporated into an interactive, image-guide neurosurgical system by using an optical position sensor to define the location of an intraoperative scanner in physical space. A C-program has been developed that communicates with the OptotrakTM system developed by Northern Digital Inc. to optically track the three-dimensional position and orientation of a fan-shaped area defined with respect to a hand-held probe. (i.e., a virtual B-mode ultrasound fan beam) Volumes of CT and MR head scans from the same patient are registered to a location in physical space using a point-based technique. The coordinates of the virtual fan beam in physical space are continuously calculated and updated on-the-fly. During each program loop, the CT and MR data volumes are reformatted along the same plane and displayed as two fan-shaped images that correspond to the current physical-space location of the virtual fan beam. When the reformatted preoperative tomographic images are eventually paired with a real-time intraoperative ultrasound image, a neurosurgeon will be able to use the unique information of each imaging modality (e.g., the high resolution and tissue contrast of CT and MR and the real-time functionality of ultrasound) in a complementary manner to identify structures in the brain more easily and to guide surgical procedures more effectively.

  8. The production and uses of Beauveria bassiana as a microbial insecticide.

    PubMed

    Mascarin, Gabriel Moura; Jaronski, Stefan T

    2016-11-01

    Among invertebrate fungal pathogens, Beauveria bassiana has assumed a key role in management of numerous arthropod agricultural, veterinary and forestry pests. Beauveria is typically deployed in one or more inundative applications of large numbers of aerial conidia in dry or liquid formulations, in a chemical paradigm. Mass production is mainly practiced by solid-state fermentation to yield hydrophobic aerial conidia, which remain the principal active ingredient of mycoinsecticides. More robust and cost-effective fermentation and formulation downstream platforms are imperative for its overall commercialization by industry. Hence, where economics allow, submerged liquid fermentation provides alternative method to produce effective and stable propagules that can be easily formulated as dry stable preparations. Formulation also continues to be a bottleneck in the development of stable and effective commercial Beauveria-mycoinsecticides in many countries, although good commercial formulations do exist. Future research on improving fermentation and formulation technologies coupled with the selection of multi-stress tolerant and virulent strains is needed to catalyze the widespread acceptance and usefulness of this fungus as a cost-effective mycoinsecticide. The role of Beauveria as one tool among many in integrated pest management, rather than a stand-alone management approach, needs to be better developed across the range of crop systems. Here, we provide an overview of mass-production and formulation strategies, updated list of registered commercial products, major biocontrol programs and ecological aspects affecting the use of Beauveria as a mycoinsecticide.

  9. Establishing an In-House Wind Maintenance Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2011-12-01

    Update to the 2008 guidebook titled “Establishing an In-house Wind Maintenance Program”, which was developed to support utilities in developing O&M strategies. This update includes significant contributions from utilities and other stakeholders around the country, representing all perspectives and regardless of whether or not they own wind turbines or projects.

  10. Professional Development for Adult Education Instructors. State Policy Update.

    ERIC Educational Resources Information Center

    Tolbert, Michelle

    This State Policy Update provides background on professional development (PD) in adult education. Section 2 describes survey methods used to document how states funded and designed their PD systems. Section 3 reviews data collected by the survey of state PD systems, highlighting PD activities in Kentucky, New York, Oregon, and Tennessee. It…

  11. 1999 Leak Detection and Monitoring and Mitigation Strategy Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    OHL, P.C.

    This document is a complete revision of WHC-SD-WM-ES-378, Rev 1. This update includes recent developments in Leak Detection, Leak Monitoring, and Leak Mitigation technologies, as well as, recent developments in single-shell tank retrieval technologies. In addition, a single-shell tank retrieval release protection strategy is presented.

  12. Update on the development of cotton gin PM10 emission factors for EPA's AP-42

    USDA-ARS?s Scientific Manuscript database

    A cotton ginning industry-supported project was initiated in 2008 to update the U.S. Environmental Protection Agency’s (EPA) Compilation of Air Pollution Emission Factors (AP-42) to include PM10 emission factors. This study develops emission factors from the PM10 emission factor data collected from ...

  13. PASMet: a web-based platform for prediction, modelling and analyses of metabolic systems

    PubMed Central

    Sriyudthsak, Kansuporn; Mejia, Ramon Francisco; Arita, Masanori; Hirai, Masami Yokota

    2016-01-01

    PASMet (Prediction, Analysis and Simulation of Metabolic networks) is a web-based platform for proposing and verifying mathematical models to understand the dynamics of metabolism. The advantages of PASMet include user-friendliness and accessibility, which enable biologists and biochemists to easily perform mathematical modelling. PASMet offers a series of user-functions to handle the time-series data of metabolite concentrations. The functions are organised into four steps: (i) Prediction of a probable metabolic pathway and its regulation; (ii) Construction of mathematical models; (iii) Simulation of metabolic behaviours; and (iv) Analysis of metabolic system characteristics. Each function contains various statistical and mathematical methods that can be used independently. Users who may not have enough knowledge of computing or programming can easily and quickly analyse their local data without software downloads, updates or installations. Users only need to upload their files in comma-separated values (CSV) format or enter their model equations directly into the website. Once the time-series data or mathematical equations are uploaded, PASMet automatically performs computation on server-side. Then, users can interactively view their results and directly download them to their local computers. PASMet is freely available with no login requirement at http://pasmet.riken.jp/ from major web browsers on Windows, Mac and Linux operating systems. PMID:27174940

  14. Discriminative clustering on manifold for adaptive transductive classification.

    PubMed

    Zhang, Zhao; Jia, Lei; Zhang, Min; Li, Bing; Zhang, Li; Li, Fanzhang

    2017-10-01

    In this paper, we mainly propose a novel adaptive transductive label propagation approach by joint discriminative clustering on manifolds for representing and classifying high-dimensional data. Our framework seamlessly combines the unsupervised manifold learning, discriminative clustering and adaptive classification into a unified model. Also, our method incorporates the adaptive graph weight construction with label propagation. Specifically, our method is capable of propagating label information using adaptive weights over low-dimensional manifold features, which is different from most existing studies that usually predict the labels and construct the weights in the original Euclidean space. For transductive classification by our formulation, we first perform the joint discriminative K-means clustering and manifold learning to capture the low-dimensional nonlinear manifolds. Then, we construct the adaptive weights over the learnt manifold features, where the adaptive weights are calculated through performing the joint minimization of the reconstruction errors over features and soft labels so that the graph weights can be joint-optimal for data representation and classification. Using the adaptive weights, we can easily estimate the unknown labels of samples. After that, our method returns the updated weights for further updating the manifold features. Extensive simulations on image classification and segmentation show that our proposed algorithm can deliver the state-of-the-art performance on several public datasets. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Assembly: a resource for assembled genomes at NCBI

    PubMed Central

    Kitts, Paul A.; Church, Deanna M.; Thibaud-Nissen, Françoise; Choi, Jinna; Hem, Vichet; Sapojnikov, Victor; Smith, Robert G.; Tatusova, Tatiana; Xiang, Charlie; Zherikov, Andrey; DiCuccio, Michael; Murphy, Terence D.; Pruitt, Kim D.; Kimchi, Avi

    2016-01-01

    The NCBI Assembly database (www.ncbi.nlm.nih.gov/assembly/) provides stable accessioning and data tracking for genome assembly data. The model underlying the database can accommodate a range of assembly structures, including sets of unordered contig or scaffold sequences, bacterial genomes consisting of a single complete chromosome, or complex structures such as a human genome with modeled allelic variation. The database provides an assembly accession and version to unambiguously identify the set of sequences that make up a particular version of an assembly, and tracks changes to updated genome assemblies. The Assembly database reports metadata such as assembly names, simple statistical reports of the assembly (number of contigs and scaffolds, contiguity metrics such as contig N50, total sequence length and total gap length) as well as the assembly update history. The Assembly database also tracks the relationship between an assembly submitted to the International Nucleotide Sequence Database Consortium (INSDC) and the assembly represented in the NCBI RefSeq project. Users can find assemblies of interest by querying the Assembly Resource directly or by browsing available assemblies for a particular organism. Links in the Assembly Resource allow users to easily download sequence and annotations for current versions of genome assemblies from the NCBI genomes FTP site. PMID:26578580

  16. Updated French guidelines for diagnosis and management of pelvic inflammatory disease.

    PubMed

    Brun, Jean-Luc; Graesslin, Olivier; Fauconnier, Arnaud; Verdon, Renaud; Agostini, Aubert; Bourret, Antoine; Derniaux, Emilie; Garbin, Olivier; Huchon, Cyrille; Lamy, Catherine; Quentin, Roland; Judlin, Philippe

    2016-08-01

    Pelvic inflammatory disease (PID) is commonly encountered in clinical practice. To provide up-to-date guidelines on management of PID. An initial search of the Cochrane database, PubMed, and Embase was performed using keywords related to PID to identify reports in any language published between January 1990 and January 2012, with an update in May 2015. All identified reports relevant to the areas of focus were included. A level of evidence based on the quality of the data available was applied for each area of focus and used for the guidelines. PID must be suspected when spontaneous pelvic pain is associated with induced adnexal or uterine pain (grade C). Pelvic ultrasonography is necessary to exclude tubo-ovarian abscess (grade B). Microbiological diagnosis requires vaginal and endocervical sampling for molecular and bacteriological analysis (grade B). First-line treatment for uncomplicated PID combines ofloxacin and metronidazole for 14days (grade B). Treatment of tubo-ovarian abscess is based on drainage if the collection measures more than 3cm (grade B), with combined ceftriaxone, metronidazole, and doxycycline for 14-21days. Current management of PID requires easily reproducible investigations and treatment, and thus can be applied worldwide. Copyright © 2016 International Federation of Gynecology and Obstetrics. Published by Elsevier Ireland Ltd. All rights reserved.

  17. Quality of Web Information About Palliative Care on Websites from the United States and Japan: Comparative Evaluation Study

    PubMed Central

    Tanabe, Kouichi; Fujiwara, Kaho; Ogura, Hana; Yasuda, Hatsuna; Goto, Nobuyuki

    2018-01-01

    Background Patients and their families are able to obtain information about palliative care from websites easily nowadays. However, there are concerns on the accuracy of information on the Web and how up to date it is. Objective The objective of this study was to elucidate problematic points of medical information about palliative care obtained from websites, and to compare the quality of the information between Japanese and US websites. Methods We searched Google Japan and Google USA for websites relating to palliative care. We then evaluated the top 50 websites from each search using the DISCERN and LIDA instruments. Results We found that Japanese websites were given a lower evaluation of reliability than US websites. In 3 LIDA instrument subcategories—engagability (P<.001), currency (P=.001), and content production procedure (P<.001)—US websites scored significantly higher and had large effect sizes. Conclusions Our results suggest that Japanese websites have problems with the frequency with which they are updated, their update procedures and policies, and the scrutiny process the evidence must undergo. Additionally, there was a weak association between search ranking and reliability, and simultaneously we found that reliability could not be assessed by search ranking alone. PMID:29615388

  18. ATM over hybrid fiber-coaxial cable networks: practical issues in deploying residential ATM services

    NASA Astrophysics Data System (ADS)

    Laubach, Mark

    1996-11-01

    Residential broadband access network technology based on asynchronous transfer modem (ATM) will soon reach commercial availability. The capabilities provided by ATM access network promise integrated services bandwidth available in excess of those provided by traditional twisted pair copper wire public telephone networks. ATM to the side of the home placed need quality of service capability closest to the subscriber allowing immediate support for Internet services and traditional voice telephony. Other services such as desktop video teleconferencing and enhanced server-based application support can be added as part of future evolution of the network. Additionally, advanced subscriber home networks can be supported easily. This paper presents an updated summary of the standardization efforts for the ATM over HFC definition work currently taking place in the ATM forum's residential broadband working group and the standards progress in the IEEE 802.14 cable TV media access control and physical protocol working group. This update is fundamental for establishing the foundation for delivering ATM-based integrated services via a cable TV network. An economic model for deploying multi-tiered services is presenting showing that a single-tier service is insufficient for a viable cable operator business. Finally, the use of an ATM based system lends itself well to various deployment scenarios of synchronous optical networks (SONET).

  19. Energy Savings Potential and RD&D Opportunities for Commercial Building Appliances (2015 Update)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goetzler, William; Guernsey, Matt; Foley, Kevin

    The Department of Energy commissioned a technology characterization and assessment of appliances used in commercial buildings for cooking, cleaning, water heating, and other end-uses. The primary objectives of this study were to document the energy consumed by commercial appliances and identify research, development, and demonstration opportunities to improve energy efficiency in each end-use. This report serves as an update to a 2009 report of the same name by incorporating updated data and sources where possible and updating the available technology options that provide opportunities for efficiency improvements.

  20. Static and Dynamic Model Update of an Inflatable/Rigidizable Torus Structure

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Reaves, mercedes C.

    2006-01-01

    The present work addresses the development of an experimental and computational procedure for validating finite element models. A torus structure, part of an inflatable/rigidizable Hexapod, is used to demonstrate the approach. Because of fabrication, materials, and geometric uncertainties, a statistical approach combined with optimization is used to modify key model parameters. Static test results are used to update stiffness parameters and dynamic test results are used to update the mass distribution. Updated parameters are computed using gradient and non-gradient based optimization algorithms. Results show significant improvements in model predictions after parameters are updated. Lessons learned in the areas of test procedures, modeling approaches, and uncertainties quantification are presented.

  1. LIPS database with LIPService: a microscopic image database of intracellular structures in Arabidopsis guard cells.

    PubMed

    Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro

    2013-05-16

    Intracellular configuration is an important feature of cell status. Recent advances in microscopic imaging techniques allow us to easily obtain a large number of microscopic images of intracellular structures. In this circumstance, automated microscopic image recognition techniques are of extreme importance to future phenomics/visible screening approaches. However, there was no benchmark microscopic image dataset for intracellular organelles in a specified plant cell type. We previously established the Live Images of Plant Stomata (LIPS) database, a publicly available collection of optical-section images of various intracellular structures of plant guard cells, as a model system of environmental signal perception and transduction. Here we report recent updates to the LIPS database and the establishment of a database table, LIPService. We updated the LIPS dataset and established a new interface named LIPService to promote efficient inspection of intracellular structure configurations. Cell nuclei, microtubules, actin microfilaments, mitochondria, chloroplasts, endoplasmic reticulum, peroxisomes, endosomes, Golgi bodies, and vacuoles can be filtered using probe names or morphometric parameters such as stomatal aperture. In addition to the serial optical sectional images of the original LIPS database, new volume-rendering data for easy web browsing of three-dimensional intracellular structures have been released to allow easy inspection of their configurations or relationships with cell status/morphology. We also demonstrated the utility of the new LIPS image database for automated organelle recognition of images from another plant cell image database with image clustering analyses. The updated LIPS database provides a benchmark image dataset for representative intracellular structures in Arabidopsis guard cells. The newly released LIPService allows users to inspect the relationship between organellar three-dimensional configurations and morphometrical parameters.

  2. A technique for routinely updating the ITU-R database using radio occultation electron density profiles

    NASA Astrophysics Data System (ADS)

    Brunini, Claudio; Azpilicueta, Francisco; Nava, Bruno

    2013-09-01

    Well credited and widely used ionospheric models, such as the International Reference Ionosphere or NeQuick, describe the variation of the electron density with height by means of a piecewise profile tied to the F2-peak parameters: the electron density,, and the height, . Accurate values of these parameters are crucial for retrieving reliable electron density estimations from those models. When direct measurements of these parameters are not available, the models compute the parameters using the so-called ITU-R database, which was established in the early 1960s. This paper presents a technique aimed at routinely updating the ITU-R database using radio occultation electron density profiles derived from GPS measurements gathered from low Earth orbit satellites. Before being used, these radio occultation profiles are validated by fitting to them an electron density model. A re-weighted Least Squares algorithm is used for down-weighting unreliable measurements (occasionally, entire profiles) and to retrieve and values—together with their error estimates—from the profiles. These values are used to monthly update the database, which consists of two sets of ITU-R-like coefficients that could easily be implemented in the IRI or NeQuick models. The technique was tested with radio occultation electron density profiles that are delivered to the community by the COSMIC/FORMOSAT-3 mission team. Tests were performed for solstices and equinoxes seasons in high and low-solar activity conditions. The global mean error of the resulting maps—estimated by the Least Squares technique—is between and elec/m for the F2-peak electron density (which is equivalent to 7 % of the value of the estimated parameter) and from 2.0 to 5.6 km for the height (2 %).

  3. Tanning lamps: health effects and reclassification by the Food and Drug Administration.

    PubMed

    Ernst, Alexander; Grimm, Amanda; Lim, Henry W

    2015-01-01

    Tanning lamps have long been considered a class I medical device under regulation by the Food and Drug Administration (FDA). A growing body of research has repeatedly documented the association between elective indoor tanning and several negative health consequences. These accepted findings have prompted action by the FDA to officially reclassify tanning lamps as a class II medical device. The main purpose of this review is to update practitioners on the current state of tanning lamp classification and highlight the practical implications of this recent change. This information can be used by clinicians to easily reference this important action, and empower patients with a better understanding of the risks associated with indoor tanning. Copyright © 2014 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.

  4. The Small Aircraft Transportation System Project: An Update

    NASA Technical Reports Server (NTRS)

    Kemmerly, Guy T.

    2006-01-01

    To all peoples in all parts of the world throughout history, the ability to move about easily is a fundamental element of freedom. The American people have charged NASA to increase their freedom and that of their children knowing that their quality of life will improve as our nation s transportation systems improve. In pursuit of this safe, reliable, and affordable personalized air transportation option, in 2000 NASA established the Small Aircraft Transportation System (SATS) Project. As the name suggests personalized air transportation would be built on smaller aircraft than those used by the airlines. Of course, smaller aircraft can operate from smaller airports and 96% of the American population is within thirty miles of a high-quality, underutilized community airport as are the vast majority of their customers, family members, and favorite vacation destinations.

  5. Phenotypic Identification of Actinomyces and Related Species Isolated from Human Sources

    PubMed Central

    Sarkonen, Nanna; Könönen, Eija; Summanen, Paula; Könönen, Mauno; Jousimies-Somer, Hannele

    2001-01-01

    Recent advancements in chemotaxonomic and molecular biology-based identification methods have clarified the taxonomy of the genus Actinomyces and have led to the recognition of several new Actinomyces and related species. Actinomyces-like gram-positive rods have increasingly been isolated from various clinical specimens. Thus, an easily accessible scheme for reliable differentiation at the species level is needed in clinical and oral microbiology laboratories, where bacterial identification is mainly based on conventional biochemical methods. In the present study we designed a two-step protocol that consists of a flowchart that describes rapid, cost-efficient tests for preliminary identification of Actinomyces and closely related species and an updated more comprehensive scheme that also uses fermentation reactions for accurate differentiation of Actinomyces and closely related species. PMID:11682514

  6. No programming required. Mobile PCs can help physicians work more efficiently, especially when the application is designed to fit the practice.

    PubMed

    Campbell, J

    2000-09-01

    The Jacobson Medical Group San Antonio Jacobson Medical Group (JMG) needed a way to effectively and efficiently coordinate referral information between their hospitalist physicians and specialists. JMG decided to replace paper-based binders with something more convenient and easily updated. The organization chose to implement a mobile solution that would provide its physicians with convenient access to a database of information via a hand-held computer. The hand-held solution provides physicians with full demographic profiles of primary care givers for each area where the group operates. The database includes multiple profiles based on different healthcare plans, along with details about preferred and authorized specialists. JMG adopted a user-friendly solution that the hospitalists and specialists would embrace and actually use.

  7. Sparse nonnegative matrix factorization with ℓ0-constraints

    PubMed Central

    Peharz, Robert; Pernkopf, Franz

    2012-01-01

    Although nonnegative matrix factorization (NMF) favors a sparse and part-based representation of nonnegative data, there is no guarantee for this behavior. Several authors proposed NMF methods which enforce sparseness by constraining or penalizing the ℓ1-norm of the factor matrices. On the other hand, little work has been done using a more natural sparseness measure, the ℓ0-pseudo-norm. In this paper, we propose a framework for approximate NMF which constrains the ℓ0-norm of the basis matrix, or the coefficient matrix, respectively. For this purpose, techniques for unconstrained NMF can be easily incorporated, such as multiplicative update rules, or the alternating nonnegative least-squares scheme. In experiments we demonstrate the benefits of our methods, which compare to, or outperform existing approaches. PMID:22505792

  8. The versatile subepithelial connective tissue graft: a literature update.

    PubMed

    Karthikeyan, B V; Khanna, Divya; Chowdhary, Kamedh Yashawant; Prabhuji, M Lv

    2016-01-01

    Harmony between hard and soft tissue morphologies is essential for form, function, and a good esthetic outlook. Replacement grafts for correction of soft tissue defects around the teeth have become important to periodontal plastic and implant surgical procedures. Among a multitude of surgical techniques and graft materials reported in literature, the subepithelial connective tissue graft (SCTG) has gained wide popularity and acceptance. The purpose of this article is to acquaint clinicians with the current understanding of the versatile SCTG. Key factors associated with graft harvesting as well as applications, limitations, and complications of SCTGs are discussed. This connective tissue has shown excellent short- and long-term stability, is easily available, and is economical to use. The SCTG should be considered as an alternative in all periodontal reconstruction surgeries.

  9. Enhancing the Remote Variable Operations in NPSS/CCDK

    NASA Technical Reports Server (NTRS)

    Sang, Janche; Follen, Gregory; Kim, Chan; Lopez, Isaac; Townsend, Scott

    2001-01-01

    Many scientific applications in aerodynamics and solid mechanics are written in Fortran. Refitting these legacy Fortran codes with distributed objects can increase the code reusability. The remote variable scheme provided in NPSS/CCDK helps programmers easily migrate the Fortran codes towards a client-server platform. This scheme gives the client the capability of accessing the variables at the server site. In this paper, we review and enhance the remote variable scheme by using the operator overloading features in C++. The enhancement enables NPSS programmers to use remote variables in much the same way as traditional variables. The remote variable scheme adopts the lazy update approach and the prefetch method. The design strategies and implementation techniques are described in details. Preliminary performance evaluation shows that communication overhead can be greatly reduced.

  10. A System for Supporting Development and Update of the International Classification of Health Interventions (ICHI).

    PubMed

    Donada, Marc; Della Mea, Vincenzo; Cumerlato, Megan; Rankin, Nicole; Madden, Richard

    2018-01-01

    The International Classification of Health Interventions (ICHI) is a member of the WHO Family of International Classifications, being developed to provide a common tool for reporting and analysing health interventions for statistical purposes. A web-based platform for classification development and update has been specifically developed to support the initial development step and then, after final approval, the continuous revision and update of the classification. The platform provides features for classification editing, versioning, comment management and URI identifiers. During the last 12 months it has been used for developing the ICHI Beta version, replacing the previous process based on the exchange of Excel files. At November 2017, 90 users have provided input to the development of the classification, which has resulted in 2913 comments and 2971 changes in the classification, since June 2017. Further work includes the development of an URI API for machine to machine communication, following the model established for ICD-11.

  11. Annual update of data for estimating ESALs.

    DOT National Transportation Integrated Search

    2006-10-01

    A revised procedure for estimating equivalent single axleloads (ESALs) was developed in 1985. This procedure used weight, classification, and traffic volume data collected by the Transportation Cabinet's Division of Planning. : Annual updates of data...

  12. DOE interpretations Guide to OSH standards. Update to the Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-03-31

    Reflecting Secretary O`Leary`s focus on occupational safety and health, the Office of Occupational Safety is pleased to provide you with the latest update to the DOE Interpretations Guide to OSH Standards. This Guide was developed in cooperation with the Occupational Safety and Health Administration, which continued its support during this last revision by facilitating access to the interpretations found on the OSHA Computerized Information System (OCIS). This March 31, 1994 update contains 123 formal interpretation letters written OSHA. As a result of the unique requests received by the 1-800 Response Line, this update also contains 38 interpretations developed by DOE.more » This new occupational safety and health information adds still more important guidance to the four volume reference set that you presently have in your possession.« less

  13. Kennedy Space Center Director Update

    NASA Image and Video Library

    2014-03-06

    CAPE CANAVERAL, Fla. - Community leaders, business executives, educators, and state and local government leaders were updated on NASA Kennedy Space Center programs and accomplishments during Center Director Bob Cabana’s Center Director Update at the Debus Center at the Kennedy Space Center Visitor Complex in Florida. At left, Susan Fernandez from the Office of Senator Marco Rubio talks with another attendee near the Education display. Attendees talked with Cabana and other senior Kennedy managers and visited displays featuring updates on Kennedy programs and projects, including International Space Station, Commercial Crew, Ground System Development and Operations, Launch Services, Center Planning and Development, Technology, KSC Swamp Works and NASA Education. The morning concluded with a tour of the new Space Shuttle Atlantis exhibit at the visitor complex. For more information, visit http://www.nasa.gov/kennedy. Photo credit: NASA/Daniel Casper

  14. Kennedy Space Center Director Update

    NASA Image and Video Library

    2014-03-06

    CAPE CANAVERAL, Fla. - Community leaders, business executives, educators, and state and local government leaders were updated on NASA Kennedy Space Center programs and accomplishments during Center Director Bob Cabana’s Center Director Update at the Debus Center at the Kennedy Space Center Visitor Complex in Florida. An attendee talks with Trent Smith, program manager, and Tammy Belk, a program specialist, at the ISS Ground Processing and Research Office display. Attendees talked with Cabana and other senior Kennedy managers and visited displays featuring updates on Kennedy programs and projects, including International Space Station, Commercial Crew, Ground System Development and Operations, Launch Services, Center Planning and Development, Technology, KSC Swamp Works and NASA Education. The morning concluded with a tour of the new Space Shuttle Atlantis exhibit at the visitor complex. For more information, visit http://www.nasa.gov/kennedy. Photo credit: NASA/Daniel Casper

  15. DOE interpretations Guide to OSH standards. Update to the Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-03-31

    Reflecting Secretary O`Leary`s focus on occupational safety and health, the Office of Occupational Safety is pleased to provide you with the latest update to the DOE Interpretations Guide to OSH Standards. This Guide was developed in cooperation with the Occupational Safety and Health Administration, which continued it`s support during this last revision by facilitating access to the interpretations found on the OSHA Computerized Information System (OCIS). This March 31, 1994 update contains 123 formal in letter written by OSHA. As a result of the unique requests received by the 1-800 Response Line, this update also contains 38 interpretations developed bymore » DOE. This new occupational safety and health information adds still more important guidance to the four volume reference set that you presently have in your possession.« less

  16. Aqua/Aura Updated Inclination Adjust Maneuver Performance Prediction Model

    NASA Technical Reports Server (NTRS)

    Boone, Spencer

    2017-01-01

    This presentation will discuss the updated Inclination Adjust Maneuver (IAM) performance prediction model that was developed for Aqua and Aura following the 2017 IAM series. This updated model uses statistical regression methods to identify potential long-term trends in maneuver parameters, yielding improved predictions when re-planning past maneuvers. The presentation has been reviewed and approved by Eric Moyer, ESMO Deputy Project Manager.

  17. Update on Research and Application of Problem-Based Learning in Medical Science Education

    ERIC Educational Resources Information Center

    Fan, Chuifeng; Jiang, Biying; Shi, Xiuying; Wang, Enhua; Li, Qingchang

    2018-01-01

    Problem-based learning (PBL) is a unique form of pedagogy dedicated to developing students' self-learning and clinical practice skills. After several decades of development, although applications vary, PBL has been recognized all over the world and implemented by many medical schools. This review summarizes and updates the application and study of…

  18. Update on the development of cotton gin PM2.5 emission factors for EPA's AP-42

    USDA-ARS?s Scientific Manuscript database

    A cotton ginning industry-supported project was initiated in 2008 to update the U.S. Environmental Protection Agency’s (EPA) Compilation of Air Pollution Emission Factors (AP-42) to include PM2.5 emission factors. This study develops emission factors from the PM2.5 emission factor data collected fro...

  19. Planned updates and refinements to the central valley hydrologic model, with an emphasis on improving the simulation of land subsidence in the San Joaquin Valley

    USGS Publications Warehouse

    Faunt, C.C.; Hanson, R.T.; Martin, P.; Schmid, W.

    2011-01-01

    California's Central Valley has been one of the most productive agricultural regions in the world for more than 50 years. To better understand the groundwater availability in the valley, the U.S. Geological Survey (USGS) developed the Central Valley hydrologic model (CVHM). Because of recent water-level declines and renewed subsidence, the CVHM is being updated to better simulate the geohydrologic system. The CVHM updates and refinements can be grouped into two general categories: (1) model code changes and (2) data updates. The CVHM updates and refinements will require that the model be recalibrated. The updated CVHM will provide a detailed transient analysis of changes in groundwater availability and flow paths in relation to climatic variability, urbanization, stream flow, and changes in irrigated agricultural practices and crops. The updated CVHM is particularly focused on more accurately simulating the locations and magnitudes of land subsidence. The intent of the updated CVHM is to help scientists better understand the availability and sustainability of water resources and the interaction of groundwater levels with land subsidence. ?? 2011 ASCE.

  20. Planned updates and refinements to the Central Valley hydrologic model with an emphasis on improving the simulation of land subsidence in the San Joaquin Valley

    USGS Publications Warehouse

    Faunt, Claudia C.; Hanson, Randall T.; Martin, Peter; Schmid, Wolfgang

    2011-01-01

    California's Central Valley has been one of the most productive agricultural regions in the world for more than 50 years. To better understand the groundwater availability in the valley, the U.S. Geological Survey (USGS) developed the Central Valley hydrologic model (CVHM). Because of recent water-level declines and renewed subsidence, the CVHM is being updated to better simulate the geohydrologic system. The CVHM updates and refinements can be grouped into two general categories: (1) model code changes and (2) data updates. The CVHM updates and refinements will require that the model be recalibrated. The updated CVHM will provide a detailed transient analysis of changes in groundwater availability and flow paths in relation to climatic variability, urbanization, stream flow, and changes in irrigated agricultural practices and crops. The updated CVHM is particularly focused on more accurately simulating the locations and magnitudes of land subsidence. The intent of the updated CVHM is to help scientists better understand the availability and sustainability of water resources and the interaction of groundwater levels with land subsidence.

  1. Updates to Enhanced Geothermal System Resource Potential Estimate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Augustine, Chad

    The deep EGS electricity generation resource potential estimate maintained by the National Renewable Energy Laboratory was updated using the most recent temperature-at-depth maps available from the Southern Methodist University Geothermal Laboratory. The previous study dates back to 2011 and was developed using the original temperature-at-depth maps showcased in the 2006 MIT Future of Geothermal Energy report. The methodology used to update the deep EGS resource potential is the same as in the previous study and is summarized in the paper. The updated deep EGS resource potential estimate was calculated for depths between 3 and 7 km and is binned inmore » 25 degrees C increments. The updated deep EGS electricity generation resource potential estimate is 4,349 GWe. A comparison of the estimates from the previous and updated studies shows a net increase of 117 GWe in the 3-7 km depth range, due mainly to increases in the underlying temperature-at-depth estimates from the updated maps.« less

  2. Update to Enhanced Geothermal System Resource Potential Estimate: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Augustine, Chad

    2016-10-01

    The deep EGS electricity generation resource potential estimate maintained by the National Renewable Energy Laboratory was updated using the most recent temperature-at-depth maps available from the Southern Methodist University Geothermal Laboratory. The previous study dates back to 2011 and was developed using the original temperature-at-depth maps showcased in the 2006 MIT Future of Geothermal Energy report. The methodology used to update the deep EGS resource potential is the same as in the previous study and is summarized in the paper. The updated deep EGS resource potential estimate was calculated for depths between 3 and 7 km and is binned inmore » 25 degrees C increments. The updated deep EGS electricity generation resource potential estimate is 4,349 GWe. A comparison of the estimates from the previous and updated studies shows a net increase of 117 GWe in the 3-7 km depth range, due mainly to increases in the underlying temperature-at-depth estimates from the updated maps.« less

  3. Annual update of data for estimating ESALs : draft.

    DOT National Transportation Integrated Search

    2008-10-01

    A revised procedure for estimating equivalent single axleloads (ESALs) was developed in 1985. This procedure used weight, classification, and traffic volume data collected by the Transportation Cabinet's Division of Planning. : Annual updates of data...

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sassani, David; Price, Laura L.; Rechard, Robert P.

    This report provides an update to Sassani et al. (2016) and includes: (1) an updated set of inputs (Sections 2.3) on various additional waste forms (WF) covering both DOE-managed spent nuclear fuel (SNF) and DOE-managed (as) high-level waste (HLW) for use in the inventory represented in the geologic disposal safety analyses (GDSA); (2) summaries of evaluations initiated to refine specific characteristics of particular WF for future use (Section 2.4); (3) updated development status of the Online Waste Library (OWL) database (Section 3.1.2) and an updated user guide to OWL (Section 3.1.3); and (4) status updates (Section 3.2) for the OWLmore » inventory content, data entry checking process, and external OWL BETA testing initiated in fiscal year 2017.« less

  5. Radiative forcing of climate

    NASA Technical Reports Server (NTRS)

    Ramanswamy, V.; Shine, Keith; Leovy, Conway; Wang, Wei-Chyung; Rodhe, Henning; Wuebbles, Donald J.; Ding, M.; Lelieveld, Joseph; Edmonds, Jae A.; Mccormick, M. Patrick

    1991-01-01

    An update of the scientific discussions presented in Chapter 2 of the Intergovernmental Panel on Climate Change (IPCC) report is presented. The update discusses the atmospheric radiative and chemical species of significance for climate change. There are two major objectives of the present update. The first is an extension of the discussion on the Global Warming Potentials (GWP's), including a reevaluation in view of the updates in the lifetimes of the radiatively active species. The second important objective is to underscore major developments in the radiative forcing of climate due to the observed stratospheric ozone losses occurring between 1979 and 1990.

  6. Progress on Updating the 1961-1990 National Solar Radiation Database

    NASA Technical Reports Server (NTRS)

    Renne, D.; Wilcox, S.; Marion, B.; George, R.; Myers, D.

    2003-01-01

    The 1961-1990 National Solar Radiation Data Base (NSRDB) provides a 30-year climate summary and solar characterization of 239 locations throughout the United States. Over the past several years, the National Renewable Energy Laboratory (NREL) has received numerous inquiries from a range of constituents as to whether an update of the database to include the 1990s will be developed. However, there are formidable challenges to creating an update of the serially complete station-specific database for the 1971-2000 period. During the 1990s, the National Weather Service changed its observational procedures from a human-based to an automated system, resulting in the loss of important input variables to the model used to complete the 1961-1990 NSRDB. As a result, alternative techniques are required for an update that covers the 1990s. This paper examines several alternative approaches for creating this update and describes preliminary NREL plans for implementing the update.

  7. The Updating of Geospatial Base Data

    NASA Astrophysics Data System (ADS)

    Alrajhi, Muhamad N.; Konecny, Gottfried

    2018-04-01

    Topopographic mapping issues concern the area coverage at different scales and their age. The age of the map is determined by the system of updating. The United Nations (UNGGIM) have attempted to track the global map coverage at various scale ranges, which has greatly improved in recent decades. However the poor state of updating of base maps is still a global problem. In Saudi Arabia large scale mapping is carried out for all urban, suburban and rural areas by aerial surveys. Updating is carried out by remapping every 5 to 10 years. Due to the rapid urban development this is not satisfactory, but faster update methods are forseen by use of high resolution satellite imagery and the improvement of object oriented geodatabase structures, which will permit to utilize various survey technologies to update the photogrammetry established geodatabases. The longterm goal is to create an geodata infrastructure, which exists in Great Britain or Germany.

  8. Update on Supersonic Jet Noise Research at NASA

    NASA Technical Reports Server (NTRS)

    Henderson, Brenda

    2010-01-01

    An update on jet noise research conducted in the Fundamental Aeronautics and Integrated Systems Research Programs was presented. Highlighted research projects included those focused on the development of prediction tools, diagnostic tools, and noise reduction concepts.

  9. Transportation infrastructure : highway pavement design guide is outdated

    DOT National Transportation Integrated Search

    1997-11-01

    The Federal Highway Administration has worked cooperatively with the : American Association of State Highway and Transportation Officials in : developing and updating the pavement design guide. The current guide is : slated to be updated by the year ...

  10. Motor carrier industry profile : an update 2004-2005.

    DOT National Transportation Integrated Search

    2005-08-01

    This report updates the Motor Carrier Industry Profile: 2001-20031 and Stock Market Performance of Publicly Traded Trucking Sector Stocks by Industry Segment, 2000-20042 to reflect more recent developments of particular significance to the industry a...

  11. MOOCs: Massive Open Online Courses. An Update of EUA's First Paper (January 2013). EUA Occasional Papers

    ERIC Educational Resources Information Center

    Gaebel, Michael

    2014-01-01

    With the rapid development of Massive Open Online Courses (MOOCs), the European Union Association (EUA) published an occasional paper in January 2013 on MOOCs for discussion at the EUA Council, and for information for EUA membership. The present paper aims to provide an update on these developments, particularly as they concern European higher…

  12. Development of interim oak assessment guidelines for the silvah decision-support system

    Treesearch

    Patrick H. Brose

    2007-01-01

    Updates to the SILVAH decision-support system make it more applicable to the mixed oak forests of Pennsylvania and other mid-Atlantic states. This update required establishing interim inventory guidelines for assessing the competitive ability of advance oak regeneration. This assessment was complicated by oak’s growth strategy, emphasizing root development in lieu of...

  13. Specifications of Standards in Systems and Synthetic Biology.

    PubMed

    Schreiber, Falk; Bader, Gary D; Golebiewski, Martin; Hucka, Michael; Kormeier, Benjamin; Le Novère, Nicolas; Myers, Chris; Nickerson, David; Sommer, Björn; Waltemath, Dagmar; Weise, Stephan

    2015-09-04

    Standards shape our everyday life. From nuts and bolts to electronic devices and technological processes, standardised products and processes are all around us. Standards have technological and economic benefits, such as making information exchange, production, and services more efficient. However, novel, innovative areas often either lack proper standards, or documents about standards in these areas are not available from a centralised platform or formal body (such as the International Standardisation Organisation). Systems and synthetic biology is a relatively novel area, and it is only in the last decade that the standardisation of data, information, and models related to systems and synthetic biology has become a community-wide effort. Several open standards have been established and are under continuous development as a community initiative. COMBINE, the ‘COmputational Modeling in BIology’ NEtwork has been established as an umbrella initiative to coordinate and promote the development of the various community standards and formats for computational models. There are yearly two meeting, HARMONY (Hackathons on Resources for Modeling in Biology), Hackathon-type meetings with a focus on development of the support for standards, and COMBINE forums, workshop-style events with oral presentations, discussion, poster, and breakout sessions for further developing the standards. For more information see http://co.mbine.org/. So far the different standards were published and made accessible through the standards’ web- pages or preprint services. The aim of this special issue is to provide a single, easily accessible and citable platform for the publication of standards in systems and synthetic biology. This special issue is intended to serve as a central access point to standards and related initiatives in systems and synthetic biology, it will be published annually to provide an opportunity for standard development groups to communicate updated specifications.

  14. Strong Motion Seismograph Based On MEMS Accelerometer

    NASA Astrophysics Data System (ADS)

    Teng, Y.; Hu, X.

    2013-12-01

    The MEMS strong motion seismograph we developed used the modularization method to design its software and hardware.It can fit various needs in different application situation.The hardware of the instrument is composed of a MEMS accelerometer,a control processor system,a data-storage system,a wired real-time data transmission system by IP network,a wireless data transmission module by 3G broadband,a GPS calibration module and power supply system with a large-volumn lithium battery in it. Among it,the seismograph's sensor adopted a three-axis with 14-bit high resolution and digital output MEMS accelerometer.Its noise level just reach about 99μg/√Hz and ×2g to ×8g dynamically selectable full-scale.Its output data rates from 1.56Hz to 800Hz. Its maximum current consumption is merely 165μA,and the device is so small that it is available in a 3mm×3mm×1mm QFN package. Furthermore,there is access to both low pass filtered data as well as high pass filtered data,which minimizes the data analysis required for earthquake signal detection. So,the data post-processing can be simplified. Controlling process system adopts a 32-bit low power consumption embedded ARM9 processor-S3C2440 and is based on the Linux operation system.The processor's operating clock at 400MHz.The controlling system's main memory is a 64MB SDRAM with a 256MB flash-memory.Besides,an external high-capacity SD card data memory can be easily added.So the system can meet the requirements for data acquisition,data processing,data transmission,data storage,and so on. Both wired and wireless network can satisfy remote real-time monitoring, data transmission,system maintenance,status monitoring or updating software.Linux was embedded and multi-layer designed conception was used.The code, including sensor hardware driver,the data acquisition,earthquake setting out and so on,was written on medium layer.The hardware driver consist of IIC-Bus interface driver, IO driver and asynchronous notification driver. The application program layer mainly concludes: earthquake parameter module, local database managing module, data transmission module, remote monitoring, FTP service and so on. The application layer adopted multi-thread process. The whole strong motion seismograph was encapsulated in a small aluminum box, which size is 80mm×120mm×55mm. The inner battery can work continuesly more than 24 hours. The MEMS accelerograph uses modular design for its software part and hardware part. It has remote software update function and can meet the following needs: a) Auto picking up the earthquake event; saving the data on wave-event files and hours files; It may be used for monitoring strong earthquake, explosion, bridge and house health. b) Auto calculate the earthquake parameters, and transferring those parameters by 3G wireless broadband network. This kind of seismograph has characteristics of low cost, easy installation. They can be concentrated in the urban region or areas need to specially care. We can set up a ground motion parameters quick report sensor network while large earthquake break out. Then high-resolution-fine shake-map can be easily produced for the need of emergency rescue. c) By loading P-wave detection program modules, it can be used for earthquake early warning for large earthquakes; d) Can easily construct a high-density layout seismic monitoring network owning remote control and modern intelligent earthquake sensor.

  15. Developing clinical practice guidelines: reviewing, reporting, and publishing guidelines; updating guidelines; and the emerging issues of enhancing guideline implementability and accounting for comorbid conditions in guideline development

    PubMed Central

    2012-01-01

    Clinical practice guidelines are one of the foundations of efforts to improve health care. In 1999, we authored a paper about methods to develop guidelines. Since it was published, the methods of guideline development have progressed both in terms of methods and necessary procedures and the context for guideline development has changed with the emergence of guideline clearing houses and large scale guideline production organisations (such as the UK National Institute for Health and Clinical Excellence). It therefore seems timely to, in a series of three articles, update and extend our earlier paper. In this third paper we discuss the issues of: reviewing, reporting, and publishing guidelines; updating guidelines; and the two emerging issues of enhancing guideline implementability and how guideline developers should approach dealing with the issue of patients who will be the subject of guidelines having co-morbid conditions. PMID:22762242

  16. Magnetic Moments in the Past: developing archaeomagnetic dating in the UK

    NASA Astrophysics Data System (ADS)

    Outram, Zoe; Batt, Catherine M.; Linford, Paul

    2010-05-01

    Magnetic studies of archaeological materials have a long history of development in the UK and the data produced by these studies is a key component of global models of the geomagnetic field. However, archaeomagnetic dating is not a widely used dating technique in UK archaeology, despite the potential to produce archaeologically significant information that directly relates to human activity. This often means that opportunities to improve our understanding of the past geomagnetic field are lost, because archaeologists are unaware of the potential of the method. This presentation discusses a project by the University of Bradford, UK and English Heritage to demonstrate and communicate the potential of archaeomagnetic dating of archaeological materials for routine use within the UK. The aims of the project were achieved through the production of a website and a database for all current and past archaeomagnetic studies carried out in the UK. The website provides archaeologists with the information required to consider the use of archaeomagnetic dating; including a general introduction to the technique, the features that can be sampled, the precision that can be expected from the dates and how much it costs. In addition, all archaeomagnetic studies carried out in the UK have been collated into a database, allowing similar studies to be identified on the basis of the location of the sites, the archaeological period and type of feature sampled. This clearly demonstrates how effective archaeomagnetic dating has been in different archaeological situations. The locations of the sites have been mapped using Google Earth so that studies carried out in a particular region, or from a specific time period can be easily identified. The database supports the continued development of archaeomagnetic dating in the UK, as the data required to construct the secular variation curves can be extracted easily. This allows the curves to be regularly updated following the production of new magnetic measurements. The information collated within the database will also be added to the global databases, such as MaGIC, contributing the improvement of the global models of the geomagnetic field. This project demonstrates the benefits that the presentation of clear, accessible information and increased communication with archaeologists can have on the study of the geomagnetic field. It is also hoped that similar approaches will beintroduced on a wider geographical scale in the future.

  17. On Using Exponential Parameter Estimators with an Adaptive Controller

    NASA Technical Reports Server (NTRS)

    Patre, Parag; Joshi, Suresh M.

    2011-01-01

    Typical adaptive controllers are restricted to using a specific update law to generate parameter estimates. This paper investigates the possibility of using any exponential parameter estimator with an adaptive controller such that the system tracks a desired trajectory. The goal is to provide flexibility in choosing any update law suitable for a given application. The development relies on a previously developed concept of controller/update law modularity in the adaptive control literature, and the use of a converse Lyapunov-like theorem. Stability analysis is presented to derive gain conditions under which this is possible, and inferences are made about the tracking error performance. The development is based on a class of Euler-Lagrange systems that are used to model various engineering systems including space robots and manipulators.

  18. Development and promotion of a national website to improve dissemination of information related to the prevention of mother-to-child HIV transmission (PMTCT) in Tanzania.

    PubMed

    Stephan, Gudila; Hoyt, Mary Jo; Storm, Deborah S; Shirima, Sylvia; Matiko, Charles; Matechi, Emmanuel

    2015-10-22

    Websites that address national public health issues provide an important mechanism to improve health education and services in resource limited countries. This article describes the development, promotion and initial evaluation of a national website to increase access to information and resources about prevention of mother-to-child transmission of HIV (PMTCT) among healthcare workers and PMTCT stakeholders in Tanzania. A participatory approach, involving the Tanzania Ministry of Health and Social Welfare (MOHSW) and key PMTCT stakeholders, was used to develop and manage the online PMTCT National Resource Center (NRC), http://pmtct.or.tz/ . The website was created with a content management system software system that does not require advanced computer skills and facilitates content updates and site management. The PMTCT NRC hosts related regularly updated PMTCT-related news, resources and publications. Website implementation, access and performance were evaluated over two years using Google Analytics data about visits, page views, downloads, bounce rates and location of visitors, supplemented by anecdotal feedback. Following its launch in July 2013, the PMTCT NRC website received a total of 28,400 visits, with 66,463 page views, over 2 years; 30 % of visits were from returning visitors. During year 1, visits increased by 80 % from the first to second 6 month period and then declined slightly (9-11 %) but remained stable in Year 2. Monthly visits spiked by about 70 % during October 2013 and January 2014 in response to the release and promotion of revised national PMTCT guidelines and training manuals. The majority of visitors came from primarily urban areas in Tanzania (50 %) and from other African countries (16 %). By year 2, over one-third of visitors used mobile devices to access the site. The successfully implemented PMTCT NRC website provides centralized, easily accessed information designed to address the needs of clinicians, educators and program partners in Tanzania. Ongoing involvement of the MOHSW and key stakeholders are essential ensure the website's growth, effectiveness and sustainability. Additional efforts are needed to expand use of the PMTCT NRC throughout the country. Future evaluations should examine the role of the website in supporting implementation of national PMTCT guidelines and services in Tanzania.

  19. Magnitude Representation and Working Memory Updating in Children With Arithmetic and Reading Comprehension Disabilities.

    PubMed

    Pelegrina, Santiago; Capodieci, Agnese; Carretti, Barbara; Cornoldi, Cesare

    2015-01-01

    It has been argued that children with learning disabilities (LD) encounter severe problems in working memory (WM) tasks, especially when they need to update information stored in their WM. It is not clear, however, to what extent this is due to a generally poor updating ability or to a difficulty specific to the domain to be processed. To examine this issue, two groups of children with arithmetic or reading comprehension LD and a group of typically developing children (9 to 10 years old) were assessed using two updating tasks requiring to select the smallest numbers or objects presented. The results showed that children with an arithmetic disability failed in a number updating task, but not in the object updating task. The opposite was true for the group with poor reading comprehension, whose performance was worse in the object than in the number updating task. It may be concluded that the problem of WM updating in children with LD is also due to a poor representation of the material to be updated. In addition, our findings suggest that the mental representation of the size of objects relates to the semantic representation of the objects' properties and differs from the quantitative representation of numbers. © Hammill Institute on Disabilities 2014.

  20. Updating the Vision for Marine Education.

    ERIC Educational Resources Information Center

    Klemm, E. Barbara

    1988-01-01

    Discusses the need to update the content, philosophical stance, and pedagogy of marine education to reflect recent advances in these areas. Cites some developments in oceanography and ocean engineering. Proposes ways teachers can learn about and utilize this knowledge. (RT)

  1. Guidelines and Recommendations to Accommodate Older Drivers and Pedestrians

    DOT National Transportation Integrated Search

    2001-05-01

    This project updated, revised, and expanded the scope of the "Older Driver Highway Design Handbook" published by the Federal Highway Administration (FHWA) in 1998. Development of the updated Handbook (FHWA-RD-01-103) was complemented by a technology ...

  2. Update LADOTD policy on pile driving vibration management.

    DOT National Transportation Integrated Search

    2012-02-01

    The main objective of this project was to update the current Louisiana Department of Transportation and Development (LADOTD) policy on pile driving vibration risk management with a focus on how to determine an appropriate vibration monitoring area. T...

  3. Update to a guide to standardized highway lighting pole hardware.

    DOT National Transportation Integrated Search

    2013-03-01

    This report describes the development of an updated Online Guide to Luminaire Supports. The Guide is a web-based content : management system for luminaire support systems that allows full viewing, submission, management, and reporting services : to i...

  4. Guidelines and Recommendations to Accommodate Older Drivers and Pedestrians

    DOT National Transportation Integrated Search

    2001-10-01

    This project updated, revised, and expanded the scope of the "Older Driver Highway Design Handbook" published by the Federal Highway Administration (FHWA) in 1998. Development of the updated Handbook (FHWA-RD-01-103) was complemented by a technology ...

  5. Sector Review: Workshops I and II. Pengian kebijakan, Subsektor Pendidikan, SD dan SMP. East Java Province, West Java Province, South Sulawes Province. Educational Policy and Planning Project.

    ERIC Educational Resources Information Center

    Florida State Univ., Tallahassee. Learning Systems Inst.

    This publication contains the first two of three training workshop manuals designed to be used in conducting an update of the Indonesian Education and Human Resources Sector Assessment. Workshop I covers the basic concepts, skills, and methods needed to design subsector updates and develop a draft plan for update activities. Workshops II and III…

  6. Human development of the ability to learn from bad news.

    PubMed

    Moutsiana, Christina; Garrett, Neil; Clarke, Richard C; Lotto, R Beau; Blakemore, Sarah-Jayne; Sharot, Tali

    2013-10-08

    Humans show a natural tendency to discount bad news while incorporating good news into beliefs (the "good news-bad news effect"), an effect that may help explain seemingly irrational risk taking. Understanding how this bias develops with age is important because adolescents are prone to engage in risky behavior; thus, educating them about danger is crucial. We reveal a striking valence-dependent asymmetry in how belief updating develops with age. In the ages tested (9-26 y), younger age was associated with inaccurate updating of beliefs in response to undesirable information regarding vulnerability. In contrast, the ability to update beliefs accurately in response to desirable information remained relatively stable with age. This asymmetry was mediated by adequate computational use of positive but not negative estimation errors to alter beliefs. The results are important for understanding how belief formation develops and might help explain why adolescents do not respond adequately to warnings.

  7. Category identification of changed land-use polygons in an integrated image processing/geographic information system

    NASA Technical Reports Server (NTRS)

    Westmoreland, Sally; Stow, Douglas A.

    1992-01-01

    A framework is proposed for analyzing ancillary data and developing procedures for incorporating ancillary data to aid interactive identification of land-use categories in land-use updates. The procedures were developed for use within an integrated image processsing/geographic information systems (GIS) that permits simultaneous display of digital image data with the vector land-use data to be updated. With such systems and procedures, automated techniques are integrated with visual-based manual interpretation to exploit the capabilities of both. The procedural framework developed was applied as part of a case study to update a portion of the land-use layer in a regional scale GIS. About 75 percent of the area in the study site that experienced a change in land use was correctly labeled into 19 categories using the combination of automated and visual interpretation procedures developed in the study.

  8. Human development of the ability to learn from bad news

    PubMed Central

    Moutsiana, Christina; Garrett, Neil; Clarke, Richard C.; Lotto, R. Beau; Blakemore, Sarah-Jayne; Sharot, Tali

    2013-01-01

    Humans show a natural tendency to discount bad news while incorporating good news into beliefs (the “good news–bad news effect”), an effect that may help explain seemingly irrational risk taking. Understanding how this bias develops with age is important because adolescents are prone to engage in risky behavior; thus, educating them about danger is crucial. We reveal a striking valence-dependent asymmetry in how belief updating develops with age. In the ages tested (9–26 y), younger age was associated with inaccurate updating of beliefs in response to undesirable information regarding vulnerability. In contrast, the ability to update beliefs accurately in response to desirable information remained relatively stable with age. This asymmetry was mediated by adequate computational use of positive but not negative estimation errors to alter beliefs. The results are important for understanding how belief formation develops and might help explain why adolescents do not respond adequately to warnings. PMID:24019466

  9. ECOD: new developments in the evolutionary classification of domains

    PubMed Central

    Schaeffer, R. Dustin; Liao, Yuxing; Cheng, Hua; Grishin, Nick V.

    2017-01-01

    Evolutionary Classification Of protein Domains (ECOD) (http://prodata.swmed.edu/ecod) comprehensively classifies protein with known spatial structures maintained by the Protein Data Bank (PDB) into evolutionary groups of protein domains. ECOD relies on a combination of automatic and manual weekly updates to achieve its high accuracy and coverage with a short update cycle. ECOD classifies the approximately 120 000 depositions of the PDB into more than 500 000 domains in ∼3400 homologous groups. We show the performance of the weekly update pipeline since the release of ECOD, describe improvements to the ECOD website and available search options, and discuss novel structures and homologous groups that have been classified in the recent updates. Finally, we discuss the future directions of ECOD and further improvements planned for the hierarchy and update process. PMID:27899594

  10. PLOT3D Export Tool for Tecplot

    NASA Technical Reports Server (NTRS)

    Alter, Stephen

    2010-01-01

    The PLOT3D export tool for Tecplot solves the problem of modified data being impossible to output for use by another computational science solver. The PLOT3D Exporter add-on enables the use of the most commonly available visualization tools to engineers for output of a standard format. The exportation of PLOT3D data from Tecplot has far reaching effects because it allows for grid and solution manipulation within a graphical user interface (GUI) that is easily customized with macro language-based and user-developed GUIs. The add-on also enables the use of Tecplot as an interpolation tool for solution conversion between different grids of different types. This one add-on enhances the functionality of Tecplot so significantly, it offers the ability to incorporate Tecplot into a general suite of tools for computational science applications as a 3D graphics engine for visualization of all data. Within the PLOT3D Export Add-on are several functions that enhance the operations and effectiveness of the add-on. Unlike Tecplot output functions, the PLOT3D Export Add-on enables the use of the zone selection dialog in Tecplot to choose which zones are to be written by offering three distinct options - output of active, inactive, or all zones (grid blocks). As the user modifies the zones to output with the zone selection dialog, the zones to be written are similarly updated. This enables the use of Tecplot to create multiple configurations of a geometry being analyzed. For example, if an aircraft is loaded with multiple deflections of flaps, by activating and deactivating different zones for a specific flap setting, new specific configurations of that aircraft can be easily generated by only writing out specific zones. Thus, if ten flap settings are loaded into Tecplot, the PLOT3D Export software can output ten different configurations, one for each flap setting.

  11. A multi-source data assimilation framework for flood forecasting: Accounting for runoff routing lags

    NASA Astrophysics Data System (ADS)

    Meng, S.; Xie, X.

    2015-12-01

    In the flood forecasting practice, model performance is usually degraded due to various sources of uncertainties, including the uncertainties from input data, model parameters, model structures and output observations. Data assimilation is a useful methodology to reduce uncertainties in flood forecasting. For the short-term flood forecasting, an accurate estimation of initial soil moisture condition will improve the forecasting performance. Considering the time delay of runoff routing is another important effect for the forecasting performance. Moreover, the observation data of hydrological variables (including ground observations and satellite observations) are becoming easily available. The reliability of the short-term flood forecasting could be improved by assimilating multi-source data. The objective of this study is to develop a multi-source data assimilation framework for real-time flood forecasting. In this data assimilation framework, the first step is assimilating the up-layer soil moisture observations to update model state and generated runoff based on the ensemble Kalman filter (EnKF) method, and the second step is assimilating discharge observations to update model state and runoff within a fixed time window based on the ensemble Kalman smoother (EnKS) method. This smoothing technique is adopted to account for the runoff routing lag. Using such assimilation framework of the soil moisture and discharge observations is expected to improve the flood forecasting. In order to distinguish the effectiveness of this dual-step assimilation framework, we designed a dual-EnKF algorithm in which the observed soil moisture and discharge are assimilated separately without accounting for the runoff routing lag. The results show that the multi-source data assimilation framework can effectively improve flood forecasting, especially when the runoff routing has a distinct time lag. Thus, this new data assimilation framework holds a great potential in operational flood forecasting by merging observations from ground measurement and remote sensing retrivals.

  12. The Arctic Observing Viewer (AOV): Visualization, Data Discovery, Strategic Assessment, and Decision Support for Arctic Observing

    NASA Astrophysics Data System (ADS)

    Kassin, A.; Cody, R. P.; Barba, M.; Escarzaga, S. M.; Villarreal, S.; Manley, W. F.; Gaylord, A. G.; Habermann, T.; Kozimor, J.; Score, R.; Tweedie, C. E.

    2017-12-01

    To better assess progress in Arctic Observing made by U.S. SEARCH, NSF AON, SAON, and related initiatives, an updated version of the Arctic Observing Viewer (AOV; http://ArcticObservingViewer.org) has been released. This web mapping application and information system conveys the who, what, where, and when of "data collection sites" - the precise locations of monitoring assets, observing platforms, and wherever repeat marine or terrestrial measurements have been taken. Over 13,000 sites across the circumarctic are documented including a range of boreholes, ship tracks, buoys, towers, sampling stations, sensor networks, vegetation plots, stream gauges, ice cores, observatories, and more. Contributing partners are the U.S. NSF, NOAA, the NSF Arctic Data Center, ADIwg, AOOS, a2dc, CAFF, GINA, IASOA, INTERACT, NASA ABoVE, and USGS, among others. Users can visualize, navigate, select, search, draw, print, view details, and follow links to obtain a comprehensive perspective of environmental monitoring efforts. We continue to develop, populate, and enhance AOV. Recent updates include: a vastly improved Search tool with free text queries, autocomplete, and filters; faster performance; a new clustering visualization; heat maps to highlight concentrated research; and 3-D represented data to more easily identify trends. AOV is founded on principles of interoperability, such that agencies and organizations can use the AOV Viewer and web services for their own purposes. In this way, AOV complements other distributed yet interoperable cyber resources and helps science planners, funding agencies, investigators, data specialists, and others to: assess status, identify overlap, fill gaps, optimize sampling design, refine network performance, clarify directions, access data, coordinate logistics, and collaborate to meet Arctic Observing goals. AOV is a companion application to the Arctic Research Mapping Application (armap.org), which is focused on general project information at a coarser level of granularity.

  13. DRUMS: Disk Repository with Update Management and Select option for high throughput sequencing data.

    PubMed

    Nettling, Martin; Thieme, Nils; Both, Andreas; Grosse, Ivo

    2014-02-04

    New technologies for analyzing biological samples, like next generation sequencing, are producing a growing amount of data together with quality scores. Moreover, software tools (e.g., for mapping sequence reads), calculating transcription factor binding probabilities, estimating epigenetic modification enriched regions or determining single nucleotide polymorphism increase this amount of position-specific DNA-related data even further. Hence, requesting data becomes challenging and expensive and is often implemented using specialised hardware. In addition, picking specific data as fast as possible becomes increasingly important in many fields of science. The general problem of handling big data sets was addressed by developing specialized databases like HBase, HyperTable or Cassandra. However, these database solutions require also specialized or distributed hardware leading to expensive investments. To the best of our knowledge, there is no database capable of (i) storing billions of position-specific DNA-related records, (ii) performing fast and resource saving requests, and (iii) running on a single standard computer hardware. Here, we present DRUMS (Disk Repository with Update Management and Select option), satisfying demands (i)-(iii). It tackles the weaknesses of traditional databases while handling position-specific DNA-related data in an efficient manner. DRUMS is capable of storing up to billions of records. Moreover, it focuses on optimizing relating single lookups as range request, which are needed permanently for computations in bioinformatics. To validate the power of DRUMS, we compare it to the widely used MySQL database. The test setting considers two biological data sets. We use standard desktop hardware as test environment. DRUMS outperforms MySQL in writing and reading records by a factor of two up to a factor of 10000. Furthermore, it can work with significantly larger data sets. Our work focuses on mid-sized data sets up to several billion records without requiring cluster technology. Storing position-specific data is a general problem and the concept we present here is a generalized approach. Hence, it can be easily applied to other fields of bioinformatics.

  14. Update on vaccine-derived polioviruses - worldwide, July 2012-December 2013.

    PubMed

    Diop, Ousmane M; Burns, Cara C; Wassilak, Steven G; Kew, Olen M

    2014-03-21

    In 1988, the World Health Assembly resolved to eradicate poliomyelitis worldwide. One of the main tools used in polio eradication efforts has been live, attenuated oral poliovirus vaccine (OPV), an inexpensive vaccine easily administered by trained volunteers. OPV might require several doses to induce immunity, but then it provides long-term protection against paralytic disease through durable humoral immunity. Rare cases of vaccine-associated paralytic poliomyelitis can occur among immunologically normal OPV recipients, their contacts, and persons who are immunodeficient. In addition, vaccine-derived polioviruses (VDPVs) can emerge in areas with low OPV coverage to cause polio outbreaks and can replicate for years in persons who have primary, B-cell immunodeficiencies. This report updates previous surveillance summaries and describes VDPVs detected worldwide during July 2012-December 2013. Those include a new circulating VDPV (cVDPV) outbreak identified in Pakistan in 2012, with spread to Afghanistan; an outbreak in Afghanistan previously identified in 2009 that continued into 2013; a new outbreak in Chad that spread to Cameroon, Niger, and northeastern Nigeria; and an outbreak that began in Somalia in 2008 that continued and spread to Kenya in 2013. A large outbreak in Nigeria that was identified in 2005 was nearly stopped by the end of 2013. Additionally, 10 newly identified persons in eight countries were found to excrete immunodeficiency-associated VDPVs (iVDPVs), and VDPVs were found among immunocompetent persons and environmental samples in 13 countries. Because the majority of VDPV isolates are type 2, the World Health Organization has developed a plan for coordinated worldwide replacement of trivalent OPV (tOPV) with bivalent OPV (bOPV; types 1 and 3) by 2016, preceded by introduction of at least 1 dose of inactivated poliovirus vaccine (IPV) containing all three poliovirus serotypes into routine immunization schedules worldwide to ensure high population immunity to all polioviruses.

  15. FY2016 Update on ILAW Glass Testing for Disposal at IDF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, E. E.; Swanberg, D. J.; Muller, Isabelle S.

    2017-04-12

    This status report provides a FY2016 update on work performed to collect information on the corrosion behavior of LAW glasses to support the IDF PA. In addition to the development of the baseline operating envelope for the WTP, since 2003, VSL has developed a wide range of LAW formulations that achieve considerably higher waste loadings than the WTP baseline formulations.

  16. CSTT Update: Fuel Quality Analyzer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brosha, Eric L.; Lujan, Roger W.; Mukundan, Rangachary

    These are slides from a presentation. The following topics are covered: project background (scope and approach), developing the prototype (timeline), update on intellectual property, analyzer comparisons (improving humidification, stabilizing the baseline, applying clean-up strategy, impact of ionomer content and improving clean-up), proposed operating mode, considerations for testing in real-world conditions (Gen 1 analyzer electronics development, testing partner identified, field trial planning), summary, and future work.

  17. A model for helicopter guidance on spiral trajectories

    NASA Technical Reports Server (NTRS)

    Mendenhall, S.; Slater, G. L.

    1980-01-01

    A point mass model is developed for helicopter guidance on spiral trajectories. A fully coupled set of state equations is developed and perturbation equations suitable for 3-D and 4-D guidance are derived and shown to be amenable to conventional state variable feedback methods. Control variables are chosen to be the magnitude and orientation of the net rotor thrust. Using these variables reference controls for nonlevel accelerating trajectories are easily determined. The effects of constant wind are shown to require significant feedforward correction to some of the reference controls and to the time. Although not easily measured themselves, the controls variables chosen are shown to be easily related to the physical variables available in the cockpit.

  18. Environmental Control Systems for Exploration Missions One and Two

    NASA Technical Reports Server (NTRS)

    Falcone, Mark A.

    2017-01-01

    In preparing for Exploration Missions One and Two (EM-1 & EM-2), the Ground Systems Development and Operations Program has significant updates to be made to nearly all facilities. This is all being done to accommodate the Space Launch System, which will be the world’s largest rocket in history upon fruition. Facilitating the launch of such a rocket requires an updated Vehicle Assembly Building, an upgraded Launchpad, Payload Processing Facility, and more. In this project, Environmental Control Systems across several facilities were involved, though there is a focus around the Mobile Launcher and Launchpad. Parts were ordered, analysis models were updated, design drawings were updated, and more.

  19. Timing considerations of Helmet Mounted Display performance

    NASA Technical Reports Server (NTRS)

    Tharp, Gregory; Liu, Andrew; French, Lloyd; Lai, Steve; Stark, Lawrence

    1992-01-01

    The Helmet Mounted Display (HMD) system developed in our lab should be a useful teleoperator systems display if it increases operator performance of the desired task; it can, however, introduce degradation in performance due to display update rate constraints and communication delays. Display update rates are slowed by communication bandwidth and/or computational power limitations. We used simulated 3D tracking and pick-and-place tasks to characterize performance levels for a range of update rates. Initial experiments with 3D tracking indicate that performance levels plateau at an update rate between 10 and 20 Hz. We have found that using the HMD with delay decreases performance as delay increases.

  20. A land cover change detection and classification protocol for updating Alaska NLCD 2001 to 2011

    USGS Publications Warehouse

    Jin, Suming; Yang, Limin; Zhu, Zhe; Homer, Collin G.

    2017-01-01

    Monitoring and mapping land cover changes are important ways to support evaluation of the status and transition of ecosystems. The Alaska National Land Cover Database (NLCD) 2001 was the first 30-m resolution baseline land cover product of the entire state derived from circa 2001 Landsat imagery and geospatial ancillary data. We developed a comprehensive approach named AKUP11 to update Alaska NLCD from 2001 to 2011 and provide a 10-year cyclical update of the state's land cover and land cover changes. Our method is designed to characterize the main land cover changes associated with different drivers, including the conversion of forests to shrub and grassland primarily as a result of wildland fire and forest harvest, the vegetation successional processes after disturbance, and changes of surface water extent and glacier ice/snow associated with weather and climate changes. For natural vegetated areas, a component named AKUP11-VEG was developed for updating the land cover that involves four major steps: 1) identify the disturbed and successional areas using Landsat images and ancillary datasets; 2) update the land cover status for these areas using a SKILL model (System of Knowledge-based Integrated-trajectory Land cover Labeling); 3) perform decision tree classification; and 4) develop a final land cover and land cover change product through the postprocessing modeling. For water and ice/snow areas, another component named AKUP11-WIS was developed for initial land cover change detection, removal of the terrain shadow effects, and exclusion of ephemeral snow changes using a 3-year MODIS snow extent dataset from 2010 to 2012. The overall approach was tested in three pilot study areas in Alaska, with each area consisting of four Landsat image footprints. The results from the pilot study show that the overall accuracy in detecting change and no-change is 90% and the overall accuracy of the updated land cover label for 2011 is 86%. The method provided a robust, consistent, and efficient means for capturing major disturbance events and updating land cover for Alaska. The method has subsequently been applied to generate the land cover and land cover change products for the entire state of Alaska.

  1. Nastran level 16 theoretical manual updates for aeroelastic analysis of bladed discs

    NASA Technical Reports Server (NTRS)

    Elchuri, V.; Smith, G. C. C.

    1980-01-01

    A computer program based on state of the art compressor and structural technologies applied to bladed shrouded disc was developed and made operational in NASTRAN Level 16. Aeroelastic analyses, modes and flutter. Theoretical manual updates are included.

  2. Guidelines and recommendations to accommodate older driver and pedestrians

    DOT National Transportation Integrated Search

    2001-05-01

    This project updated, revised, and expanded the scope of the Older Driver Highway Design Handbook published by FHWA in 1998. Development of the updated Handbook (FHWA-RD-01-103) was complemented by a technology transfer initiative to make practitione...

  3. A Secure and Efficient Audit Mechanism for Dynamic Shared Data in Cloud Storage

    PubMed Central

    2014-01-01

    With popularization of cloud services, multiple users easily share and update their data through cloud storage. For data integrity and consistency in the cloud storage, the audit mechanisms were proposed. However, existing approaches have some security vulnerabilities and require a lot of computational overheads. This paper proposes a secure and efficient audit mechanism for dynamic shared data in cloud storage. The proposed scheme prevents a malicious cloud service provider from deceiving an auditor. Moreover, it devises a new index table management method and reduces the auditing cost by employing less complex operations. We prove the resistance against some attacks and show less computation cost and shorter time for auditing when compared with conventional approaches. The results present that the proposed scheme is secure and efficient for cloud storage services managing dynamic shared data. PMID:24959630

  4. Adaptive Grouping Cloud Model Shuffled Frog Leaping Algorithm for Solving Continuous Optimization Problems

    PubMed Central

    Liu, Haorui; Yi, Fengyan; Yang, Heli

    2016-01-01

    The shuffled frog leaping algorithm (SFLA) easily falls into local optimum when it solves multioptimum function optimization problem, which impacts the accuracy and convergence speed. Therefore this paper presents grouped SFLA for solving continuous optimization problems combined with the excellent characteristics of cloud model transformation between qualitative and quantitative research. The algorithm divides the definition domain into several groups and gives each group a set of frogs. Frogs of each region search in their memeplex, and in the search process the algorithm uses the “elite strategy” to update the location information of existing elite frogs through cloud model algorithm. This method narrows the searching space and it can effectively improve the situation of a local optimum; thus convergence speed and accuracy can be significantly improved. The results of computer simulation confirm this conclusion. PMID:26819584

  5. Stringency of workplace air contaminant exposure limits: a case study of OSHA risk management.

    PubMed

    Hakes, J K

    1999-12-01

    Political context may play a large role in influencing the efficiency of environmental and health regulations. This case study uses data from a 1989 update of the Occupational Safety and Health Administration (OSHA) Permissible Exposure Limits (PELs) program to determine the relative effects of legislative mandates, costly acquisition of information by the agency, and pressure applied by special interest groups upon exposure standards. The empirical analysis suggests that federal agencies successfully thwart legislative attempts to limit agency discretion, and that agencies exercise bounded rationality by placing greater emphasis on more easily obtained information. The 1989 PELs were less significantly related to more costly information, contained "safety factors" for chemicals presenting relatively more ambiguous risks, and the proposed standard stringencies showed evidence of being influenced by vying industry and labor interests.

  6. The B-dot Earth Average Magnetic Field

    NASA Technical Reports Server (NTRS)

    Capo-Lugo, Pedro A.; Rakoczy, John; Sanders, Devon

    2013-01-01

    The average Earth's magnetic field is solved with complex mathematical models based on mean square integral. Depending on the selection of the Earth magnetic model, the average Earth's magnetic field can have different solutions. This paper presents a simple technique that takes advantage of the damping effects of the b-dot controller and is not dependent of the Earth magnetic model; but it is dependent on the magnetic torquers of the satellite which is not taken into consideration in the known mathematical models. Also the solution of this new technique can be implemented so easily that the flight software can be updated during flight, and the control system can have current gains for the magnetic torquers. Finally, this technique is verified and validated using flight data from a satellite that it has been in orbit for three years.

  7. An English Vocabulary Learning System Based on Fuzzy Theory and Memory Cycle

    NASA Astrophysics Data System (ADS)

    Wang, Tzone I.; Chiu, Ti Kai; Huang, Liang Jun; Fu, Ru Xuan; Hsieh, Tung-Cheng

    This paper proposes an English Vocabulary Learning System based on the Fuzzy Theory and the Memory Cycle Theory to help a learner to memorize vocabularies easily. By using fuzzy inferences and personal memory cycles, it is possible to find an article that best suits a learner. After reading an article, a quiz is provided for the learner to improve his/her memory of the vocabulary in the article. Early researches use just explicit response (ex. quiz exam) to update memory cycles of newly learned vocabulary; apart from that approach, this paper proposes a methodology that also modify implicitly the memory cycles of learned word. By intensive reading of articles recommended by our approach, a learner learns new words quickly and reviews learned words implicitly as well, and by which the vocabulary ability of the learner improves efficiently.

  8. The Focusing Optics X-ray Solar Imager: Second Flight and Recent Results

    NASA Astrophysics Data System (ADS)

    Christe, S.; Krucker, S.; Glesener, L.; Ishikawa, S. N.; Ramsey, B.; Buitrago Casas, J. C.; Foster, N.

    2014-12-01

    Solar flares accelerate particles up to high energies through various acceleration mechanisms which are not currently understood. Hard X-rays are the most direct diagnostic of flare-accelerated electrons. However past and current hard x-ray observation lack the sensitivity and dynamic range necessary to observe the faint signature of accelerated electrons in the acceleration region, the solar corona. These limitations can be easily overcome through the use of HXR focusing optics coupled with solid state pixelated detectors. We present on recent updates on the FOXSI sounding rocket program. During its first flight FOXSI observed imaged a microflare with simultaneous observations by RHESSI. We present recent imaging analysis of the FOXSI observations and detailed comparison with RHESSI. New detector calibration results are also presented and, time-permitting, preliminary results from the second launch of FOXSI scheduled for December 2014.

  9. A secure and efficient audit mechanism for dynamic shared data in cloud storage.

    PubMed

    Kwon, Ohmin; Koo, Dongyoung; Shin, Yongjoo; Yoon, Hyunsoo

    2014-01-01

    With popularization of cloud services, multiple users easily share and update their data through cloud storage. For data integrity and consistency in the cloud storage, the audit mechanisms were proposed. However, existing approaches have some security vulnerabilities and require a lot of computational overheads. This paper proposes a secure and efficient audit mechanism for dynamic shared data in cloud storage. The proposed scheme prevents a malicious cloud service provider from deceiving an auditor. Moreover, it devises a new index table management method and reduces the auditing cost by employing less complex operations. We prove the resistance against some attacks and show less computation cost and shorter time for auditing when compared with conventional approaches. The results present that the proposed scheme is secure and efficient for cloud storage services managing dynamic shared data.

  10. BUPRENORPHINE ABUSE IN INDIA : AN UPDATE

    PubMed Central

    Sharma, Yogesh; Mattoo, S.K.

    1999-01-01

    This study reviews the available Indian literature on buprenorphine abuse. Buprenorphine was introduced in 1986; the abuse, first noticed in 1987, increased rapidly till 1994, and then decreased gradually. Initiated through other addicts and medical practitioners, the abuse was mostly as a cheap, easily and legally available substitute for opioids. The typical young adult male abuser used an intravenous cocktail with diazepam, pheneramine or promethazine for a better kick. The withdrawal syndrome was typical of the opioids and without an expected delayed onset. Complications of pseudoaneurysm and recurrent koro in repeated withdrawal were reported. Buprenorphine as a detoxifying agent for opioids reportedly gave better symptom control in the first week but high rates of dependence induction were reported. The Indian data tends to caution against the Western enthusiasm to use buprenorphine for detoxification or maintenance of opioid abusers. PMID:21455379

  11. A Data Handling System for Modern and Future Fermilab Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Illingworth, R. A.

    2014-01-01

    Current and future Fermilab experiments such as Minerva, NOνA, and MicroBoone are now using an improved version of the Fermilab SAM data handling system. SAM was originally used by the CDF and D0 experiments for Run II of the Fermilab Tevatron to provide file metadata and location cataloguing, uploading of new files to tape storage, dataset management, file transfers between global processing sites, and processing history tracking. However SAM was heavily tailored to the Run II environment and required complex and hard to deploy client software, which made it hard to adapt to new experiments. The Fermilab Computing Sector hasmore » progressively updated SAM to use modern, standardized, technologies in order to more easily deploy it for current and upcoming Fermilab experiments, and to support the data preservation efforts of the Run II experiments.« less

  12. Bioterrorism web site resources for infectious disease clinicians and epidemiologists.

    PubMed

    Ferguson, Natalie E; Steele, Lynn; Crawford, Carol Y; Huebner, Nathan L; Fonseka, Jamila C; Bonander, Jason C; Kuehnert, Matthew J

    2003-06-01

    Finding bioterrorism-related information on the World Wide Web can be laborious. We hope to help readers find such information more easily by summarizing essential information in a consistent framework. A panel of 7 Centers for Disease Control and Prevention reviewers identified Web sites and evaluated them for sponsorship, mission, content usefulness, online ease of use, and adherence to commonly accepted quality criteria. Of >100 potential sites identified, 81 were chosen for target content of interest, and 43 were selected for inclusion. The results were classified into general purpose/portal sites; biological agent information; laboratory, infection control, epidemiology, and mental health information; and emergency contact sources, news and updates, event preparedness resources, information for first-responder settings, clinical and public education materials, and research resources. Agents covered included anthrax, smallpox, plague, botulism, tularemia, and viral hemorrhagic fever.

  13. Kennedy Space Center Director Update

    NASA Image and Video Library

    2014-03-06

    CAPE CANAVERAL, Fla. – NASA Kennedy Space Center Director Robert Cabana welcomes community leaders, business executives, educators, community organizers, and state and local government leaders to the Kennedy Space Center Visitor Complex Debus Center for the Kennedy Space Center Director Update. Attendees talked with Cabana and other senior Kennedy managers and visited displays featuring updates on Kennedy programs and projects, including International Space Station, Commercial Crew, Ground System Development and Operations, Launch Services, Center Planning and Development, Technology, KSC Swamp Works and NASA Education. The morning concluded with a tour of the new Space Shuttle Atlantis exhibit at the visitor complex. For more information, visit http://www.nasa.gov/kennedy. Photo credit: NASA/Daniel Casper

  14. NXE pellicle: development update

    NASA Astrophysics Data System (ADS)

    Brouns, Derk; Bendiksen, Aage; Broman, Par; Casimiri, Eric; Colsters, Paul; de Graaf, Dennis; Harrold, Hilary; Hennus, Piet; Janssen, Paul; Kramer, Ronald; Kruizinga, Matthias; Kuntzel, Henk; Lafarre, Raymond; Mancuso, Andrea; Ockwell, David; Smith, Daniel; van de Weg, David; Wiley, Jim

    2016-09-01

    ASML introduced the NXE pellicle concept, a removable pellicle solution that is compatible with current and future patterned mask inspection methods. We will present results of how we have taken the idea from concept to a demonstrated solution enabling the use of EUV pellicle by the industry for high volume manufacturing. We will update on the development of the next generation of pellicle films with higher power capability. Further, we will provide an update on top level requirements for pellicles and external interface requirements needed to support NXE pellicle adoption at a mask shop. Finally, we will present ASML's pellicle handling equipment to enable pellicle use at mask shops and our NXE pellicle roadmap outlining future improvements.

  15. Protecting the Geyser Basins of Yellowstone National Park: Toward a New National Policy for a Vulnerable Environmental Resource

    NASA Astrophysics Data System (ADS)

    Barrick, Kenneth A.

    2010-01-01

    Geyser basins provide high value recreation, scientific, economic and national heritage benefits. Geysers are globally rare, in part, because development activities have quenched about 260 of the natural endowment. Today, more than half of the world’s remaining geysers are located in Yellowstone National Park, northwest Wyoming, USA. However, the hydrothermal reservoirs that supply Yellowstone’s geysers extend well beyond the Park borders, and onto two “Known Geothermal Resource Areas”—Island Park to the west and Corwin Springs on the north. Geysers are sensitive geologic features that are easily quenched by nearby geothermal wells. Therefore, the potential for geothermal energy development adjacent to Yellowstone poses a threat to the sustainability of about 500 geysers and 10,000 hydrothermal features. The purpose here is to propose that Yellowstone be protected by a “Geyser Protection Area” (GPA) extending in a 120-km radius from Old Faithful Geyser. The GPA concept would prohibit geothermal and large-scale groundwater wells, and thereby protect the water and heat supply of the hydrothermal reservoirs that support Yellowstone’s geyser basins and important hot springs. Proactive federal leadership, including buyouts of private groundwater development rights, can assist in navigating the GPA through the greater Yellowstone area’s “wicked” public policy environment. Moreover, the potential impacts on geyser basins from intrusive research sampling techniques are considered in order to facilitate the updating of national park research regulations to a precautionary standard. The GPA model can provide the basis for protecting the world’s few remaining geyser basins.

  16. Onco-STS: a web-based laboratory information management system for sample and analysis tracking in oncogenomic experiments.

    PubMed

    Gavrielides, Mike; Furney, Simon J; Yates, Tim; Miller, Crispin J; Marais, Richard

    2014-01-01

    Whole genomes, whole exomes and transcriptomes of tumour samples are sequenced routinely to identify the drivers of cancer. The systematic sequencing and analysis of tumour samples, as well other oncogenomic experiments, necessitates the tracking of relevant sample information throughout the investigative process. These meta-data of the sequencing and analysis procedures include information about the samples and projects as well as the sequencing centres, platforms, data locations, results locations, alignments, analysis specifications and further information relevant to the experiments. The current work presents a sample tracking system for oncogenomic studies (Onco-STS) to store these data and make them easily accessible to the researchers who work with the samples. The system is a web application, which includes a database and a front-end web page that allows the remote access, submission and updating of the sample data in the database. The web application development programming framework Grails was used for the development and implementation of the system. The resulting Onco-STS solution is efficient, secure and easy to use and is intended to replace the manual data handling of text records. Onco-STS allows simultaneous remote access to the system making collaboration among researchers more effective. The system stores both information on the samples in oncogenomic studies and details of the analyses conducted on the resulting data. Onco-STS is based on open-source software, is easy to develop and can be modified according to a research group's needs. Hence it is suitable for laboratories that do not require a commercial system.

  17. Trust and Society: Suggestions for Further Development of Niklas Luhmann's Theory of Trust.

    PubMed

    Morgner, Christian

    2018-05-01

    This paper addresses an apparent gap in the work of Niklas Luhmann. While the issue of trust continues to receive widespread attention in the social sciences, Luhmann's interest in this topic declined following the development of his systems theory. It is argued that this decline does not reflect any diminished relevance of trust for systems theory, but rather that the architectural remodeling of theory cannot easily be applied to the issue of trust. Here, the issue of trust is reconceptualized as a connection medium. This entails a reconstruction of Luhmann's early theory of trust, especially with regard to function and social positioning. In this context, trust can in turn be linked to the concept of medium in Luhmann's late work. As a connection medium, trust mediates between the different levels of sociality-interaction, organization, and society. These theoretical considerations are employed to develop a more applied framework for empirical research, with a brief case study from southern Italy. From this perspective, the idea of trust as society's glue is seen to be overly simplistic. The common ethical understanding that more trust leads to a better society is also questioned on the grounds that social cooperation can also lead to social sclerosis. Finally, risk and trust are shown to accommodate the formation of different cultures of trust. The paper shows how Luhmann's updated version of trust can inspire current research and enhance our understanding of how trust operates in contemporary society. © 2018 Canadian Sociological Association/La Société canadienne de sociologie.

  18. GRAPPA 2015 Research and Education Project Reports.

    PubMed

    Mease, Philip J; Helliwell, Philip S; Boehncke, Wolf-Henning; Coates, Laura C; FitzGerald, Oliver; Gladman, Dafna D; Deodhar, Atul A; Callis Duffin, Kristina

    2016-05-01

    At the 2015 annual meeting of the Group for Research and Assessment of Psoriasis and Psoriatic Arthritis (GRAPPA), attendees were presented with brief updates on several ongoing initiatives, including educational projects. Updates were presented on the treatment recommendations project, the development of simple criteria to identify inflammatory musculoskeletal disease, new patient/physician Delphi exercises, and BIODAM (identifying biomarkers that predict progressive structural joint damage). The publication committee also gave a report. Herein we summarize those project updates.

  19. Metazen – metadata capture for metagenomes

    DOE PAGES

    Bischof, Jared; Harrison, Travis; Paczian, Tobias; ...

    2014-12-08

    Background: As the impact and prevalence of large-scale metagenomic surveys grow, so does the acute need for more complete and standards compliant metadata. Metadata (data describing data) provides an essential complement to experimental data, helping to answer questions about its source, mode of collection, and reliability. Metadata collection and interpretation have become vital to the genomics and metagenomics communities, but considerable challenges remain, including exchange, curation, and distribution. Currently, tools are available for capturing basic field metadata during sampling, and for storing, updating and viewing it. These tools are not specifically designed for metagenomic surveys; in particular, they lack themore » appropriate metadata collection templates, a centralized storage repository, and a unique ID linking system that can be used to easily port complete and compatible metagenomic metadata into widely used assembly and sequence analysis tools. Results: Metazen was developed as a comprehensive framework designed to enable metadata capture for metagenomic sequencing projects. Specifically, Metazen provides a rapid, easy-to-use portal to encourage early deposition of project and sample metadata. Conclusion: Metazen is an interactive tool that aids users in recording their metadata in a complete and valid format. A defined set of mandatory fields captures vital information, while the option to add fields provides flexibility.« less

  20. Metazen – metadata capture for metagenomes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bischof, Jared; Harrison, Travis; Paczian, Tobias

    Background: As the impact and prevalence of large-scale metagenomic surveys grow, so does the acute need for more complete and standards compliant metadata. Metadata (data describing data) provides an essential complement to experimental data, helping to answer questions about its source, mode of collection, and reliability. Metadata collection and interpretation have become vital to the genomics and metagenomics communities, but considerable challenges remain, including exchange, curation, and distribution. Currently, tools are available for capturing basic field metadata during sampling, and for storing, updating and viewing it. These tools are not specifically designed for metagenomic surveys; in particular, they lack themore » appropriate metadata collection templates, a centralized storage repository, and a unique ID linking system that can be used to easily port complete and compatible metagenomic metadata into widely used assembly and sequence analysis tools. Results: Metazen was developed as a comprehensive framework designed to enable metadata capture for metagenomic sequencing projects. Specifically, Metazen provides a rapid, easy-to-use portal to encourage early deposition of project and sample metadata. Conclusion: Metazen is an interactive tool that aids users in recording their metadata in a complete and valid format. A defined set of mandatory fields captures vital information, while the option to add fields provides flexibility.« less

  1. Effectiveness of a digital platform for sharing knowledge on headache management: a two-year experience.

    PubMed

    Raieli, Vincenzo; Correnti, E; Sandullo, A; Romano, M; Marchese, F; Loiacono, C; Brighina, Filippo

    It is crucial that all headache specialists receive adequate training. Considering the unsatisfactory results obtained with standard updating courses and the growing need for continuing professional education, a digital platform was developed as a training tool. The platform has been active since 1 October 2014. It is readily accessible to doctors by free registration. Users have access to all the material available on the platform, which includes scientific articles, e-books, presentations and images. Users can share their own material and clinical cases directly. At the time of this study, the platform had 37 users. In the second year following its launch 316 files were downloaded and five discussions were started. These saw 22 contributions. Fifteen of the 37 members did not perform any action on the platform. In total, 74 files were uploaded in the second year of activity, but 90% of the contributions came from a very small group of users. There were no significant differences in use of the platform between members of the Italian Society for the Study of Headache and other specialists. Even though the platform appears to be an easily accessible, interactive and inexpensive instrument, the higher number of downloads than uploads suggests that it is used passively.

  2. Modeling Gas and Gas Hydrate Accumulation in Marine Sediments Using a K-Nearest Neighbor Machine-Learning Technique

    NASA Astrophysics Data System (ADS)

    Wood, W. T.; Runyan, T. E.; Palmsten, M.; Dale, J.; Crawford, C.

    2016-12-01

    Natural Gas (primarily methane) and gas hydrate accumulations require certain bio-geochemical, as well as physical conditions, some of which are poorly sampled and/or poorly understood. We exploit recent advances in the prediction of seafloor porosity and heat flux via machine learning techniques (e.g. Random forests and Bayesian networks) to predict the occurrence of gas and subsequently gas hydrate in marine sediments. The prediction (actually guided interpolation) of key parameters we use in this study is a K-nearest neighbor technique. KNN requires only minimal pre-processing of the data and predictors, and requires minimal run-time input so the results are almost entirely data-driven. Specifically we use new estimates of sedimentation rate and sediment type, along with recently derived compaction modeling to estimate profiles of porosity and age. We combined the compaction with seafloor heat flux to estimate temperature with depth and geologic age, which, with estimates of organic carbon, and models of methanogenesis yield limits on the production of methane. Results include geospatial predictions of gas (and gas hydrate) accumulations, with quantitative estimates of uncertainty. The Generic Earth Modeling System (GEMS) we have developed to derive the machine learning estimates is modular and easily updated with new algorithms or data.

  3. Unified Models of Turbulence and Nonlinear Wave Evolution in the Extended Solar Corona and Solar Wind

    NASA Technical Reports Server (NTRS)

    Cranmer, Steven R.; Wagner, William (Technical Monitor)

    2004-01-01

    The PI (Cranmer) and Co-I (A. van Ballegooijen) made substantial progress toward the goal of producing a unified model of the basic physical processes responsible for solar wind acceleration. The approach outlined in the original proposal comprised two complementary pieces: (1) to further investigate individual physical processes under realistic coronal and solar wind conditions, and (2) to extract the dominant physical effects from simulations and apply them to a 1D model of plasma heating and acceleration. The accomplishments in Year 2 are divided into these two categories: 1a. Focused Study of Kinetic Magnetohydrodynamic (MHD) Turbulence. lb. Focused Study of Non - WKB Alfven Wave Rejection. and 2. The Unified Model Code. We have continued the development of the computational model of a time-study open flux tube in the extended corona. The proton-electron Monte Carlo model is being tested, and collisionless wave-particle interactions are being included. In order to better understand how to easily incorporate various kinds of wave-particle processes into the code, the PI performed a detailed study of the so-called "Ito Calculus", i.e., the mathematical theory of how to update the positions of particles in a probabilistic manner when their motions are governed by diffusion in velocity space.

  4. A splitting integration scheme for the SPH simulation of concentrated particle suspensions

    NASA Astrophysics Data System (ADS)

    Bian, Xin; Ellero, Marco

    2014-01-01

    Simulating nearly contacting solid particles in suspension is a challenging task due to the diverging behavior of short-range lubrication forces, which pose a serious time-step limitation for explicit integration schemes. This general difficulty limits severely the total duration of simulations of concentrated suspensions. Inspired by the ideas developed in [S. Litvinov, M. Ellero, X.Y. Hu, N.A. Adams, J. Comput. Phys. 229 (2010) 5457-5464] for the simulation of highly dissipative fluids, we propose in this work a splitting integration scheme for the direct simulation of solid particles suspended in a Newtonian liquid. The scheme separates the contributions of different forces acting on the solid particles. In particular, intermediate- and long-range multi-body hydrodynamic forces, which are computed from the discretization of the Navier-Stokes equations using the smoothed particle hydrodynamics (SPH) method, are taken into account using an explicit integration; for short-range lubrication forces, velocities of pairwise interacting solid particles are updated implicitly by sweeping over all the neighboring pairs iteratively, until convergence in the solution is obtained. By using the splitting integration, simulations can be run stably and efficiently up to very large solid particle concentrations. Moreover, the proposed scheme is not limited to the SPH method presented here, but can be easily applied to other simulation techniques employed for particulate suspensions.

  5. An evaluation of immunization education resources by family medicine residency directors.

    PubMed

    Nowalk, Mary Patricia; Zimmerman, Richard K; Middleton, Donald B; Sherwood, Roger A; Ko, Feng-Shou; Kimmel, Sanford R; Troy, Judith A

    2007-01-01

    Immunization is a rapidly evolving field, and teachers of family medicine are responsible for ensuring that they and their students are knowledgeable about the latest vaccine recommendations. A survey was mailed to 456 family medicine residency directors across the United States to obtain their evaluation of immunization resources developed by the Society of Teachers of Family Medicine's Group on Immunization Education. Frequencies, measures of central tendency, and differences between responses from 2001 to 2005 were analyzed. Directors of 261 (57%) family medicine residencies responded, with >80% reporting satisfaction with immunization teaching resources. The popularity of bound resources decreased from 2001 to 2005, while immunization Web sites increased in importance. The journal supplement, "Vaccines Across the Lifespan, 2005" was less frequently read in 2005 than its predecessor published in 2001, but quality ratings remained high. Use of the Web site, www.ImmunizationEd.org, and the Shots software for both desktop and handheld computers has increased since their creation. Electronic immunization teaching resources are increasingly popular among family medicine residencies. As the field continues to change, the use of electronic resources is expected to continue, since they are easily updated and, in the case of www.ImmunizationEd.org and Shots software, are available free of charge.

  6. Regionalization of response routine parameters

    NASA Astrophysics Data System (ADS)

    Tøfte, Lena S.; Sultan, Yisak A.

    2013-04-01

    When area distributed hydrological models are to be calibrated or updated, fewer calibration parameters is of a considerable advantage. Based on, among others, Kirchner, we have developed a simple non-threshold response model for drainage in natural catchments, to be used in the gridded hydrological model ENKI. The new response model takes only the hydrogram into account, it has one state and two parameters, and is adapted to catchments that are dominated by terrain drainage. The method is based on the assumption that in catchments where precipitation, evaporation and snowmelt is neglect able, the discharge is entirely determined by the amount of stored water. It can then be characterized as a simple first-order nonlinear dynamical system, where the governing equations can be found directly from measured stream flow fluctuations. This means that the response in the catchment can be modelled by using hydrogram data where all data from periods with rain, snowmelt or evaporation is left out, and adjust these series to a two or three parameter equation. A large number of discharge series from catchments in different regions in Norway are analyzed, and parameters found for all the series. By combining the computed parameters and known catchments characteristics, we try to regionalize the parameters. Then the parameters in the response routine can easily be found also for ungauged catchments, from maps or data bases.

  7. A simple web-based tool to compare freshwater fish data collected using AFS standard methods

    USGS Publications Warehouse

    Bonar, Scott A.; Mercado-Silva, Norman; Rahr, Matt; Torrey, Yuta T.; Cate, Averill

    2016-01-01

    The American Fisheries Society (AFS) recently published Standard Methods for Sampling North American Freshwater Fishes. Enlisting the expertise of 284 scientists from 107 organizations throughout Canada, Mexico, and the United States, this text was developed to facilitate comparisons of fish data across regions or time. Here we describe a user-friendly web tool that automates among-sample comparisons in individual fish condition, population length-frequency distributions, and catch per unit effort (CPUE) data collected using AFS standard methods. Currently, the web tool (1) provides instantaneous summaries of almost 4,000 data sets of condition, length frequency, and CPUE of common freshwater fishes collected using standard gears in 43 states and provinces; (2) is easily appended with new standardized field data to update subsequent queries and summaries; (3) compares fish data from a particular water body with continent, ecoregion, and state data summaries; and (4) provides additional information about AFS standard fish sampling including benefits, ongoing validation studies, and opportunities to comment on specific methods. The web tool—programmed in a PHP-based Drupal framework—was supported by several AFS Sections, agencies, and universities and is freely available from the AFS website and fisheriesstandardsampling.org. With widespread use, the online tool could become an important resource for fisheries biologists.

  8. BioInfra.Prot: A comprehensive proteomics workflow including data standardization, protein inference, expression analysis and data publication.

    PubMed

    Turewicz, Michael; Kohl, Michael; Ahrens, Maike; Mayer, Gerhard; Uszkoreit, Julian; Naboulsi, Wael; Bracht, Thilo; Megger, Dominik A; Sitek, Barbara; Marcus, Katrin; Eisenacher, Martin

    2017-11-10

    The analysis of high-throughput mass spectrometry-based proteomics data must address the specific challenges of this technology. To this end, the comprehensive proteomics workflow offered by the de.NBI service center BioInfra.Prot provides indispensable components for the computational and statistical analysis of this kind of data. These components include tools and methods for spectrum identification and protein inference, protein quantification, expression analysis as well as data standardization and data publication. All particular methods of the workflow which address these tasks are state-of-the-art or cutting edge. As has been shown in previous publications, each of these methods is adequate to solve its specific task and gives competitive results. However, the methods included in the workflow are continuously reviewed, updated and improved to adapt to new scientific developments. All of these particular components and methods are available as stand-alone BioInfra.Prot services or as a complete workflow. Since BioInfra.Prot provides manifold fast communication channels to get access to all components of the workflow (e.g., via the BioInfra.Prot ticket system: bioinfraprot@rub.de) users can easily benefit from this service and get support by experts. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  9. Periodicity of extinction: A 1988 update

    NASA Technical Reports Server (NTRS)

    Sepkowski, J. John, Jr.

    1988-01-01

    The hypothesis that events of mass extinction recur periodically at approximately 26 my intervals is an empirical claim based on analysis of data from the fossil record. The hypothesis has become closely linked with catastrophism because several events in the periodic series are associated with evidence of extraterrestrial impacts, and terrestrial forcing mechanisms with long, periodic recurrences are not easily conceived. Astronomical mechanisms that have been hypothesized include undetected solar companions and solar oscillation about the galactic plane, which induce comet showers and result in impacts on Earth at regular intervals. Because these mechanisms are speculative, they have been the subject of considerable controversy, as has the hypothesis of periodicity of extinction. In response to criticisms and uncertainties, a data base was developed on times of extinction of marine animal genera. A time series is given and analyzed with 49 sample points for the per-genus extinction rate from the Late Permian to the Recent. An unexpected pattern in the data is the uniformity of magnitude of many of the periodic extinction events. Observations suggest that the sequence of extinction events might be the result of two sets of mechanisms: a periodic forcing that normally induces only moderate amounts of extinction, and independent incidents or catastrophes that, when coincident with the periodic forcing, amplify its signal and produce major-mass extinctions.

  10. The UCERF3 grand inversion: Solving for the long‐term rate of ruptures in a fault system

    USGS Publications Warehouse

    Page, Morgan T.; Field, Edward H.; Milner, Kevin; Powers, Peter M.

    2014-01-01

    We present implementation details, testing, and results from a new inversion‐based methodology, known colloquially as the “grand inversion,” developed for the Uniform California Earthquake Rupture Forecast (UCERF3). We employ a parallel simulated annealing algorithm to solve for the long‐term rate of all ruptures that extend through the seismogenic thickness on major mapped faults in California while simultaneously satisfying available slip‐rate, paleoseismic event‐rate, and magnitude‐distribution constraints. The inversion methodology enables the relaxation of fault segmentation and allows for the incorporation of multifault ruptures, which are needed to remove magnitude‐distribution misfits that were present in the previous model, UCERF2. The grand inversion is more objective than past methodologies, as it eliminates the need to prescriptively assign rupture rates. It also provides a means to easily update the model as new data become available. In addition to UCERF3 model results, we present verification of the grand inversion, including sensitivity tests, tuning of equation set weights, convergence metrics, and a synthetic test. These tests demonstrate that while individual rupture rates are poorly resolved by the data, integrated quantities such as magnitude–frequency distributions and, most importantly, hazard metrics, are much more robust.

  11. Early and intermediate age-related macular degeneration: update and clinical review.

    PubMed

    García-Layana, Alfredo; Cabrera-López, Francisco; García-Arumí, José; Arias-Barquet, Lluís; Ruiz-Moreno, José M

    2017-01-01

    Age-related macular degeneration (AMD) is the leading cause of irreversible central vision loss in developed countries. With the aging of population, AMD will become globally an increasingly important and prevalent disease worldwide. It is a complex disease whose etiology is associated with both genetic and environmental risk factors. An extensive decline in the quality of life and progressive need of daily living assistance resulting from AMD among those most severely affected highlights the essential role of preventive strategies, particularly advising patients to quit smoking. In addition, maintaining a healthy diet, controlling other risk factors (such as hypertension, obesity, and atherosclerosis), and the use of nutritional supplements (antioxidants) are recommendable. Genetic testing may be especially important in patients with a family history of AMD. Recently, unifying criteria for the clinical classification of AMD, defining no apparent aging changes; normal aging changes; and early, intermediate, and late AMD stages, are of value in predicting AMD risk of progression and in establishing recommendations for the diagnosis, therapeutic approach, and follow-up of patients. The present review is focused on early and intermediate AMD and presents a description of the clinical characteristics and ophthalmological findings for these stages, together with algorithms for the diagnosis and management of patients, which are easily applicable in daily clinical practice.

  12. Early and intermediate age-related macular degeneration: update and clinical review

    PubMed Central

    García-Layana, Alfredo; Cabrera-López, Francisco; García-Arumí, José; Arias-Barquet, Lluís; Ruiz-Moreno, José M

    2017-01-01

    Age-related macular degeneration (AMD) is the leading cause of irreversible central vision loss in developed countries. With the aging of population, AMD will become globally an increasingly important and prevalent disease worldwide. It is a complex disease whose etiology is associated with both genetic and environmental risk factors. An extensive decline in the quality of life and progressive need of daily living assistance resulting from AMD among those most severely affected highlights the essential role of preventive strategies, particularly advising patients to quit smoking. In addition, maintaining a healthy diet, controlling other risk factors (such as hypertension, obesity, and atherosclerosis), and the use of nutritional supplements (antioxidants) are recommendable. Genetic testing may be especially important in patients with a family history of AMD. Recently, unifying criteria for the clinical classification of AMD, defining no apparent aging changes; normal aging changes; and early, intermediate, and late AMD stages, are of value in predicting AMD risk of progression and in establishing recommendations for the diagnosis, therapeutic approach, and follow-up of patients. The present review is focused on early and intermediate AMD and presents a description of the clinical characteristics and ophthalmological findings for these stages, together with algorithms for the diagnosis and management of patients, which are easily applicable in daily clinical practice. PMID:29042759

  13. Progress in the clinical development and utilization of vision prostheses: an update

    PubMed Central

    Brandli, Alice; Luu, Chi D; Guymer, Robyn H; Ayton, Lauren N

    2016-01-01

    Vision prostheses, or “bionic eyes”, are implantable medical bionic devices with the potential to restore rudimentary sight to people with profound vision loss or blindness. In the past two decades, this field has rapidly progressed, and there are now two commercially available retinal prostheses in the US and Europe, and a number of next-generation devices in development. This review provides an update on the development of these devices and a discussion on the future directions for the field. PMID:28539798

  14. An interval model updating strategy using interval response surface models

    NASA Astrophysics Data System (ADS)

    Fang, Sheng-En; Zhang, Qiu-Hu; Ren, Wei-Xin

    2015-08-01

    Stochastic model updating provides an effective way of handling uncertainties existing in real-world structures. In general, probabilistic theories, fuzzy mathematics or interval analyses are involved in the solution of inverse problems. However in practice, probability distributions or membership functions of structural parameters are often unavailable due to insufficient information of a structure. At this moment an interval model updating procedure shows its superiority in the aspect of problem simplification since only the upper and lower bounds of parameters and responses are sought. To this end, this study develops a new concept of interval response surface models for the purpose of efficiently implementing the interval model updating procedure. The frequent interval overestimation due to the use of interval arithmetic can be maximally avoided leading to accurate estimation of parameter intervals. Meanwhile, the establishment of an interval inverse problem is highly simplified, accompanied by a saving of computational costs. By this means a relatively simple and cost-efficient interval updating process can be achieved. Lastly, the feasibility and reliability of the developed method have been verified against a numerical mass-spring system and also against a set of experimentally tested steel plates.

  15. Cyclebase 3.0: a multi-organism database on cell-cycle regulation and phenotypes.

    PubMed

    Santos, Alberto; Wernersson, Rasmus; Jensen, Lars Juhl

    2015-01-01

    The eukaryotic cell division cycle is a highly regulated process that consists of a complex series of events and involves thousands of proteins. Researchers have studied the regulation of the cell cycle in several organisms, employing a wide range of high-throughput technologies, such as microarray-based mRNA expression profiling and quantitative proteomics. Due to its complexity, the cell cycle can also fail or otherwise change in many different ways if important genes are knocked out, which has been studied in several microscopy-based knockdown screens. The data from these many large-scale efforts are not easily accessed, analyzed and combined due to their inherent heterogeneity. To address this, we have created Cyclebase--available at http://www.cyclebase.org--an online database that allows users to easily visualize and download results from genome-wide cell-cycle-related experiments. In Cyclebase version 3.0, we have updated the content of the database to reflect changes to genome annotation, added new mRNA and protein expression data, and integrated cell-cycle phenotype information from high-content screens and model-organism databases. The new version of Cyclebase also features a new web interface, designed around an overview figure that summarizes all the cell-cycle-related data for a gene. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  16. Updating and improving methodology for prioritizing highway project locations on the strategic intermodal system : [summary].

    DOT National Transportation Integrated Search

    2016-05-01

    Florida International University researchers examined the existing performance measures and the project prioritization method in the CMP and updated them to better reflect the current conditions and strategic goals of FDOT. They also developed visual...

  17. UPDATE ON EPA'S URBAN WATERSHED MANAGEMENT BRANCH MODELING ACTIVITIES

    EPA Science Inventory

    This paper provides the Stormwater Management Model (SWMM) user community with a description of the Environmental Protection Agency (EPA's) Office of Research and Development (ORD) approach to urban watershed modeling research and provides an update on current ORD SWMM-related pr...

  18. a Bottom-Up Geosptial Data Update Mechanism for Spatial Data Infrastructure Updating

    NASA Astrophysics Data System (ADS)

    Tian, W.; Zhu, X.; Liu, Y.

    2012-08-01

    Currently, the top-down spatial data update mechanism has made a big progress and it is wildly applied in many SDI (spatial data infrastructure). However, this mechanism still has some issues. For example, the update schedule is limited by the professional department's project, usually which is too long for the end-user; the data form collection to public cost too much time and energy for professional department; the details of geospatial information does not provide sufficient attribute, etc. Thus, how to deal with the problems has become the effective shortcut. Emerging Internet technology, 3S technique and geographic information knowledge which is popular in the public promote the booming development of geoscience in volunteered geospatial information. Volunteered geospatial information is the current "hotspot", which attracts many researchers to study its data quality and credibility, accuracy, sustainability, social benefit, application and so on. In addition to this, a few scholars also pay attention to the value of VGI to support the SDI updating. And on that basis, this paper presents a bottom-up update mechanism form VGI to SDI, which includes the processes of match homonymous elements between VGI and SDI vector data , change data detection, SDI spatial database update and new data product publication to end-users. Then, the proposed updating cycle is deeply discussed about the feasibility of which can detect the changed elements in time and shorten the update period, provide more accurate geometry and attribute data for spatial data infrastructure and support update propagation.

  19. Acinetobacter Species Infections among Navy and Marine Corps Beneficiaries: 2012 Annual Report

    DTIC Science & Technology

    2013-11-18

    with a large number of Infections, have the ability to easily acquire resistance determinants , and quickly develop resistance to multiple antibiotics...have the ability to easily acquire resistance determinants , and quickly develop resistance to multiple antibiotics, leaving few, if any, treatment...resistant (XDR) organisms accounted for 1.3% of DON cases. For non-MDR cases in the DON, providers most commonly prescribed trimethoprim /sulfamethoxazole

  20. Quality of Web Information About Palliative Care on Websites from the United States and Japan: Comparative Evaluation Study.

    PubMed

    Tanabe, Kouichi; Fujiwara, Kaho; Ogura, Hana; Yasuda, Hatsuna; Goto, Nobuyuki; Ohtsu, Fumiko

    2018-04-03

    Patients and their families are able to obtain information about palliative care from websites easily nowadays. However, there are concerns on the accuracy of information on the Web and how up to date it is. The objective of this study was to elucidate problematic points of medical information about palliative care obtained from websites, and to compare the quality of the information between Japanese and US websites. We searched Google Japan and Google USA for websites relating to palliative care. We then evaluated the top 50 websites from each search using the DISCERN and LIDA instruments. We found that Japanese websites were given a lower evaluation of reliability than US websites. In 3 LIDA instrument subcategories-engagability (P<.001), currency (P=.001), and content production procedure (P<.001)-US websites scored significantly higher and had large effect sizes. Our results suggest that Japanese websites have problems with the frequency with which they are updated, their update procedures and policies, and the scrutiny process the evidence must undergo. Additionally, there was a weak association between search ranking and reliability, and simultaneously we found that reliability could not be assessed by search ranking alone. ©Kouichi Tanabe, Kaho Fujiwara, Hana Ogura, Hatsuna Yasuda, Nobuyuki Goto, Fumiko Ohtsu. Originally published in the Interactive Journal of Medical Research (http://www.i-jmr.org/), 03.04.2018.

  1. Parametric design criteria of an updated thermoradiative cell operating at optimal states

    NASA Astrophysics Data System (ADS)

    Zhang, Xin; Peng, Wanli; Lin, Jian; Chen, Xiaohang; Chen, Jincan

    2017-11-01

    An updated mode of the thermoradiative cell (TRC) with sub-band gap and non-radiative losses is proposed, which can efficiently harvest moderate-temperature heat energy and convert a part of heat into electricity. It is found that when the TRC is operated between the heat source at 800 K and the environment at 300 K , its maximum power output density and efficiency can attain 1490 W m-2 and 27.2 % , respectively. Moreover, the effects of some key parameters including the band gap and voltage output on the performance of the TRC are discussed. The optimally working regions of the power density, efficiency, band gap, and voltage output are determined. The maximum efficiency and power output density of the TRC operated at different temperatures are calculated and compared with those of thermophotovoltaic cells (TPVCs) and thermionic energy converters (TECs), and consequently, it is revealed that the maximum efficiency of the TRC operated at the moderate-temperature range is much higher than that of the TEC or the TPVC and the maximum power output density of the TRC is larger than that of the TEC but smaller than that of the TPVC. Particularly, the TRC is manufactured more easily than the near-field TPVC possessing a nanoscale vacuum gap. The results obtained will be helpful for engineers to choose the semiconductor materials, design and manufacture TRCs, and control operative conditions.

  2. Integrate genome-based assessment of safety for probiotic strains: Bacillus coagulans GBI-30, 6086 as a case study.

    PubMed

    Salvetti, Elisa; Orrù, Luigi; Capozzi, Vittorio; Martina, Alessia; Lamontanara, Antonella; Keller, David; Cash, Howard; Felis, Giovanna E; Cattivelli, Luigi; Torriani, Sandra; Spano, Giuseppe

    2016-05-01

    Probiotics are microorganisms that confer beneficial effects on the host; nevertheless, before being allowed for human consumption, their safety must be verified with accurate protocols. In the genomic era, such procedures should take into account the genomic-based approaches. This study aims at assessing the safety traits of Bacillus coagulans GBI-30, 6086 integrating the most updated genomics-based procedures and conventional phenotypic assays. Special attention was paid to putative virulence factors (VF), antibiotic resistance (AR) genes and genes encoding enzymes responsible for harmful metabolites (i.e. biogenic amines, BAs). This probiotic strain was phenotypically resistant to streptomycin and kanamycin, although the genome analysis suggested that the AR-related genes were not easily transferrable to other bacteria, and no other genes with potential safety risks, such as those related to VF or BA production, were retrieved. Furthermore, no unstable elements that could potentially lead to genomic rearrangements were detected. Moreover, a workflow is proposed to allow the proper taxonomic identification of a microbial strain and the accurate evaluation of risk-related gene traits, combining whole genome sequencing analysis with updated bioinformatics tools and standard phenotypic assays. The workflow presented can be generalized as a guideline for the safety investigation of novel probiotic strains to help stakeholders (from scientists to manufacturers and consumers) to meet regulatory requirements and avoid misleading information.

  3. Using SCADA Data, Field Studies, and Real-Time Modeling to ...

    EPA Pesticide Factsheets

    EPA has been providing technical assistance to the City of Flint and the State of Michigan in response to the drinking water lead contamination incident. Responders quickly recognized the need for a water distribution system hydraulic model to provide insight on flow patterns and water quality as well as to evaluate changes being made to the system operation to enhance corrosion control and improve chlorine residuals. EPA partnered with the City of Flint and the Michigan Department of Environmental Quality to update and calibrate an existing hydraulic model. The City provided SCADA data, GIS data, customer billing data, valve status data, design diagrams, and information on operations. Team members visited all facilities and updated pump and valve types, sizes, settings, elevations, and pump discharge curves. Several technologies were used to support this work including the EPANET-RTX based Polaris real-time modeling software, WaterGEMS, ArcGIS, EPANET, and RTX:LINK. Field studies were conducted to collect pressure and flow data from more than 25 locations throughout the distribution system. An assessment of the model performance compared model predictions for flow, pressure, and tank levels to SCADA and field data, resulting in error measurements for each data stream over the time period analyzed. Now, the calibrated model can be used with a known confidence in its performance to evaluate hydraulic and water quality problems, and the model can be easily

  4. The Lightwave programme and roadshow: an overview and update

    NASA Astrophysics Data System (ADS)

    Wong, Nicholas H. L.; Posner, Matthew T.; John, Pearl V.

    2015-10-01

    While optics and photonics are exciting disciplines with much research, industrial, and economic potential in the 21st century, this appreciation is only shared by a limited number of science, technology, engineering, and mathematics (STEM) experts, and there is a recognized STEM skills shortage. To widen the pool of talent, it is essential to expose students to optics and photonics throughout their education and particularly starting at a young age. The Lightwave programme, consisting of an interactive collection of photonics demonstrations and experiments targeted for primary school students, was thus created to facilitate this endeavor. The programme is run by doctoral students forming a team of "Lightwave ambassadors". All the demonstrations that comprise Lightwave can be easily integrated into a physics curriculum, enabling educators to generate more student interest and enhance the image of science through an interactive pedagogy. We provide a description of the programme at its initial inception, and report on the recent additions and updates that have brought about its success, moving from a purely outreach driven focus to engaging pupils with our own research. We also discuss our approach to ensuring that our team of ambassadors are from diverse backgrounds and use both male and female students as role models. Finally, we reflect on how evaluation methods to obtain feedback from our activities are key to Lightwave's sustainability and in improving the perception of optics and photonics.

  5. An Update on Candida tropicalis Based on Basic and Clinical Approaches

    PubMed Central

    Zuza-Alves, Diana L.; Silva-Rocha, Walicyranison P.; Chaves, Guilherme M.

    2017-01-01

    Candida tropicalis has emerged as one of the most important Candida species. It has been widely considered the second most virulent Candida species, only preceded by C. albicans. Besides, this species has been recognized as a very strong biofilm producer, surpassing C. albicans in most of the studies. In addition, it produces a wide range of other virulence factors, including: adhesion to buccal epithelial and endothelial cells; the secretion of lytic enzymes, such as proteinases, phospholipases, and hemolysins, bud-to-hyphae transition (also called morphogenesis) and the phenomenon called phenotypic switching. This is a species very closely related to C. albicans and has been easily identified with both phenotypic and molecular methods. In addition, no cryptic sibling species were yet described in the literature, what is contradictory to some other medically important Candida species. C. tropicalis is a clinically relevant species and may be the second or third etiological agent of candidemia, specifically in Latin American countries and Asia. Antifungal resistance to the azoles, polyenes, and echinocandins has already been described. Apart from all these characteristics, C. tropicalis has been considered an osmotolerant microorganism and this ability to survive to high salt concentration may be important for fungal persistence in saline environments. This physiological characteristic makes this species suitable for use in biotechnology processes. Here we describe an update of C. tropicalis, focusing on all these previously mentioned subjects. PMID:29081766

  6. Novel approach to improve the attitude update rate of a star tracker.

    PubMed

    Zhang, Shuo; Xing, Fei; Sun, Ting; You, Zheng; Wei, Minsong

    2018-03-05

    The star tracker is widely used in attitude control systems of spacecraft for attitude measurement. The attitude update rate of a star tracker is important to guarantee the attitude control performance. In this paper, we propose a novel approach to improve the attitude update rate of a star tracker. The electronic Rolling Shutter (RS) imaging mode of the complementary metal-oxide semiconductor (CMOS) image sensor in the star tracker is applied to acquire star images in which the star spots are exposed with row-to-row time offsets, thereby reflecting the rotation of star tracker at different times. The attitude estimation method with a single star spot is developed to realize the multiple attitude updates by a star image, so as to reach a high update rate. The simulation and experiment are performed to verify the proposed approaches. The test results demonstrate that the proposed approach is effective and the attitude update rate of a star tracker is increased significantly.

  7. OGEE v2: an update of the online gene essentiality database with special focus on differentially essential genes in human cancer cell lines.

    PubMed

    Chen, Wei-Hua; Lu, Guanting; Chen, Xiao; Zhao, Xing-Ming; Bork, Peer

    2017-01-04

    OGEE is an Online GEne Essentiality database. To enhance our understanding of the essentiality of genes, in OGEE we collected experimentally tested essential and non-essential genes, as well as associated gene properties known to contribute to gene essentiality. We focus on large-scale experiments, and complement our data with text-mining results. We organized tested genes into data sets according to their sources, and tagged those with variable essentiality statuses across data sets as conditionally essential genes, intending to highlight the complex interplay between gene functions and environments/experimental perturbations. Developments since the last public release include increased numbers of species and gene essentiality data sets, inclusion of non-coding essential sequences and genes with intermediate essentiality statuses. In addition, we included 16 essentiality data sets from cancer cell lines, corresponding to 9 human cancers; with OGEE, users can easily explore the shared and differentially essential genes within and between cancer types. These genes, especially those derived from cell lines that are similar to tumor samples, could reveal the oncogenic drivers, paralogous gene expression pattern and chromosomal structure of the corresponding cancer types, and can be further screened to identify targets for cancer therapy and/or new drug development. OGEE is freely available at http://ogee.medgenius.info. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  8. An Improved Neutron Transport Algorithm for HZETRN2006

    NASA Astrophysics Data System (ADS)

    Slaba, Tony

    NASA's new space exploration initiative includes plans for long term human presence in space thereby placing new emphasis on space radiation analyses. In particular, a systematic effort of verification, validation and uncertainty quantification of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. In this paper, the numerical error associated with energy discretization in HZETRN2006 is addressed; large errors in the low-energy portion of the neutron fluence spectrum are produced due to a numerical truncation error in the transport algorithm. It is shown that the truncation error results from the narrow energy domain of the neutron elastic spectral distributions, and that an extremely fine energy grid is required in order to adequately resolve the problem under the current formulation. Since adding a sufficient number of energy points will render the code computationally inefficient, we revisit the light-ion transport theory developed for HZETRN2006 and focus on neutron elastic interactions. The new approach that is developed numerically integrates with adequate resolution in the energy domain without affecting the run-time of the code and is easily incorporated into the current code. Efforts were also made to optimize the computational efficiency of the light-ion propagator; a brief discussion of the efforts is given along with run-time comparisons between the original and updated codes. Convergence testing is then completed by running the code for various environments and shielding materials with many different energy grids to ensure stability of the proposed method.

  9. A Spatial Data Infrastructure for Environmental Noise Data in Europe.

    PubMed

    Abramic, Andrej; Kotsev, Alexander; Cetl, Vlado; Kephalopoulos, Stylianos; Paviotti, Marco

    2017-07-06

    Access to high quality data is essential in order to better understand the environmental and health impact of noise in an increasingly urbanised world. This paper analyses how recent developments of spatial data infrastructures in Europe can significantly improve the utilization of data and streamline reporting on a pan-European scale. The Infrastructure for Spatial Information in the European Community (INSPIRE), and Environmental Noise Directive (END) described in this manuscript provide principles for data management that, once applied, would lead to a better understanding of the state of environmental noise. Furthermore, shared, harmonised and easily discoverable environmental spatial data, required by the INSPIRE, would also support the data collection needed for the assessment and development of strategic noise maps. Action plans designed by the EU Member States to reduce noise and mitigate related effects can be shared to the public through already established nodes of the European spatial data infrastructure. Finally, data flows regarding reporting on the state of environment and END implementation to the European level can benefit by applying a decentralised e-reporting service oriented infrastructure. This would allow reported data to be maintained, frequently updated and enable pooling of information from/to other relevant and interrelated domains such as air quality, transportation, human health, population, marine environment or biodiversity. We describe those processes and provide a use case in which noise data from two neighbouring European countries are mapped to common data specifications, defined by INSPIRE, thus ensuring interoperability and harmonisation.

  10. A WBAN System for Ambulatory Monitoring of Physical Activity and Health Status: Applications and Challenges.

    PubMed

    Jovanov, E; Milenkovic, A; Otto, C; De Groen, P; Johnson, B; Warren, S; Taibi, G

    2005-01-01

    Recent technological advances in sensors, low-power integrated circuits, and wireless communications have enabled the design of low-cost, miniature, lightweight, intelligent physiological sensor platforms that can be seamlessly integrated into a body area network for health monitoring. Wireless body area networks (WBANs) promise unobtrusive ambulatory health monitoring for extended periods of time and near real-time updates of patients' medical records through the Internet. A number of innovative systems for health monitoring have recently been proposed. However, they typically rely on custom communication protocols and hardware designs, lacking generality and flexibility. The lack of standard platforms, system software support, and standards makes these systems expensive. Bulky sensors, high price, and frequent battery changes are all likely to limit user compliance. To address some of these challenges, we prototyped a WBAN utilizing a common off-the-shelf wireless sensor platform with a ZigBee-compliant radio interface and an ultra low-power microcontroller. The standard platform interfaces to custom sensor boards that are equipped with accelerometers for motion monitoring and a bioamplifier for electrocardiogram or electromyogram monitoring. Software modules for on-board processing, communication, and network synchronization have been developed using the TinyOS operating system. Although the initial WBAN prototype targets ambulatory monitoring of user activity, the developed sensors can easily be adapted to monitor other physiological parameters. In this paper, we discuss initial results, implementation challenges, and the need for standardization in this dynamic and promising research field.

  11. Graphic-based musculoskeletal model for biomechanical analyses and animation.

    PubMed

    Chao, Edmund Y S

    2003-04-01

    The ability to combine physiology and engineering analyses with computer sciences has opened the door to the possibility of creating the 'Virtual Human' reality. This paper presents a broad foundation for a full-featured biomechanical simulator for the human musculoskeletal system physiology. This simulation technology unites the expertise in biomechanical analysis and graphic modeling to investigate joint and connective tissue mechanics at the structural level and to visualize the results in both static and animated forms together with the model. Adaptable anatomical models including prosthetic implants and fracture fixation devices and a robust computational infrastructure for static, kinematic, kinetic, and stress analyses under varying boundary and loading conditions are incorporated on a common platform, the VIMS (Virtual Interactive Musculoskeletal System). Within this software system, a manageable database containing long bone dimensions, connective tissue material properties and a library of skeletal joint system functional activities and loading conditions are also available and they can easily be modified, updated and expanded. Application software is also available to allow end-users to perform biomechanical analyses interactively. This paper details the design, capabilities, and features of the VIMS development at Johns Hopkins University, an effort possible only through academic and commercial collaborations. Examples using these models and the computational algorithms in a virtual laboratory environment are used to demonstrate the utility of this unique database and simulation technology. This integrated system will impact on medical education, basic research, device development and application, and clinical patient care related to musculoskeletal diseases, trauma, and rehabilitation.

  12. New FINESSE Faculty Institutes for NASA Earth and Space Science Education

    NASA Astrophysics Data System (ADS)

    Slater, Timothy F.; Slater, Stephanie; Marshall, Sunette Sophia; Stork, Debra; Pomeroy, J. Richard R

    2014-06-01

    In a systematic effort to improve the preparation of future science teachers, scholars coordinated by the CAPER Center for Astronomy & Physics Education Research are providing a series of high-quality, 2-day professional development workshops, with year-round follow-up support, for college and university professors who prepare future science teachers to work with highly diverse student populations. These workshops focus on reforming and revitalizing undergraduate science teaching methods courses and Earth and Space science content courses that future teachers most often take to reflect contemporary pedagogies and data-rich problem-based learning approaches steeped in authentic scientific inquiry, which consistently demonstrate effectiveness with diverse students. Participants themselves conduct science data-rich research projects during the institutes using highly regarded approaches to inquiry using proven models. In addition, the Institute allocates significant time to illustrating best practices for working with diverse students. Moreover, participants leave with a well-formulated action plan to reform their courses targeting future teachers to include more data-rich scientific inquiry lessons and to be better focused on improving science education for a wide diversity of students. Through these workshops faculty use a backwards faded scaffolding mechanism for working inquiry into a deeper understanding of science by using existing on-line data to develop and research astronomy, progressing from creating a valid and easily testable question, to simple data analysis, arriving at a conclusion, and finally presenting and supporting that conclusion in the classroom. An updated schedule is available at FINESSEProgram.org

  13. Model documentation for relations between continuous real-time and discrete water-quality constituents in the North Fork Ninnescah River upstream from Cheney Reservoir, south-central Kansas, 1999--2009

    USGS Publications Warehouse

    Stone, Mandy L.; Graham, Jennifer L.; Gatotho, Jackline W.

    2013-01-01

    Cheney Reservoir in south-central Kansas is one of the primary sources of water for the city of Wichita. The North Fork Ninnescah River is the largest contributing tributary to Cheney Reservoir. The U.S. Geological Survey has operated a continuous real-time water-quality monitoring station since 1998 on the North Fork Ninnescah River. Continuously measured water-quality physical properties include streamflow, specific conductance, pH, water temperature, dissolved oxygen, and turbidity. Discrete water-quality samples were collected during 1999 through 2009 and analyzed for sediment, nutrients, bacteria, and other water-quality constituents. Regression models were developed to establish relations between discretely sampled constituent concentrations and continuously measured physical properties to estimate concentrations of those constituents of interest that are not easily measured in real time because of limitations in sensor technology and fiscal constraints. Regression models were published in 2006 that were based on a different dataset collected during 1997 through 2003. This report updates those models using discrete and continuous data collected during January 1999 through December 2009. Models also were developed for five new constituents, including additional nutrient species and indicator bacteria. The water-quality information in this report is important to the city of Wichita because it allows the concentrations of many potential pollutants of interest, including nutrients and sediment, to be estimated in real time and characterized over conditions and time scales that would not be possible otherwise.

  14. Limit states and reliability-based pipeline design. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zimmerman, T.J.E.; Chen, Q.; Pandey, M.D.

    1997-06-01

    This report provides the results of a study to develop limit states design (LSD) procedures for pipelines. Limit states design, also known as load and resistance factor design (LRFD), provides a unified approach to dealing with all relevant failure modes combinations of concern. It explicitly accounts for the uncertainties that naturally occur in the determination of the loads which act on a pipeline and in the resistance of the pipe to failure. The load and resistance factors used are based on reliability considerations; however, the designer is not faced with carrying out probabilistic calculations. This work is done during developmentmore » and periodic updating of the LSD document. This report provides background information concerning limits states and reliability-based design (Section 2), gives the limit states design procedures that were developed (Section 3) and provides results of the reliability analyses that were undertaken in order to partially calibrate the LSD method (Section 4). An appendix contains LSD design examples in order to demonstrate use of the method. Section 3, Limit States Design has been written in the format of a recommended practice. It has been structured so that, in future, it can easily be converted to a limit states design code format. Throughout the report, figures and tables are given at the end of each section, with the exception of Section 3, where to facilitate understanding of the LSD method, they have been included with the text.« less

  15. Predicate Argument Structure Analysis for Use Case Description Modeling

    NASA Astrophysics Data System (ADS)

    Takeuchi, Hironori; Nakamura, Taiga; Yamaguchi, Takahira

    In a large software system development project, many documents are prepared and updated frequently. In such a situation, support is needed for looking through these documents easily to identify inconsistencies and to maintain traceability. In this research, we focus on the requirements documents such as use cases and consider how to create models from the use case descriptions in unformatted text. In the model construction, we propose a few semantic constraints based on the features of the use cases and use them for a predicate argument structure analysis to assign semantic labels to actors and actions. With this approach, we show that we can assign semantic labels without enhancing any existing general lexical resources such as case frame dictionaries and design a less language-dependent model construction architecture. By using the constructed model, we consider a system for quality analysis of the use cases and automated test case generation to keep the traceability between document sets. We evaluated the reuse of the existing use cases and generated test case steps automatically with the proposed prototype system from real-world use cases in the development of a system using a packaged application. Based on the evaluation, we show how to construct models with high precision from English and Japanese use case data. Also, we could generate good test cases for about 90% of the real use cases through the manual improvement of the descriptions based on the feedback from the quality analysis system.

  16. Nursing leadership succession planning in Veterans Health Administration: creating a useful database.

    PubMed

    Weiss, Lizabeth M; Drake, Audrey

    2007-01-01

    An electronic database was developed for succession planning and placement of nursing leaders interested and ready, willing, and able to accept an assignment in a nursing leadership position. The tool is a 1-page form used to identify candidates for nursing leadership assignments. This tool has been deployed nationally, with access to the database restricted to nurse executives at every Veterans Health Administration facility for the purpose of entering the names of developed nurse leaders ready for a leadership assignment. The tool is easily accessed through the Veterans Health Administration Office of Nursing Service, and by limiting access to the nurse executive group, ensures candidates identified are qualified. Demographic information included on the survey tool includes the candidate's demographic information and other certifications/credentials. This completed information form is entered into a database from which a report can be generated, resulting in a listing of potential candidates to contact to supplement a local or Veterans Integrated Service Network wide position announcement. The data forms can be sorted by positions, areas of clinical or functional experience, training programs completed, and geographic preference. The forms can be edited or updated and/or added or deleted in the system as the need is identified. This tool allows facilities with limited internal candidates to have a resource with Department of Veterans Affairs prepared staff in which to seek additional candidates. It also provides a way for interested candidates to be considered for positions outside of their local geographic area.

  17. Updating the 2001 National Land Cover Database Impervious Surface Products to 2006 using Landsat imagery change detection methods

    USGS Publications Warehouse

    Xian, George; Homer, Collin G.

    2010-01-01

    A prototype method was developed to update the U.S. Geological Survey (USGS) National Land Cover Database (NLCD) 2001 to a nominal date of 2006. NLCD 2001 is widely used as a baseline for national land cover and impervious cover conditions. To enable the updating of this database in an optimal manner, methods are designed to be accomplished by individual Landsat scene. Using conservative change thresholds based on land cover classes, areas of change and no-change were segregated from change vectors calculated from normalized Landsat scenes from 2001 and 2006. By sampling from NLCD 2001 impervious surface in unchanged areas, impervious surface predictions were estimated for changed areas within an urban extent defined by a companion land cover classification. Methods were developed and tested for national application across six study sites containing a variety of urban impervious surface. Results show the vast majority of impervious surface change associated with urban development was captured, with overall RMSE from 6.86 to 13.12% for these areas. Changes of urban development density were also evaluated by characterizing the categories of change by percentile for impervious surface. This prototype method provides a relatively low cost, flexible approach to generate updated impervious surface using NLCD 2001 as the baseline.

  18. Development of reversible jump Markov Chain Monte Carlo algorithm in the Bayesian mixture modeling for microarray data in Indonesia

    NASA Astrophysics Data System (ADS)

    Astuti, Ani Budi; Iriawan, Nur; Irhamah, Kuswanto, Heri

    2017-12-01

    In the Bayesian mixture modeling requires stages the identification number of the most appropriate mixture components thus obtained mixture models fit the data through data driven concept. Reversible Jump Markov Chain Monte Carlo (RJMCMC) is a combination of the reversible jump (RJ) concept and the Markov Chain Monte Carlo (MCMC) concept used by some researchers to solve the problem of identifying the number of mixture components which are not known with certainty number. In its application, RJMCMC using the concept of the birth/death and the split-merge with six types of movement, that are w updating, θ updating, z updating, hyperparameter β updating, split-merge for components and birth/death from blank components. The development of the RJMCMC algorithm needs to be done according to the observed case. The purpose of this study is to know the performance of RJMCMC algorithm development in identifying the number of mixture components which are not known with certainty number in the Bayesian mixture modeling for microarray data in Indonesia. The results of this study represent that the concept RJMCMC algorithm development able to properly identify the number of mixture components in the Bayesian normal mixture model wherein the component mixture in the case of microarray data in Indonesia is not known for certain number.

  19. Update in cardiology: vascular risk and cardiac rehabilitation.

    PubMed

    Galve, Enrique; Alegría, Eduardo; Cordero, Alberto; Fácila, Lorenzo; Fernández de Bobadilla, Jaime; Lluís-Ganella, Carla; Mazón, Pilar; de Pablo Zarzosa, Carmen; González-Juanatey, José Ramón

    2014-03-01

    Cardiovascular disease develops in a slow and subclinical manner over decades, only to manifest suddenly and unexpectedly. The role of prevention is crucial, both before and after clinical appearance, and there is ample evidence of the effectiveness and usefulness of the early detection of at-risk individuals and lifestyle modifications or pharmacological approaches. However, these approaches require time, perseverance, and continuous development. The present article reviews the developments in 2013 in epidemiological aspects related to prevention, includes relevant contributions in areas such as diet, weight control methods (obesity is now considered a disease), and physical activity recommendations (with warnings about the risk of strenuous exercise), deals with habit-related psychosocial factors such as smoking, provides an update on emerging issues such as genetics, addresses the links between cardiovascular disease and other pathologies such as kidney disease, summarizes the contributions of new, updated guidelines (3 of which have recently been released on topics of considerable clinical importance: hypertension, diabetes mellitus, and chronic kidney disease), analyzes the pharmacological advances (largely mediocre except for promising lipid-related results), and finishes by outlining developments in the oft-neglected field of cardiac rehabilitation. This article provides a briefing on controversial issues, presents interesting and somewhat surprising developments, updates established knowledge with undoubted application in clinical practice, and sheds light on potential future contributions. Copyright © 2013 Sociedad Española de Cardiología. Published by Elsevier Espana. All rights reserved.

  20. The Greek National Observatory of Forest Fires (NOFFi)

    NASA Astrophysics Data System (ADS)

    Tompoulidou, Maria; Stefanidou, Alexandra; Grigoriadis, Dionysios; Dragozi, Eleni; Stavrakoudis, Dimitris; Gitas, Ioannis Z.

    2016-08-01

    Efficient forest fire management is a key element for alleviating the catastrophic impacts of wildfires. Overall, the effective response to fire events necessitates adequate planning and preparedness before the start of the fire season, as well as quantifying the environmental impacts in case of wildfires. Moreover, the estimation of fire danger provides crucial information required for the optimal allocation and distribution of the available resources. The Greek National Observatory of Forest Fires (NOFFi)—established by the Greek Forestry Service in collaboration with the Laboratory of Forest Management and Remote Sensing of the Aristotle University of Thessaloniki and the International Balkan Center—aims to develop a series of modern products and services for supporting the efficient forest fire prevention management in Greece and the Balkan region, as well as to stimulate the development of transnational fire prevention and impacts mitigation policies. More specifically, NOFFi provides three main fire-related products and services: a) a remote sensing-based fuel type mapping methodology, b) a semi-automatic burned area mapping service, and c) a dynamically updatable fire danger index providing mid- to long-term predictions. The fuel type mapping methodology was developed and applied across the country, following an object-oriented approach and using Landsat 8 OLI satellite imagery. The results showcase the effectiveness of the generated methodology in obtaining highly accurate fuel type maps on a national level. The burned area mapping methodology was developed as a semi-automatic object-based classification process, carefully crafted to minimize user interaction and, hence, be easily applicable on a near real-time operational level as well as for mapping historical events. NOFFi's products can be visualized through the interactive Fire Forest portal, which allows the involvement and awareness of the relevant stakeholders via the Public Participation GIS (PPGIS) tool.

  1. Planning Tool for Strategic Evaluation of Facility Plans - 13570

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Magoulas, Virginia; Cercy, Michael; Hall, Irin

    2013-07-01

    Savannah River National Laboratory (SRNL) has developed a strategic planning tool for the evaluation of the utilization of its unique resources for processing and research and development of nuclear materials. The Planning Tool is a strategic level tool for assessing multiple missions that could be conducted utilizing the SRNL facilities and showcasing the plan. Traditional approaches using standard scheduling tools and laying out a strategy on paper tended to be labor intensive and offered either a limited or cluttered view for visualizing and communicating results. A tool that can assess the process throughput, duration, and utilization of the facility wasmore » needed. SRNL teamed with Newport News Shipbuilding (NNS), a division of Huntington Ingalls Industries, to create the next generation Planning Tool. The goal of this collaboration was to create a simulation based tool that allows for quick evaluation of strategies with respect to new or changing missions, and clearly communicates results to the decision makers. This tool has been built upon a mature modeling and simulation software previously developed by NNS. The Planning Tool provides a forum for capturing dependencies, constraints, activity flows, and variable factors. It is also a platform for quickly evaluating multiple mission scenarios, dynamically adding/updating scenarios, generating multiple views for evaluating/communicating results, and understanding where there are areas of risks and opportunities with respect to capacity. The Planning Tool that has been developed is useful in that it presents a clear visual plan for the missions at the Savannah River Site (SRS). It not only assists in communicating the plans to SRS corporate management, but also allows the area stakeholders a visual look at the future plans for SRS. The design of this tool makes it easily deployable to other facility and mission planning endeavors. (authors)« less

  2. HydroApps: An R package for statistical simulation to use in regional analysis

    NASA Astrophysics Data System (ADS)

    Ganora, D.

    2013-12-01

    The HydroApps package is a newborn R extension initially developed to support the use of a recent model for flood frequency estimation developed for applications in Northwestern Italy; it also contains some general tools for regional analyses and can be easily extended to include other statistical models. The package is currently at an experimental level of development. The HydroApps is a corollary of the SSEM project for regional flood frequency analysis, although it was developed independently to support various instances of regional analyses. Its aim is to provide a basis for interplay between statistical simulation and practical operational use. In particular, the main module of the package deals with the building of the confidence bands of flood frequency curves expressed by means of their L-moments. Other functions include pre-processing and visualization of hydrologic time series, analysis of the optimal design-flood under uncertainty, but also tools useful in water resources management for the estimation of flow duration curves and their sensitivity to water withdrawals. Particular attention is devoted to the code granularity, i.e. the level of detail and aggregation of the code: a greater detail means more low-level functions, which entails more flexibility but reduces the ease of use for practical use. A balance between detail and simplicity is necessary and can be resolved with appropriate wrapping functions and specific help pages for each working block. From a more general viewpoint, the package has not really and user-friendly interface, but runs on multiple operating systems and it's easy to update, as many other open-source projects., The HydroApps functions and their features are reported in order to share ideas and materials to improve the ';technological' and information transfer between scientist communities and final users like policy makers.

  3. Ontology to relational database transformation for web application development and maintenance

    NASA Astrophysics Data System (ADS)

    Mahmudi, Kamal; Inggriani Liem, M. M.; Akbar, Saiful

    2018-03-01

    Ontology is used as knowledge representation while database is used as facts recorder in a KMS (Knowledge Management System). In most applications, data are managed in a database system and updated through the application and then they are transformed to knowledge as needed. Once a domain conceptor defines the knowledge in the ontology, application and database can be generated from the ontology. Most existing frameworks generate application from its database. In this research, ontology is used for generating the application. As the data are updated through the application, a mechanism is designed to trigger an update to the ontology so that the application can be rebuilt based on the newest ontology. By this approach, a knowledge engineer has a full flexibility to renew the application based on the latest ontology without dependency to a software developer. In many cases, the concept needs to be updated when the data changed. The framework is built and tested in a spring java environment. A case study was conducted to proof the concepts.

  4. How Documentalists Update SIMBAD

    NASA Astrophysics Data System (ADS)

    Buga, M.; Bot, C.; Brouty, M.; Bruneau, C.; Brunet, C.; Cambresy, L.; Eisele, A.; Genova, F.; Lesteven, S.; Loup, C.; Neuville, M.; Oberto, A.; Ochsenbein, F.; Perret, E.; Siebert, A.; Son, E.; Vannier, P.; Vollmer, B.; Vonflie, P.; Wenger, M.; Woelfel, F.

    2015-04-01

    The Strasbourg astronomical Data Center (CDS) was created in 1972 and has had a major role in astronomy for more than forty years. CDS develops a service called SIMBAD that provides basic data, cross-identifications, bibliography, and measurements for astronomical objects outside the solar system. It brings to the scientific community an added value to content which is updated daily by a team of documentalists working together in close collaboration with astronomers and IT specialists. We explain how the CDS staff updates SIMBAD with object citations in the main astronomical journals, as well as with astronomical data and measurements. We also explain how the identification is made between the objects found in the literature and those already existing in SIMBAD. We show the steps followed by the documentalist team to update the database using different tools developed at CDS, like the sky visualizer Aladin, and the large catalogues and survey database VizieR. As a direct result of this teamwork, SIMBAD integrates almost 10.000 bibliographic references per year. The service receives more than 400.000 queries per day.

  5. Parallel updating and weighting of multiple spatial maps for visual stability during whole body motion

    PubMed Central

    Medendorp, W. P.

    2015-01-01

    It is known that the brain uses multiple reference frames to code spatial information, including eye-centered and body-centered frames. When we move our body in space, these internal representations are no longer in register with external space, unless they are actively updated. Whether the brain updates multiple spatial representations in parallel, or whether it restricts its updating mechanisms to a single reference frame from which other representations are constructed, remains an open question. We developed an optimal integration model to simulate the updating of visual space across body motion in multiple or single reference frames. To test this model, we designed an experiment in which participants had to remember the location of a briefly presented target while being translated sideways. The behavioral responses were in agreement with a model that uses a combination of eye- and body-centered representations, weighted according to the reliability in which the target location is stored and updated in each reference frame. Our findings suggest that the brain simultaneously updates multiple spatial representations across body motion. Because both representations are kept in sync, they can be optimally combined to provide a more precise estimate of visual locations in space than based on single-frame updating mechanisms. PMID:26490289

  6. Finite element modelling and updating of a lively footbridge: The complete process

    NASA Astrophysics Data System (ADS)

    Živanović, Stana; Pavic, Aleksandar; Reynolds, Paul

    2007-03-01

    The finite element (FE) model updating technology was originally developed in the aerospace and mechanical engineering disciplines to automatically update numerical models of structures to match their experimentally measured counterparts. The process of updating identifies the drawbacks in the FE modelling and the updated FE model could be used to produce more reliable results in further dynamic analysis. In the last decade, the updating technology has been introduced into civil structural engineering. It can serve as an advanced tool for getting reliable modal properties of large structures. The updating process has four key phases: initial FE modelling, modal testing, manual model tuning and automatic updating (conducted using specialist software). However, the published literature does not connect well these phases, although this is crucial when implementing the updating technology. This paper therefore aims to clarify the importance of this linking and to describe the complete model updating process as applicable in civil structural engineering. The complete process consisting the four phases is outlined and brief theory is presented as appropriate. Then, the procedure is implemented on a lively steel box girder footbridge. It was found that even a very detailed initial FE model underestimated the natural frequencies of all seven experimentally identified modes of vibration, with the maximum error being almost 30%. Manual FE model tuning by trial and error found that flexible supports in the longitudinal direction should be introduced at the girder ends to improve correlation between the measured and FE-calculated modes. This significantly reduced the maximum frequency error to only 4%. It was demonstrated that only then could the FE model be automatically updated in a meaningful way. The automatic updating was successfully conducted by updating 22 uncertain structural parameters. Finally, a physical interpretation of all parameter changes is discussed. This interpretation is often missing in the published literature. It was found that the composite slabs were less stiff than originally assumed and that the asphalt layer contributed considerably to the deck stiffness.

  7. Kennedy Space Center Director Update

    NASA Image and Video Library

    2014-03-06

    CAPE CANAVERAL, Fla. - NASA Kennedy Space Center Director Robert Cabana, second from right, welcomes community leaders, business executives, educators, community organizers, and state and local government leaders to the Kennedy Space Center Visitor Complex Debus Center for the Kennedy Space Center Director Update. At far right is Brevard County District 1 Commissioner Robin Fisher. Attendees talked with Cabana and other senior Kennedy managers and visited displays featuring updates on Kennedy programs and projects, including International Space Station, Commercial Crew, Ground System Development and Operations, Launch Services, Center Planning and Development, Technology, KSC Swamp Works and NASA Education. The morning concluded with a tour of the new Space Shuttle Atlantis exhibit at the visitor complex. For more information, visit http://www.nasa.gov/kennedy. Photo credit: NASA/Daniel Casper

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Reflecting Secretary O`Leary`s focus on occupational safety and health, the Office of Occupational Safety is pleased to provide you with the latest update to the DOE Interpretations Guide to OSH Standards. This Guide was developed in cooperation with the Occupational Safety and Health Administration, which continued its support during this last revision by facilitating access to the interpretations found on the OSHA Computerized Information System (OCIS). This March 31, 1994 update contains 123 formal interpretation letters written by OSHA. As a result of the unique requests received by the 1-800 Response Line, this update also contains 38 interpretations developed bymore » DOE. This new occupational safety and health information adds still more important guidance to the four volume reference set that you presently have in your possession.« less

  9. OOSTethys - Open Source Software for the Global Earth Observing Systems of Systems

    NASA Astrophysics Data System (ADS)

    Bridger, E.; Bermudez, L. E.; Maskey, M.; Rueda, C.; Babin, B. L.; Blair, R.

    2009-12-01

    An open source software project is much more than just picking the right license, hosting modular code and providing effective documentation. Success in advancing in an open collaborative way requires that the process match the expected code functionality to the developer's personal expertise and organizational needs as well as having an enthusiastic and responsive core lead group. We will present the lessons learned fromOOSTethys , which is a community of software developers and marine scientists who develop open source tools, in multiple languages, to integrate ocean observing systems into an Integrated Ocean Observing System (IOOS). OOSTethys' goal is to dramatically reduce the time it takes to install, adopt and update standards-compliant web services. OOSTethys has developed servers, clients and a registry. Open source PERL, PYTHON, JAVA and ASP tool kits and reference implementations are helping the marine community publish near real-time observation data in interoperable standard formats. In some cases publishing an OpenGeospatial Consortium (OGC), Sensor Observation Service (SOS) from NetCDF files or a database or even CSV text files could take only minutes depending on the skills of the developer. OOSTethys is also developing an OGC standard registry, Catalog Service for Web (CSW). This open source CSW registry was implemented to easily register and discover SOSs using ISO 19139 service metadata. A web interface layer over the CSW registry simplifies the registration process by harvesting metadata describing the observations and sensors from the “GetCapabilities” response of SOS. OPENIOOS is the web client, developed in PERL to visualize the sensors in the SOS services. While the number of OOSTethys software developers is small, currently about 10 around the world, the number of OOSTethys toolkit implementers is larger and growing and the ease of use has played a large role in spreading the use of interoperable standards compliant web services widely in the marine community.

  10. Updated optical read/write memory system components

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The fabrication of an updated block data composer and holographic storage array for a breadboard holographic read/write memory system is described. System considerations such as transform optics and controlled aberration lens design are described along with the block data composer, photoplastic recording materials, and material development.

  11. Update to the USDA-ARS fixed-wing spray nozzle models

    USDA-ARS?s Scientific Manuscript database

    The current USDA ARS Aerial Spray Nozzle Models were updated to reflect both new standardized measurement methods and systems, as well as, to increase operational spray pressure, aircraft airspeed and nozzle orientation angle limits. The new models were developed using both Central Composite Design...

  12. Updating Working Memory and Arithmetical Attainment in School

    ERIC Educational Resources Information Center

    Iuculano, Teresa; Moro, Raffaella; Butterworth, Brian

    2011-01-01

    Here we wished to determine how the sub-components of Working Memory (Phonological-Loop and Central Executive) influence children's arithmetical development. Specifically, we aimed at distinguishing between Working Memory inhibition and updating processes within the Central Executive, and the domain-specificity (words and numbers) of both…

  13. RAEGE Project Update: Yebes Observatory Broadband Receiver Ready for VGOS

    NASA Astrophysics Data System (ADS)

    IGN Yebes Observatory staff

    2016-12-01

    An update of the deployment and activities at the Spanish/Portuguese RAEGE project (``Atlantic Network of Geodynamical and Space Stations'') is presented. While regular observations with the Yebes radio telescope are on-going, technological developments about receivers for VGOS are progressing at the Yebes laboratories.

  14. What's New: Update on GASB and Accounting Standards.

    ERIC Educational Resources Information Center

    Marrone, Robert S.; Scharle, Robert E.

    1996-01-01

    Updates the Governmental Accounting Standards Board (GASB) statements, which pronounce upon and provide guidance in accounting and financial reporting for state and local governmental entities. Describes the development of GASB's governmental finance-reporting model project and identifies five components of internal control. One figure and two…

  15. 78 FR 72139 - Reporting and Recordkeeping Requirements Under OMB Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-02

    ...'s 7(a) loan program, are necessary to conform to recent updates to SBA's Standard Operating Procedures (SOP), Lender and Development Company Loan Programs, designated as SOP 50 10 5(F). The update resulted in changes related to franchise eligibility, character determinations, credit standards, and...

  16. On the number of different dynamics in Boolean networks with deterministic update schedules.

    PubMed

    Aracena, J; Demongeot, J; Fanchon, E; Montalva, M

    2013-04-01

    Deterministic Boolean networks are a type of discrete dynamical systems widely used in the modeling of genetic networks. The dynamics of such systems is characterized by the local activation functions and the update schedule, i.e., the order in which the nodes are updated. In this paper, we address the problem of knowing the different dynamics of a Boolean network when the update schedule is changed. We begin by proving that the problem of the existence of a pair of update schedules with different dynamics is NP-complete. However, we show that certain structural properties of the interaction diagraph are sufficient for guaranteeing distinct dynamics of a network. In [1] the authors define equivalence classes which have the property that all the update schedules of a given class yield the same dynamics. In order to determine the dynamics associated to a network, we develop an algorithm to efficiently enumerate the above equivalence classes by selecting a representative update schedule for each class with a minimum number of blocks. Finally, we run this algorithm on the well known Arabidopsis thaliana network to determine the full spectrum of its different dynamics. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. Citation Discovery Tools for Conducting Adaptive Meta-analyses to Update Systematic Reviews.

    PubMed

    Bae, Jong-Myon; Kim, Eun Hee

    2016-03-01

    The systematic review (SR) is a research methodology that aims to synthesize related evidence. Updating previously conducted SRs is necessary when new evidence has been produced, but no consensus has yet emerged on the appropriate update methodology. The authors have developed a new SR update method called 'adaptive meta-analysis' (AMA) using the 'cited by', 'similar articles', and 'related articles' citation discovery tools in the PubMed and Scopus databases. This study evaluates the usefulness of these citation discovery tools for updating SRs. Lists were constructed by applying the citation discovery tools in the two databases to the articles analyzed by a published SR. The degree of overlap between the lists and distribution of excluded results were evaluated. The articles ultimately selected for the SR update meta-analysis were found in the lists obtained from the 'cited by' and 'similar' tools in PubMed. Most of the selected articles appeared in both the 'cited by' lists in Scopus and PubMed. The Scopus 'related' tool did not identify the appropriate articles. The AMA, which involves using both citation discovery tools in PubMed, and optionally, the 'related' tool in Scopus, was found to be useful for updating an SR.

  18. Who's Who? Memory updating and character reference in children's narratives.

    PubMed

    Whitely, Cristy; Colozzo, Paola

    2013-10-01

    The capacity to update and monitor the contents of working memory is an executive function presumed to play a critical role in language processing. The current study used an individual differences approach to consider the relationship between memory updating and accurate reference to story characters in the narratives of typically developing children. English-speaking children from kindergarten to grade 2 ( N = 63; M age = 7.0 years) completed updating tasks, short-term memory tasks, and narrative productions. The authors used multiple regression to test whether updating accounted for independent variability in referential adequacy. The capacity to update working memory was related to adequate character reference beyond the effects of age and of short-term memory capacity, with the strongest relationship emerging for maintaining reference over multiple utterances. This individual differences study is the first to show a link between updating and performance in a discourse production task for young school-age children. The findings contribute to the growing body of research investigating the role of working memory in shaping language production. This study invites extension to children of different ages and language abilities as well as to other language production tasks.

  19. Building VoiceXML-Based Applications

    DTIC Science & Technology

    2002-01-01

    basketball games. The Busline systems were pri- y developed using an early implementation of VoiceXML he NBA Update Line was developed using VoiceXML...traveling in and out of Pittsburgh’s rsity neighborhood. The second project is the NBA Up- Line, which provides callers with real-time information NBA ... NBA UPDATE LINE The target user of this system is a fairly knowledgeable basket- ball fan; the system must therefore be able to provide detailed

  20. Leveling the Playing Field: China’s Development of Advanced Energy Weapons

    DTIC Science & Technology

    2012-05-02

    02-05-2012 2. REPORT TYPE Master of Military Studies Research Paper 3. DATES COVERED (From - To) September 2011 - April 2012 5a. CONTRACT NUMBER...weapons in a surprise attack scenario to counter superior U.S. capabilities and technology. This paper will update and review current and developing...utilizing these weapons in a surprise attack scenario to counter superior U.S. capabilities and technology. This paper will update and review current

  1. Avionics Tether Operations Control

    NASA Technical Reports Server (NTRS)

    Glaese, John R.

    2001-01-01

    The activities described in this Final Report were authorized and performed under Purchase Order Number H32835D, issued as part of NASA contract number NAS8-00114. The period of performance of this PO was from March 1 to September 30, 2001. The primary work activity was the continued development and updating of the tether dynamic simulation tools GTOSS (Generalized Tethered Object System Simulation) and TSSIM (Tethered Satellite System) and use of these and other tools in the analysis of various tether dynamics problems. Several updated versions of GTOSS were delivered during the period of performance by the author of the simulation, Lang Associates' David Lang. These updates had mainly to do with updated documentation and an updated coordinate system definition to the J2000 standards. This Final Report is organized by the months in which the activities described were performed. The following sections review the Statement of Work (SOW) and activities performed to satisfy it.

  2. Updated U.S. population standard for the Veterans RAND 12-item Health Survey (VR-12).

    PubMed

    Selim, Alfredo J; Rogers, William; Fleishman, John A; Qian, Shirley X; Fincke, Benjamin G; Rothendler, James A; Kazis, Lewis E

    2009-02-01

    The purpose of this project was to develop an updated U.S. population standard for the Veterans RAND 12-item Health Survey (VR-12). We used a well-defined and nationally representative sample of the U.S. population from 52,425 responses to the Medical Expenditure Panel Survey (MEPS) collected between 2000 and 2002. We applied modified regression estimates to update the non-proprietary 1990 scoring algorithms. We applied the updated standard to the Medicare Health Outcomes Survey (HOS) to compute the VR-12 physical (PCS((MEPS standard))) and mental (MCS((MEPS standard))) component summaries based on the MEPS. We compared these scores to PCS and MCS based on the 1990 U.S. population standard. Using the updated U.S. population standard, the average VR-12 PCS((MEPS standard)) and MCS((MEPS standard)) scores in the Medicare HOS were 39.82 (standard deviation [SD] = 12.2) and 50.08 (SD = 11.4), respectively. For the same Medicare HOS, the average PCS and MCS scores based on the 1990 standard were 1.40 points higher and 0.99 points lower in comparison to VR-12 PCS and MCS, respectively. Changes in the U.S. population between 1990 and today make the old standard obsolete for the VR-12, so the updated standard developed here is widely available to serve as such a contemporary standard for future applications for health-related quality of life (HRQoL) assessments.

  3. Development of an updated tensile neck injury criterion.

    PubMed

    Parr, Jeffrey C; Miller, Michael E; Schubert Kabban, Christine M; Pellettiere, Joseph A; Perry, Chris E

    2014-10-01

    Ejection neck safety remains a concern in military aviation with the growing use of helmet mounted displays (HMDs) worn for entire mission durations. The original USAF tensile neck injury criterion proposed by Carter et al. (4) is updated and an injury protection limit for tensile loading is presented to evaluate escape system and HMD safety. An existent tensile neck injury criterion was updated through the addition of newer post mortem human subject (PMHS) tensile loading and injury data and the application of Survival Analysis to account for censoring in this data. The updated risk function was constructed with a combined human subject (N = 208) and PMHS (N = 22) data set. An updated AIS 3+ tensile neck injury criterion is proposed based upon human and PMHS data. This limit is significantly more conservative than the criterion proposed by Carter in 2000, yielding a 5% risk of AIS 3+ injury at a force of 1136 N as compared to a corresponding force of 1559 N. The inclusion of recent PMHS data into the original tensile neck injury criterion results in an injury protection limit that is significantly more conservative, as recent PMHS data is substantially less censored than the PMHS data included in the earlier criterion. The updated tensile risk function developed in this work is consistent with the tensile risk function published by the Federal Aviation Administration used as the basis for their neck injury criterion for side facing aircraft seats.

  4. Key science issues in the central and eastern United States for the next version of the USGS National Seismic Hazard Maps

    USGS Publications Warehouse

    Peterson, M.D.; Mueller, C.S.

    2011-01-01

    The USGS National Seismic Hazard Maps are updated about every six years by incorporating newly vetted science on earthquakes and ground motions. The 2008 hazard maps for the central and eastern United States region (CEUS) were updated by using revised New Madrid and Charleston source models, an updated seismicity catalog and an estimate of magnitude uncertainties, a distribution of maximum magnitudes, and several new ground-motion prediction equations. The new models resulted in significant ground-motion changes at 5 Hz and 1 Hz spectral acceleration with 5% damping compared to the 2002 version of the hazard maps. The 2008 maps have now been incorporated into the 2009 NEHRP Recommended Provisions, the 2010 ASCE-7 Standard, and the 2012 International Building Code. The USGS is now planning the next update of the seismic hazard maps, which will be provided to the code committees in December 2013. Science issues that will be considered for introduction into the CEUS maps include: 1) updated recurrence models for New Madrid sources, including new geodetic models and magnitude estimates; 2) new earthquake sources and techniques considered in the 2010 model developed by the nuclear industry; 3) new NGA-East ground-motion models (currently under development); and 4) updated earthquake catalogs. We will hold a regional workshop in late 2011 or early 2012 to discuss these and other issues that will affect the seismic hazard evaluation in the CEUS.

  5. HCV Health Policy Developments in Response to the National Viral Hepatitis Action Plan: A Brief Update.

    PubMed

    Guo, Yuqi; Sims, Omar T

    2017-02-17

    Hepatitis C virus (HCV) kills 366,000 people worldwide and 17,000 people in the United States each year. In 2011, the U.S. Department of Health and Human Services (HHS) published a national viral hepatitis action plan to control and combat HCV in the United States. This article provides a brief update of HCV health policy developments that have emerged since publication of HHS's national viral hepatitis action plan and concludes with a discussion of the public health impact of these recent HCV health policy developments.

  6. UPDATE ON PEC ACTIVITIES INCLUDING NEW EVALUATION CRITERIA, THE APPLICATION COMPLETENESS CHECKLIST, AND STATUS OF THE WEBSITE DEVELOPMENT

    EPA Science Inventory

    US EPA's Pathogen Equivalency Committee (PEC) has updated the evaluation criteria it uses to make recommendations of equivalency (to processes acceptable under 40CFR503) on innovative or alternative sludge pathogen reduction processes. These criteria will be presented along with ...

  7. From Neurons to Neighborhoods: An Update--Workshop Summary

    ERIC Educational Resources Information Center

    Olson, Steve

    2012-01-01

    "From Neurons to Neighborhoods: An Update: Workshop Summary" is based on the original study "From Neurons to Neighborhoods: Early Childhood Development," which was released in October of 2000. From the time of the original publication's release, much has occurred to cause a fundamental reexamination of the nation's…

  8. Austin Community College Benchmarking Update.

    ERIC Educational Resources Information Center

    Austin Community Coll., TX. Office of Institutional Effectiveness.

    Austin Community College contracted with MGT of America, Inc. in spring 1999 to develop a peer and benchmark (best) practices analysis on key indicators. These indicators were updated in spring 2002 using data from eight Texas community colleges and four non-Texas institutions that represent large, comprehensive, urban community colleges, similar…

  9. Update on Validity of Required Competencies for Worksite Health Professionals

    ERIC Educational Resources Information Center

    Becker, Craig; Rager, Robin C.; Wright, Fred Egbert

    2013-01-01

    Background: To improve global health, the workforce capacity of health promotion professionals must be strengthened through the provision of competencies necessary to deliver effective programs. Purpose: This study provides an updated analysis of the validity of the worksite health promotion (WHP) professional competencies developed in 2000 by the…

  10. An Updated Measure for Assessing Subtle Rape Myths

    ERIC Educational Resources Information Center

    McMahon, Sarah; Farmer, G. Lawrence

    2011-01-01

    Social workers responsible for developing rape prevention programs on college campuses must have valid evaluation instruments. This article presents the challenges encountered by the authors when they attempted to keep rape myth measures relevant to student populations by updating the language to reflect the subtleties involved with rape myths.…

  11. The Deficit Profile of Working Memory, Inhibition, and Updating in Chinese Children with Reading Difficulties

    ERIC Educational Resources Information Center

    Peng, Peng; Sha, Tao; Li, Beilei

    2013-01-01

    This study investigated executive function deficits among Chinese children with reading difficulties. Verbal and numerical measures of working memory, inhibition, updating, and processing speed were examined among children with only reading difficulties (RD), children with reading and mathematics difficulties (RDMD), and typically developing peers…

  12. Political Education: National Policy Comes of Age. The Updated Edition

    ERIC Educational Resources Information Center

    Cross, Christopher T.

    2010-01-01

    Political insider Christopher Cross has updated his critically acclaimed book to reflect recent education policy developments, including the impact of the Obama administration and "Race to the Top" as well as the controversy over NCLB's reauthorization. Featuring a new introduction and the addition of postscripts for key chapters, this…

  13. Validating and updating a prediction rule for serious bacterial infection in patients with fever without source.

    PubMed

    Bleeker, S E; Derksen-Lubsen, G; Grobbee, D E; Donders, A R T; Moons, K G M; Moll, H A

    2007-01-01

    To externally validate and update a previously developed rule for predicting the presence of serious bacterial infections in children with fever without apparent source. Patients, 1-36 mo, presenting with fever without source, were prospectively enrolled. Serious bacterial infection included bacterial meningitis, sepsis, bacteraemia, pneumonia, urinary tract infection, bacterial gastroenteritis, osteomyelitis/ethmoiditis. The generalizability of the original rule was determined. Subsequently, the prediction rule was updated using all available data of the patients with fever without source (1996-1998 and 2000-2001, n = 381) using multivariable logistic regression. the generalizability of the rule appeared insufficient in the new patients (n = 150). In the updated rule, independent predictors from history and examination were duration of fever, vomiting, ill clinical appearance, chest-wall retractions and poor peripheral circulation (ROC area (95%CI): 0.69 (0.63-0.75)). Additional independent predictors from laboratory were serum white blood cell count and C-reactive protein, and in urinalysis > or = 70 white bloods (ROC area (95%CI): 0.83 (0.78-0.88). A previously developed prediction rule for predicting the presence of serious bacterial infection in children with fever without apparent source was updated. Its clinical score can be used as a first screening tool. Additional laboratory testing may specify the individual risk estimate (range: 4-54%) further.

  14. Theater Nuclear Force Survivability, Security and Safety Instrumentation. Volume I. Engineering Development Phase - Fiscal Year 1980.

    DTIC Science & Technology

    1980-12-31

    development and acquisition program. It is generally agreed that the measures of merit in system acquisition programs are costs, schedule, and achievement...very few system acquisitions have successfully achieved their predicted measures of merit. The reasons for the poor record have been attributed to a...and Logistics -- The instrumentation must be easily maintained and easily transported to remote test sites in CONUS and Europe. 13 4. Useful Lifetime

  15. Conceptual Development of a National Volcanic Hazard Model for New Zealand

    NASA Astrophysics Data System (ADS)

    Stirling, Mark; Bebbington, Mark; Brenna, Marco; Cronin, Shane; Christophersen, Annemarie; Deligne, Natalia; Hurst, Tony; Jolly, Art; Jolly, Gill; Kennedy, Ben; Kereszturi, Gabor; Lindsay, Jan; Neall, Vince; Procter, Jonathan; Rhoades, David; Scott, Brad; Shane, Phil; Smith, Ian; Smith, Richard; Wang, Ting; White, James D. L.; Wilson, Colin J. N.; Wilson, Tom

    2017-06-01

    We provide a synthesis of a workshop held in February 2016 to define the goals, challenges and next steps for developing a national probabilistic volcanic hazard model for New Zealand. The workshop involved volcanologists, statisticians, and hazards scientists from GNS Science, Massey University, University of Otago, Victoria University of Wellington, University of Auckland, and University of Canterbury. We also outline key activities that will develop the model components, define procedures for periodic update of the model, and effectively articulate the model to end-users and stakeholders. The development of a National Volcanic Hazard Model is a formidable task that will require long-term stability in terms of team effort, collaboration and resources. Development of the model in stages or editions that are modular will make the process a manageable one that progressively incorporates additional volcanic hazards over time, and additional functionalities (e.g. short-term forecasting). The first edition is likely to be limited to updating and incorporating existing ashfall hazard models, with the other hazards associated with lahar, pyroclastic density currents, lava flow, ballistics, debris avalanche, and gases/aerosols being considered in subsequent updates.

  16. Self-learning Monte Carlo method and cumulative update in fermion systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Junwei; Shen, Huitao; Qi, Yang

    2017-06-07

    In this study, we develop the self-learning Monte Carlo (SLMC) method, a general-purpose numerical method recently introduced to simulate many-body systems, for studying interacting fermion systems. Our method uses a highly efficient update algorithm, which we design and dub “cumulative update”, to generate new candidate configurations in the Markov chain based on a self-learned bosonic effective model. From a general analysis and a numerical study of the double exchange model as an example, we find that the SLMC with cumulative update drastically reduces the computational cost of the simulation, while remaining statistically exact. Remarkably, its computational complexity is far lessmore » than the conventional algorithm with local updates.« less

  17. An updated Lagrangian particle hydrodynamics (ULPH) for Newtonian fluids

    NASA Astrophysics Data System (ADS)

    Tu, Qingsong; Li, Shaofan

    2017-11-01

    In this work, we have developed an updated Lagrangian particle hydrodynamics (ULPH) for Newtonian fluid. Unlike the smoothed particle hydrodynamics, the non-local particle hydrodynamics formulation proposed here is consistent and convergence. Unlike the state-based peridynamics, the discrete particle dynamics proposed here has no internal material bond between particles, and it is not formulated with respect to initial or a fixed referential configuration. In specific, we have shown that (1) the non-local update Lagrangian particle hydrodynamics formulation converges to the conventional local fluid mechanics formulation; (2) the non-local updated Lagrangian particle hydrodynamics can capture arbitrary flow discontinuities without any changes in the formulation, and (3) the proposed non-local particle hydrodynamics is computationally efficient and robust.

  18. Choosing Important Health Outcomes for Comparative Effectiveness Research: An Updated Review and Identification of Gaps.

    PubMed

    Gorst, Sarah L; Gargon, Elizabeth; Clarke, Mike; Smith, Valerie; Williamson, Paula R

    2016-01-01

    The COMET (Core Outcome Measures in Effectiveness Trials) Initiative promotes the development and application of core outcome sets (COS), including relevant studies in an online database. In order to keep the database current, an annual search of the literature is undertaken. This study aimed to update a previous systematic review, in order to identify any further studies where a COS has been developed. Furthermore, no prioritization for COS development has previously been undertaken, therefore this study also aimed to identify COS relevant to the world's most prevalent health conditions. The methods used in this updated review followed the same approach used in the original review and the previous update. A survey was also sent to the corresponding authors of COS identified for inclusion in this review, to ascertain what lessons they had learnt from developing their COS. Additionally, the COMET database was searched to identify COS that might be relevant to the conditions with the highest global prevalence. Twenty-five reports relating to 22 new studies were eligible for inclusion in the review. Further improvements were identified in relation to the description of the scope of the COS, use of the Delphi technique, and the inclusion of patient participants within the development process. Additionally, 33 published and ongoing COS were identified for 13 of the world's most prevalent conditions. The development of a reporting guideline and minimum standards should contribute towards future improvements in development and reporting of COS. This study has also described a first approach to identifying gaps in existing COS, and to priority setting in this area. Important gaps have been identified, on the basis of global burden of disease, and the development and application of COS in these areas should be considered a priority.

  19. Intelligent E-Learning Systems: Automatic Construction of Ontologies

    NASA Astrophysics Data System (ADS)

    Peso, Jesús del; de Arriaga, Fernando

    2008-05-01

    During the last years a new generation of Intelligent E-Learning Systems (ILS) has emerged with enhanced functionality due, mainly, to influences from Distributed Artificial Intelligence, to the use of cognitive modelling, to the extensive use of the Internet, and to new educational ideas such as the student-centered education and Knowledge Management. The automatic construction of ontologies provides means of automatically updating the knowledge bases of their respective ILS, and of increasing their interoperability and communication among them, sharing the same ontology. The paper presents a new approach, able to produce ontologies from a small number of documents such as those obtained from the Internet, without the assistance of large corpora, by using simple syntactic rules and some semantic information. The method is independent of the natural language used. The use of a multi-agent system increases the flexibility and capability of the method. Although the method can be easily improved, the results so far obtained, are promising.

  20. Ordering structured populations in multiplayer cooperation games

    PubMed Central

    Peña, Jorge; Wu, Bin; Traulsen, Arne

    2016-01-01

    Spatial structure greatly affects the evolution of cooperation. While in two-player games the condition for cooperation to evolve depends on a single structure coefficient, in multiplayer games the condition might depend on several structure coefficients, making it difficult to compare different population structures. We propose a solution to this issue by introducing two simple ways of ordering population structures: the containment order and the volume order. If population structure is greater than population structure in the containment or the volume order, then can be considered a stronger promoter of cooperation. We provide conditions for establishing the containment order, give general results on the volume order, and illustrate our theory by comparing different models of spatial games and associated update rules. Our results hold for a large class of population structures and can be easily applied to specific cases once the structure coefficients have been calculated or estimated. PMID:26819335

  1. Membrane Protein Production in E. coli Lysates in Presence of Preassembled Nanodiscs.

    PubMed

    Rues, Ralf-Bernhardt; Gräwe, Alexander; Henrich, Erik; Bernhard, Frank

    2017-01-01

    Cell-free expression allows to synthesize membrane proteins in completely new formats that can relatively easily be customized for particular applications. Amphiphilic superstructures such as micelles, lipomicelles, or nanodiscs can be provided as nano-devices for the solubilization of membrane proteins. Defined empty bilayers in the form of nanodiscs offer native like environments for membrane proteins, supporting functional folding, proper oligomeric assembly as well as stability. Even very difficult and detergent-sensitive membrane proteins can be addressed by the combination of nanodisc technology with efficient cell-free expression systems as the direct co-translational insertion of nascent membrane proteins into supplied preassembled nanodiscs is possible. This chapter provides updated protocols for the synthesis of membrane proteins in presence of preassembled nanodiscs suitable for emerging applications such as screening of lipid effects on membrane protein function and the modulation of oligomeric complex formation.

  2. An iPod treatment of amblyopia: an updated binocular approach.

    PubMed

    Hess, Robert F; Thompson, B; Black, J M; Machara, G; Zhang, P; Bobier, W R; Cooperstock, J

    2012-02-15

    We describe the successful translation of computerized and space-consuming laboratory equipment for the treatment of suppression to a small handheld iPod device (Apple iPod; Apple Inc., Cupertino, California). A portable and easily obtainable Apple iPod display, using current video technology offers an ideal solution for the clinical treatment of suppression. The following is a description of the iPod device and illustrates how a video game has been adapted to provide the appropriate stimulation to implement our recent antisuppression treatment protocol. One to 2 hours per day of video game playing under controlled conditions for 1 to 3 weeks can improve acuity and restore binocular function, including stereopsis in adults, well beyond the age at which traditional patching is used. This handheld platform provides a convenient and effective platform for implementing the newly proposed binocular treatment of amblyopia in the clinic, home, or elsewhere. American Optometric Association.

  3. Parallel transformation of K-SVD solar image denoising algorithm

    NASA Astrophysics Data System (ADS)

    Liang, Youwen; Tian, Yu; Li, Mei

    2017-02-01

    The images obtained by observing the sun through a large telescope always suffered with noise due to the low SNR. K-SVD denoising algorithm can effectively remove Gauss white noise. Training dictionaries for sparse representations is a time consuming task, due to the large size of the data involved and to the complexity of the training algorithms. In this paper, an OpenMP parallel programming language is proposed to transform the serial algorithm to the parallel version. Data parallelism model is used to transform the algorithm. Not one atom but multiple atoms updated simultaneously is the biggest change. The denoising effect and acceleration performance are tested after completion of the parallel algorithm. Speedup of the program is 13.563 in condition of using 16 cores. This parallel version can fully utilize the multi-core CPU hardware resources, greatly reduce running time and easily to transplant in multi-core platform.

  4. A review of the Nearctic genus Zealeuctra Ricker (Plecoptera, Leuctridae), with the description of a new species from the Cumberland Plateau region of eastern North America.

    PubMed

    Grubbs, Scott A; Kondratieff, Boris C; Stark, Bill P; Dewalt, R Edward

    2013-01-01

    The stonefly genus Zealeuctra (Plecoptera: Leuctridae) is endemic to the central and eastern Nearctic regions and is presently comprised of 10 species. Scanning electron microscopy (SEM) was used to examine and redescribe two important diagnostic features typically used to identify and define the adult male stage: the large, anteriorly-recurved epiproct and the medial cleft of the ninth abdominal tergite. SEM was also employed to depict the posteromedial portion of female 7(th) sternum. A new species, Z. ukayodi sp. n., is described from the Cumberland Plateau region of northeastern Alabama and Tennessee. The new species appears superficially similar to Z. talladega Grubbs, but is easily differentiated by characteristics of the male medial cleft. An updated taxonomic key to the males of Zealeuctra is provided.

  5. A review of the Nearctic genus Zealeuctra Ricker (Plecoptera, Leuctridae), with the description of a new species from the Cumberland Plateau region of eastern North America

    PubMed Central

    Grubbs, Scott A.; Kondratieff, Boris C.; Stark, Bill P.; DeWalt, R. Edward

    2013-01-01

    Abstract The stonefly genus Zealeuctra (Plecoptera: Leuctridae) is endemic to the central and eastern Nearctic regions and is presently comprised of 10 species. Scanning electron microscopy (SEM) was used to examine and redescribe two important diagnostic features typically used to identify and define the adult male stage: the large, anteriorly-recurved epiproct and the medial cleft of the ninth abdominal tergite. SEM was also employed to depict the posteromedial portion of female 7th sternum. A new species, Z. ukayodi sp. n., is described from the Cumberland Plateau region of northeastern Alabama and Tennessee. The new species appears superficially similar to Z. talladega Grubbs, but is easily differentiated by characteristics of the male medial cleft. An updated taxonomic key to the males of Zealeuctra is provided. PMID:24194658

  6. Controllable Edge Feature Sharpening for Dental Applications

    PubMed Central

    2014-01-01

    This paper presents a new approach to sharpen blurred edge features in scanned tooth preparation surfaces generated by structured-light scanners. It aims to efficiently enhance the edge features so that the embedded feature lines can be easily identified in dental CAD systems, and to avoid unnatural oversharpening geometry. We first separate the feature regions using graph-cut segmentation, which does not require a user-defined threshold. Then, we filter the face normal vectors to propagate the geometry from the smooth region to the feature region. In order to control the degree of the sharpness, we propose a feature distance measure which is based on normal tensor voting. Finally, the vertex positions are updated according to the modified face normal vectors. We have applied the approach to scanned tooth preparation models. The results show that the blurred edge features are enhanced without unnatural oversharpening geometry. PMID:24741376

  7. Controllable edge feature sharpening for dental applications.

    PubMed

    Fan, Ran; Jin, Xiaogang

    2014-01-01

    This paper presents a new approach to sharpen blurred edge features in scanned tooth preparation surfaces generated by structured-light scanners. It aims to efficiently enhance the edge features so that the embedded feature lines can be easily identified in dental CAD systems, and to avoid unnatural oversharpening geometry. We first separate the feature regions using graph-cut segmentation, which does not require a user-defined threshold. Then, we filter the face normal vectors to propagate the geometry from the smooth region to the feature region. In order to control the degree of the sharpness, we propose a feature distance measure which is based on normal tensor voting. Finally, the vertex positions are updated according to the modified face normal vectors. We have applied the approach to scanned tooth preparation models. The results show that the blurred edge features are enhanced without unnatural oversharpening geometry.

  8. MM Algorithms for Geometric and Signomial Programming

    PubMed Central

    Lange, Kenneth; Zhou, Hua

    2013-01-01

    This paper derives new algorithms for signomial programming, a generalization of geometric programming. The algorithms are based on a generic principle for optimization called the MM algorithm. In this setting, one can apply the geometric-arithmetic mean inequality and a supporting hyperplane inequality to create a surrogate function with parameters separated. Thus, unconstrained signomial programming reduces to a sequence of one-dimensional minimization problems. Simple examples demonstrate that the MM algorithm derived can converge to a boundary point or to one point of a continuum of minimum points. Conditions under which the minimum point is unique or occurs in the interior of parameter space are proved for geometric programming. Convergence to an interior point occurs at a linear rate. Finally, the MM framework easily accommodates equality and inequality constraints of signomial type. For the most important special case, constrained quadratic programming, the MM algorithm involves very simple updates. PMID:24634545

  9. A trust region approach with multivariate Padé model for optimal circuit design

    NASA Astrophysics Data System (ADS)

    Abdel-Malek, Hany L.; Ebid, Shaimaa E. K.; Mohamed, Ahmed S. A.

    2017-11-01

    Since the optimization process requires a significant number of consecutive function evaluations, it is recommended to replace the function by an easily evaluated approximation model during the optimization process. The model suggested in this article is based on a multivariate Padé approximation. This model is constructed using data points of ?, where ? is the number of parameters. The model is updated over a sequence of trust regions. This model avoids the slow convergence of linear models of ? and has features of quadratic models that need interpolation data points of ?. The proposed approach is tested by applying it to several benchmark problems. Yield optimization using such a direct method is applied to some practical circuit examples. Minimax solution leads to a suitable initial point to carry out the yield optimization process. The yield is optimized by the proposed derivative-free method for active and passive filter examples.

  10. MM Algorithms for Geometric and Signomial Programming.

    PubMed

    Lange, Kenneth; Zhou, Hua

    2014-02-01

    This paper derives new algorithms for signomial programming, a generalization of geometric programming. The algorithms are based on a generic principle for optimization called the MM algorithm. In this setting, one can apply the geometric-arithmetic mean inequality and a supporting hyperplane inequality to create a surrogate function with parameters separated. Thus, unconstrained signomial programming reduces to a sequence of one-dimensional minimization problems. Simple examples demonstrate that the MM algorithm derived can converge to a boundary point or to one point of a continuum of minimum points. Conditions under which the minimum point is unique or occurs in the interior of parameter space are proved for geometric programming. Convergence to an interior point occurs at a linear rate. Finally, the MM framework easily accommodates equality and inequality constraints of signomial type. For the most important special case, constrained quadratic programming, the MM algorithm involves very simple updates.

  11. The NASA computer aided design and test system

    NASA Technical Reports Server (NTRS)

    Gould, J. M.; Juergensen, K.

    1973-01-01

    A family of computer programs facilitating the design, layout, evaluation, and testing of digital electronic circuitry is described. CADAT (computer aided design and test system) is intended for use by NASA and its contractors and is aimed predominantly at providing cost effective microelectronic subsystems based on custom designed metal oxide semiconductor (MOS) large scale integrated circuits (LSIC's). CADAT software can be easily adopted by installations with a wide variety of computer hardware configurations. Its structure permits ease of update to more powerful component programs and to newly emerging LSIC technologies. The components of the CADAT system are described stressing the interaction of programs rather than detail of coding or algorithms. The CADAT system provides computer aids to derive and document the design intent, includes powerful automatic layout software, permits detailed geometry checks and performance simulation based on mask data, and furnishes test pattern sequences for hardware testing.

  12. Fixed-head star tracker attitude updates on the Hubble Space Telescope

    NASA Technical Reports Server (NTRS)

    Nadelman, Matthew S.; Karl, Jeffrey B.; Hallock, Lou

    1994-01-01

    The Hubble Space Telescope (HST) was launched in April 1990 to begin observing celestial space to the edge of the universe. National Aeronautics and Space Administration (NASA) standard fixed-head star trackers (FHST's) are used operationally onboard the HST to regularly adjust ('update') the spacecraft attitude before the acquisition of guide stars for science observations. During the first 3 months of the mission, the FHST's updated the spacecraft attitude successfully only 85 percent of the time. During the other periods, the trackers were unable to find the selected stars -- either they failed to find any star, or worse, they selected incorrect stars and produced erroneous attitude updates. In July 1990, the HST project office at Goddard Space Flight Center (GSFC) requested that Computer Sciences Corporation (CSC) form an investigative 'tiger' team to examine these FHST update failures. This paper discusses the work of the FHST tiger team, describes the investigations that led the team to identify the sources of the errors, and defines the solutions that were subsequently developed, which ultimately increased the success rate of FHST updates to approximately 98 percent.

  13. Updating beliefs and combining evidence in adaptive forest management under climate change: a case study of Norway spruce (Picea abies L. Karst) in the Black Forest, Germany.

    PubMed

    Yousefpour, Rasoul; Temperli, Christian; Bugmann, Harald; Elkin, Che; Hanewinkel, Marc; Meilby, Henrik; Jacobsen, Jette Bredahl; Thorsen, Bo Jellesmark

    2013-06-15

    We study climate uncertainty and how managers' beliefs about climate change develop and influence their decisions. We develop an approach for updating knowledge and beliefs based on the observation of forest and climate variables and illustrate its application for the adaptive management of an even-aged Norway spruce (Picea abies L. Karst) forest in the Black Forest, Germany. We simulated forest development under a range of climate change scenarios and forest management alternatives. Our analysis used Bayesian updating and Dempster's rule of combination to simulate how observations of climate and forest variables may influence a decision maker's beliefs about climate development and thereby management decisions. While forest managers may be inclined to rely on observed forest variables to infer climate change and impacts, we found that observation of climate state, e.g. temperature or precipitation is superior for updating beliefs and supporting decision-making. However, with little conflict among information sources, the strongest evidence would be offered by a combination of at least two informative variables, e.g., temperature and precipitation. The success of adaptive forest management depends on when managers switch to forward-looking management schemes. Thus, robust climate adaptation policies may depend crucially on a better understanding of what factors influence managers' belief in climate change. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Brucellosis update in Libya and regional prospective

    PubMed Central

    Ahmed, Mohamed O; Abouzeed, Yousef M; Bennour, Emad M; van Velkinburgh, Jennifer C

    2015-01-01

    Brucellosis is a global bacterial zoonosis responsible for high morbidity in humans and significant livestock economic losses. While brucellosis remains a public health concern worldwide, its global geographic distribution is variable, largely due to different management schemes; however, paucity of information renders the status of brucellosis unclear and incomplete in many countries, especially those with low income and under-developed infrastructure. This short article summarizes and discusses recent important updates on brucellosis from the North African countries, with a particular brief emphasis on the current status and recent updates in Libya. PMID:25578285

  15. Brucellosis update in Libya and regional prospective.

    PubMed

    Ahmed, Mohamed O; Abouzeed, Yousef M; Bennour, Emad M; van Velkinburgh, Jennifer C

    2015-02-01

    Brucellosis is a global bacterial zoonosis responsible for high morbidity in humans and significant livestock economic losses. While brucellosis remains a public health concern worldwide, its global geographic distribution is variable, largely due to different management schemes; however, paucity of information renders the status of brucellosis unclear and incomplete in many countries, especially those with low income and under-developed infrastructure. This short article summarizes and discusses recent important updates on brucellosis from the North African countries, with a particular brief emphasis on the current status and recent updates in Libya.

  16. Assimilation of CryoSat-2 altimetry to a hydrodynamic model of the Brahmaputra river

    NASA Astrophysics Data System (ADS)

    Schneider, Raphael; Nygaard Godiksen, Peter; Ridler, Marc-Etienne; Madsen, Henrik; Bauer-Gottwein, Peter

    2016-04-01

    Remote sensing provides valuable data for parameterization and updating of hydrological models, for example water level measurements of inland water bodies from satellite radar altimeters. Satellite altimetry data from repeat-orbit missions such as Envisat, ERS or Jason has been used in many studies, also synthetic wide-swath altimetry data as expected from the SWOT mission. This study is one of the first hydrologic applications of altimetry data from a drifting orbit satellite mission, namely CryoSat-2. CryoSat-2 is equipped with the SIRAL instrument, a new type of radar altimeter similar to SRAL on Sentinel-3. CryoSat-2 SARIn level 2 data is used to improve a 1D hydrodynamic model of the Brahmaputra river basin in South Asia set up in the DHI MIKE 11 software. CryoSat-2 water levels were extracted over river masks derived from Landsat imagery. After discharge calibration, simulated water levels were fitted to the CryoSat-2 data along the Assam valley by adapting cross section shapes and datums. The resulting hydrodynamic model shows accurate spatio-temporal representation of water levels, which is a prerequisite for real-time model updating by assimilation of CryoSat-2 altimetry or multi-mission data in general. For this task, a data assimilation framework has been developed and linked with the MIKE 11 model. It is a flexible framework that can assimilate water level data which are arbitrarily distributed in time and space. Different types of error models, data assimilation methods, etc. can easily be used and tested. Furthermore, it is not only possible to update the water level of the hydrodynamic model, but also the states of the rainfall-runoff models providing the forcing of the hydrodynamic model. The setup has been used to assimilate CryoSat-2 observations over the Assam valley for the years 2010 to 2013. Different data assimilation methods and localizations were tested, together with different model error representations. Furthermore, the impact of different filtering and clustering methods and error descriptions of the CryoSat-2 observations was evaluated. Performance improvement in terms of discharge and water level forecast due to the assimilation of satellite altimetry data was then evaluated. The model forecasts were also compared to climatology and persistence forecasts. Using ensemble based filters, the evaluation was done not only based on performance criteria for the central forecast such as root-mean-square error (RMSE) and Nash-Sutcliffe model efficiency (NSE), but also based on sharpness, reliability and continuous ranked probability score (CRPS) of the ensemble of probabilistic forecasts.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Chin-Cheng, E-mail: chen.ccc@gmail.com; Chang, Chang; Mah, Dennis

    Purpose: The spot characteristics for proton pencil beam scanning (PBS) were measured and analyzed over a 16 month period, which included one major site configuration update and six cyclotron interventions. The results provide a reference to establish the quality assurance (QA) frequency and tolerance for proton pencil beam scanning. Methods: A simple treatment plan was generated to produce an asymmetric 9-spot pattern distributed throughout a field of 16 × 18 cm for each of 18 proton energies (100.0–226.0 MeV). The delivered fluence distribution in air was measured using a phosphor screen based CCD camera at three planes perpendicular to themore » beam line axis (x-ray imaging isocenter and up/down stream 15.0 cm). The measured fluence distributions for each energy were analyzed using in-house programs which calculated the spot sizes and positional deviations of the Gaussian shaped spots. Results: Compared to the spot characteristic data installed into the treatment planning system, the 16-month averaged deviations of the measured spot sizes at the isocenter plane were 2.30% and 1.38% in the IEC gantry x and y directions, respectively. The maximum deviation was 12.87% while the minimum deviation was 0.003%, both at the upstream plane. After the collinearity of the proton and x-ray imaging system isocenters was optimized, the positional deviations of the spots were all within 1.5 mm for all three planes. During the site configuration update, spot positions were found to deviate by 6 mm until the tuning parameters file was properly restored. Conclusions: For this beam delivery system, it is recommended to perform a spot size and position check at least monthly and any time after a database update or cyclotron intervention occurs. A spot size deviation tolerance of <15% can be easily met with this delivery system. Deviations of spot positions were <2 mm at any plane up/down stream 15 cm from the isocenter.« less

  18. Technical Note: Spot characteristic stability for proton pencil beam scanning.

    PubMed

    Chen, Chin-Cheng; Chang, Chang; Moyers, Michael F; Gao, Mingcheng; Mah, Dennis

    2016-02-01

    The spot characteristics for proton pencil beam scanning (PBS) were measured and analyzed over a 16 month period, which included one major site configuration update and six cyclotron interventions. The results provide a reference to establish the quality assurance (QA) frequency and tolerance for proton pencil beam scanning. A simple treatment plan was generated to produce an asymmetric 9-spot pattern distributed throughout a field of 16 × 18 cm for each of 18 proton energies (100.0-226.0 MeV). The delivered fluence distribution in air was measured using a phosphor screen based CCD camera at three planes perpendicular to the beam line axis (x-ray imaging isocenter and up/down stream 15.0 cm). The measured fluence distributions for each energy were analyzed using in-house programs which calculated the spot sizes and positional deviations of the Gaussian shaped spots. Compared to the spot characteristic data installed into the treatment planning system, the 16-month averaged deviations of the measured spot sizes at the isocenter plane were 2.30% and 1.38% in the IEC gantry x and y directions, respectively. The maximum deviation was 12.87% while the minimum deviation was 0.003%, both at the upstream plane. After the collinearity of the proton and x-ray imaging system isocenters was optimized, the positional deviations of the spots were all within 1.5 mm for all three planes. During the site configuration update, spot positions were found to deviate by 6 mm until the tuning parameters file was properly restored. For this beam delivery system, it is recommended to perform a spot size and position check at least monthly and any time after a database update or cyclotron intervention occurs. A spot size deviation tolerance of <15% can be easily met with this delivery system. Deviations of spot positions were <2 mm at any plane up/down stream 15 cm from the isocenter.

  19. Computerized Workstation for Tsunami Hazard Monitoring

    NASA Astrophysics Data System (ADS)

    Lavrentiev-Jr, Mikhail; Marchuk, Andrey; Romanenko, Alexey; Simonov, Konstantin; Titov, Vasiliy

    2010-05-01

    We present general structure and functionality of the proposed Computerized Workstation for Tsunami Hazard Monitoring (CWTHM). The tool allows interactive monitoring of hazard, tsunami risk assessment, and mitigation - at all stages, from the period of strong tsunamigenic earthquake preparation to inundation of the defended coastal areas. CWTHM is a software-hardware complex with a set of software applications, optimized to achieve best performance on hardware platforms in use. The complex is calibrated for selected tsunami source zone(s) and coastal zone(s) to be defended. The number of zones (both source and coastal) is determined, or restricted, by available hardware resources. The presented complex performs monitoring of selected tsunami source zone via the Internet. The authors developed original algorithms, which enable detection of the preparation zone of the strong underwater earthquake automatically. For the so-determined zone the event time, magnitude and spatial location of tsunami source are evaluated by means of energy of the seismic precursors (foreshocks) analysis. All the above parameters are updated after each foreshock. Once preparing event is detected, several scenarios are forecasted for wave amplitude parameters as well as the inundation zone. Estimations include the lowest and the highest wave amplitudes and the least and the most inundation zone. In addition to that, the most probable case is calculated. In case of multiple defended coastal zones, forecasts and estimates can be done in parallel. Each time the simulated model wave reaches deep ocean buoys or tidal gauge, expected values of wave parameters and inundation zones are updated with historical events information and pre-calculated scenarios. The Method of Splitting Tsunami (MOST) software package is used for mathematical simulation. The authors suggest code acceleration for deep water wave propagation. As a result, performance is 15 times faster compared to MOST, original version. Performance gain is achieved by compiler options, use of optimized libraries, and advantages of OpenMP parallel technology. Moreover, it is possible to achieve 100 times code acceleration by using modern Graphics Processing Units (GPU). Parallel evaluation of inundation zones for multiple coastal zones is also available. All computer codes can be easily assembled under MS Windows and Unix OS family. Although software is virtually platform independent, the most performance gain is achieved while using the recommended hardware components. When the seismic event occurs, all valuable parameters are updated with seismic data and wave propagation monitoring is enabled. As soon as the wave passes each deep ocean tsunameter, parameters of the initial displacement at source are updated from direct calculations based on original algorithms. For better source reconstruction, a combination of two methods is used: optimal unit source linear combination from preliminary calculated database and direct numerical inversion along the wave ray between real source and particular measurement buoys. Specific dissipation parameter along with the wave ray is also taken into account. During the entire wave propagation process the expected wave parameters and inundation zone(s) characteristics are updated with all available information. If recommended hardware components are used, monitoring results are available in real time. The suggested version of CWTHM has been tested by analyzing seismic precursors (foreshocks) and the measured tsunami waves at North Pacific for the Central Kuril's tsunamigenic earthquake of November 15, 2006.

  20. Development of a Scaling Technique for Sociometric Data.

    ERIC Educational Resources Information Center

    Peper, John B.; Chansky, Norman M.

    This study explored the stability and interjudge agreements of a sociometric scaling device to which children could easily respond, which teachers could easily administer and score, and which provided scores that researchers could use in parametric statistical analyses. Each student was paired with every other member of his class. He voted on each…

Top