Recommendations for the use of mist nets for inventory and monitoring of bird populations
C. John Ralph; Erica H. Dunn; Will J. Peach; Colleen M. Handel
2004-01-01
We provide recommendations on the best practices for mist netting for the purposes of monitoring population parameters such as abundance and demography. Studies should be carefully thought out before nets are set up, to ensure that sampling design and estimated sample size will allow study objectives to be met. Station location, number of nets, type of nets, net...
NASA Astrophysics Data System (ADS)
Schaap, Dick M. A.; Maudire, Gilbert
2010-05-01
SeaDataNet is a leading infrastructure in Europe for marine & ocean data management. It is actively operating and further developing a Pan-European infrastructure for managing, indexing and providing access to ocean and marine data sets and data products, acquired via research cruises and other observational activities, in situ and remote sensing. The basis of SeaDataNet is interconnecting 40 National Oceanographic Data Centres and Marine Data Centers from 35 countries around European seas into a distributed network of data resources with common standards for metadata, vocabularies, data transport formats, quality control methods and flags, and access. Thereby most of the NODC's operate and/or are developing national networks to other institutes in their countries to ensure national coverage and long-term stewardship of available data sets. The majority of data managed by SeaDataNet partners concerns physical oceanography, marine chemistry, hydrography, and a substantial volume of marine biology and geology and geophysics. These are partly owned by the partner institutes themselves and for a major part also owned by other organizations from their countries. The SeaDataNet infrastructure is implemented with support of the EU via the EU FP6 SeaDataNet project to provide the Pan-European data management system adapted both to the fragmented observation system and the users need for an integrated access to data, meta-data, products and services. The SeaDataNet project has a duration of 5 years and started in 2006, but builds upon earlier data management infrastructure projects, undertaken over a period of 20 years by an expanding network of oceanographic data centres from the countries around all European seas. Its predecessor project Sea-Search had a strict focus on metadata. SeaDataNet maintains significant interest in the further development of the metadata infrastructure, extending its services with the provision of easy data access and generic data products. Version 1 of its infrastructure upgrade was launched in April 2008 and is now well underway to include all 40 data centres at V1 level. It comprises the network of 40 interconnected data centres (NODCs) and a central SeaDataNet portal. V1 provides users a unified and transparent overview of the metadata and controlled access to the large collections of data sets, that are managed at these data centres. The SeaDataNet V1 infrastructure comprises the following middleware services: • Discovery services = Metadata directories and User interfaces • Vocabulary services = Common vocabularies and Governance • Security services = Authentication, Authorization & Accounting • Delivery services = Requesting and Downloading of data sets • Viewing services = Mapping of metadata • Monitoring services = Statistics on system usage and performance and Registration of data requests and transactions • Maintenance services = Entry and updating of metadata by data centres Also good progress is being made with extending the SeaDataNet infrastructure with V2 services: • Viewing services = Quick views and Visualisation of data and data products • Product services = Generic and standard products • Exchange services = transformation of SeaDataNet portal CDI output to INSPIRE compliance As a basis for the V1 services, common standards have been defined for metadata and data formats, common vocabularies, quality flags, and quality control methods, based on international standards, such as ISO 19115, OGC, NetCDF (CF), ODV, best practices from IOC and ICES, and following INSPIRE developments. An important objective of the SeaDataNet V1 infrastructure is to provide transparent access to the distributed data sets via a unique user interface and download service. In the SeaDataNet V1 architecture the Common Data Index (CDI) V1 metadata service provides the link between discovery and delivery of data sets. The CDI user interface enables users to have a detailed insight of the availability and geographical distribution of marine data, archived at the connected data centres. It provides sufficient information to allow the user to assess the data relevance. Moreover the CDI user interface provides the means for downloading data sets in common formats via a transaction mechanism. The SeaDataNet portal provides registered users access to these distributed data sets via the CDI V1 Directory and a shopping basket mechanism. This allows registered users to locate data of interest and submit their data requests. The requests are forwarded automatically from the portal to the relevant SeaDataNet data centres. This process is controlled via the Request Status Manager (RSM) Web Service at the portal and a Download Manager (DM) java software module, implemented at each of the data centres. The RSM also enables registered users to check regularly the status of their requests and download data sets, after access has been granted. Data centres can follow all transactions for their data sets online and can handle requests which require their consent. The actual delivery of data sets is done between the user and the selected data centre. Very good progress is being made with connecting all SeaDataNet data centres and their data sets to the CDI V1 system. At present the CDI V1 system provides users functionality to discover and download more than 500.000 data sets, a number which is steadily increasing. The SeaDataNet architecture provides a coherent system of the various V1 services and inclusion of the V2 services. For the implementation, a range of technical components have been defined and developed. These make use of recent web technologies, and also comprise Java components, to provide multi-platform support and syntactic interoperability. To facilitate sharing of resources and interoperability, SeaDataNet has adopted the technology of SOAP Web services for various communication tasks. The SeaDataNet architecture has been designed as a multi-disciplinary system from the beginning. It is able to support a wide variety of data types and to serve several sector communities. SeaDataNet is willing to share its technologies and expertise, to spread and expand its approach, and to build bridges to other well established infrastructures in the marine domain. Therefore SeaDataNet has developed a strategy of seeking active cooperation on a national scale with other data holding organisations via its NODC networks and on an international scale with other European and international data management initiatives and networks. This is done with the objective to achieve a wider coverage of data sources and an overall interoperability between data infrastructures in the marine and ocean domains. Recent examples are e.g. the EU FP7 projects Geo-Seas for geology and geophysical data sets, UpgradeBlackSeaScene for a Black Sea data management infrastructure, CaspInfo for a Caspian Sea data management infrastructure, the EU EMODNET pilot projects, for hydrographic, chemical, and biological data sets. All projects are adopting the SeaDataNet standards and extending its services. Also active cooperation takes place with EuroGOOS and MyOcean in the domain of real-time and delayed mode metocean monitoring data. SeaDataNet Partners: IFREMER (France), MARIS (Netherlands), HCMR/HNODC (Greece), ULg (Belgium), OGS (Italy), NERC/BODC (UK), BSH/DOD (Germany), SMHI (Sweden), IEO (Spain), RIHMI/WDC (Russia), IOC (International), ENEA (Italy), INGV (Italy), METU (Turkey), CLS (France), AWI (Germany), IMR (Norway), NERI (Denmark), ICES (International), EC-DG JRC (International), MI (Ireland), IHPT (Portugal), RIKZ (Netherlands), RBINS/MUMM (Belgium), VLIZ (Belgium), MRI (Iceland), FIMR (Finland ), IMGW (Poland), MSI (Estonia), IAE/UL (Latvia), CMR (Lithuania), SIO/RAS (Russia), MHI/DMIST (Ukraine), IO/BAS (Bulgaria), NIMRD (Romania), TSU (Georgia), INRH (Morocco), IOF (Croatia), PUT (Albania), NIB (Slovenia), UoM (Malta), OC/UCY (Cyprus), IOLR (Israel), NCSR/NCMS (Lebanon), CNR-ISAC (Italy), ISMAL (Algeria), INSTM (Tunisia)
Recommendations for the use of mist nets for inventory and monitoring of bird populations
Ralph, C. John; Dunn, Erica H.; Peach, Will J.; Handel, Colleen M.; Ralph, C. John; Dunn, Erica H.
2004-01-01
We provide recommendations on the best practices for mist netting for the purposes of monitoring population parameters such as abundance and demography. Studies should be carefully thought out before nets are set up, to ensure that sampling design and estimated sample size will allow study objectives to be met. Station location, number of nets, type of nets, net placement, and schedule of operation should be determined by the goals of the particular project, and we provide guidelines for typical mist-net studies. In the absence of study-specific requirements for novel protocols, commonly used protocols should be used to enable comparison of results among studies. Regardless of the equipment, net layout, or netting schedule selected, it is important for all studies that operations be strictly standardized, and a well-written operation protocol will help in attaining this goal. We provide recommendations for data to be collected on captured birds, and emphasize the need for good training of project personnel
The management of neuroendocrine tumours: A nutritional viewpoint.
Gallo, Marco; Muscogiuri, Giovanna; Pizza, Genoveffa; Ruggeri, Rosaria Maddalena; Barrea, Luigi; Faggiano, Antongiulio; Colao, Annamaria
2017-11-29
Nutritional status in patients with neuroendocrine tumours (NETs), especially of gastroenteropancreatic origin, can be deeply affected by excessive production of gastrointestinal hormones, peptides, and amines, which can lead to malabsorption, diarrhoea, steatorrhea, and altered gastrointestinal motility. Besides, the surgical and/or medical management of NETs can lead to alteration of gastrointestinal secretory, motor, and absorptive functions, with both dietary and nutritional consequences. Indeed, disease-related malnutrition is a frequently encountered yet both underrecognized and understudied clinical phenomenon in patients with NETs, with substantial prognostic and socioeconomic consequences. Most of these conditions can be alleviated by a tailored nutritional approach, also with the aim of improving the efficacy of cancer treatments. In this setting, skilled nutritionists can play a fundamental role in the multidisciplinary health care team in NETs management and their presence should be recommended. The aim of this review is to provide dietary advices for each specific condition in patients with NETs, underlining the importance of a nutritional approach to treat malnutrition in this setting. Further, we will provide preliminary evidence coming from our data on the assessment of nutritional status in a single cohort of patients with NETs.
Nematode.net update 2011: addition of data sets and tools featuring next-generation sequencing data
Martin, John; Abubucker, Sahar; Heizer, Esley; Taylor, Christina M.; Mitreva, Makedonka
2012-01-01
Nematode.net (http://nematode.net) has been a publicly available resource for studying nematodes for over a decade. In the past 3 years, we reorganized Nematode.net to provide more user-friendly navigation through the site, a necessity due to the explosion of data from next-generation sequencing platforms. Organism-centric portals containing dynamically generated data are available for over 56 different nematode species. Next-generation data has been added to the various data-mining portals hosted, including NemaBLAST and NemaBrowse. The NemaPath metabolic pathway viewer builds associations using KOs, rather than ECs to provide more accurate and fine-grained descriptions of proteins. Two new features for data analysis and comparative genomics have been added to the site. NemaSNP enables the user to perform population genetics studies in various nematode populations using next-generation sequencing data. HelmCoP (Helminth Control and Prevention) as an independent component of Nematode.net provides an integrated resource for storage, annotation and comparative genomics of helminth genomes to aid in learning more about nematode genomes, as well as drug, pesticide, vaccine and drug target discovery. With this update, Nematode.net will continue to realize its original goal to disseminate diverse bioinformatic data sets and provide analysis tools to the broad scientific community in a useful and user-friendly manner. PMID:22139919
Fort, Meredith P; Namba, Lynnette M; Dutcher, Sarah; Copeland, Tracy; Bermingham, Neysa; Fellenz, Chris; Lantz, Deborah; Reusch, John J; Bayliss, Elizabeth A
2017-01-01
Objectives: In response to limited access to specialty care in safety-net settings, an integrated delivery system and three safety-net organizations in the Denver, CO, metropolitan area launched a unique program in 2013. The program offers safety-net providers the option to electronically consult with specialists. Uninsured patients may be seen by specialists in office visits for a defined set of services. This article describes the program, identifies aspects that have worked well and areas that need improvement, and offers lessons learned. Methods: We quantified electronic consultations (e-consults) between safety-net clinicians and specialists, and face-to-face specialist visits between May 2013 and December 2014. We reviewed and categorized all e-consults from November and December 2014. In 2015, we interviewed 21 safety-net clinicians and staff, 12 specialists, and 10 patients, and conducted a thematic analysis to determine factors facilitating and limiting optimal program use. Results: In the first 20 months of the program, safety-net clinicians at 23 clinics made 602 e-consults to specialists, and 81 patients received face-to-face specialist visits. Of 204 primary care clinicians, 103 made e-consults; 65 specialists participated in the program. Aspects facilitating program use were referral case managers’ involvement and the use of clear, concise questions in e-consults. Key recommendations for process improvement were to promote an understanding of the different health care contexts, support provider-to-provider communication, facilitate hand-offs between settings, and clarify program scope. Conclusion: Participants perceived the program as responsive to their needs, yet opportunities exist for continued uptake and expansion. Communitywide efforts to assess and address needs remain important. PMID:28241908
Factors shaping effective utilization of health information technology in urban safety-net clinics.
George, Sheba; Garth, Belinda; Fish, Allison; Baker, Richard
2013-09-01
Urban safety-net clinics are considered prime targets for the adoption of health information technology innovations; however, little is known about their utilization in such safety-net settings. Current scholarship provides limited guidance on the implementation of health information technology into safety-net settings as it typically assumes that adopting institutions have sufficient basic resources. This study addresses this gap by exploring the unique challenges urban resource-poor safety-net clinics must consider when adopting and utilizing health information technology. In-depth interviews (N = 15) were used with key stakeholders (clinic chief executive officers, medical directors, nursing directors, chief financial officers, and information technology directors) from staff at four clinics to explore (a) nonhealth information technology-related clinic needs, (b) how health information technology may provide solutions, and (c) perceptions of and experiences with health information technology. Participants identified several challenges, some of which appear amenable to health information technology solutions. Also identified were requirements for effective utilization of health information technology including physical infrastructural improvements, funding for equipment/training, creation of user groups to share health information technology knowledge/experiences, and specially tailored electronic billing guidelines. We found that despite the potential benefit that can be derived from health information technologies, the unplanned and uninformed introduction of these tools into these settings might actually create more problems than are solved. From these data, we were able to identify a set of factors that should be considered when integrating health information technology into the existing workflows of low-resourced urban safety-net clinics in order to maximize their utilization and enhance the quality of health care in such settings.
Chan, David; Lawrence, Ben; Pavlakis, Nick; Kennecke, Hagen F.; Jackson, Christopher; Law, Calvin; Singh, Simron
2017-01-01
Purpose Neuroendocrine tumors (NETs) are a diverse group of malignancies that pose challenges common to all rare tumors. The Commonwealth Neuroendocrine Tumor Collaboration (CommNETS) was established in 2015 to enhance outcomes for patients with NETs in Canada, Australia, and New Zealand. A modified Delphi process was undertaken involving patients, clinicians, and researchers to identify gaps in NETs research to produce a comprehensive and defensible research action plan. Methods A three-round modified Delphi process was undertaken with larger representation than usual for medical consensus processes. Patient/advocate and health care provider/researcher expert panels undertook Round 1, which canvassed 17 research priorities and 42 potential topics; in Round 2, these priorities were ranked. Round 3 comprised a face-to-face meeting to generate final consensus rankings and formulate the research action plan. Results The Delphi groups consisted of 203 participants in Round 1 (64% health care providers/researchers, 36% patient/advocates; 52% Canadian, 32% Australian, and 17% New Zealander), of whom 132 participated in Round 2. The top eight priorities were biomarker development; peptide receptor radionuclide therapy optimization; trials of new agents in advanced NETs; functional imaging; sequencing therapies for metastatic NETs, including development of validated surrogate end points for studies; pathologic classification; early diagnosis; interventional therapeutics; and curative surgery. Two major areas were ranked significantly higher by patients/advocates: early diagnosis and curative surgery. Six CommNETS working parties were established. Conclusion This modified Delphi process resulted in a well-founded set of research priorities for the newly formed CommNETS collaboration by involving a large, diverse group of stakeholders. This approach to setting a research agenda for a new collaborative group should be adopted to ensure that research plans reflect unmet needs and priorities in the field. PMID:28831446
Segelov, Eva; Chan, David; Lawrence, Ben; Pavlakis, Nick; Kennecke, Hagen F; Jackson, Christopher; Law, Calvin; Singh, Simron
2017-08-01
Neuroendocrine tumors (NETs) are a diverse group of malignancies that pose challenges common to all rare tumors. The Commonwealth Neuroendocrine Tumor Collaboration (CommNETS) was established in 2015 to enhance outcomes for patients with NETs in Canada, Australia, and New Zealand. A modified Delphi process was undertaken involving patients, clinicians, and researchers to identify gaps in NETs research to produce a comprehensive and defensible research action plan. A three-round modified Delphi process was undertaken with larger representation than usual for medical consensus processes. Patient/advocate and health care provider/researcher expert panels undertook Round 1, which canvassed 17 research priorities and 42 potential topics; in Round 2, these priorities were ranked. Round 3 comprised a face-to-face meeting to generate final consensus rankings and formulate the research action plan. The Delphi groups consisted of 203 participants in Round 1 (64% health care providers/researchers, 36% patient/advocates; 52% Canadian, 32% Australian, and 17% New Zealander), of whom 132 participated in Round 2. The top eight priorities were biomarker development; peptide receptor radionuclide therapy optimization; trials of new agents in advanced NETs; functional imaging; sequencing therapies for metastatic NETs, including development of validated surrogate end points for studies; pathologic classification; early diagnosis; interventional therapeutics; and curative surgery. Two major areas were ranked significantly higher by patients/advocates: early diagnosis and curative surgery. Six CommNETS working parties were established. This modified Delphi process resulted in a well-founded set of research priorities for the newly formed CommNETS collaboration by involving a large, diverse group of stakeholders. This approach to setting a research agenda for a new collaborative group should be adopted to ensure that research plans reflect unmet needs and priorities in the field.
NASA Astrophysics Data System (ADS)
Schaap, D. M. A.; Maudire, G.
2009-04-01
SeaDataNet is an Integrated research Infrastructure Initiative (I3) in EU FP6 (2006 - 2011) to provide the data management system adapted both to the fragmented observation system and the users need for an integrated access to data, meta-data, products and services. Therefore SeaDataNet insures the long term archiving of the large number of multidisciplinary data (i.e. temperature, salinity current, sea level, chemical, physical and biological properties) collected by many different sensors installed on board of research vessels, satellite and the various platforms of the marine observing system. The SeaDataNet project started in 2006, but builds upon earlier data management infrastructure projects, undertaken over a period of 20 years by an expanding network of oceanographic data centres from the countries around all European seas. Its predecessor project Sea-Search had a strict focus on metadata. SeaDataNet maintains significant interest in the further development of the metadata infrastructure, but its primary objective is the provision of easy data access and generic data products. SeaDataNet is a distributed infrastructure that provides transnational access to marine data, meta-data, products and services through 40 interconnected Trans National Data Access Platforms (TAP) from 35 countries around the Black Sea, Mediterranean, North East Atlantic, North Sea, Baltic and Arctic regions. These include: National Oceanographic Data Centres (NODC's) Satellite Data Centres. Furthermore the SeaDataNet consortium comprises a number of expert modelling centres, SME's experts in IT, and 3 international bodies (ICES, IOC and JRC). Planning: The SeaDataNet project is delivering and operating the infrastructure in 3 versions: Version 0: maintenance and further development of the metadata systems developed by the Sea-Search project plus the development of a new metadata system for indexing and accessing to individual data objects managed by the SeaDataNet data centres. This is known as the Common Data Index (CDI) V0 system Version 1: harmonisation and upgrading of the metadatabases through adoption of the ISO 19115 metadata standard and provision of transparent data access and download services from all partner data centres through upgrading the Common Data Index and deployment of a data object delivery service. Version 2: adding data product services and OGC compliant viewing services and further virtualisation of data access. SeaDataNet Version 0: The SeaDataNet portal has been set up at http://www.seadatanet.org and it provides a platform for all SeaDataNet services and standards as well as background information about the project and its partners. It includes discovery services via the following catalogues: CSR - Cruise Summary Reports of research vessels; EDIOS - Locations and details of monitoring stations and networks / programmes; EDMED - High level inventory of Marine Environmental Data sets collected and managed by research institutes and organisations; EDMERP - Marine Environmental Research Projects ; EDMO - Marine Organisations. These catalogues are interrelated, where possible, to facilitate cross searching and context searching. These catalogues connect to the Common Data Index (CDI). Common Data Index (CDI) The CDI gives detailed insight in available datasets at partners databases and paves the way to direct online data access or direct online requests for data access / data delivery. The CDI V0 metadatabase contains more than 340.000 individual data entries from 36 CDI partners from 29 countries across Europe, covering a broad scope and range of data, held by these organisations. For purposes of standardisation and international exchange the ISO19115 metadata standard has been adopted. The CDI format is defined as a dedicated subset of this standard. A CDI XML format supports the exchange between CDI-partners and the central CDI manager, and ensures interoperability with other systems and networks. CDI XML entries are generated by participating data centres, directly from their databases. CDI-partners can make use of dedicated SeaDataNet Tools to generate CDI XML files automatically. Approach for SeaDataNet V1 and V2: The approach for SeaDataNet V1 and V2, which is in line with the INSPIRE Directive, comprises the following services: Discovery services = Metadata directories Security services = Authentication, Authorization & Accounting (AAA) Delivery services = Data access & downloading of datasets Viewing services = Visualisation of metadata, data and data products Product services = Generic and standard products Monitoring services = Statistics on usage and performance of the system Maintenance services = Updating of metadata by SeaDataNet partners The services will be operated over a distributed network of interconnected Data Centres accessed through a central Portal. In addition to service access the portal will provide information on data management standards, tools and protocols. The architecture has been designed to provide a coherent system based on V1 services, whilst leaving the pathway open for later extension with V2 services. For the implementation, a range of technical components have been defined. Some are already operational with the remainder in the final stages of development and testing. These make use of recent web technologies, and also comprise Java components, to provide multi-platform support and syntactic interoperability. To facilitate sharing of resources and interoperability, SeaDataNet has adopted SOAP Web Service technology. The SeaDataNet architecture and components have been designed to handle all kinds of oceanographic and marine environmental data including both in-situ measurements and remote sensing observations. The V1 technical development is ready and the V1 system is now being implemented and adopted by all participating data centres in SeaDataNet. Interoperability: Interoperability is the key to distributed data management system success and it is achieved in SeaDataNet V1 by: Using common quality control protocols and flag scale Using controlled vocabularies from a single source that have been developed using international content governance Adopting the ISO 19115 metadata standard for all metadata directories Providing XML Validation Services to quality control the metadata maintenance, including field content verification based on Schematron. Providing standard metadata entry tools Using harmonised Data Transport Formats (NetCDF, ODV ASCII and MedAtlas ASCII) for data sets delivery Adopting of OGC standards for mapping and viewing services Using SOAP Web Services in the SeaDataNet architecture SeaDataNet V1 Delivery Services: An important objective of the V1 system is to provide transparent access to the distributed data sets via a unique user interface at the SeaDataNet portal and download service. In the SeaDataNet V1 architecture the Common Data Index (CDI) V1 provides the link between discovery and delivery. The CDI user interface enables users to have a detailed insight of the availability and geographical distribution of marine data, archived at the connected data centres, and it provides the means for downloading data sets in common formats via a transaction mechanism. The SeaDataNet portal provides registered users access to these distributed data sets via the CDI V1 Directory and a shopping basket mechanism. This allows registered users to locate data of interest and submit their data requests. The requests are forwarded automatically from the portal to the relevant SeaDataNet data centres. This process is controlled via the Request Status Manager (RSM) Web Service at the portal and a Download Manager (DM) java software module, implemented at each of the data centres. The RSM also enables registered users to check regularly the status of their requests and download data sets, after access has been granted. Data centres can follow all transactions for their data sets online and can handle requests which require their consent. The actual delivery of data sets is done between the user and the selected data centre. The CDI V1 system is now being populated by all participating data centres in SeaDataNet, thereby phasing out CDI V0. 0.1 SeaDataNet Partners: IFREMER (France), MARIS (Netherlands), HCMR/HNODC (Greece), ULg (Belgium), OGS (Italy), NERC/BODC (UK), BSH/DOD (Germany), SMHI (Sweden), IEO (Spain), RIHMI/WDC (Russia), IOC (International), ENEA (Italy), INGV (Italy), METU (Turkey), CLS (France), AWI (Germany), IMR (Norway), NERI (Denmark), ICES (International), EC-DG JRC (International), MI (Ireland), IHPT (Portugal), RIKZ (Netherlands), RBINS/MUMM (Belgium), VLIZ (Belgium), MRI (Iceland), FIMR (Finland ), IMGW (Poland), MSI (Estonia), IAE/UL (Latvia), CMR (Lithuania), SIO/RAS (Russia), MHI/DMIST (Ukraine), IO/BAS (Bulgaria), NIMRD (Romania), TSU (Georgia), INRH (Morocco), IOF (Croatia), PUT (Albania), NIB (Slovenia), UoM (Malta), OC/UCY (Cyprus), IOLR (Israel), NCSR/NCMS (Lebanon), CNR-ISAC (Italy), ISMAL (Algeria), INSTM (Tunisia)
EviNet: a web platform for network enrichment analysis with flexible definition of gene sets.
Jeggari, Ashwini; Alekseenko, Zhanna; Petrov, Iurii; Dias, José M; Ericson, Johan; Alexeyenko, Andrey
2018-06-09
The new web resource EviNet provides an easily run interface to network enrichment analysis for exploration of novel, experimentally defined gene sets. The major advantages of this analysis are (i) applicability to any genes found in the global network rather than only to those with pathway/ontology term annotations, (ii) ability to connect genes via different molecular mechanisms rather than within one high-throughput platform, and (iii) statistical power sufficient to detect enrichment of very small sets, down to individual genes. The users' gene sets are either defined prior to upload or derived interactively from an uploaded file by differential expression criteria. The pathways and networks used in the analysis can be chosen from the collection menu. The calculation is typically done within seconds or minutes and the stable URL is provided immediately. The results are presented in both visual (network graphs) and tabular formats using jQuery libraries. Uploaded data and analysis results are kept in separated project directories not accessible by other users. EviNet is available at https://www.evinet.org/.
NASA Astrophysics Data System (ADS)
Schaap, Dick M. A.; Fichaut, Michele
2014-05-01
The second phase of the project SeaDataNet is well underway since October 2011 and is making good progress. The main objective is to improve operations and to progress towards an efficient data management infrastructure able to handle the diversity and large volume of data collected via research cruises and monitoring activities in European marine waters and global oceans. The SeaDataNet infrastructure comprises a network of interconnected data centres and a central SeaDataNet portal. The portal provides users a unified and transparent overview of the metadata and controlled access to the large collections of data sets, managed by the interconnected data centres, and the various SeaDataNet standards and tools,. Recently the 1st Innovation Cycle has been completed, including upgrading of the CDI Data Discovery and Access service to ISO 19139 and making it fully INSPIRE compliant. The extensive SeaDataNet Vocabularies have been upgraded too and implemented for all SeaDataNet European metadata directories. SeaDataNet is setting and governing marine data standards, and exploring and establishing interoperability solutions to connect to other e-infrastructures on the basis of standards of ISO (19115, 19139), OGC (WMS, WFS, CS-W and SWE), and OpenSearch. The population of directories has also increased considerably in cooperation and involvement in associated EU projects and initiatives. SeaDataNet now gives overview and access to more than 1.4 million data sets for physical oceanography, chemistry, geology, geophysics, bathymetry and biology from more than 90 connected data centres from 30 countries riparian to European seas. Access to marine data is also a key issue for the implementation of the EU Marine Strategy Framework Directive (MSFD). The EU communication 'Marine Knowledge 2020' underpins the importance of data availability and harmonising access to marine data from different sources. SeaDataNet qualified itself for leading the data management component of the EMODNet (European Marine Observation and Data Network) that is promoted in the EU Communication. In the past 4 years EMODNet portals have been initiated for marine data themes: digital bathymetry, chemistry, physical oceanography, geology, biology, and seabed habitat mapping. These portals are now being expanded to all European seas in successor projects, which started mid 2013 from EU DG MARE. EMODNet encourages more data providers to come forward for data sharing and participating in the process of making complete overviews and homogeneous data products. The EMODNet Bathymetry project is very illustrative for the synergy with SeaDataNet and added value of generating public data products. The project develops and publishes Digital Terrain Models (DTM) for the European seas. These are produced from survey and aggregated data sets. The portal provides a versatile DTM viewing service with many relevant map layers and functions for retrieving. A further refinement is taking place in the new phase. The presentation will give information on present services of the SeaDataNet infrastructure and services, highlight key achievements in SeaDataNet II so far, and give further insights in the EMODNet Bathymetry progress.
Sakyo, Yumi; Nakayama, Kazuhiro; Komatsu, Hiroko; Setoyama, Yoko
2009-01-01
People are required to take in and comprehend a massive amount of health information and in turn make some serious decisions based on that information. We, at St. Luke's College of Nursing, provide a rich selection of high-quality health information, and have set up Nursing Net (The Kango Net:Kango is Nursing in Japanese). This website provides information for consumers as well as people interested in the nursing profession. In an attempt to identify the needs of users, this study conducted an analysis of the contents on the total consultation page. Many readers voted that responses to nursing techniques and symptoms questions proved instrumental in their queries. Based on the results of this study, we can conclude that this is an easy-to-access, convenient site for getting health information about physical symptoms and nursing techniques.
Deckard, Gloria J; Borkowski, Nancy; Diaz, Deisell; Sanchez, Carlos; Boisette, Serge A
2010-01-01
Designated primary care clinics largely serve low-income and uninsured patients who present a disproportionate number of chronic illnesses and face great difficulty in obtaining the medical care they need, particularly the access to specialty physicians. With limited capacity for providing specialty care, these primary care clinics generally refer patients to safety net hospitals' specialty ambulatory care clinics. A large public safety net health system successfully improved the effectiveness and efficiency of the specialty clinic referral process through application of Lean Six Sigma, an advanced process-improvement methodology and set of tools driven by statistics and engineering concepts.
Providing the Tools for Information Sharing: Net-Centric Enterprise Services
2007-07-01
The Department of Defense (DoD) is establishing a net-centric environment that increasingly leverages shared services and Service-Oriented...transformational program that delivers a set of shared services as part of the DoD’s common infrastructure to enable networked joint force capabilities, improved interoperability, and increased information sharing across mission area services.
BioModels.net Web Services, a free and integrated toolkit for computational modelling software.
Li, Chen; Courtot, Mélanie; Le Novère, Nicolas; Laibe, Camille
2010-05-01
Exchanging and sharing scientific results are essential for researchers in the field of computational modelling. BioModels.net defines agreed-upon standards for model curation. A fundamental one, MIRIAM (Minimum Information Requested in the Annotation of Models), standardises the annotation and curation process of quantitative models in biology. To support this standard, MIRIAM Resources maintains a set of standard data types for annotating models, and provides services for manipulating these annotations. Furthermore, BioModels.net creates controlled vocabularies, such as SBO (Systems Biology Ontology) which strictly indexes, defines and links terms used in Systems Biology. Finally, BioModels Database provides a free, centralised, publicly accessible database for storing, searching and retrieving curated and annotated computational models. Each resource provides a web interface to submit, search, retrieve and display its data. In addition, the BioModels.net team provides a set of Web Services which allows the community to programmatically access the resources. A user is then able to perform remote queries, such as retrieving a model and resolving all its MIRIAM Annotations, as well as getting the details about the associated SBO terms. These web services use established standards. Communications rely on SOAP (Simple Object Access Protocol) messages and the available queries are described in a WSDL (Web Services Description Language) file. Several libraries are provided in order to simplify the development of client software. BioModels.net Web Services make one step further for the researchers to simulate and understand the entirety of a biological system, by allowing them to retrieve biological models in their own tool, combine queries in workflows and efficiently analyse models.
NASA Astrophysics Data System (ADS)
Schaap, Dick M. A.; Fichaut, Michele
2017-04-01
SeaDataCloud marks the third phase of developing the pan-European SeaDataNet infrastructure for marine and ocean data management. The SeaDataCloud project is funded by EU and runs for 4 years from 1st November 2016. It succeeds the successful SeaDataNet II (2011 - 2015) and SeaDataNet (2006 - 2011) projects. SeaDataNet has set up and operates a pan-European infrastructure for managing marine and ocean data and is undertaken by National Oceanographic Data Centres (NODC's) and oceanographic data focal points from 34 coastal states in Europe. The infrastructure comprises a network of interconnected data centres and central SeaDataNet portal. The portal provides users a harmonised set of metadata directories and controlled access to the large collections of datasets, managed by the interconnected data centres. The population of directories has increased considerably in cooperation with and involvement in many associated EU projects and initiatives such as EMODnet. SeaDataNet at present gives overview and access to more than 1.9 million data sets for physical oceanography, chemistry, geology, geophysics, bathymetry and biology from more than 100 connected data centres from 34 countries riparian to European seas. SeaDataNet is also active in setting and governing marine data standards, and exploring and establishing interoperability solutions to connect to other e-infrastructures on the basis of standards of ISO (19115, 19139), and OGC (WMS, WFS, CS-W and SWE). Standards and associated SeaDataNet tools are made available at the SeaDataNet portal for wide uptake by data handling and managing organisations. SeaDataCloud aims at further developing standards, innovating services & products, adopting new technologies, and giving more attention to users. Moreover, it is about implementing a cooperation between the SeaDataNet consortium of marine data centres and the EUDAT consortium of e-infrastructure service providers. SeaDataCloud aims at considerably advancing services and increasing their usage by adopting cloud and High Performance Computing technology. SeaDataCloud will empower researchers with a packaged collection of services and tools, tailored to their specific needs, supporting research and enabling generation of added-value products from marine and ocean data. Substantial activities will be focused on developing added-value services, such as data subsetting, analysis, visualisation, and publishing workflows for users, both regular and advanced users, as part of a Virtual Research Environment (VRE). SeaDataCloud aims at a number of leading user communities that have new challenges for upgrading and expanding the SeaDataNet standards and services: Science, EMODnet, Copernicus Marine Environmental Monitoring Service (CMEMS) and EuroGOOS, and International scientific programmes. The presentation will give information on present services of the SeaDataNet infrastructure and services, and the new challenges in SeaDataCloud, and will highlight a number of key achievements in SeaDataCloud so far.
NASA Astrophysics Data System (ADS)
Schaap, Dick M. A.; Fichaut, Michele
2015-04-01
The second phase of the project SeaDataNet is well underway since October 2011. The main objective is to improve operations and to progress towards an efficient data management infrastructure able to handle the diversity and large volume of data collected via research cruises and monitoring activities in European marine waters and global oceans. The SeaDataNet infrastructure comprises a network of interconnected data centres and a central SeaDataNet portal. The portal provides users a unified and transparent overview of the metadata and controlled access to the large collections of data sets, managed by the interconnected data centres, and the various SeaDataNet standards and tools,. SeaDataNet is also setting and governing marine data standards, and exploring and establishing interoperability solutions to connect to other e-infrastructures on the basis of standards of ISO (19115, 19139), OGC (WMS, WFS, CS-W and SWE), and OpenSearch. The population of directories has increased considerably in cooperation and involvement in associated EU projects and initiatives. SeaDataNet now gives overview and access to more than 1.6 million data sets for physical oceanography, chemistry, geology, geophysics, bathymetry and biology from more than 100 connected data centres from 34 countries riparian to European seas. Access to marine data is also a key issue for the implementation of the EU Marine Strategy Framework Directive (MSFD). The EU communication 'Marine Knowledge 2020' underpins the importance of data availability and harmonising access to marine data from different sources. SeaDataNet qualified itself for an active role in the data management component of the EMODnet (European Marine Observation and Data network) that is promoted in the EU Communication. Starting 2009 EMODnet pilot portals have been initiated for marine data themes: digital bathymetry, chemistry, physical oceanography, geology, biology, and seabed habitat mapping. These portals are being expanded to all European sea regions as part of EMODnet Phase 2, which started mid 2013. EMODnet encourages more data providers to come forward for data sharing and participating in the process of making complete overviews and homogeneous data products. The EMODnet Bathymetry project is very illustrative for the synergy between SeaDataNet and EMODnet and added value of generating public data products. The project develops and publishes Digital Terrain Models (DTM) for the European seas. These are produced from survey and aggregated data sets. The portal provides a versatile DTM viewing service with many relevant map layers and functions for retrieving. A further refinement is taking place as part of phase 2. The presentation will highlight key achievements in SeaDataNet II and give further details and views on the new EMODNet Digital Bathymetry for European seas as to be released early 2015.
Public-domain-software solution to data-access problems for numerical modelers
Jenter, Harry; Signell, Richard
1992-01-01
Unidata's network Common Data Form, netCDF, provides users with an efficient set of software for scientific-data-storage, retrieval, and manipulation. The netCDF file format is machine-independent, direct-access, self-describing, and in the public domain, thereby alleviating many problems associated with accessing output from large hydrodynamic models. NetCDF has programming interfaces in both the Fortran and C computer language with an interface to C++ planned for release in the future. NetCDF also has an abstract data type that relieves users from understanding details of the binary file structure; data are written and retrieved by an intuitive, user-supplied name rather than by file position. Users are aided further by Unidata's inclusion of the Common Data Language, CDL, a printable text-equivalent of the contents of a netCDF file. Unidata provides numerous operators and utilities for processing netCDF files. In addition, a number of public-domain and proprietary netCDF utilities from other sources are available at this time or will be available later this year. The U.S. Geological Survey has produced and is producing a number of public-domain netCDF utilities.
EMODNet Bathymetry - building and providing a high resolution digital bathymetry for European seas
NASA Astrophysics Data System (ADS)
Schaap, D.
2016-12-01
Access to marine data is a key issue for the EU Marine Strategy Framework Directive and the EU Marine Knowledge 2020 agenda and includes the European Marine Observation and Data Network (EMODnet) initiative. The EMODnet Bathymetry project develops and publishes Digital Terrain Models (DTM) for the European seas. These are produced from survey and aggregated data sets that are indexed with metadata by adopting from SeaDataNet the Common Data Index (CDI) data discovery and access service and the Sextant data products catalogue service. SeaDataNet is a network of major oceanographic data centres around the European seas that manage, operate and further develop a pan-European infrastructure for marine and ocean data management. SeaDataNet is also setting and governing marine data standards, and exploring and establishing interoperability solutions to connect to other e-infrastructures on the basis of standards such as ISO and OGC. The SeaDataNet portal provides users a number of interrelated meta directories, an extensive range of controlled vocabularies, and the various SeaDataNet standards and tools. SeaDataNet at present gives overview and access to more than 1.8 million data sets for physical oceanography, chemistry, geology, geophysics, bathymetry and biology from more than 100 connected data centres from 34 countries riparian to European seas. The latest EMODnet Bathymetry DTM has a resolution of 1/8 arcminute * 1/8 arcminute and covers all European sea regions. Use is made of available and gathered surveys and already more than 13.000 surveys have been indexed by 27 European data providers from 15 countries. Also use is made of composite DTMs as generated and maintained by several data providers for their areas of interest. Already 44 composite DTMs are included in the Sextant data products catalogue. For areas without coverage use is made of the latest global DTM of GEBCO who is partner in the EMODnet Bathymetry project. In return GEBCO integrates the EMODnet DTM to achieve an enriched and better result. The catalogue services and the generated EMODnet can be queried and browsed at the dedicated EMODnet Bathymetry portal which also provides a versatile DTM viewing service with many relevant map layers and functions for retrieving. The EMODnet DTM is publicly available for downloading in various formats.
Precision of channel catfish catch estimates using hoop nets in larger Oklahoma reservoirs
Stewart, David R.; Long, James M.
2012-01-01
Hoop nets are rapidly becoming the preferred gear type used to sample channel catfish Ictalurus punctatus, and many managers have reported that hoop nets effectively sample channel catfish in small impoundments (<200 ha). However, the utility and precision of this approach in larger impoundments have not been tested. We sought to determine how the number of tandem hoop net series affected the catch of channel catfish and the time involved in using 16 tandem hoop net series in larger impoundments (>200 ha). Hoop net series were fished once, set for 3 d; then we used Monte Carlo bootstrapping techniques that allowed us to estimate the number of net series required to achieve two levels of precision (relative standard errors [RSEs] of 15 and 25) at two levels of confidence (80% and 95%). Sixteen hoop net series were effective at obtaining an RSE of 25 with 80% and 95% confidence in all but one reservoir. Achieving an RSE of 15 was often less effective and required 18-96 hoop net series given the desired level of confidence. We estimated that an hour was needed, on average, to deploy and retrieve three hoop net series, which meant that 16 hoop net series per reservoir could be "set" and "retrieved" within a day, respectively. The estimated number of net series to achieve an RSE of 25 or 15 was positively associated with the coefficient of variation (CV) of the sample but not with reservoir surface area or relative abundance. Our results suggest that hoop nets are capable of providing reasonably precise estimates of channel catfish relative abundance and that the relationship with the CV of the sample reported herein can be used to determine the sampling effort for a desired level of precision.
Lauret, Gert-Jan; Gijsbers, Harm J H; Hendriks, Erik J M; Bartelink, Marie-Louise; de Bie, Rob A; Teijink, Joep A W
2012-01-01
Intermittent claudication (IC) is a manifestation of peripheral arterial occlusive disease (PAOD). Besides cardiovascular risk management, supervised exercise therapy (SET) should be offered to all patients with IC. Outdated guidelines, an insufficient number of specialized physiotherapists (PTs), lack of awareness of the importance of SET by referring physicians, and misguided financial incentives all seriously impede the availability of a structured SET program in The Netherlands. By initiating regional care networks, ClaudicatioNet aims to improve the quality of care for patients with IC. Based on the chronic care model as a conceptual framework, these networks should enhance the access, continuity, and (cost) efficiency of the health care system. With the aid of a national database, health care professionals will be able to benchmark patient results while ClaudicatioNet will be able to monitor quality of care by way of functional and patient reported outcome measures. The success of ClaudicatioNet is dependent on several factors. Vascular surgeons, general practitioners and coordinating central caregivers will need to team up and work in close collaboration with specialized PTs. A substantial task in the upcoming years will be to monitor the quality, volume, and distribution of ClaudicatioNet PTs. Finally, misguided financial incentives within the Dutch health care system need to be tackled. With ClaudicatioNet, integrated care pathways are likely to improve in the upcoming years. This should result in the achievement of optimal quality of care for all patients with IC.
Stern, Rachel J; Fernandez, Alicia; Jacobs, Elizabeth A; Neilands, Torsten B; Weech-Maldonado, Robert; Quan, Judy; Carle, Adam; Seligman, Hilary K
2012-09-01
Providing culturally competent care shows promise as a mechanism to reduce health care inequalities. Until the recent development of the Consumer Assessment of Healthcare Providers and Systems Cultural Competency Item Set (CAHPS-CC), no measures capturing patient-level experiences with culturally competent care have been suitable for broad-scale administration. We performed confirmatory factor analysis and internal consistency reliability analysis of CAHPS-CC among patients with type 2 diabetes (n=600) receiving primary care in safety-net clinics. CAHPS-CC domains were also correlated with global physician ratings. A 7-factor model demonstrated satisfactory fit (χ²₂₃₁=484.34, P<0.0001) with significant factor loadings at P<0.05. Three domains showed excellent reliability-Doctor Communication-Positive Behaviors (α=0.82), Trust (α=0.77), and Doctor Communication-Health Promotion (α=0.72). Four domains showed inadequate reliability either among Spanish speakers or overall (overall reliabilities listed): Doctor Communication-Negative Behaviors (α=0.54), Equitable Treatment (α=0.69), Doctor Communication-Alternative Medicine (α=0.52), and Shared Decision-Making (α=0.51). CAHPS-CC domains were positively and significantly correlated with global physician rating. Select CAHPS-CC domains are suitable for broad-scale administration among safety-net patients. Those domains may be used to target quality-improvement efforts focused on providing culturally competent care in safety-net settings.
Gresenz, Carole Roan; Rogowski, Jeannette; Escarce, José J
2006-03-01
Despite concerted policy efforts, a sizeable percentage of children lack health insurance coverage. This article examines the impact of the health care safety net and health care market structure on the use of health care by uninsured children. We used the Medical Expenditure Panel Survey linked with data from multiple sources to analyze health care utilization among uninsured children. We ran analyses separately for children who lived in rural and urban areas and assessed the effects on utilization of the availability of safety net providers, safety net funding, supply of primary care physicians, health maintenance organization penetration, and the percentage of people who are uninsured, controlling for other factors that influence use. Fewer than half of uninsured children had office-based visits to health care providers during the year, 8% of rural and 10% of urban children visited the emergency department at least once, and just over half of children had medical expenditures or charges during the year. Among uninsured children in rural areas, living closer to a safety net provider and living in an area with a higher supply of primary care physicians were positively associated with higher use and medical expenditures. In urban areas, the supply of primary care physicians and the level of safety net funding were positively associated with uninsured children's medical expenditures, whereas the percentage of the population that was uninsured was negatively associated with use of the emergency department. Uninsured children had low levels of utilization over a range of different health care provider types and settings. The availability of safety net providers in the local area and the safety net's capacity to serve the uninsured influence access to care among children. Possible measures for ensuring access to health care among uninsured children include increasing the density of safety net providers in rural areas, enhancing funding for the safety net, and policies to increase primary care physician supply.
Marchini, Giovanni Scala; Rai, Aayushi; De, Shubha; Sarkissian, Carl; Monga, Manoj
2013-01-01
to test the effect of stone entrapment on laser lithotripsy efficiency. Spherical stone phantoms were created using the BegoStone® plaster. Lithotripsy of one stone (1.0 g) per test jar was performed with Ho:YAG laser (365 µm fiber; 1 minute/trial). Four laser settings were tested: I-0.8 J,8 Hz; II-0.2J,50 Hz; III-0.5 J,50 Hz; IV-1.5 J,40 Hz. Uro-Net (US Endoscopy) deployment was used in 3/9 trials. Post-treatment, stone fragments were strained though a 1mm sieve; after a 7-day drying period fragments and unfragmented stone were weighed. Uro-Net nylon mesh and wire frame resistance were tested (laser fired for 30s). All nets used were evaluated for functionality and strength (compared to 10 new nets). Student's T test was used to compare the studied parameters; significance was set at p < 0.05. Laser settings I and II caused less damage to the net overall; the mesh and wire frame had worst injuries with setting IV; setting III had an intermediate outcome; 42% of nets were rendered unusable and excluded from strength analysis. There was no difference in mean strength between used functional nets and non-used devices (8.05 vs. 7.45 lbs, respectively; p = 0.14). Setting IV was the most efficient for lithotripsy (1.9 ± 0.6 mg/s; p < 0.001) with or without net stabilization; setting III was superior to I and II only if a net was not used. Laser lithotripsy is not optimized by stone entrapment with a net retrieval device which may be damaged by high energy laser settings.
Ceasar, Rachel; Chang, Jamie; Zamora, Kara; Hurstak, Emily; Kushel, Margot; Miaskowski, Christine; Knight, Kelly
2016-01-01
Background Guideline recommendations to reduce prescription opioid misuse among patients with chronic non-cancer pain include the routine use of urine toxicology tests for high-risk patients. Yet little is known about how the implementation of urine toxicology tests among patients with co-occurring chronic non-cancer pain and substance use impacts primary care providers’ management of misuse. In this paper, we present clinicians’ perspectives on the benefits and challenges of implementing urine toxicology tests in the monitoring of opioid misuse and substance use in safety net healthcare settings. Methods We interviewed 23 primary care providers from six safety net healthcare settings whose patients had a diagnosis of co-occurring chronic non-cancer pain and substance use. We transcribed, coded, and analyzed interviews using grounded theory methodology. Results The benefits of implementing urine toxicology tests for primary care providers included less reliance on intuition to assess for misuse and the ability to identify unknown opioid misuse and/or substance use. The challenges of implementing urine toxicology tests included insufficient education and training about how to interpret and implement tests, and a lack of clarity on how and when to act on tests that indicated misuse and/or substance use. Conclusions These data suggest that primary care clinicians’ lack of education and training to interpret and implement urine toxicology tests may impact their management of patient opioid misuse and/or substance use. Clinicians may benefit from additional education and training about the clinical implementation and use of urine toxicology tests. Additional research is needed on how primary care providers implementation and use of urine toxicology tests impacts chronic non-cancer pain management in primary care and safety net healthcare settings among patients with co-occurring chronic non-cancer pain and substance use. PMID:26682471
A New Network Modeling Tool for the Ground-based Nuclear Explosion Monitoring Community
NASA Astrophysics Data System (ADS)
Merchant, B. J.; Chael, E. P.; Young, C. J.
2013-12-01
Network simulations have long been used to assess the performance of monitoring networks to detect events for such purposes as planning station deployments and network resilience to outages. The standard tool has been the SAIC-developed NetSim package. With correct parameters, NetSim can produce useful simulations; however, the package has several shortcomings: an older language (FORTRAN), an emphasis on seismic monitoring with limited support for other technologies, limited documentation, and a limited parameter set. Thus, we are developing NetMOD (Network Monitoring for Optimal Detection), a Java-based tool designed to assess the performance of ground-based networks. NetMOD's advantages include: coded in a modern language that is multi-platform, utilizes modern computing performance (e.g. multi-core processors), incorporates monitoring technologies other than seismic, and includes a well-validated default parameter set for the IMS stations. NetMOD is designed to be extendable through a plugin infrastructure, so new phenomenological models can be added. Development of the Seismic Detection Plugin is being pursued first. Seismic location and infrasound and hydroacoustic detection plugins will follow. By making NetMOD an open-release package, it can hopefully provide a common tool that the monitoring community can use to produce assessments of monitoring networks and to verify assessments made by others.
PyPedal, an open source software package for pedigree analysis
USDA-ARS?s Scientific Manuscript database
The open source software package PyPedal (http://pypedal.sourceforge.net/) was first released in 2002, and provided users with a set of simple tools for manipulating pedigrees. Its flexibility has been demonstrated by its used in a number of settings for large and small populations. After substantia...
Relation between SM-covers and SM-decompositions of Petri nets
NASA Astrophysics Data System (ADS)
Karatkevich, Andrei; Wiśniewski, Remigiusz
2015-12-01
A task of finding for a given Petri net a set of sequential components being able to represent together the behavior of the net arises often in formal analysis of Petri nets and in applications of Petri net to logical control. Such task can be met in two different variants: obtaining a Petri net cover or a decomposition. Petri net cover supposes that a set of the subnets of given net is selected, and the sequential nets forming a decomposition may have additional places, which do not belong to the decomposed net. The paper discusses difference and relations between two mentioned tasks and their results.
Stern, RJ; Fernandez, A; Jacobs, EA; Neilands, TB; Weech-Maldonado, R; Quan, J; Carle, A; Seligman, HK
2012-01-01
Background Providing culturally competent care shows promise as a mechanism to reduce healthcare inequalities. Until the recent development of the CAHPS Cultural Competency Item Set (CAHPS-CC), no measures capturing patient-level experiences with culturally competent care have been suitable for broad-scale administration. Methods We performed confirmatory factor analysis and internal consistency reliability analysis of CAHPS-CC among patients with type 2 diabetes (n=600) receiving primary care in safety-net clinics. CAHPS-CC domains were also correlated with global physician ratings. Results A 7-factor model demonstrated satisfactory fit (χ2(231)=484.34, p<.0001) with significant factor loadings at p<.05. Three domains showed excellent reliability – Doctor Communication- Positive Behaviors (α=.82), Trust (α=.77), and Doctor Communication- Health Promotion (α=.72). Four domains showed inadequate reliability either among Spanish speakers or overall (overall reliabilities listed): Doctor Communication- Negative Behaviors (α=.54), Equitable Treatment (α=.69), Doctor Communication- Alternative Medicine (α=.52), and Shared Decision-Making (α=.51). CAHPS-CC domains were positively and significantly correlated with global physician rating. Conclusions Select CAHPS-CC domains are suitable for broad-scale administration among safety-net patients. Those domains may be used to target quality-improvement efforts focused on providing culturally competent care in safety-net settings. PMID:22895231
Portable platforms for setting rocket nets in open-water areas
Cox, R.R.; Afton, A.D.
1994-01-01
Rocket-netting of aquatic birds is generally done from permanent sites that are free of vegetation and debris to allow visibility and unobstructed projection of nets. We developed a technique for setting rocket nets on portable platforms to capture waterfowl in open-water habitats.
Lauret, Gert-Jan; Gijsbers, Harm JH; Hendriks, Erik JM; Bartelink, Marie-Louise; de Bie, Rob A; Teijink, Joep AW
2012-01-01
Introduction: Intermittent claudication (IC) is a manifestation of peripheral arterial occlusive disease (PAOD). Besides cardiovascular risk management, supervised exercise therapy (SET) should be offered to all patients with IC. Outdated guidelines, an insufficient number of specialized physiotherapists (PTs), lack of awareness of the importance of SET by referring physicians, and misguided financial incentives all seriously impede the availability of a structured SET program in The Netherlands. Description of care practice: By initiating regional care networks, ClaudicatioNet aims to improve the quality of care for patients with IC. Based on the chronic care model as a conceptual framework, these networks should enhance the access, continuity, and (cost) efficiency of the health care system. With the aid of a national database, health care professionals will be able to benchmark patient results while ClaudicatioNet will be able to monitor quality of care by way of functional and patient reported outcome measures. Discussion: The success of ClaudicatioNet is dependent on several factors. Vascular surgeons, general practitioners and coordinating central caregivers will need to team up and work in close collaboration with specialized PTs. A substantial task in the upcoming years will be to monitor the quality, volume, and distribution of ClaudicatioNet PTs. Finally, misguided financial incentives within the Dutch health care system need to be tackled. Conclusion: With ClaudicatioNet, integrated care pathways are likely to improve in the upcoming years. This should result in the achievement of optimal quality of care for all patients with IC. PMID:22942648
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bosler, Peter
Stride Search provides a flexible tool for detecting storms or other extreme climate events in high-resolution climate data sets saved on uniform latitude-longitude grids in standard NetCDF format. Users provide the software a quantitative description of a meteorological event they are interested in; the software searches a data set for locations in space and time that meet the user’s description. In its first stage, Stride Search performs a spatial search of the data set at each timestep by dividing a search domain into circular sectors of constant geodesic radius. Data from a netCDF file is read into memory for eachmore » circular search sector. If the data meet or exceed a set of storm identification criteria (defined by the user), a storm is recorded to a linked list. Finally, the linked list is examined and duplicate detections of the same storm are removed and the results are written to an output file. The first stage’s output file is read by a second program that builds storm. Additional identification criteria may be applied at this stage to further classify storms. Storm tracks are the software’s ultimate output and routines are provided for formatting that output for various external software libraries for plotting and tabulating data.« less
DeepID-Net: Deformable Deep Convolutional Neural Networks for Object Detection.
Ouyang, Wanli; Zeng, Xingyu; Wang, Xiaogang; Qiu, Shi; Luo, Ping; Tian, Yonglong; Li, Hongsheng; Yang, Shuo; Wang, Zhe; Li, Hongyang; Loy, Chen Change; Wang, Kun; Yan, Junjie; Tang, Xiaoou
2016-07-07
In this paper, we propose deformable deep convolutional neural networks for generic object detection. This new deep learning object detection framework has innovations in multiple aspects. In the proposed new deep architecture, a new deformation constrained pooling (def-pooling) layer models the deformation of object parts with geometric constraint and penalty. A new pre-training strategy is proposed to learn feature representations more suitable for the object detection task and with good generalization capability. By changing the net structures, training strategies, adding and removing some key components in the detection pipeline, a set of models with large diversity are obtained, which significantly improves the effectiveness of model averaging. The proposed approach improves the mean averaged precision obtained by RCNN [16], which was the state-of-the-art, from 31% to 50.3% on the ILSVRC2014 detection test set. It also outperforms the winner of ILSVRC2014, GoogLeNet, by 6.1%. Detailed component-wise analysis is also provided through extensive experimental evaluation, which provides a global view for people to understand the deep learning object detection pipeline.
Review of FEWS NET Biophysical Monitoring Requirements
NASA Technical Reports Server (NTRS)
Ross, K. W.; Brown, Molly E.; Verdin, J.; Underwood, L. W.
2009-01-01
The Famine Early Warning System Network (FEWS NET) provides monitoring and early warning support to decision makers responsible for responding to famine and food insecurity. FEWS NET transforms satellite remote sensing data into rainfall and vegetation information that can be used by these decision makers. The National Aeronautics and Space Administration has recently funded activities to enhance remote sensing inputs to FEWS NET. To elicit Earth observation requirements, a professional review questionnaire was disseminated to FEWS NET expert end-users: it focused upon operational requirements to determine additional useful remote sensing data and; subsequently, beneficial FEWS NET biophysical supplementary inputs. The review was completed by over 40 experts from around the world, enabling a robust set of professional perspectives to be gathered and analyzed rapidly. Reviewers were asked to evaluate the relative importance of environmental variables and spatio-temporal requirements for Earth science data products, in particular for rainfall and vegetation products. The results showed that spatio-temporal resolution requirements are complex and need to vary according to place, time, and hazard: that high resolution remote sensing products continue to be in demand, and that rainfall and vegetation products were valued as data that provide actionable food security information.
A generic model for evaluating payor net cost savings from a disease management program.
McKay, Niccie L
2006-01-01
Private and public payors increasingly are turning to disease management programs as a means of improving the quality of care provided and controlling expenditures for individuals with specific medical conditions. This article presents a generic model that can be adapted to evaluate payor net cost savings from a variety of types of disease management programs, with net cost savings taking into account both changes in expenditures resulting from the program and the costs of setting up and operating the program. The model specifies the required data, describes the data collection process, and shows how to calculate the net cost savings in a spreadsheet format. An accompanying hypothetical example illustrates how to use the model.
A Mixed-Methods Study of Patient-Provider E-mail Content in a Safety-Net Setting
Mirsky, Jacob B.; Tieu, Lina; Lyles, Courtney; Sarkar, Urmimala
2016-01-01
Objective To explore the content of patient-provider e-mails in a safety-net primary care clinic. Methods We conducted a content analysis using inductive and deductive coding of e-mail exchanges (n=31) collected from January through November of 2013. Participants were English-speaking adult patients with a chronic condition (or their caregivers) cared for at a single publicly-funded general internal medicine clinic and their primary care providers (attending general internist physicians, clinical fellows, internal medicine residents, and nurse practitioners). Results All e-mails were non-urgent. Patients included a medical update in 19% of all e-mails. Patients requested action in 77% of e-mails, and the most common requests overall were for action regarding medications or treatment (29%). Requests for information were less common (45% of e-mails). Patient requests (n=56) were resolved in 84% of e-mail exchanges, resulting in 63 actions. Conclusion Patients in safety-net clinics are capable of safely and effectively using electronic messaging for between-visit communication with providers. Practical Implications Safety-net systems should implement electronic communications tools as soon as possible to increase healthcare access and enhance patient involvement in their care. PMID:26332306
Application of Petri net based analysis techniques to signal transduction pathways.
Sackmann, Andrea; Heiner, Monika; Koch, Ina
2006-11-02
Signal transduction pathways are usually modelled using classical quantitative methods, which are based on ordinary differential equations (ODEs). However, some difficulties are inherent in this approach. On the one hand, the kinetic parameters involved are often unknown and have to be estimated. With increasing size and complexity of signal transduction pathways, the estimation of missing kinetic data is not possible. On the other hand, ODEs based models do not support any explicit insights into possible (signal-) flows within the network. Moreover, a huge amount of qualitative data is available due to high-throughput techniques. In order to get information on the systems behaviour, qualitative analysis techniques have been developed. Applications of the known qualitative analysis methods concern mainly metabolic networks. Petri net theory provides a variety of established analysis techniques, which are also applicable to signal transduction models. In this context special properties have to be considered and new dedicated techniques have to be designed. We apply Petri net theory to model and analyse signal transduction pathways first qualitatively before continuing with quantitative analyses. This paper demonstrates how to build systematically a discrete model, which reflects provably the qualitative biological behaviour without any knowledge of kinetic parameters. The mating pheromone response pathway in Saccharomyces cerevisiae serves as case study. We propose an approach for model validation of signal transduction pathways based on the network structure only. For this purpose, we introduce the new notion of feasible t-invariants, which represent minimal self-contained subnets being active under a given input situation. Each of these subnets stands for a signal flow in the system. We define maximal common transition sets (MCT-sets), which can be used for t-invariant examination and net decomposition into smallest biologically meaningful functional units. The paper demonstrates how Petri net analysis techniques can promote a deeper understanding of signal transduction pathways. The new concepts of feasible t-invariants and MCT-sets have been proven to be useful for model validation and the interpretation of the biological system behaviour. Whereas MCT-sets provide a decomposition of the net into disjunctive subnets, feasible t-invariants describe subnets, which generally overlap. This work contributes to qualitative modelling and to the analysis of large biological networks by their fully automatic decomposition into biologically meaningful modules.
Application of Petri net based analysis techniques to signal transduction pathways
Sackmann, Andrea; Heiner, Monika; Koch, Ina
2006-01-01
Background Signal transduction pathways are usually modelled using classical quantitative methods, which are based on ordinary differential equations (ODEs). However, some difficulties are inherent in this approach. On the one hand, the kinetic parameters involved are often unknown and have to be estimated. With increasing size and complexity of signal transduction pathways, the estimation of missing kinetic data is not possible. On the other hand, ODEs based models do not support any explicit insights into possible (signal-) flows within the network. Moreover, a huge amount of qualitative data is available due to high-throughput techniques. In order to get information on the systems behaviour, qualitative analysis techniques have been developed. Applications of the known qualitative analysis methods concern mainly metabolic networks. Petri net theory provides a variety of established analysis techniques, which are also applicable to signal transduction models. In this context special properties have to be considered and new dedicated techniques have to be designed. Methods We apply Petri net theory to model and analyse signal transduction pathways first qualitatively before continuing with quantitative analyses. This paper demonstrates how to build systematically a discrete model, which reflects provably the qualitative biological behaviour without any knowledge of kinetic parameters. The mating pheromone response pathway in Saccharomyces cerevisiae serves as case study. Results We propose an approach for model validation of signal transduction pathways based on the network structure only. For this purpose, we introduce the new notion of feasible t-invariants, which represent minimal self-contained subnets being active under a given input situation. Each of these subnets stands for a signal flow in the system. We define maximal common transition sets (MCT-sets), which can be used for t-invariant examination and net decomposition into smallest biologically meaningful functional units. Conclusion The paper demonstrates how Petri net analysis techniques can promote a deeper understanding of signal transduction pathways. The new concepts of feasible t-invariants and MCT-sets have been proven to be useful for model validation and the interpretation of the biological system behaviour. Whereas MCT-sets provide a decomposition of the net into disjunctive subnets, feasible t-invariants describe subnets, which generally overlap. This work contributes to qualitative modelling and to the analysis of large biological networks by their fully automatic decomposition into biologically meaningful modules. PMID:17081284
Electro shield system applications on set gill net as efforts to preserve shark resources
NASA Astrophysics Data System (ADS)
Fitri Aristi, DP; Boesono, H.; Prihantoko, K. E.; Gautama, D. Y.
2018-05-01
Sharks are kind of ETP biota (Endangered, Threatened, and Protected), and are generally caught as by catch during fishing operations. In addition, sharks are one of the biota that plays a role in the life cycle in coastal waters. The Electro Shield System (ESS) was a device with an electromagnetic wave source that the shark can detect and make it afraid. ESS can be applied to set gill net operation to prevent the shark from getting caught. The objective of the study was to analyze the ESS on shark catches during set gill net operations. The research method was experimental fishing, conducted in March-May 2017 in Bangka Belitung Islands, Indonesia. Design the study by comparing shark catches during set gill net operation between those without using ESS (control) and using ESS with frequency 55 Hz. The shark catch by using Electro Shield System was 5.26% lower than control (7.80%). T-student analysis (sign 0.05) indicates that there was a significant difference between the set gill net without ESS and using the ESS against shark biota as bycatch. This indicates that the application of ESS in set gill net can reduce the capture of shark as by catch.
Chang, Jamie Suki; Kushel, Margot; Miaskowski, Christine; Ceasar, Rachel; Zamora, Kara; Hurstak, Emily; Knight, Kelly R.
2017-01-01
Background In the US and internationally, providers have adopted guidelines on the management of prescription opioids for chronic non-cancer pain (CNCP). For “high-risk” patients with co-occurring CNCP and a history of substance use, guidelines advise providers to monitor patients using urine toxicology screening tests, develop opioid management plans, and refer patients to substance use treatment. Objective We report primary care provider experiences in the safety net interpreting and implementing guideline recommendations for patients with CNCP and substance use. Methods We interviewed primary care providers who work in the safety net (N=23) on their experiences managing CNCP and substance use. We analyzed interviews using a content analysis method. Results Providers found management plans and urine toxicology screening tests useful for informing patients about clinic expectations of opioid therapy and substance use. However, they described that guideline-based clinic policies had unintended consequences, such as raising barriers to open, honest dialogue about substance use and treatment. While substance use treatment was recommended for “high-risk” patients, providers described lack of integration with and availability of substance use treatment programs. Conclusions Our findings indicate that clinicians in the safety net found guideline-based clinic policies helpful. However, effective implementation was challenged by barriers to open dialogue about substance use and limited linkages with treatment programs. Further research is needed to examine how the context of safety net settings shapes the management and treatment of co-occurring CNCP and substance use. PMID:27754719
Translating Health Services Research into Practice in the Safety Net.
Moore, Susan L; Fischer, Ilana; Havranek, Edward P
2016-02-01
To summarize research relating to health services research translation in the safety net through analysis of the literature and case study of a safety net system. Literature review and key informant interviews at an integrated safety net hospital. This paper describes the results of a comprehensive literature review of translational science literature as applied to health care paired with qualitative analysis of five key informant interviews conducted with senior-level management at Denver Health and Hospital Authority. Results from the literature suggest that implementing innovation may be more difficult in the safety net due to multiple factors, including financial and organizational constraints. Results from key informant interviews confirmed the reality of financial barriers to innovation implementation but also implied that factors, including institutional respect for data, organizational attitudes, and leadership support, could compensate for disadvantages. Translating research into practice is of critical importance to safety net providers, which are under increased pressure to improve patient care and satisfaction. Results suggest that translational research done in the safety net can better illuminate the special challenges of this setting; more such research is needed. © Health Research and Educational Trust.
PetriScape - A plugin for discrete Petri net simulations in Cytoscape.
Almeida, Diogo; Azevedo, Vasco; Silva, Artur; Baumbach, Jan
2016-06-04
Systems biology plays a central role for biological network analysis in the post-genomic era. Cytoscape is the standard bioinformatics tool offering the community an extensible platform for computational analysis of the emerging cellular network together with experimental omics data sets. However, only few apps/plugins/tools are available for simulating network dynamics in Cytoscape 3. Many approaches of varying complexity exist but none of them have been integrated into Cytoscape as app/plugin yet. Here, we introduce PetriScape, the first Petri net simulator for Cytoscape. Although discrete Petri nets are quite simplistic models, they are capable of modeling global network properties and simulating their behaviour. In addition, they are easily understood and well visualizable. PetriScape comes with the following main functionalities: (1) import of biological networks in SBML format, (2) conversion into a Petri net, (3) visualization as Petri net, and (4) simulation and visualization of the token flow in Cytoscape. PetriScape is the first Cytoscape plugin for Petri nets. It allows a straightforward Petri net model creation, simulation and visualization with Cytoscape, providing clues about the activity of key components in biological networks.
PetriScape - A plugin for discrete Petri net simulations in Cytoscape.
Almeida, Diogo; Azevedo, Vasco; Silva, Artur; Baumbach, Jan
2016-03-01
Systems biology plays a central role for biological network analysis in the post-genomic era. Cytoscape is the standard bioinformatics tool offering the community an extensible platform for computational analysis of the emerging cellular network together with experimental omics data sets. However, only few apps/plugins/tools are available for simulating network dynamics in Cytoscape 3. Many approaches of varying complexity exist but none of them have been integrated into Cytoscape as app/plugin yet. Here, we introduce PetriScape, the first Petri net simulator for Cytoscape. Although discrete Petri nets are quite simplistic models, they are capable of modeling global network properties and simulating their behaviour. In addition, they are easily understood and well visualizable. PetriScape comes with the following main functionalities: (1) import of biological networks in SBML format, (2) conversion into a Petri net, (3) visualization as Petri net, and (4) simulation and visualization of the token flow in Cytoscape. PetriScape is the first Cytoscape plugin for Petri nets. It allows a straightforward Petri net model creation, simulation and visualization with Cytoscape, providing clues about the activity of key components in biological networks.
Catch of channel catfish with tandem-set hoop nets and gill nets in lentic systems of Nebraska
Richters, Lindsey K.; Pope, Kevin L.
2011-01-01
Twenty-six Nebraska water bodies representing two ecosystem types (small standing waters and large standing waters) were surveyed during 2008 and 2009 with tandem-set hoop nets and experimental gill nets to determine if similar trends existed in catch rates and size structures of channel catfish Ictalurus punctatus captured with these gears. Gear efficiency was assessed as the number of sets (nets) that would be required to capture 100 channel catfish given observed catch per unit effort (CPUE). Efficiency of gill nets was not correlated with efficiency of hoop nets for capturing channel catfish. Small sample sizes prohibited estimation of proportional size distributions in most surveys; in the four surveys for which sample size was sufficient to quantify length-frequency distributions of captured channel catfish, distributions differed between gears. The CPUE of channel catfish did not differ between small and large water bodies for either gear. While catch rates of hoop nets were lower than rates recorded in previous studies, this gear was more efficient than gill nets at capturing channel catfish. However, comparisons of size structure between gears may be problematic.
Liang, Shuting; Kegler, Michelle C; Cotter, Megan; Emily, Phillips; Beasley, Derrick; Hermstad, April; Morton, Rentonia; Martinez, Jeremy; Riehman, Kara
2016-08-02
Implementing evidence-based practices (EBPs) to increase cancer screenings in safety net primary care systems has great potential for reducing cancer disparities. Yet there is a gap in understanding the factors and mechanisms that influence EBP implementation within these high-priority systems. Guided by the Consolidated Framework for Implementation Research (CFIR), our study aims to fill this gap with a multiple case study of health care safety net systems that were funded by an American Cancer Society (ACS) grants program to increase breast and colorectal cancer screening rates. The initiative funded 68 safety net systems to increase cancer screening through implementation of evidence-based provider and client-oriented strategies. Data are from a mixed-methods evaluation with nine purposively selected safety net systems. Fifty-two interviews were conducted with project leaders, implementers, and ACS staff. Funded safety net systems were categorized into high-, medium-, and low-performing cases based on the level of EBP implementation. Within- and cross-case analyses were performed to identify CFIR constructs that influenced level of EBP implementation. Of 39 CFIR constructs examined, six distinguished levels of implementation. Two constructs were from the intervention characteristics domain: adaptability and trialability. Three were from the inner setting domain: leadership engagement, tension for change, and access to information and knowledge. Engaging formally appointed internal implementation leaders, from the process domain, also distinguished level of implementation. No constructs from the outer setting or individual characteristics domain differentiated systems by level of implementation. Our study identified a number of influential CFIR constructs and illustrated how they impacted EBP implementation across a variety of safety net systems. Findings may inform future dissemination efforts of EBPs for increasing cancer screening in similar settings. Moreover, our analytic approach is similar to previous case studies using CFIR and hence could facilitate comparisons across studies.
BOREAS RSS-14 Level -3 Gridded Radiometer and Satellite Surface Radiation Images
NASA Technical Reports Server (NTRS)
Hall, Forrest G. (Editor); Nickeson, Jaime (Editor); Hodges, Gary; Smith, Eric A.
2000-01-01
The BOREAS RSS-14 team collected and processed GOES-7 and -8 images of the BOREAS region as part of its effort to characterize the incoming, reflected, and emitted radiation at regional scales. This data set contains surface radiation parameters, such as net radiation and net solar radiation, that have been interpolated from GOES-7 images and AMS data onto the standard BOREAS mapping grid at a resolution of 5 km N-S and E-W. While some parameters are taken directly from the AMS data set, others have been corrected according to calibrations carried out during IFC-2 in 1994. The corrected values as well as the uncorrected values are included. For example, two values of net radiation are provided: an uncorrected value (Rn), and a value that has been corrected according to the calibrations (Rn-COR). The data are provided in binary image format data files. Some of the data files on the BOREAS CD-ROMs have been compressed using the Gzip program. See section 8.2 for details. The data files are available on a CD-ROM (see document number 20010000884), or from the Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC).
Zapata, Carly; Lum, Hillary D; Wistar, Emily; Horton, Claire; Sudore, Rebecca L
2018-02-20
Primary care providers in safety-net settings often do not have time to discuss advance care planning (ACP). Group visits (GV) may be an efficient means to provide ACP education. To assess the feasibility and impact of a video-based website to facilitate GVs to engage diverse adults in ACP. Feasibility pilot among patients who were ≥55 years of age from two primary care clinics in a Northern California safety-net setting. Participants attended two 90-minute GVs and viewed the five steps of the movie version of the PREPARE website ( www.prepareforyourcare.org ) concerning surrogates, values, and discussing wishes in video format. Two clinician facilitators were available to encourage participation. We assessed pre-to-post ACP knowledge, whether participants designated a surrogate or completed an advance directive (AD), and acceptability of GVs and PREPARE materials. We conducted two GVs with 22 participants. Mean age was 64 years (±7), 55% were women, 73% nonwhite, and 55% had limited literacy. Knowledge improved about surrogate designation (46% correct pre vs. 85% post, p = 0.01) and discussing decisions with others (59% vs. 90%, p = 0.01). Surrogate designation increased (48% vs. 85%, p = 0.01) and there was a trend toward AD completion (9% vs. 24%, p = 0.21). Participants rated the GVs and PREPARE materials a mean of 8 (±3.1) on a 10-point acceptability scale. Using the PREPARE movie to facilitate ACP GVs for diverse adults in safety net, primary care settings is feasible and shows potential for increasing ACP engagement.
IntNetLncSim: an integrative network analysis method to infer human lncRNA functional similarity
Hu, Yang; Yang, Haixiu; Zhou, Chen; Sun, Jie; Zhou, Meng
2016-01-01
Increasing evidence indicated that long non-coding RNAs (lncRNAs) were involved in various biological processes and complex diseases by communicating with mRNAs/miRNAs each other. Exploiting interactions between lncRNAs and mRNA/miRNAs to lncRNA functional similarity (LFS) is an effective method to explore function of lncRNAs and predict novel lncRNA-disease associations. In this article, we proposed an integrative framework, IntNetLncSim, to infer LFS by modeling the information flow in an integrated network that comprises both lncRNA-related transcriptional and post-transcriptional information. The performance of IntNetLncSim was evaluated by investigating the relationship of LFS with the similarity of lncRNA-related mRNA sets (LmRSets) and miRNA sets (LmiRSets). As a result, LFS by IntNetLncSim was significant positively correlated with the LmRSet (Pearson correlation γ2=0.8424) and LmiRSet (Pearson correlation γ2=0.2601). Particularly, the performance of IntNetLncSim is superior to several previous methods. In the case of applying the LFS to identify novel lncRNA-disease relationships, we achieved an area under the ROC curve (0.7300) in experimentally verified lncRNA-disease associations based on leave-one-out cross-validation. Furthermore, highly-ranked lncRNA-disease associations confirmed by literature mining demonstrated the excellent performance of IntNetLncSim. Finally, a web-accessible system was provided for querying LFS and potential lncRNA-disease relationships: http://www.bio-bigdata.com/IntNetLncSim. PMID:27323856
IntNetLncSim: an integrative network analysis method to infer human lncRNA functional similarity.
Cheng, Liang; Shi, Hongbo; Wang, Zhenzhen; Hu, Yang; Yang, Haixiu; Zhou, Chen; Sun, Jie; Zhou, Meng
2016-07-26
Increasing evidence indicated that long non-coding RNAs (lncRNAs) were involved in various biological processes and complex diseases by communicating with mRNAs/miRNAs each other. Exploiting interactions between lncRNAs and mRNA/miRNAs to lncRNA functional similarity (LFS) is an effective method to explore function of lncRNAs and predict novel lncRNA-disease associations. In this article, we proposed an integrative framework, IntNetLncSim, to infer LFS by modeling the information flow in an integrated network that comprises both lncRNA-related transcriptional and post-transcriptional information. The performance of IntNetLncSim was evaluated by investigating the relationship of LFS with the similarity of lncRNA-related mRNA sets (LmRSets) and miRNA sets (LmiRSets). As a result, LFS by IntNetLncSim was significant positively correlated with the LmRSet (Pearson correlation γ2=0.8424) and LmiRSet (Pearson correlation γ2=0.2601). Particularly, the performance of IntNetLncSim is superior to several previous methods. In the case of applying the LFS to identify novel lncRNA-disease relationships, we achieved an area under the ROC curve (0.7300) in experimentally verified lncRNA-disease associations based on leave-one-out cross-validation. Furthermore, highly-ranked lncRNA-disease associations confirmed by literature mining demonstrated the excellent performance of IntNetLncSim. Finally, a web-accessible system was provided for querying LFS and potential lncRNA-disease relationships: http://www.bio-bigdata.com/IntNetLncSim.
Patient preferences and access to text messaging for health care reminders in a safety-net setting.
Zallman, Leah; Bearse, Adriana; West, Catherine; Bor, David; McCormick, Danny
2017-01-01
Text messaging may be an effective method for providing health care reminders to patients. We aimed to understand patient access to and preferences for receiving health-related reminders via text message among patients receiving care in safety-net hospitals. We conducted face-to-face surveys with 793 patients seeking care in three hospital emergency departments at a large safety-net institution and determined clinical and demographic predictors of preferences for text messaging for health care reminders. 95% of respondents reported having daily access to text messaging. Text messaging was preferred over e-mail, phone, and letters for communication. 78% of respondents wanted to receive appointment reminders, 56% wanted expiring insurance reminders, and 36% wanted reminders to take their medications. We found no clinical predictors but did find some demographic predictors-including age, ethnicity, insurance status, and income-of wanting text message reminders. In our convenience sample of safety-net patients, text messaging is an accessible, acceptable, and patient-preferred modality for receiving health care reminders. Text messaging may be a promising patient-centered approach for providing health care and insurance reminders to patients seeking care at safety-net institutions.
NASA Astrophysics Data System (ADS)
Elliott, E. M.; Bain, D. J.; Divers, M. T.; Crowley, K. J.; Povis, K.; Scardina, A.; Steiner, M.
2012-12-01
We describe a newly funded collaborative NSF initiative, ENERGY-NET (Energy, Environment and Society Learning Network), that brings together the Carnegie Museum of Natural History (CMNH) with the Learning Science and Geoscience research strengths at the University of Pittsburgh. ENERGY-NET aims to create rich opportunities for participatory learning and public education in the arena of energy, the environment, and society using an Earth systems science framework. We build upon a long-established teen docent program at CMNH and to form Geoscience Squads comprised of underserved teens. Together, the ENERGY-NET team, including museum staff, experts in informal learning sciences, and geoscientists spanning career stage (undergraduates, graduate students, faculty) provides inquiry-based learning experiences guided by Earth systems science principles. Together, the team works with Geoscience Squads to design "Exploration Stations" for use with CMNH visitors that employ an Earth systems science framework to explore the intersecting lenses of energy, the environment, and society. The goals of ENERGY-NET are to: 1) Develop a rich set of experiential learning activities to enhance public knowledge about the complex dynamics between Energy, Environment, and Society for demonstration at CMNH; 2) Expand diversity in the geosciences workforce by mentoring underrepresented teens, providing authentic learning experiences in earth systems science and life skills, and providing networking opportunities with geoscientists; and 3) Institutionalize ENERGY-NET collaborations among geosciences expert, learning researchers, and museum staff to yield long-term improvements in public geoscience education and geoscience workforce recruiting.
Noor, Abdisalan M.; Moloney, Grainne; Borle, Mohamed; Fegan, Greg W.; Shewchuk, Tanya; Snow, Robert W.
2008-01-01
Background There have been resurgent efforts in Africa to estimate the public health impact of malaria control interventions such as insecticide treated nets (ITNs) following substantial investments in scaling-up coverage in the last five years. Little is known, however, on the effectiveness of ITN in areas of Africa that support low transmission. This hinders the accurate estimation of impact of ITN use on disease burden and its cost-effectiveness in low transmission settings. Methods and Principal Findings Using a stratified two-stage cluster sample design, four cross-sectional studies were undertaken between March-June 2007 across three livelihood groups in an area of low intensity malaria transmission in South Central Somalia. Information on bed net use; age; and sex of all participants were recorded. A finger prick blood sample was taken from participants to examine for parasitaemia. Mantel-Haenzel methods were used to measure the effect of net use on parasitaemia adjusting for livelihood; age; and sex. A total of 10,587 individuals of all ages were seen of which 10,359 provided full information. Overall net use and parasite prevalence were 12.4% and 15.7% respectively. Age-specific protective effectiveness (PE) of bed net ranged from 39% among <5 years to 72% among 5–14 years old. Overall PE of bed nets was 54% (95% confidence interval 44%–63%) after adjusting for livelihood; sex; and age. Conclusions and Significance Bed nets confer high protection against parasite infection in South Central Somalia. In such areas where baseline transmission is low, however, the absolute reductions in parasitaemia due to wide-scale net use will be relatively small raising questions on the cost-effectiveness of covering millions of people living in such settings in Africa with nets. Further understanding of the progress of disease upon infection against the cost of averting its consequent burden in low transmission areas of Africa is therefore required. PMID:18461178
Noor, Abdisalan M; Moloney, Grainne; Borle, Mohamed; Fegan, Greg W; Shewchuk, Tanya; Snow, Robert W
2008-05-07
There have been resurgent efforts in Africa to estimate the public health impact of malaria control interventions such as insecticide treated nets (ITNs) following substantial investments in scaling-up coverage in the last five years. Little is known, however, on the effectiveness of ITN in areas of Africa that support low transmission. This hinders the accurate estimation of impact of ITN use on disease burden and its cost-effectiveness in low transmission settings. Using a stratified two-stage cluster sample design, four cross-sectional studies were undertaken between March-June 2007 across three livelihood groups in an area of low intensity malaria transmission in South Central Somalia. Information on bed net use; age; and sex of all participants were recorded. A finger prick blood sample was taken from participants to examine for parasitaemia. Mantel-Haenzel methods were used to measure the effect of net use on parasitaemia adjusting for livelihood; age; and sex. A total of 10,587 individuals of all ages were seen of which 10,359 provided full information. Overall net use and parasite prevalence were 12.4% and 15.7% respectively. Age-specific protective effectiveness (PE) of bed net ranged from 39% among <5 years to 72% among 5-14 years old. Overall PE of bed nets was 54% (95% confidence interval 44%-63%) after adjusting for livelihood; sex; and age. Bed nets confer high protection against parasite infection in South Central Somalia. In such areas where baseline transmission is low, however, the absolute reductions in parasitaemia due to wide-scale net use will be relatively small raising questions on the cost-effectiveness of covering millions of people living in such settings in Africa with nets. Further understanding of the progress of disease upon infection against the cost of averting its consequent burden in low transmission areas of Africa is therefore required.
Okumu, Fredros O; Kiware, Samson S; Moore, Sarah J; Killeen, Gerry F
2013-01-16
Indoor residual insecticide spraying (IRS) and long-lasting insecticide treated nets (LLINs) are commonly used together even though evidence that such combinations confer greater protection against malaria than either method alone is inconsistent. A deterministic model of mosquito life cycle processes was adapted to allow parameterization with results from experimental hut trials of various combinations of untreated nets or LLINs (Olyset, PermaNet 2.0, Icon Life nets) with IRS (pirimiphos methyl, lambda cyhalothrin, DDT), in a setting where vector populations are dominated by Anopheles arabiensis, so that community level impact upon malaria transmission at high coverage could be predicted. Intact untreated nets alone provide equivalent personal protection to all three LLINs. Relative to IRS plus untreated nets, community level protection is slightly higher when Olyset or PermaNet 2.0 nets are added onto IRS with pirimiphos methyl or lambda cyhalothrin but not DDT, and when Icon Life nets supplement any of the IRS insecticides. Adding IRS onto any net modestly enhances communal protection when pirimiphos methyl is sprayed, while spraying lambda cyhalothrin enhances protection for untreated nets but not LLINs. Addition of DDT reduces communal protection when added to LLINs. Where transmission is mediated primarily by An. arabiensis, adding IRS to high LLIN coverage provides only modest incremental benefit (e.g. when an organophosphate like pirimiphos methyl is used), but can be redundant (e.g. when a pyrethroid like lambda cyhalothin is used) or even regressive (e.g. when DDT is used for the IRS). Relative to IRS plus untreated nets, supplementing IRS with LLINs will only modestly improve community protection. Beyond the physical protection that intact nets provide, additional protection against transmission by An. arabiensis conferred by insecticides will be remarkably small, regardless of whether they are delivered as LLINs or IRS. The insecticidal action of LLINs and IRS probably already approaches their absolute limit of potential impact upon this persistent vector so personal protection of nets should be enhanced by improving the physical integrity and durability. Combining LLINs and non-pyrethroid IRS in residual transmission systems may nevertheless be justified as a means to manage insecticide resistance and prevent potential rebound of not only An. arabiensis, but also more potent, vulnerable and historically important species such as Anopheles gambiae and Anopheles funestus.
Tuncbag, Nurcan; McCallum, Scott; Huang, Shao-shan Carol; Fraenkel, Ernest
2012-01-01
High-throughput technologies including transcriptional profiling, proteomics and reverse genetics screens provide detailed molecular descriptions of cellular responses to perturbations. However, it is difficult to integrate these diverse data to reconstruct biologically meaningful signaling networks. Previously, we have established a framework for integrating transcriptional, proteomic and interactome data by searching for the solution to the prize-collecting Steiner tree problem. Here, we present a web server, SteinerNet, to make this method available in a user-friendly format for a broad range of users with data from any species. At a minimum, a user only needs to provide a set of experimentally detected proteins and/or genes and the server will search for connections among these data from the provided interactomes for yeast, human, mouse, Drosophila melanogaster and Caenorhabditis elegans. More advanced users can upload their own interactome data as well. The server provides interactive visualization of the resulting optimal network and downloadable files detailing the analysis and results. We believe that SteinerNet will be useful for researchers who would like to integrate their high-throughput data for a specific condition or cellular response and to find biologically meaningful pathways. SteinerNet is accessible at http://fraenkel.mit.edu/steinernet. PMID:22638579
Protocols for Handling Messages Between Simulation Computers
NASA Technical Reports Server (NTRS)
Balcerowski, John P.; Dunnam, Milton
2006-01-01
Practical Simulator Network (PSimNet) is a set of data-communication protocols designed especially for use in handling messages between computers that are engaging cooperatively in real-time or nearly-real-time training simulations. In a typical application, computers that provide individualized training at widely dispersed locations would communicate, by use of PSimNet, with a central host computer that would provide a common computational- simulation environment and common data. Originally intended for use in supporting interfaces between training computers and computers that simulate the responses of spacecraft scientific payloads, PSimNet could be especially well suited for a variety of other applications -- for example, group automobile-driver training in a classroom. Another potential application might lie in networking of automobile-diagnostic computers at repair facilities to a central computer that would compile the expertise of numerous technicians and engineers and act as an expert consulting technician.
Komenaka, Ian K; Nodora, Jesse N; Madlensky, Lisa; Winton, Lisa M; Heberer, Meredith A; Schwab, Richard B; Weitzel, Jeffrey N; Martinez, Maria Elena
2016-07-01
Some communities and populations lack access to genetic cancer risk assessment (GCRA) and testing. This is particularly evident in safety-net institutions, which serve a large segment of low-income, uninsured individuals. We describe the experience of a safety-net clinic with limited resources in providing GCRA and BRCA1/2 testing. We compared the proportion and characteristics of high-risk women who were offered and underwent GCRA and genetic testing. We also provide a description of the mutation profile for affected women. All 125 patients who were offered GCRA accepted to undergo GCRA. Of these, 72 % had a breast cancer diagnosis, 70 % were Hispanic, 52.8 % were non-English speakers, and 66 % did not have health insurance. Eighty four (67 %) were offered genetic testing and 81 (96 %) agreed. Hispanic women, those with no medical insurance, and those with a family history of breast cancer were significantly more likely to undergo testing (p > 0.01). Twelve of 81 (15 %) patients were found to have deleterious mutations, seven BRCA1, and five BRCA2. Our experience shows that it is possible to offer GCRA and genetic testing even in the setting of limited resources for these services. This is important given that a large majority of the low-income women in our study agreed to undergo counseling and testing. Our experience could serve as a model for similar low-resource safety-net health settings.
Bhatavadekar, Neel B; Rozier, R Gary; Konrad, Thomas R
2011-06-01
Access to oral health care among low income populations is a growing problem. The National Health Service Corps (NHSC) might increase the supply of dentists motivated to provide services for this population. To determine if North Carolina dentists who began a service obligation with the NHSC in 1990-1999 continued to provide care for underserved populations and if they differ from non-NHSC alumni primary care dentists who started practice in the state during that same period. All 19 NHSC alumni and 50 comparison dentists were surveyed by mail. NHSC alumni also responded to selected items in a telephone follow-up interview. The two groups were compared using difference of means tests and multivariate contingency tables. National Health Service Corps alumni were more likely to be African-American (38%vs. 10%), work in safety net practices (84%vs. 23%), and see more publicly insured patients (60%vs. 19%) than comparison dentists. Yet their job satisfaction was comparable to non-NHSC alumni dentists. Analyses suggested that current practice in safety net settings is affected by dentists' race, altruistic motivations and previous NHSC participation. CONCLUSION AND POLICY IMPLICATION: Targeted recruitment of African-American dentists and others wanting to work in underserved communities could amplify the effectiveness of the financial incentive of NHSC loan repayment and induce dentists to remain in 'safety net' settings. © 2011 FDI World Dental Federation.
Koenker, Hannah M; Loll, Dana; Rweyemamu, Datius; Ali, Abdullah S
2013-06-13
Intensive malaria control interventions in the United Republic of Tanzania have contributed to reductions in malaria prevalence. Given that malaria control remains reliant upon continued use of long-lasting insecticidal bed nets (LLINs) even when the threat of malaria has been reduced, this qualitative study sought to understand how changes in perceived risk influence LLIN usage, and to explore in more detail the benefits of net use that are unrelated to malaria. Eleven focus group discussions were conducted in Bukoba Rural district and in Zanzibar Urban West district in late 2011. Participants were males aged 18 and over, females between the ages of 18 and 49, and females at least 50 years old. The perceived risk of malaria had decreased among the respondents, and malaria control interventions were credited for the decline. Participants cited reductions in both the severity of malaria and in their perceived susceptibility to malaria. However, malaria was still considered a significant threat. Participants' conceptualization of risk appeared to be an important consideration for net use. At the same time, comfort and aspects of comfort (getting a good night's sleep, avoiding biting pests) appeared to play a large role in personal decisions to use nets consistently or not. Barriers to comfort (feeling uncomfortable or trapped; perceived difficulty breathing, or itching/rashes) were frequently cited as reasons not to use a net consistently. While it was apparent that participants acknowledged the malaria-prevention benefits of net use, the exploration of the risk and comfort determinants of net use provides a richer understanding of net use behaviours, particularly in a setting where transmission has fallen and yet consistent net use is still crucial to maintaining those gains. Future behaviour change communication campaigns should capitalize on the non-malaria benefits of net use that provide a long-term rationale for consistent use even when the immediate threat of malaria transmission has been reduced.
Framing U-Net via Deep Convolutional Framelets: Application to Sparse-View CT.
Han, Yoseob; Ye, Jong Chul
2018-06-01
X-ray computed tomography (CT) using sparse projection views is a recent approach to reduce the radiation dose. However, due to the insufficient projection views, an analytic reconstruction approach using the filtered back projection (FBP) produces severe streaking artifacts. Recently, deep learning approaches using large receptive field neural networks such as U-Net have demonstrated impressive performance for sparse-view CT reconstruction. However, theoretical justification is still lacking. Inspired by the recent theory of deep convolutional framelets, the main goal of this paper is, therefore, to reveal the limitation of U-Net and propose new multi-resolution deep learning schemes. In particular, we show that the alternative U-Net variants such as dual frame and tight frame U-Nets satisfy the so-called frame condition which makes them better for effective recovery of high frequency edges in sparse-view CT. Using extensive experiments with real patient data set, we demonstrate that the new network architectures provide better reconstruction performance.
Petersson, Ingemar F; Strömbeck, Britta; Andersen, Lene; Cimmino, Marco; Greiff, Rolf; Loza, Estibaliz; Sciré, Carlo; Stamm, Tanja; Stoffer, Michaela; Uhlig, Till; Woolf, Anthony D; Vliet Vlieland, Theodora P M
2014-05-01
Eumusc.net (http://www.eumusc.net) is a European project supported by the EU and European League Against Rheumatism to improve musculoskeletal care in Europe. To develop patient-centred healthcare quality indicators (HCQIs) for healthcare provision for rheumatoid arthritis (RA) patients. Based on a systematic literature search, existing HCQIs for RA were identified and their contents analysed and categorised referring to a list of 16 standards of care developed within the eumusc.net. An international expert panel comprising 14 healthcare providers and two patient representatives added topics and during repeated Delphi processes by email ranked the topics and rephrased suggested HCQIs with the preliminary set being established during a second expert group meeting. After an audit process by rheumatology units (including academic centres) in six countries (The Netherlands, Norway, Romania, Italy, Austria and Sweden), a final version of the HCQIs was established. 56 possible topics for HCQIs were processed resulting in a final set of HCQIs for RA (n=14) including two for structure (patient information and calculation of composite scores), 11 for process (eg, access to care, assessments, and pharmacological and non-pharmacological treatments) and one for outcome (effect of treatment on disease activity). They included definitions to be used in clinical practice and also by patients. Further, the numerators and the denominators for each HCQI were defined. A set of 14 patient-centred HCQIs for RA was developed to be used in quality improvement and bench marking in countries across Europe.
NASA Technical Reports Server (NTRS)
Koster, Randal D.; Fekete, Balazs M.; Huffman, George J.; Stackhouse, Paul W.
2006-01-01
The International Satellite Land Surface Climatology Project Initiative 2 (ISLSCP-2) data set provides the data needed to characterize the surface water budget across much of the globe in terms of energy availability (net radiation) and water availability (precipitation) controls. The data, on average, are shown to be consistent with Budyko s decades-old framework, thereby demonstrating the continuing relevance of Budyko s semiempirical relationships. This consistency, however, appears only when a small subset of the data with hydrologically suspicious behavior is removed from the analysis. In general, the precipitation, net radiation, and runoff data also appear consistent in their interannual variability and in the phasing of their seasonal cycles.
2013-01-01
Background Indoor residual insecticide spraying (IRS) and long-lasting insecticide treated nets (LLINs) are commonly used together even though evidence that such combinations confer greater protection against malaria than either method alone is inconsistent. Methods A deterministic model of mosquito life cycle processes was adapted to allow parameterization with results from experimental hut trials of various combinations of untreated nets or LLINs (Olyset®, PermaNet 2.0®, Icon Life® nets) with IRS (pirimiphos methyl, lambda cyhalothrin, DDT), in a setting where vector populations are dominated by Anopheles arabiensis, so that community level impact upon malaria transmission at high coverage could be predicted. Results Intact untreated nets alone provide equivalent personal protection to all three LLINs. Relative to IRS plus untreated nets, community level protection is slightly higher when Olyset® or PermaNet 2.0® nets are added onto IRS with pirimiphos methyl or lambda cyhalothrin but not DDT, and when Icon Life® nets supplement any of the IRS insecticides. Adding IRS onto any net modestly enhances communal protection when pirimiphos methyl is sprayed, while spraying lambda cyhalothrin enhances protection for untreated nets but not LLINs. Addition of DDT reduces communal protection when added to LLINs. Conclusions Where transmission is mediated primarily by An. arabiensis, adding IRS to high LLIN coverage provides only modest incremental benefit (e.g. when an organophosphate like pirimiphos methyl is used), but can be redundant (e.g. when a pyrethroid like lambda cyhalothin is used) or even regressive (e.g. when DDT is used for the IRS). Relative to IRS plus untreated nets, supplementing IRS with LLINs will only modestly improve community protection. Beyond the physical protection that intact nets provide, additional protection against transmission by An. arabiensis conferred by insecticides will be remarkably small, regardless of whether they are delivered as LLINs or IRS. The insecticidal action of LLINs and IRS probably already approaches their absolute limit of potential impact upon this persistent vector so personal protection of nets should be enhanced by improving the physical integrity and durability. Combining LLINs and non-pyrethroid IRS in residual transmission systems may nevertheless be justified as a means to manage insecticide resistance and prevent potential rebound of not only An. arabiensis, but also more potent, vulnerable and historically important species such as Anopheles gambiae and Anopheles funestus. PMID:23324456
SEMANTIC3D.NET: a New Large-Scale Point Cloud Classification Benchmark
NASA Astrophysics Data System (ADS)
Hackel, T.; Savinov, N.; Ladicky, L.; Wegner, J. D.; Schindler, K.; Pollefeys, M.
2017-05-01
This paper presents a new 3D point cloud classification benchmark data set with over four billion manually labelled points, meant as input for data-hungry (deep) learning methods. We also discuss first submissions to the benchmark that use deep convolutional neural networks (CNNs) as a work horse, which already show remarkable performance improvements over state-of-the-art. CNNs have become the de-facto standard for many tasks in computer vision and machine learning like semantic segmentation or object detection in images, but have no yet led to a true breakthrough for 3D point cloud labelling tasks due to lack of training data. With the massive data set presented in this paper, we aim at closing this data gap to help unleash the full potential of deep learning methods for 3D labelling tasks. Our semantic3D.net data set consists of dense point clouds acquired with static terrestrial laser scanners. It contains 8 semantic classes and covers a wide range of urban outdoor scenes: churches, streets, railroad tracks, squares, villages, soccer fields and castles. We describe our labelling interface and show that our data set provides more dense and complete point clouds with much higher overall number of labelled points compared to those already available to the research community. We further provide baseline method descriptions and comparison between methods submitted to our online system. We hope semantic3D.net will pave the way for deep learning methods in 3D point cloud labelling to learn richer, more general 3D representations, and first submissions after only a few months indicate that this might indeed be the case.
Patient portal readiness among postpartum patients in a safety net setting.
Wieland, Daryl; Gibeau, Anne; Dewey, Caitlin; Roshto, Melanie; Frankel, Hilary
2017-07-05
Maternity patients interact with the healthcare system over an approximately ten-month interval, requiring multiple visits, acquiring pregnancy-specific education, and sharing health information among providers. Many features of a web-based patient portal could help pregnant women manage their interactions with the healthcare system; however, it is unclear whether pregnant women in safety-net settings have the resources, skills or interest required for portal adoption. In this study of postpartum patients in a safety net hospital, we aimed to: (1) determine if patients have the technical resources and skills to access a portal, (2) gain insight into their interest in health information, and (3) identify the perceived utility of portal features and potential barriers to adoption. We developed a structured questionnaire to collect demographics from postpartum patients and measure use of technology and the internet, self-reported literacy, interest in health information, awareness of portal functions, and perceived barriers to use. The questionnaire was administered in person to women in an inpatient setting. Of the 100 participants surveyed, 95% reported routine internet use and 56% used it to search for health information. Most participants had never heard of a patient portal, yet 92% believed that the portal functions were important. The two most appealing functions were to check results and manage appointments. Most participants in this study have the required resources such as a device and familiarity with the internet to access a patient portal including an interest in interacting with a healthcare institution via electronic means. Pregnancy is a critical episode of care where active engagement with the healthcare system can influence outcomes. Healthcare systems and portal developers should consider ways to tailor a portal to address the specific health needs of a maternity population including those in a safety net setting.
,
2008-01-01
This report documents the computer program INFIL3.0, which is a grid-based, distributed-parameter, deterministic water-balance watershed model that calculates the temporal and spatial distribution of daily net infiltration of water across the lower boundary of the root zone. The bottom of the root zone is the estimated maximum depth below ground surface affected by evapotranspiration. In many field applications, net infiltration below the bottom of the root zone can be assumed to equal net recharge to an underlying water-table aquifer. The daily water balance simulated by INFIL3.0 includes precipitation as either rain or snow; snowfall accumulation, sublimation, and snowmelt; infiltration into the root zone; evapotranspiration from the root zone; drainage and water-content redistribution within the root-zone profile; surface-water runoff from, and run-on to, adjacent grid cells; and net infiltration across the bottom of the root zone. The water-balance model uses daily climate records of precipitation and air temperature and a spatially distributed representation of drainage-basin characteristics defined by topography, geology, soils, and vegetation to simulate daily net infiltration at all locations, including stream channels with intermittent streamflow in response to runoff from rain and snowmelt. The model does not simulate streamflow originating as ground-water discharge. Drainage-basin characteristics are represented in the model by a set of spatially distributed input variables uniquely assigned to each grid cell of a model grid. The report provides a description of the conceptual model of net infiltration on which the INFIL3.0 computer code is based and a detailed discussion of the methods by which INFIL3.0 simulates the net-infiltration process. The report also includes instructions for preparing input files necessary for an INFIL3.0 simulation, a description of the output files that are created as part of an INFIL3.0 simulation, and a sample problem that illustrates application of the code to a field setting. Brief descriptions of the main program routine and of each of the modules and subroutines of the INFIL3.0 code, as well as definitions of the variables used in each subroutine, are provided in an appendix.
Validation results of satellite mock-up capturing experiment using nets
NASA Astrophysics Data System (ADS)
Medina, Alberto; Cercós, Lorenzo; Stefanescu, Raluca M.; Benvenuto, Riccardo; Pesce, Vincenzo; Marcon, Marco; Lavagna, Michèle; González, Iván; Rodríguez López, Nuria; Wormnes, Kjetil
2017-05-01
The PATENDER activity (Net parametric characterization and parabolic flight), funded by the European Space Agency (ESA) via its Clean Space initiative, was aiming to validate a simulation tool for designing nets for capturing space debris. This validation has been performed through a set of different experiments under microgravity conditions where a net was launched capturing and wrapping a satellite mock-up. This paper presents the architecture of the thrown-net dynamics simulator together with the set-up of the deployment experiment and its trajectory reconstruction results on a parabolic flight (Novespace A-310, June 2015). The simulator has been implemented within the Blender framework in order to provide a highly configurable tool, able to reproduce different scenarios for Active Debris Removal missions. The experiment has been performed over thirty parabolas offering around 22 s of zero-g conditions. Flexible meshed fabric structure (the net) ejected from a container and propelled by corner masses (the bullets) arranged around its circumference have been launched at different initial velocities and launching angles using a pneumatic-based dedicated mechanism (representing the chaser satellite) against a target mock-up (the target satellite). High-speed motion cameras were recording the experiment allowing 3D reconstruction of the net motion. The net knots have been coloured to allow the images post-process using colour segmentation, stereo matching and iterative closest point (ICP) for knots tracking. The final objective of the activity was the validation of the net deployment and wrapping simulator using images recorded during the parabolic flight. The high-resolution images acquired have been post-processed to determine accurately the initial conditions and generate the reference data (position and velocity of all knots of the net along its deployment and wrapping of the target mock-up) for the simulator validation. The simulator has been properly configured according to the parabolic flight scenario, and executed in order to generate the validation data. Both datasets have been compared according to different metrics in order to perform the validation of the PATENDER simulator.
Character Sets for PLATO/NovaNET: An Expository Catalog.
ERIC Educational Resources Information Center
Gilpin, John B.
The PLATO and NovaNET computer-based instructional systems use a fixed system character set ("normal font") and an author-definable character set ("alternate font"). The alternate font lets the author construct his own symbols and bitmapped pictures. This expository catalog allows users to determine quickly (1) whether there is…
The Influence of Health Policy and Market Factors on the Hospital Safety Net
Bazzoli, Gloria J; Lindrooth, Richard C; Kang, R ay; Hasnain-Wynia, R omana
2006-01-01
Objective To examine how the financial pressures resulting from the Balanced Budget Act (BBA) of 1997 interacted with private sector pressures to affect indigent care provision. Data Sources/Study Setting American Hospital Association Annual Survey, Area Resource File, InterStudy Health Maintenance Organization files, Current Population Survey, and Bureau of Primary Health Care data. Study Design We distinguished core and voluntary safety net hospitals in our analysis. Core safety net hospitals provide a large share of uncompensated care in their markets and have large indigent care patient mix. Voluntary safety net hospitals provide substantial indigent care but less so than core hospitals. We examined the effect of financial pressure in the initial year of the 1997 BBA on uncompensated care for three hospital groups. Data for 1996–2000 were analyzed using approaches that control for hospital and market heterogeneity. Data Collection/Extraction Methods All urban U.S. general acute care hospitals with complete data for at least 2 years between 1996 and 2000, which totaled 1,693 institutions. Principal Findings Core safety net hospitals reduced their uncompensated care in response to Medicaid financial pressure. Voluntary safety net hospitals also responded in this way but only when faced with the combined forces of Medicaid and private sector payment pressures. Nonsafety net hospitals did not exhibit similar responses. Conclusions Our results are consistent with theories of hospital behavior when institutions face reductions in payment. They raise concern given continuing state budget crises plus the focus of recent federal deficit reduction legislation intended to cut Medicaid expenditures. PMID:16899001
NASA Astrophysics Data System (ADS)
Schaap, Dick M. A.; Fichaut, Michele
2013-04-01
The second phase of the project SeaDataNet started on October 2011 for another 4 years with the aim to upgrade the SeaDataNet infrastructure built during previous years. The numbers of the project are quite impressive: 59 institutions from 35 different countries are involved. In particular, 45 data centers are sharing human and financial resources in a common efforts to sustain an operationally robust and state-of-the-art Pan-European infrastructure for providing up-to-date and high quality access to ocean and marine metadata, data and data products. The main objective of SeaDataNet II is to improve operations and to progress towards an efficient data management infrastructure able to handle the diversity and large volume of data collected via the Pan-European oceanographic fleet and the new observation systems, both in real-time and delayed mode. The infrastructure is based on a semi-distributed system that incorporates and enhance the existing NODCs network. SeaDataNet aims at serving users from science, environmental management, policy making, and economical sectors. Better integrated data systems are vital for these users to achieve improved scientific research and results, to support marine environmental and integrated coastal zone management, to establish indicators of Good Environmental Status for sea basins, and to support offshore industry developments, shipping, fisheries, and other economic activities. The recent EU communication "MARINE KNOWLEDGE 2020 - marine data and observation for smart and sustainable growth" states that the creation of marine knowledge begins with observation of the seas and oceans. In addition, directives, policies, science programmes require reporting of the state of the seas and oceans in an integrated pan-European manner: of particular note are INSPIRE, MSFD, WISE-Marine and GMES Marine Core Service. These underpin the importance of a well functioning marine and ocean data management infrastructure. SeaDataNet is now one of the major players in informatics in oceanography and collaborative relationships have been created with other EU and non EU projects. In particular SeaDataNet has recognised roles in the continuous serving of common vocabularies, the provision of tools for data management, as well as giving access to metadata, data sets and data products of importance for society. The SeaDataNet infrastructure comprises a network of interconnected data centres and a central SeaDataNet portal. The portal provides users not only background information about SeaDataNet and the various SeaDataNet standards and tools, but also a unified and transparent overview of the metadata and controlled access to the large collections of data sets, managed by the interconnected data centres. The presentation will give information on present services of the SeaDataNet infrastructure and services, and highlight a number of key achievements in SeaDataNet II so far.
EMODNet Bathymetry - building and providing a high resolution digital bathymetry for European seas
NASA Astrophysics Data System (ADS)
Schaap, Dick M. A.
2016-04-01
Access to marine data is a key issue for the EU Marine Strategy Framework Directive and the EU Marine Knowledge 2020 agenda and includes the European Marine Observation and Data Network (EMODNet) initiative. EMODNet aims at assembling European marine data, data products and metadata from diverse sources in a uniform way. The EMODNet data infrastructure is developed through a stepwise approach in three major phases. Currently EMODNet is entering its 3rd phase with operational portals providing access to marine data for bathymetry, geology, physics, chemistry, biology, seabed habitats and human activities, complemented by checkpoint projects, analyzing the fitness for purpose of data provision. The EMODNet Bathymetry project develops and publishes Digital Terrain Models (DTM) for the European seas. These are produced from survey and aggregated data sets that are indexed with metadata by adopting from SeaDataNet the Common Data Index (CDI) data discovery and access service and the Sextant data products catalogue service. SeaDataNet is a network of major oceanographic data centers around the European seas that manage, operate and further develop a pan-European infrastructure for marine and ocean data management. SeaDataNet is also setting and governing marine data standards, and exploring and establishing interoperability solutions to connect to other e-infrastructures on the basis of standards such as ISO and OGC. The SeaDataNet portal provides users a number of interrelated meta directories, an extensive range of controlled vocabularies, and the various SeaDataNet standards and tools. SeaDataNet at present gives overview and access to more than 1.8 million data sets for physical oceanography, chemistry, geology, geophysics, bathymetry and biology from more than 100 connected data centers from 34 countries riparian to European seas. The latest EMODNet Bathymetry DTM has a resolution of 1/8 arc minute * 1/8 arc minute and covers all European sea regions. Use is made of available and gathered surveys and already more than 13.000 surveys have been indexed by 27 European data providers from 15 countries and originating from more than 120 organizations. Also use is made of composite DTMs as generated and maintained by several data providers for their areas of interest. Already 44 composite DTMs are included in the Sextant data products catalogue. For areas without coverage use is made of the latest global DTM of GEBCO who is partner in the EMODNet Bathymetry project. In return GEBCO integrates the EMODNet DTM to achieve an enriched and better result. The catalogue services and the generated EMODNet can be queried and browsed at the dedicated EMODNet Bathymetry portal which also provides a versatile DTM viewing service with many relevant map layers and functions for retrieving. Activities are underway for further refinement following user feedback. The EMODnet DTM is publicly available for downloading in various formats. The presentation will highlight key details of EMODNet Bathymetry project, the recently released EMODNet Digital Bathymetry for all European seas, its portal and its versatile viewer.
Koch, Ina; Schueler, Markus; Heiner, Monika
2005-01-01
To understand biochemical processes caused by, e. g., mutations or deletions in the genome, the knowledge of possible alternative paths between two arbitrary chemical compounds is of increasing interest for biotechnology, pharmacology, medicine, and drug design. With the steadily increasing amount of data from high-throughput experiments new biochemical networks can be constructed and existing ones can be extended, which results in many large metabolic, signal transduction, and gene regulatory networks. The search for alternative paths within these complex and large networks can provide a huge amount of solutions, which can not be handled manually. Moreover, not all of the alternative paths are generally of interest. Therefore, we have developed and implemented a method, which allows us to define constraints to reduce the set of all structurally possible paths to the truly interesting path set. The paper describes the search algorithm and the constraints definition language. We give examples for path searches using this dedicated special language for a Petri net model of the sucrose-to-starch breakdown in the potato tuber.
Koch, Ina; Schüler, Markus; Heiner, Monika
2011-01-01
To understand biochemical processes caused by, e.g., mutations or deletions in the genome, the knowledge of possible alternative paths between two arbitrary chemical compounds is of increasing interest for biotechnology, pharmacology, medicine, and drug design. With the steadily increasing amount of data from high-throughput experiments new biochemical networks can be constructed and existing ones can be extended, which results in many large metabolic, signal transduction, and gene regulatory networks. The search for alternative paths within these complex and large networks can provide a huge amount of solutions, which can not be handled manually. Moreover, not all of the alternative paths are generally of interest. Therefore, we have developed and implemented a method, which allows us to define constraints to reduce the set of all structurally possible paths to the truly interesting path set. The paper describes the search algorithm and the constraints definition language. We give examples for path searches using this dedicated special language for a Petri net model of the sucrose-to-starch breakdown in the potato tuber. http://sanaga.tfh-berlin.de/~stepp/
2012-01-01
Background Drugs safety issues are now recognized as being factors generating the most reasons for drug withdrawals at various levels of development and at the post-approval stage. Among them cardiotoxicity remains the main reason, despite the substantial effort put into in vitro and in vivo testing, with the main focus put on hERG channel inhibition as the hypothesized surrogate of drug proarrhythmic potency. The large interest in the IKr current has resulted in the development of predictive tools and informative databases describing a drug's susceptibility to interactions with the hERG channel, although there are no similar, publicly available sets of data describing other ionic currents driven by the human cardiomyocyte ionic channels, which are recognized as an overlooked drug safety target. Discussion The aim of this database development and publication was to provide a scientifically useful, easily usable and clearly verifiable set of information describing not only IKr (hERG), but also other human cardiomyocyte specific ionic channels inhibition data (IKs, INa, ICa). Summary The broad range of data (chemical space and in vitro settings) and the easy to use user interface makes tox-database.net a useful tool for interested scientists. Database URL http://tox-database.net. PMID:22947121
47 CFR 36.605 - Calculation of safety net additive.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 2 2010-10-01 2010-10-01 false Calculation of safety net additive. 36.605... § 36.605 Calculation of safety net additive. (a) “Safety net additive support.” A rural incumbent local exchange carrier shall receive safety net additive support if it satisfies the conditions set forth in...
47 CFR 36.605 - Calculation of safety net additive.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 2 2011-10-01 2011-10-01 false Calculation of safety net additive. 36.605... § 36.605 Calculation of safety net additive. (a) “Safety net additive support.” A rural incumbent local exchange carrier shall receive safety net additive support if it satisfies the conditions set forth in...
Burke, Nancy J
2014-09-20
Approximately 20% of adult cancer patients are eligible to participate in a clinical trial, but only 2.5-9% do so. Accrual is even less for minority and medically underserved populations. As a result, critical life-saving treatments and quality of life services developed from research studies may not address their needs. This study questions the utility of the bioethical concern with therapeutic misconception (TM), a misconception that occurs when research subjects fail to distinguish between clinical research and ordinary treatment, and therefore attribute therapeutic intent to research procedures in the safety net setting. This paper provides ethnographic insight into the ways in which research is discussed and related to standard treatment. In the course of two years of ethnographic fieldwork in a safety net hospital, I conducted clinic observations (n=150 clinic days) and in-depth in-person qualitative interviews with patients (n=37) and providers (n=15). I used standard qualitative methods to organize and code resulting fieldnote and interview data. Findings suggest that TM is limited in relevance for the interdisciplinary context of cancer clinical trial recruitment in the safety net setting. Ethnographic data show the value of the discussions that happen prior to the informed consent, those that introduce the idea of participation in research. These preliminary discussions are elemental especially when recruiting underserved and vulnerable patients for clinical trial participation who are often unfamiliar with medical research and how it relates to medical care. Data also highlight the multiple actors involved in research discussions and the ethics of social justice and patient advocacy they mobilize, suggesting that class, inequality, and dependency influence the forms of ethical engagements in public hospital settings. On the ground ethics of social justice and patient advocacy are more relevant than TM as guiding ethical principles in the context of ongoing cancer disparities and efforts to diversify clinical trial participation.
Assessing Patient Activation among High-Need, High-Cost Patients in Urban Safety Net Care Settings.
Napoles, Tessa M; Burke, Nancy J; Shim, Janet K; Davis, Elizabeth; Moskowitz, David; Yen, Irene H
2017-12-01
We sought to examine the literature using the Patient Activation Measure (PAM) or the Patient Enablement Instrument (PEI) with high-need, high-cost (HNHC) patients receiving care in urban safety net settings. Urban safety net care management programs serve low-income, racially/ethnically diverse patients living with multiple chronic conditions. Although many care management programs track patient progress with the PAM or the PEI, it is not clear whether the PAM or the PEI is an effective and appropriate tool for HNHC patients receiving care in urban safety net settings in the United States. We searched PubMed, EMBASE, Web of Science, and PsycINFO for articles published between 2004 and 2015 that used the PAM and between 1998 and 2015 that used the PEI. The search was limited to English-language articles conducted in the United States and published in peer-reviewed journals. To assess the utility of the PAM and the PEI in urban safety net care settings, we defined a HNHC patient sample as racially/ethnically diverse, low socioeconomic status (SES), and multimorbid. One hundred fourteen articles used the PAM. All articles using the PEI were conducted outside the U.S. and therefore were excluded. Nine PAM studies (8%) included participants similar to those receiving care in urban safety net settings, three of which were longitudinal. Two of the three longitudinal studies reported positive changes following interventions. Our results indicate that research on patient activation is not commonly conducted on racially and ethnically diverse, low SES, and multimorbid patients; therefore, there are few opportunities to assess the appropriateness of the PAM in such populations. Investigators expressed concerns with the potential unreliability and inappropriate nature of the PAM on multimorbid, older, and low-literacy patients. Thus, the PAM may not be able to accurately assess patient progress among HNHC patients receiving care in urban safety net settings. Assessing progress in the urban safety net care setting requires measures that account for the social and structural challenges and competing demands of HNHC patients.
Cornelio-Flores, Oscar; Lemaster, Chelsey; Hernandez, Maria; Fong, Calvin; Resnick, Kirsten; Wardle, Jon; Hanser, Suzanne; Saper, Robert
2017-01-01
Background Little is known about the feasibility of providing massage or music therapy to medical inpatients at urban safety-net hospitals or the impact these treatments may have on patient experience. Objective To determine the feasibility of providing massage and music therapy to medical inpatients and to assess the impact of these interventions on patient experience. Design Single-center 3-arm feasibility randomized controlled trial. Setting Urban academic safety-net hospital. Patients Adult inpatients on the Family Medicine ward. Interventions Massage therapy consisted of a standardized protocol adapted from a previous perioperative study. Music therapy involved a preference assessment, personalized compact disc, music-facilitated coping, singing/playing music, and/or songwriting. Credentialed therapists provided the interventions. Measurements Patient experience was measured with the Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) within 7 days of discharge. We compared the proportion of patients in each study arm reporting “top box” scores for the following a priori HCAHPS domains: pain management, recommendation of hospital, and overall hospital rating. Responses to additional open-ended postdischarge questions were transcribed, coded independently, and analyzed for common themes. Results From July to December 2014, 90 medical inpatients were enrolled; postdischarge data were collected on 68 (76%) medical inpatients. Participants were 70% females, 43% non-Hispanic black, and 23% Hispanic. No differences between groups were observed on HCAHPS. The qualitative analysis found that massage and music therapy were associated with improved overall hospital experience, pain management, and connectedness to the massage or music therapist. Conclusions Providing music and massage therapy in an urban safety-net inpatient setting was feasible. There was no quantitative impact on HCAHPS. Qualitative findings suggest benefits related to an improved hospital experience, pain management, and connectedness to the massage or music therapist. PMID:29085740
Wells, Anjanette A; Palinkas, Lawrence A; Williams, Sha-Lai L; Ell, Kathleen
2015-08-01
Previously published work finds significant benefit from medical and behavioral health team care among safety-net patients with major depression. This qualitative study assessed clinical social worker, psychiatrist and patient navigator strategies to increase depression treatment among low-income minority cancer patients participating in the ADAPt-C clinical depression trial. Patient care retention strategies were elicited through in-depth, semi-structured interviews with nine behavioral health providers. Using grounded theory, concepts from the literature and dropout barriers identified by patients, guided interview prompts. Retention strategies clustered around five dropout barriers: (1) informational, (2) instrumental, (3) provider-patient therapeutic alliance, (4) clinic setting, and (5) depression treatment. All strategies emphasized the importance of communication between providers and patients. Findings suggest that strong therapeutic alliance and telephone facilitates collaborative team provider communication and depression treatment retention among patients in safety-net oncology care systems.
RIPGIS-NET: a GIS tool for riparian groundwater evapotranspiration in MODFLOW.
Ajami, Hoori; Maddock, Thomas; Meixner, Thomas; Hogan, James F; Guertin, D Phillip
2012-01-01
RIPGIS-NET, an Environmental System Research Institute (ESRI's) ArcGIS 9.2/9.3 custom application, was developed to derive parameters and visualize results of spatially explicit riparian groundwater evapotranspiration (ETg), evapotranspiration from saturated zone, in groundwater flow models for ecohydrology, riparian ecosystem management, and stream restoration. Specifically RIPGIS-NET works with riparian evapotranspiration (RIP-ET), a modeling package that works with the MODFLOW groundwater flow model. RIP-ET improves ETg simulations by using a set of eco-physiologically based ETg curves for plant functional subgroups (PFSGs), and separates ground evaporation and plant transpiration processes from the water table. The RIPGIS-NET program was developed in Visual Basic 2005, .NET framework 2.0, and runs in ArcMap 9.2 and 9.3 applications. RIPGIS-NET, a pre- and post-processor for RIP-ET, incorporates spatial variability of riparian vegetation and land surface elevation into ETg estimation in MODFLOW groundwater models. RIPGIS-NET derives RIP-ET input parameters including PFSG evapotranspiration curve parameters, fractional coverage areas of each PFSG in a MODFLOW cell, and average surface elevation per riparian vegetation polygon using a digital elevation model. RIPGIS-NET also provides visualization tools for modelers to create head maps, depth to water table (DTWT) maps, and plot DTWT for a PFSG in a polygon in the Geographic Information System based on MODFLOW simulation results. © 2011, The Author(s). Ground Water © 2011, National Ground Water Association.
Recent work on network application layer: MioNet, the virtual workplace for small businesses
NASA Astrophysics Data System (ADS)
Hesselink, Lambertus; Rizal, Dharmarus; Bjornson, Eric; Miller, Brian; Chan, Keith
2005-11-01
Small businesses must be extremely efficient and smartly leverage their resources, suppliers, and partners to successfully compete with larger firms. A successful small business requires a set of companies with interlocking business relationships that are dynamic and needs-based. There has been no software solution that creates a secure and flexible way to efficiently connect small business computer-based employees and partners. In this invited paper, we discuss MioNet, a secure and powerful data management platform which may provide millions of small businesses with a virtual workplace and help them to succeed.
Performance Measurement and Target-Setting in California's Safety Net Health Systems.
Hemmat, Shirin; Schillinger, Dean; Lyles, Courtney; Ackerman, Sara; Gourley, Gato; Vittinghoff, Eric; Handley, Margaret; Sarkar, Urmimala
Health policies encourage implementing quality measurement with performance targets. The 2010-2015 California Medicaid waiver mandated quality measurement and reporting. In 2013, California safety net hospitals participating in the waiver set a voluntary performance target (the 90th percentile for Medicare preferred provider organization plans) for mammography screening and cholesterol control in diabetes. They did not reach the target, and the difference-in-differences analysis suggested that there was no difference for mammography ( P = .39) and low-density lipoprotein control ( P = .11) performance compared to measures for which no statewide quality improvement initiative existed. California's Medicaid waiver was associated with improved performance on a number of metrics, but this performance was not attributable to target setting on specific health conditions. Performance may have improved because of secular trends or systems improvements related to waiver funding. Relying on condition-specific targets to measure performance may underestimate improvements and disadvantage certain health systems. Achieving ambitious targets likely requires sustained fiscal, management, and workforce investments.
Application of Risk within Net Present Value Calculations for Government Projects
NASA Technical Reports Server (NTRS)
Grandl, Paul R.; Youngblood, Alisha D.; Componation, Paul; Gholston, Sampson
2007-01-01
In January 2004, President Bush announced a new vision for space exploration. This included retirement of the current Space Shuttle fleet by 2010 and the development of new set of launch vehicles. The President's vision did not include significant increases in the NASA budget, so these development programs need to be cost conscious. Current trade study procedures address factors such as performance, reliability, safety, manufacturing, maintainability, operations, and costs. It would be desirable, however, to have increased insight into the cost factors behind each of the proposed system architectures. This paper reports on a set of component trade studies completed on the upper stage engine for the new launch vehicles. Increased insight into architecture costs was developed by including a Net Present Value (NPV) method and applying a set of associated risks to the base parametric cost data. The use of the NPV method along with the risks was found to add fidelity to the trade study and provide additional information to support the selection of a more robust design architecture.
The geospatial data quality REST API for primary biodiversity data
Otegui, Javier; Guralnick, Robert P.
2016-01-01
Summary: We present a REST web service to assess the geospatial quality of primary biodiversity data. It enables access to basic and advanced functions to detect completeness and consistency issues as well as general errors in the provided record or set of records. The API uses JSON for data interchange and efficient parallelization techniques for fast assessments of large datasets. Availability and implementation: The Geospatial Data Quality API is part of the VertNet set of APIs. It can be accessed at http://api-geospatial.vertnet-portal.appspot.com/geospatial and is already implemented in the VertNet data portal for quality reporting. Source code is freely available under GPL license from http://www.github.com/vertnet/api-geospatial. Contact: javier.otegui@gmail.com or rguralnick@flmnh.ufl.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26833340
The geospatial data quality REST API for primary biodiversity data.
Otegui, Javier; Guralnick, Robert P
2016-06-01
We present a REST web service to assess the geospatial quality of primary biodiversity data. It enables access to basic and advanced functions to detect completeness and consistency issues as well as general errors in the provided record or set of records. The API uses JSON for data interchange and efficient parallelization techniques for fast assessments of large datasets. The Geospatial Data Quality API is part of the VertNet set of APIs. It can be accessed at http://api-geospatial.vertnet-portal.appspot.com/geospatial and is already implemented in the VertNet data portal for quality reporting. Source code is freely available under GPL license from http://www.github.com/vertnet/api-geospatial javier.otegui@gmail.com or rguralnick@flmnh.ufl.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Development of Bioinformatics Infrastructure for Genomics Research.
Mulder, Nicola J; Adebiyi, Ezekiel; Adebiyi, Marion; Adeyemi, Seun; Ahmed, Azza; Ahmed, Rehab; Akanle, Bola; Alibi, Mohamed; Armstrong, Don L; Aron, Shaun; Ashano, Efejiro; Baichoo, Shakuntala; Benkahla, Alia; Brown, David K; Chimusa, Emile R; Fadlelmola, Faisal M; Falola, Dare; Fatumo, Segun; Ghedira, Kais; Ghouila, Amel; Hazelhurst, Scott; Isewon, Itunuoluwa; Jung, Segun; Kassim, Samar Kamal; Kayondo, Jonathan K; Mbiyavanga, Mamana; Meintjes, Ayton; Mohammed, Somia; Mosaku, Abayomi; Moussa, Ahmed; Muhammd, Mustafa; Mungloo-Dilmohamud, Zahra; Nashiru, Oyekanmi; Odia, Trust; Okafor, Adaobi; Oladipo, Olaleye; Osamor, Victor; Oyelade, Jellili; Sadki, Khalid; Salifu, Samson Pandam; Soyemi, Jumoke; Panji, Sumir; Radouani, Fouzia; Souiai, Oussama; Tastan Bishop, Özlem
2017-06-01
Although pockets of bioinformatics excellence have developed in Africa, generally, large-scale genomic data analysis has been limited by the availability of expertise and infrastructure. H3ABioNet, a pan-African bioinformatics network, was established to build capacity specifically to enable H3Africa (Human Heredity and Health in Africa) researchers to analyze their data in Africa. Since the inception of the H3Africa initiative, H3ABioNet's role has evolved in response to changing needs from the consortium and the African bioinformatics community. H3ABioNet set out to develop core bioinformatics infrastructure and capacity for genomics research in various aspects of data collection, transfer, storage, and analysis. Various resources have been developed to address genomic data management and analysis needs of H3Africa researchers and other scientific communities on the continent. NetMap was developed and used to build an accurate picture of network performance within Africa and between Africa and the rest of the world, and Globus Online has been rolled out to facilitate data transfer. A participant recruitment database was developed to monitor participant enrollment, and data is being harmonized through the use of ontologies and controlled vocabularies. The standardized metadata will be integrated to provide a search facility for H3Africa data and biospecimens. Because H3Africa projects are generating large-scale genomic data, facilities for analysis and interpretation are critical. H3ABioNet is implementing several data analysis platforms that provide a large range of bioinformatics tools or workflows, such as Galaxy, the Job Management System, and eBiokits. A set of reproducible, portable, and cloud-scalable pipelines to support the multiple H3Africa data types are also being developed and dockerized to enable execution on multiple computing infrastructures. In addition, new tools have been developed for analysis of the uniquely divergent African data and for downstream interpretation of prioritized variants. To provide support for these and other bioinformatics queries, an online bioinformatics helpdesk backed by broad consortium expertise has been established. Further support is provided by means of various modes of bioinformatics training. For the past 4 years, the development of infrastructure support and human capacity through H3ABioNet, have significantly contributed to the establishment of African scientific networks, data analysis facilities, and training programs. Here, we describe the infrastructure and how it has affected genomics and bioinformatics research in Africa. Copyright © 2017 World Heart Federation (Geneva). Published by Elsevier B.V. All rights reserved.
Lee, Terrie M.; Sacks, Laura A.; Swancar, Amy
2014-01-01
The long-term balance between net precipitation and net groundwater exchange that maintains thousands of seepage lakes in Florida’s karst terrain is explored at a representative lake basin and then regionally for the State’s peninsular lake district. The 15-year water budget of Lake Starr includes El Niño Southern Oscillation (ENSO)-related extremes in rainfall, and provides the longest record of Bowen ratio energy-budget (BREB) lake evaporation and lake-groundwater exchanges in the southeastern United States. Negative net precipitation averaging -25 cm/yr at Lake Starr overturns the previously-held conclusion that lakes in this region receive surplus net precipitation. Net groundwater exchange with the lake was positive on average but too small to balance the net precipitation deficit. Groundwater pumping effects and surface-water withdrawals from the lake widened the imbalance. Satellite-based regional estimates of potential evapotranspiration at five large lakes in peninsular Florida compared well with basin-scale evaporation measurements from seven open-water sites that used BREB methods. The regional average lake evaporation estimated for Lake Starr during 1996-2011 was within 5 percent of its measured average, and regional net precipitation agreed within 10 percent. Regional net precipitation to lakes was negative throughout central peninsular Florida and the net precipitation deficit increased by about 20 cm from north to south. Results indicate that seepage lakes farther south on the peninsula receive greater net groundwater inflow than northern lakes and imply that northern lakes are in comparatively leakier hydrogeologic settings. Findings reveal the peninsular lake district to be more vulnerable than was previously realized to drier climate, surface-water withdrawals from lakes, and groundwater pumping effects.
NASA Astrophysics Data System (ADS)
Budde, M. E.; Galu, G.; Funk, C. C.; Verdin, J. P.; Rowland, J.
2014-12-01
The Planning for Resilience in East Africa through Policy, Adaptation, Research, and Economic Development (PREPARED) is a multi-organizational project aimed at mainstreaming climate-resilient development planning and program implementation into the East African Community (EAC). The Famine Early Warning Systems Network (FEWS NET) has partnered with the PREPARED project to address three key development challenges for the EAC; 1) increasing resiliency to climate change, 2) managing trans-boundary freshwater biodiversity and conservation and 3) improving access to drinking water supply and sanitation services. USGS FEWS NET has been instrumental in the development of gridded climate data sets that are the fundamental building blocks for climate change adaptation studies in the region. Tools such as the Geospatial Climate Tool (GeoCLIM) have been developed to interpolate time-series grids of precipitation and temperature values from station observations and associated satellite imagery, elevation data, and other spatially continuous fields. The GeoCLIM tool also allows the identification of anomalies and assessments of both their frequency of occurrence and directional trends. A major effort has been put forth to build the capacities of local and regional institutions to use GeoCLIM to integrate their station data (which is not typically available to the public) into improved national and regional gridded climate data sets. In addition to the improvements and capacity building activities related to geospatial analysis tools, FEWS NET will assist in two other areas; 1) downscaling of climate change scenarios and 2) vulnerability impact assessments. FEWS NET will provide expertise in statistical downscaling of Global Climate Model output fields and work with regional institutions to assess results of other downscaling methods. Completion of a vulnerability impact assessment (VIA) involves the examination of sectoral consequences in identified climate "hot spots". FEWS NET will lead the VIA for the agriculture and food security sector, but will also provide key geospatial layers needed by multiple sectors in the areas of exposure, sensitivity, and adaptive capacity. Project implementation will strengthen regional coordination in policy-making, planning, and response to climate change issues.
Eo, Taejoon; Jun, Yohan; Kim, Taeseong; Jang, Jinseong; Lee, Ho-Joon; Hwang, Dosik
2018-04-06
To demonstrate accurate MR image reconstruction from undersampled k-space data using cross-domain convolutional neural networks (CNNs) METHODS: Cross-domain CNNs consist of 3 components: (1) a deep CNN operating on the k-space (KCNN), (2) a deep CNN operating on an image domain (ICNN), and (3) an interleaved data consistency operations. These components are alternately applied, and each CNN is trained to minimize the loss between the reconstructed and corresponding fully sampled k-spaces. The final reconstructed image is obtained by forward-propagating the undersampled k-space data through the entire network. Performances of K-net (KCNN with inverse Fourier transform), I-net (ICNN with interleaved data consistency), and various combinations of the 2 different networks were tested. The test results indicated that K-net and I-net have different advantages/disadvantages in terms of tissue-structure restoration. Consequently, the combination of K-net and I-net is superior to single-domain CNNs. Three MR data sets, the T 2 fluid-attenuated inversion recovery (T 2 FLAIR) set from the Alzheimer's Disease Neuroimaging Initiative and 2 data sets acquired at our local institute (T 2 FLAIR and T 1 weighted), were used to evaluate the performance of 7 conventional reconstruction algorithms and the proposed cross-domain CNNs, which hereafter is referred to as KIKI-net. KIKI-net outperforms conventional algorithms with mean improvements of 2.29 dB in peak SNR and 0.031 in structure similarity. KIKI-net exhibits superior performance over state-of-the-art conventional algorithms in terms of restoring tissue structures and removing aliasing artifacts. The results demonstrate that KIKI-net is applicable up to a reduction factor of 3 to 4 based on variable-density Cartesian undersampling. © 2018 International Society for Magnetic Resonance in Medicine.
Tools for Atmospheric Radiative Transfer: Streamer and FluxNet. Revised
NASA Technical Reports Server (NTRS)
Key, Jeffrey R.; Schweiger, Axel J.
1998-01-01
Two tools for the solution of radiative transfer problems are presented. Streamer is a highly flexible medium spectral resolution radiative transfer model based on the plane-parallel theory of radiative transfer. Capable of computing either fluxes or radiances, it is suitable for studying radiative processes at the surface or within the atmosphere and for the development of remote-sensing algorithms. FluxNet is a fast neural network-based implementation of Streamer for computing surface fluxes. It allows for a sophisticated treatment of radiative processes in the analysis of large data sets and potential integration into geophysical models where computational efficiency is an issue. Documentation and tools for the development of alternative versions of Fluxnet are available. Collectively, Streamer and FluxNet solve a wide variety of problems related to radiative transfer: Streamer provides the detail and sophistication needed to perform basic research on most aspects of complex radiative processes while the efficiency and simplicity of FluxNet make it ideal for operational use.
Using fuzzy logic to integrate neural networks and knowledge-based systems
NASA Technical Reports Server (NTRS)
Yen, John
1991-01-01
Outlined here is a novel hybrid architecture that uses fuzzy logic to integrate neural networks and knowledge-based systems. The author's approach offers important synergistic benefits to neural nets, approximate reasoning, and symbolic processing. Fuzzy inference rules extend symbolic systems with approximate reasoning capabilities, which are used for integrating and interpreting the outputs of neural networks. The symbolic system captures meta-level information about neural networks and defines its interaction with neural networks through a set of control tasks. Fuzzy action rules provide a robust mechanism for recognizing the situations in which neural networks require certain control actions. The neural nets, on the other hand, offer flexible classification and adaptive learning capabilities, which are crucial for dynamic and noisy environments. By combining neural nets and symbolic systems at their system levels through the use of fuzzy logic, the author's approach alleviates current difficulties in reconciling differences between low-level data processing mechanisms of neural nets and artificial intelligence systems.
Targeting Net Zero Energy at Marine Corps Base Hawaii, Kaneohe Bay: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burman, K.; Kandt, A.; Lisell, L.
2012-05-01
This paper summarizes the results of an NREL assessment of Marine Corps Base Hawaii (MCBH), Kaneohe Bay to appraise the potential of achieving net zero energy status through energy efficiency, renewable energy, and hydrogen vehicle integration. In 2008, the U.S. Department of Defense's U.S. Pacific Command partnered with the U.S. Department of Energy's (DOE's) National Renewable Energy Laboratory (NREL) to assess opportunities for increasing energy security through renewable energy and energy efficiency at Hawaii military installations. DOE selected Marine Corps Base Hawaii (MCBH), Kaneohe Bay, to receive technical support for net zero energy assessment and planning funded through the Hawaiimore » Clean Energy Initiative (HCEI). NREL performed a comprehensive assessment to appraise the potential of MCBH Kaneohe Bay to achieve net zero energy status through energy efficiency, renewable energy, and hydrogen vehicle integration. This paper summarizes the results of the assessment and provides energy recommendations. The analysis shows that MCBH Kaneohe Bay has the potential to make significant progress toward becoming a net zero installation. Wind, solar photovoltaics, solar hot water, and hydrogen production were assessed, as well as energy efficiency technologies. Deploying wind turbines is the most cost-effective energy production measure. If the identified energy projects and savings measures are implemented, the base will achieve a 96% site Btu reduction and a 99% source Btu reduction. Using excess wind and solar energy to produce hydrogen for a fleet and fuel cells could significantly reduce energy use and potentially bring MCBH Kaneohe Bay to net zero. Further analysis with an environmental impact and interconnection study will need to be completed. By achieving net zero status, the base will set an example for other military installations, provide environmental benefits, reduce costs, increase energy security, and exceed its energy goals and mandates.« less
Zhang, Hong-guang; Lu, Jian-gang
2016-02-01
Abstract To overcome the problems of significant difference among samples and nonlinearity between the property and spectra of samples in spectral quantitative analysis, a local regression algorithm is proposed in this paper. In this algorithm, net signal analysis method(NAS) was firstly used to obtain the net analyte signal of the calibration samples and unknown samples, then the Euclidean distance between net analyte signal of the sample and net analyte signal of calibration samples was calculated and utilized as similarity index. According to the defined similarity index, the local calibration sets were individually selected for each unknown sample. Finally, a local PLS regression model was built on each local calibration sets for each unknown sample. The proposed method was applied to a set of near infrared spectra of meat samples. The results demonstrate that the prediction precision and model complexity of the proposed method are superior to global PLS regression method and conventional local regression algorithm based on spectral Euclidean distance.
Seligman, Hilary K; Fernandez, Alicia; Stern, Rachel J; Weech-Maldonado, Robert; Quan, Judy; Jacobs, Elizabeth A
2012-09-01
The Consumer Assessment of Healthcare Providers and Systems Cultural Competency Item Set assesses patient perceptions of aspects of the cultural competence of their health care. To determine characteristics of patients who identify the care they receive as less culturally competent. Cross-sectional survey consisting of face-to-face interviews. Safety-net population of patients with type 2 diabetes (n=600) receiving ongoing primary care. Participants completed the Consumer Assessment of Healthcare Providers and Systems Cultural Competency and answered questions about their race/ethnicity, sex, age, education, health status, depressive symptoms, insurance coverage, English proficiency, duration of relationship with primary care provider, and comorbidities. In adjusted models, depressive symptoms were significantly associated with poor cultural competency in the Doctor Communication--Positive Behaviors domain [odds ratio (OR) 1.73, 95% confidence interval, 1.11-2.69]. African Americans were less likely than whites to report poor cultural competence in the Doctor Communication--Positive Behaviors domain (OR 0.52, 95% CI, 0.28-0.97). Participants who reported a longer relationship (≥ 3 y) with their primary care provider were less likely to report poor cultural competence in the Doctor Communication--Health Promotion (OR 0.35, 95% CI, 0.21-0.60) and Trust domains (OR 0.4, 95% CI, 0.24-0.67), whereas participants with lower educational attainment were less likely to report poor cultural competence in the Trust domain (OR 0.51, 95% CI, 0.30-0.86). Overall, however, sociodemographic and clinical differences in reports of poor cultural competence were insignificant or inconsistent across the various domains of cultural competence examined. Cultural competence interventions in safety-net settings should be implemented across populations, rather than being narrowly focused on specific sociodemographic or clinical groups.
Swancar, Amy; Lee, T.M.; O'Hare, T. M.
2000-01-01
Lake Starr, a 134-acre seepage lake of multiple-sinkhole origin on the Lake Wales Ridge of central Florida, was the subject of a detailed water-budget study from August 1996 through July 1998. The study monitored the effects of hydrogeologic setting, climate, and ground-water pumping on the water budget and lake stage. The hydrogeologic setting of the Lake Starr basin differs markedly on the two sides of the lake. Ground water from the surficial aquifer system flows into the lake from the northwest side of the basin, and lake water leaks out to the surficial aquifer system on the southeast side of the basin. Lake Starr and the surrounding surficial aquifer system recharge the underlying Upper Floridan aquifer. The rate of recharge to the Upper Floridan aquifer is determined by the integrity of the intermediate confining unit and by the downward head gradient between the two aquifers. On the inflow side of the lake, the intermediate confining unit is more continuous, allowing ground water from the surficial aquifer system to flow laterally into the lake. Beneath the lake and on the southeast side of the basin, breaches in the intermediate confining unit enhance downward flow to the Upper Floridan aquifer, so that water flows both downward and laterally away from the lake through the ground-water flow system in these areas. An accurate water budget, including evaporation measured by the energy-budget method, was used to calculate net ground-water flow to the lake, and to do a preliminary analysis of the relation of net ground-water fluxes to other variables. Water budgets constructed over different timeframes provided insight on processes that affect ground-water interactions with Lake Starr. Weekly estimates of net ground-water flow provided evidence for the occurrence of transient inflows from the nearshore basin, as well as the short-term effects of head in the Upper Floridan aquifer on ground-water exchange with the lake. Monthly water budgets showed the effects of wet and dry seasons, and provided evidence for ground-water inflow generated from the upper basin. Annual water budgets showed how differences in timing of rainfall and pumping stresses affected lake stage and lake ground-water interactions. Lake evaporation measurements made during the study suggest that, on average, annual lake evaporation exceeds annual precipitation in the basin. Rainfall was close to the long-term average of 51.99 inches per year for the 2 years of the study (50.68 and 54.04 inches, respectively). Lake evaporation was 57.08 and 55.88 inches per year for the same 2 years, making net precipitation (rainfall minus evaporation) negative during both years. If net precipitation to seepage lakes in this area is negative over the long-term, then the ability to generate net ground-water inflow from the surrounding basin plays an important role in sustaining lake levels. Evaporation exceeded rainfall by a similar amount for both years of the study, but net ground-water flow differed substantially between the 2 years. The basin contributed net ground-water inflow to the lake in both years, however, net ground-water inflow was not sufficient to make up for the negative net precipitation during the first year, and the lake fell 4.9 inches. During the second year, net ground-water inflow exceeded the difference between evaporation and rainfall and the lake rose by 12.7 inches. The additional net ground-water inflow in the second year was due to both an increase in the amount of gross ground-water inflow and a decrease in lake leakage (ground-water outflow). Ground-water inflow was greater during the second year because more rain fell during the winter, when evaporative losses were low, resulting in greater ground-water recharge. However, decreased lake leakage during this year was probably at least as important as increased ground-water inflow in explaining the difference in net ground-water flow to the lake between the 2 years. Estimates of lake leakage
Optimising the performance of an outpatient setting.
Sendi, Pedram; Al, Maiwenn J; Battegay, Manuel; Al Maiwenn, J
2004-01-24
An outpatient setting typically includes experienced and novice resident physicians who are supervised by senior staff physicians. The performance of this kind of outpatient setting, for a given mix of experienced and novice resident physicians, is determined by the number of senior staff physicians available for supervision. The optimum mix of human resources may be determined using discrete-event simulation. An outpatient setting represents a system where concurrency and resource sharing are important. These concepts can be modelled by means of timed Coloured Petri Nets (CPN), which is a discrete-event simulation formalism. We determined the optimum mix of resources (i.e. the number of senior staff physicians needed for a given number of experienced and novice resident physicians) to guarantee efficient overall system performance. In an outpatient setting with 10 resident physicians, two staff physicians are required to guarantee a minimum level of system performance (42-52 patients are seen per 5-hour period). However, with 3 senior staff physicians system performance can be improved substantially (49-56 patients per 5-hour period). An additional fourth staff physician does not substantially enhance system performance (50-57 patients per 5-hour period). Coloured Petri Nets provide a flexible environment in which to simulate an outpatient setting and assess the impact of any staffing changes on overall system performance, to promote informed resource allocation decisions.
Genetic and Environmental Pathways in Type 1 Diabetes Complications
2010-09-26
setting? In the event that the project identifies a set of strongly predictive biomarkers an appropriate next step would be to approach TrialNet (see...http://www.diabetestrialnet.org). The TrialNet organization is a multi center study with the goal of identifying subjects for T1D prevention and...intervention trials. Children’s Hospital of Pittsburgh (CHP) is already acting as a clinical center for the TrialNet natural history study (Mahon et al
Remote sensing investigations of wetland biomass and productivity for global biosystems research
NASA Technical Reports Server (NTRS)
Harkisky, M.; Klemas, V.
1983-01-01
Monitoring biomass of wetlands ecosystems can provide information on net primary production and on the chemical and physical status of wetland soils relative to anaerobic microbial transformation of key elements. Multispectral remote sensing techniques successfully estimated macrophytic biomass in wetlands systems. Regression models developed from ground spectral data for predicting Spartina alterniflora biomass over an entire growing season include seasonal variations in biomass density and illumination intensity. An independent set of biomass and spectral data were collected and the standing crop biomass and net primary productivity were estimated. The improved spatial, radiometric and spectral resolution of th LANDSAT-4 Thematic Mapper over the LANDSAT MSS can greatly enhance multispectral techniques for estimating wetlands biomass over large areas. These techniques can provide the biomass data necessary for global ecology studies.
A Holistic Approach to Scoring in Complex Mobile Learning Scenarios
ERIC Educational Resources Information Center
Gebbe, Marcel; Teine, Matthias; Beutner, Marc
2016-01-01
Interactive dialogues are key elements for designing authentic and motivating learning situations, and in combination with learning analysis they provide educators and users with the opportunity to track information related to professional competences, but mind-sets as well. This paper offers exemplary insights into the project NetEnquiry that is…
NASA Technical Reports Server (NTRS)
Decker, Arthur J.
2001-01-01
Artificial neural networks have been used for a number of years to process holography-generated characteristic patterns of vibrating structures. This technology depends critically on the selection and the conditioning of the training sets. A scaling operation called folding is discussed for conditioning training sets optimally for training feed-forward neural networks to process characteristic fringe patterns. Folding allows feed-forward nets to be trained easily to detect damage-induced vibration-displacement-distribution changes as small as 10 nm. A specific application to aerospace of neural-net processing of characteristic patterns is presented to motivate the conditioning and optimization effort.
NASA Astrophysics Data System (ADS)
Liao, Wei-Cheng; Hong, Mingyi; Liu, Ya-Feng; Luo, Zhi-Quan
2014-08-01
In a densely deployed heterogeneous network (HetNet), the number of pico/micro base stations (BS) can be comparable with the number of the users. To reduce the operational overhead of the HetNet, proper identification of the set of serving BSs becomes an important design issue. In this work, we show that by jointly optimizing the transceivers and determining the active set of BSs, high system resource utilization can be achieved with only a small number of BSs. In particular, we provide formulations and efficient algorithms for such joint optimization problem, under the following two common design criteria: i) minimization of the total power consumption at the BSs, and ii) maximization of the system spectrum efficiency. In both cases, we introduce a nonsmooth regularizer to facilitate the activation of the most appropriate BSs. We illustrate the efficiency and the efficacy of the proposed algorithms via extensive numerical simulations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gunter, Dan; Lee, Jason; Stoufer, Martin
2003-03-28
The NetLogger Toolkit is designed to monitor, under actual operating conditions, the behavior of all the elements of the application-to-application communication path in order to determine exactly where time is spent within a complex system Using NetLogger, distnbuted application components are modified to produce timestamped logs of "interesting" events at all the critical points of the distributed system Events from each component are correlated, which allov^ one to characterize the performance of all aspects of the system and network in detail. The NetLogger Toolkit itself consists of four components an API and library of functions to simplify the generation ofmore » application-level event logs, a set of tools for collecting and sorting log files, an event archive system, and a tool for visualization and analysis of the log files In order to instrument an application to produce event logs, the application developer inserts calls to the NetLogger API at all the critical points in the code, then links the application with the NetLogger library All the tools in the NetLogger Toolkit share a common log format, and assume the existence of accurate and synchronized system clocks NetLogger messages can be logged using an easy-to-read text based format based on the lETF-proposed ULM format, or a binary format that can still be used through the same API but that is several times faster and smaller, with performance comparable or better than binary message formats such as MPI, XDR, SDDF-Binary, and PBIO. The NetLogger binary format is both highly efficient and self-describing, thus optimized for the dynamic message construction and parsing of application instrumentation. NetLogger includes an "activation" API that allows NetLogger logging to be turned on, off, or modified by changing an external file This IS useful for activating logging in daemons/services (e g GndFTP server). The NetLogger reliability API provides the ability to specify backup logging locations and penodically try to reconnect broken TCP pipe. A typical use for this is to store data on local disk while net is down. An event archiver can log one or more incoming NetLogger streams to a local disk file (netlogd) or to a mySQL database (netarchd). We have found exploratory, visual analysis of the log event data to be the most useful means of determining the causes of performance anomalies The NetLogger Visualization tool, niv, has been developed to provide a flexible and interactive graphical representation of system-level and application-level events.« less
History, Principles, and Policies of Observation Medicine.
Ross, Michael A; Granovsky, Michael
2017-08-01
The history of observation medicine has paralleled the rise of emergency medicine over the past 50 years to meet the needs of patients, emergency departments, hospitals, and the US health care system. Just as emergency departments are the safety net of the health system, observation units are the safety net of emergency departments. The growth of observation medicine has been driven by innovations in health care, an ongoing shift of patients from inpatient to outpatient settings, and changes in health policy. These units have been shown to provide better outcomes than traditional care for selected patients. Copyright © 2017 Elsevier Inc. All rights reserved.
Coffin, Phillip O.; Behar, Emily; Rowe, Christopher; Santos, Glenn-Milo; Coffa, Diana; Bald, Matthew; Vittinghoff, Eric
2018-01-01
Background Unintentional overdose involving opioid analgesics is a leading cause of injury-related death in the United States. Objectives To evaluate the feasibility and impact of implementing naloxone prescription to patients prescribed opioids for chronic pain. Design 2-year non-randomized intervention study. Setting 6 safety net primary care clinics in San Francisco. Participants 1985 adults receiving long-term opioids for pain. Intervention Providers and clinic staff were trained and supported in naloxone prescribing. Measurements Outcomes were proportion of patients prescribed naloxone, opioid-related emergency department (ED) visits, and prescribed opioid dose based on chart review. Results 38.2% of 1,985 patients on long-term opioids were prescribed naloxone. Patients on higher doses of opioids and with a past 12-month opioid-related emergency department (ED) visit were independently more likely to be prescribed naloxone. Patients who received a naloxone prescription had 47% fewer opioid-related ED visits per month six months after the receipt of the prescription (IRR=0.53, 95%CI=0.34–0.83, P=0.005) and 63% fewer visits after one year (IRR=0.37, 95%CI=0.22–0.64, P<0.001), compared to patients who did not receive naloxone. There was no net change over time in opioid dose among those who received naloxone compared to those who did not (IRR 1.03, 95% CI 0.91–1.27, P = 0.61). Limitations Results are observational and may not be generalizable beyond safety net settings. Conclusion Naloxone can be co-prescribed to primary care patients prescribed opioids for pain. When advised to offer naloxone to all patients on opioids, providers may prioritize those with established risk factors. Providing naloxone in primary care settings may have ancillary benefits such as reducing opioid-related adverse events. Funding Source National Institutes of Health grant R21DA036776 PMID:27366987
NASA Net Zero Energy Buildings Roadmap
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pless, S.; Scheib, J.; Torcellini, P.
In preparation for the time-phased net zero energy requirement for new federal buildings starting in 2020, set forth in Executive Order 13514, NASA requested that the National Renewable Energy Laboratory (NREL) to develop a roadmap for NASA's compliance. NASA detailed a Statement of Work that requested information on strategic, organizational, and tactical aspects of net zero energy buildings. In response, this document presents a high-level approach to net zero energy planning, design, construction, and operations, based on NREL's first-hand experience procuring net zero energy construction, and based on NREL and other industry research on net zero energy feasibility. The strategicmore » approach to net zero energy starts with an interpretation of the executive order language relating to net zero energy. Specifically, this roadmap defines a net zero energy acquisition process as one that sets an aggressive energy use intensity goal for the building in project planning, meets the reduced demand goal through energy efficiency strategies and technologies, then adds renewable energy in a prioritized manner, using building-associated, emission- free sources first, to offset the annual energy use required at the building; the net zero energy process extends through the life of the building, requiring a balance of energy use and production in each calendar year.« less
BOREAS TGB-1/TGB-3 NEE Data over the NSA Fen
NASA Technical Reports Server (NTRS)
Bellisario, Lianne; Hall, Forrest G. (Editor); Conrad, Sara K. (Editor); Moore, Tim R.
2000-01-01
The BOReal Ecosystem-Atmosphere Study Trace Gas Biogeochemistry (BOREAS TGB-1) and TGB-3 teams collected several data sets that contributed to understanding the measured trace gas fluxes over sites in the Northern Study Area (NSA). This data set contains Net Ecosystem Exchange of CO2 (NEE) measurements collected with chambers at the NSA fen in 1994 and 1996. Gas samples were extracted approximately every 7 days from chambers and analyzed at the NSA lab facility. The data are provided in tabular ASCII files.
Hu, Jialu; Kehr, Birte; Reinert, Knut
2014-02-15
Owing to recent advancements in high-throughput technologies, protein-protein interaction networks of more and more species become available in public databases. The question of how to identify functionally conserved proteins across species attracts a lot of attention in computational biology. Network alignments provide a systematic way to solve this problem. However, most existing alignment tools encounter limitations in tackling this problem. Therefore, the demand for faster and more efficient alignment tools is growing. We present a fast and accurate algorithm, NetCoffee, which allows to find a global alignment of multiple protein-protein interaction networks. NetCoffee searches for a global alignment by maximizing a target function using simulated annealing on a set of weighted bipartite graphs that are constructed using a triplet approach similar to T-Coffee. To assess its performance, NetCoffee was applied to four real datasets. Our results suggest that NetCoffee remedies several limitations of previous algorithms, outperforms all existing alignment tools in terms of speed and nevertheless identifies biologically meaningful alignments. The source code and data are freely available for download under the GNU GPL v3 license at https://code.google.com/p/netcoffee/.
Elastic SCAD as a novel penalization method for SVM classification tasks in high-dimensional data.
Becker, Natalia; Toedt, Grischa; Lichter, Peter; Benner, Axel
2011-05-09
Classification and variable selection play an important role in knowledge discovery in high-dimensional data. Although Support Vector Machine (SVM) algorithms are among the most powerful classification and prediction methods with a wide range of scientific applications, the SVM does not include automatic feature selection and therefore a number of feature selection procedures have been developed. Regularisation approaches extend SVM to a feature selection method in a flexible way using penalty functions like LASSO, SCAD and Elastic Net.We propose a novel penalty function for SVM classification tasks, Elastic SCAD, a combination of SCAD and ridge penalties which overcomes the limitations of each penalty alone.Since SVM models are extremely sensitive to the choice of tuning parameters, we adopted an interval search algorithm, which in comparison to a fixed grid search finds rapidly and more precisely a global optimal solution. Feature selection methods with combined penalties (Elastic Net and Elastic SCAD SVMs) are more robust to a change of the model complexity than methods using single penalties. Our simulation study showed that Elastic SCAD SVM outperformed LASSO (L1) and SCAD SVMs. Moreover, Elastic SCAD SVM provided sparser classifiers in terms of median number of features selected than Elastic Net SVM and often better predicted than Elastic Net in terms of misclassification error.Finally, we applied the penalization methods described above on four publicly available breast cancer data sets. Elastic SCAD SVM was the only method providing robust classifiers in sparse and non-sparse situations. The proposed Elastic SCAD SVM algorithm provides the advantages of the SCAD penalty and at the same time avoids sparsity limitations for non-sparse data. We were first to demonstrate that the integration of the interval search algorithm and penalized SVM classification techniques provides fast solutions on the optimization of tuning parameters.The penalized SVM classification algorithms as well as fixed grid and interval search for finding appropriate tuning parameters were implemented in our freely available R package 'penalizedSVM'.We conclude that the Elastic SCAD SVM is a flexible and robust tool for classification and feature selection tasks for high-dimensional data such as microarray data sets.
Elastic SCAD as a novel penalization method for SVM classification tasks in high-dimensional data
2011-01-01
Background Classification and variable selection play an important role in knowledge discovery in high-dimensional data. Although Support Vector Machine (SVM) algorithms are among the most powerful classification and prediction methods with a wide range of scientific applications, the SVM does not include automatic feature selection and therefore a number of feature selection procedures have been developed. Regularisation approaches extend SVM to a feature selection method in a flexible way using penalty functions like LASSO, SCAD and Elastic Net. We propose a novel penalty function for SVM classification tasks, Elastic SCAD, a combination of SCAD and ridge penalties which overcomes the limitations of each penalty alone. Since SVM models are extremely sensitive to the choice of tuning parameters, we adopted an interval search algorithm, which in comparison to a fixed grid search finds rapidly and more precisely a global optimal solution. Results Feature selection methods with combined penalties (Elastic Net and Elastic SCAD SVMs) are more robust to a change of the model complexity than methods using single penalties. Our simulation study showed that Elastic SCAD SVM outperformed LASSO (L1) and SCAD SVMs. Moreover, Elastic SCAD SVM provided sparser classifiers in terms of median number of features selected than Elastic Net SVM and often better predicted than Elastic Net in terms of misclassification error. Finally, we applied the penalization methods described above on four publicly available breast cancer data sets. Elastic SCAD SVM was the only method providing robust classifiers in sparse and non-sparse situations. Conclusions The proposed Elastic SCAD SVM algorithm provides the advantages of the SCAD penalty and at the same time avoids sparsity limitations for non-sparse data. We were first to demonstrate that the integration of the interval search algorithm and penalized SVM classification techniques provides fast solutions on the optimization of tuning parameters. The penalized SVM classification algorithms as well as fixed grid and interval search for finding appropriate tuning parameters were implemented in our freely available R package 'penalizedSVM'. We conclude that the Elastic SCAD SVM is a flexible and robust tool for classification and feature selection tasks for high-dimensional data such as microarray data sets. PMID:21554689
IDIS Small Bodies and Dust Node
NASA Astrophysics Data System (ADS)
de Sanctis, M. C.; Capria, M. T.; Carraro, F.; Fonte, S.; Giacomini, L.; Turrini, D.
2009-04-01
The EuroPlaNet information service provides access to lists of researchers, laboratories and data archives relevant to many aspects of planetary and space physics. Information can be accessed through EuroPlaNet website or, for advanced searches, via web-services available at the different thematic nodes. The goal of IDIS is to provide easy-to-use access to resources like people, laboratories, modeling activities and data archives related to planetary sciences. The development of IDIS is an international effort started under the European Commission's 6th Framework Programme and which will expand its capabilities during the 7th Framework Programme, as part of the Capacities Specific Programme/Research Infrastructures. IDIS is complemented by a set of other EuroPlaNet web-services maintained under the responsibility of separate institutions. Each activity maintains its own web-portal with cross-links pointing to the other elements of EuroPlaNet. General access is provided via the EuroPlaNet Homepage. IDIS is not a repository of original data but rather supports the access to various data sources. The final goal of IDIS is to provide Virtual Observatory tools for the access to data from laboratory measurements and ground- and spaced-based observations to modeling results, allowing the combination of as divergent data sources as feasible. IDIS is built around four scientific nodes located in different European countries. Each node deals with a subset of the disciplines related to planetary sciences and, working in cooperation with international experts in these fields, provides a wealth of information to the international planetary science community. The EuroPlaNet IDIS thematic node "Small Bodies and Dust Node" is hosted by the Istituto di Fisica dello Spazio Interplanetario and is established in close cooperation with the Istituto di Astrofisica Spaziale. Both these institutes are part of the Istituto Nazionale di Astrofisica (INAF). The IDIS Small Bodies and Dust Node aims at becoming a focus point in the fields of Solar System's minor bodies and interplanetary dust by providing the community with a central, user friendly resource and service inventory and contact point. The main aim of the Small Bodies and Dust Node will be to: • support collaborative work in the field of Small Bodies and Dust • provide information about databases and scientific tools in this field • establish a scientific information management system • define and develop Science Cases regarding IDIS
Practical Approaches for Achieving Integrated Behavioral Health Care in Primary Care Settings
Ratzliff, Anna; Phillips, Kathryn E.; Sugarman, Jonathan R.; Unützer, Jürgen; Wagner, Edward H.
2016-01-01
Behavioral health problems are common, yet most patients do not receive effective treatment in primary care settings. Despite availability of effective models for integrating behavioral health care in primary care settings, uptake has been slow. The Behavioral Health Integration Implementation Guide provides practical guidance for adapting and implementing effective integrated behavioral health care into patient-centered medical homes. The authors gathered input from stakeholders involved in behavioral health integration efforts: safety net providers, subject matter experts in primary care and behavioral health, a behavioral health patient and peer specialist, and state and national policy makers. Stakeholder input informed development of the Behavioral Health Integration Implementation Guide and the GROW Pathway Planning Worksheet. The Behavioral Health Integration Implementation Guide is model neutral and allows organizations to take meaningful steps toward providing integrated care that achieves access and accountability. PMID:26698163
Practical Approaches for Achieving Integrated Behavioral Health Care in Primary Care Settings.
Ratzliff, Anna; Phillips, Kathryn E; Sugarman, Jonathan R; Unützer, Jürgen; Wagner, Edward H
Behavioral health problems are common, yet most patients do not receive effective treatment in primary care settings. Despite availability of effective models for integrating behavioral health care in primary care settings, uptake has been slow. The Behavioral Health Integration Implementation Guide provides practical guidance for adapting and implementing effective integrated behavioral health care into patient-centered medical homes. The authors gathered input from stakeholders involved in behavioral health integration efforts: safety net providers, subject matter experts in primary care and behavioral health, a behavioral health patient and peer specialist, and state and national policy makers. Stakeholder input informed development of the Behavioral Health Integration Implementation Guide and the GROW Pathway Planning Worksheet. The Behavioral Health Integration Implementation Guide is model neutral and allows organizations to take meaningful steps toward providing integrated care that achieves access and accountability.
How 3 rural safety net clinics integrate care for patients: a qualitative case study.
Derrett, Sarah; Gunter, Kathryn E; Nocon, Robert S; Quinn, Michael T; Coleman, Katie; Daniel, Donna M; Wagner, Edward H; Chin, Marshall H
2014-11-01
Integrated care focuses on care coordination and patient centeredness. Integrated care supports continuity of care over time, with care that is coordinated within and between settings and is responsive to patients' needs. Currently, little is known about care integration for rural patients. To examine challenges to care integration in rural safety net clinics and strategies to address these challenges. Qualitative case study. Thirty-six providers and staff from 3 rural clinics in the Safety Net Medical Home Initiative. Interviews were analyzed using the framework method with themes organized within 3 constructs: Team Coordination and Empanelment, External Coordination and Partnerships, and Patient-centered and Community-centered Care. Participants described challenges common to safety net clinics, including limited access to specialists for Medicaid and uninsured patients, difficulty communicating with external providers, and payment models with limited support for care integration activities. Rurality compounded these challenges. Respondents reported benefits of empanelment and team-based care, and leveraged local resources to support care for patients. Rural clinics diversified roles within teams, shared responsibility for patient care, and colocated providers, as strategies to support care integration. Care integration was supported by 2 fundamental changes to organize and deliver care to patients-(1) empanelment with a designated group of patients being cared for by a provider; and (2) a multidisciplinary team able to address rural issues. New funding and organizational initiatives of the Affordable Care Act may help to further improve care integration, although additional solutions may be necessary to address particular needs of rural communities.
Neural net applied to anthropological material: a methodical study on the human nasal skeleton.
Prescher, Andreas; Meyers, Anne; Gerf von Keyserlingk, Diedrich
2005-07-01
A new information processing method, an artificial neural net, was applied to characterise the variability of anthropological features of the human nasal skeleton. The aim was to find different types of nasal skeletons. A neural net with 15*15 nodes was trained by 17 standard anthropological parameters taken from 184 skulls of the Aachen collection. The trained neural net delivers its classification in a two-dimensional map. Different types of noses were locally separated within the map. Rare and frequent types may be distinguished after one passage of the complete collection through the net. Statistical descriptive analysis, hierarchical cluster analysis, and discriminant analysis were applied to the same data set. These parallel applications allowed comparison of the new approach to the more traditional ones. In general the classification by the neural net is in correspondence with cluster analysis and discriminant analysis. However, it goes beyond these classifications because of the possibility of differentiating the types in multi-dimensional dependencies. Furthermore, places in the map are kept blank for intermediate forms, which may be theoretically expected, but were not included in the training set. In conclusion, the application of a neural network is a suitable method for investigating large collections of biological material. The gained classification may be helpful in anatomy and anthropology as well as in forensic medicine. It may be used to characterise the peculiarity of a whole set as well as to find particular cases within the set.
Foster, N.R.; Kennedy, G.W.; Munawar, M.; Edsall, T.; Leach, J.
1995-01-01
In August 1987, the Michigan Department of Natural Resources (MDNR), with the help and co-sponsorship of Walleyes for Iosco County, constructed Tawas artificial reef to improve recreational fishing in Tawas Bay. Post-construction assessment in October, 1987, by the MDNR found twice as many adult lake trout in a gill net set on the reef as in a similar net set off the reef, indicating that lake trout already had begun to investigate this new habitat. Similar netting efforts in October 1989 caught three times as many adults on the reef as off it, even though the on-reef net was set for less than one third as long a period. Using a remotely operated vehicle (ROV), we detected prespawning aggregations of lake trout on the reef in fall 1989, and MDNR biologists set emergent fly traps on the reef in April-May 1990-1991. These fry traps captured several newly emerged lake trout and lake whitefish fry, demonstrating that eggs of both species has hatched successfully. Gill netting in 1992-1993 by U.S. Fish and Wildlife Service biologists netted large numbers of ripe lake trout in late October and ripe lake whitefish in early to mid-November. The purpose of this paper is to describe the relative quantities of eggs deposited and the spatial patterns of egg deposition by lake trout and lake whitefish at Tawas artificial reef during 1990-1993.
eWaterCycle visualisation. combining the strength of NetCDF and Web Map Service: ncWMS
NASA Astrophysics Data System (ADS)
Hut, R.; van Meersbergen, M.; Drost, N.; Van De Giesen, N.
2016-12-01
As a result of the eWatercycle global hydrological forecast we have created Cesium-ncWMS, a web application based on ncWMS and Cesium. ncWMS is a server side application capable of reading any NetCDF file written using the Climate and Forecasting (CF) conventions, and making the data available as a Web Map Service(WMS). ncWMS automatically determines available variables in a file, and creates maps colored according to map data and a user selected color scale. Cesium is a Javascript 3D virtual Globe library. It uses WebGL for rendering, which makes it very fast, and it is capable of displaying a wide variety of data types such as vectors, 3D models, and 2D maps. The forecast results are automatically uploaded to our web server running ncWMS. In turn, the web application can be used to change the settings for color maps and displayed data. The server uses the settings provided by the web application, together with the data in NetCDF to provide WMS image tiles, time series data and legend graphics to the Cesium-NcWMS web application. The user can simultaneously zoom in to the very high resolution forecast results anywhere on the world, and get time series data for any point on the globe. The Cesium-ncWMS visualisation combines a global overview with local relevant information in any browser. See the visualisation live at forecast.ewatercycle.org
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-19
... protect the waterways, waterway users, and vessels from hazards associated with intensive fish sampling... sampling efforts will include the setting of nets throughout this portion of the Chicago Sanitary and Ship Canal. The purpose of this sampling is to provide essential information in connection with efforts to...
Falcaro, Milena; Carpenter, James R
2017-06-01
Population-based net survival by tumour stage at diagnosis is a key measure in cancer surveillance. Unfortunately, data on tumour stage are often missing for a non-negligible proportion of patients and the mechanism giving rise to the missingness is usually anything but completely at random. In this setting, restricting analysis to the subset of complete records gives typically biased results. Multiple imputation is a promising practical approach to the issues raised by the missing data, but its use in conjunction with the Pohar-Perme method for estimating net survival has not been formally evaluated. We performed a resampling study using colorectal cancer population-based registry data to evaluate the ability of multiple imputation, used along with the Pohar-Perme method, to deliver unbiased estimates of stage-specific net survival and recover missing stage information. We created 1000 independent data sets, each containing 5000 patients. Stage data were then made missing at random under two scenarios (30% and 50% missingness). Complete records analysis showed substantial bias and poor confidence interval coverage. Across both scenarios our multiple imputation strategy virtually eliminated the bias and greatly improved confidence interval coverage. In the presence of missing stage data complete records analysis often gives severely biased results. We showed that combining multiple imputation with the Pohar-Perme estimator provides a valid practical approach for the estimation of stage-specific colorectal cancer net survival. As usual, when the percentage of missing data is high the results should be interpreted cautiously and sensitivity analyses are recommended. Copyright © 2017 Elsevier Ltd. All rights reserved.
Role of Cost on Failure to Access Prescribed Pharmaceuticals: The Case of Statins.
McRae, Ian; van Gool, Kees; Hall, Jane; Yen, Laurann
2017-10-01
In Australia, as in many other Western countries, patient surveys suggest the costs of medicines lead to deferring or avoiding filling of prescriptions. The Australian Pharmaceutical Benefits Scheme provides approved prescription medicines at subsidised prices with relatively low patient co-payments. The Pharmaceutical Benefits Scheme defines patient co-payment levels per script depending on whether patients are "concessional" (holding prescribed pension or other government concession cards) or "general", and whether they have reached a safety net defined by total out-of-pocket costs for Pharmaceutical Benefits Scheme-approved medicines. The purpose of this study was to explore the impact of costs on adherence to statins in this relatively low-cost environment. Using data from a large-scale survey of older Australians in the state of New South Wales linked to administrative data from the national medical and pharmaceutical insurance schemes, we explore the relationships between adherence to medication regimes for statins and out-of-pocket costs of prescribed pharmaceuticals, income, other health costs, and a wide set of demographic and socio-economic control variables using both descriptive analysis and logistic regressions. Within the general non-safety net group, which has the highest co-payment, those with lowest income have the lowest adherence, suggesting that the general safety threshold may be set at a level that forms a major barrier to statin adherence. This is reinforced by over 75% of those who were not adherent before reaching the safety net threshold becoming adherent after reaching the safety net with its lower co-payments. The main financial determinant of adherence is the concessional/general and safety net category of the patient, which means the main determinant is the level of co-payment.
Combining Costs and Benefits of Animal Activities to Assess Net Yield Outcomes in Apple Orchards
Luck, Gary W.
2016-01-01
Diverse animal communities influence ecosystem function in agroecosystems through positive and negative plant-animal interactions. Yet, past research has largely failed to examine multiple interactions that can have opposing impacts on agricultural production in a given context. We collected data on arthropod communities and yield quality and quantity parameters (fruit set, yield loss and net outcomes) in three major apple-growing regions in south-eastern Australia. We quantified the net yield outcome (accounting for positive and negative interactions) of multiple animal activities (pollination, fruit damage, biological control) across the entire growing season on netted branches, which excluded vertebrate predators of arthropods, and open branches. Net outcome was calculated as the number of undamaged fruit at harvest as a proportion of the number of blossoms (i.e., potential fruit yield). Vertebrate exclusion resulted in lower levels of fruit set and higher levels of arthropod damage to apples, but did not affect net outcomes. Yield quality and quantity parameters (fruit set, yield loss, net outcomes) were not directly associated with arthropod functional groups. Model variance and significant differences between the ratio of pest to beneficial arthropods between regions indicated that complex relationships between environmental factors and multiple animal interactions have a combined effect on yield. Our results show that focusing on a single crop stage, species group or ecosystem function/service can overlook important complexity in ecological processes within the system. Accounting for this complexity and quantifying the net outcome of ecological interactions within the system, is more informative for research and management of biodiversity and ecosystem services in agricultural landscapes. PMID:27391022
Combining Costs and Benefits of Animal Activities to Assess Net Yield Outcomes in Apple Orchards.
Saunders, Manu E; Luck, Gary W
2016-01-01
Diverse animal communities influence ecosystem function in agroecosystems through positive and negative plant-animal interactions. Yet, past research has largely failed to examine multiple interactions that can have opposing impacts on agricultural production in a given context. We collected data on arthropod communities and yield quality and quantity parameters (fruit set, yield loss and net outcomes) in three major apple-growing regions in south-eastern Australia. We quantified the net yield outcome (accounting for positive and negative interactions) of multiple animal activities (pollination, fruit damage, biological control) across the entire growing season on netted branches, which excluded vertebrate predators of arthropods, and open branches. Net outcome was calculated as the number of undamaged fruit at harvest as a proportion of the number of blossoms (i.e., potential fruit yield). Vertebrate exclusion resulted in lower levels of fruit set and higher levels of arthropod damage to apples, but did not affect net outcomes. Yield quality and quantity parameters (fruit set, yield loss, net outcomes) were not directly associated with arthropod functional groups. Model variance and significant differences between the ratio of pest to beneficial arthropods between regions indicated that complex relationships between environmental factors and multiple animal interactions have a combined effect on yield. Our results show that focusing on a single crop stage, species group or ecosystem function/service can overlook important complexity in ecological processes within the system. Accounting for this complexity and quantifying the net outcome of ecological interactions within the system, is more informative for research and management of biodiversity and ecosystem services in agricultural landscapes.
Seligman, Hilary K.; Fernandez, Alicia; Stern, Rachel J.; Weech-Maldonado, Robert; Quan, Judy; Jacobs, Elizabeth A.
2012-01-01
Background The Consumer Assessment of Healthcare Providers and Systems Cultural Competency (CAHPS-CC) Item Set assesses patient perceptions of aspects of the cultural competence of their health care. Objective To determine characteristics of patients who identify the care they receive as less culturally competent Research Design Cross-sectional survey consisting of face-to-face interviews Subjects Safety-net population of patients with type 2 diabetes (n=600) receiving ongoing primary care Measures Participants completed the CAHPS-CC and answered questions about their race/ethnicity, gender, age, education, health status, depressive symptoms, insurance coverage, English proficiency, duration of relationship with primary care provider, and co-morbidities. Results In adjusted models, depressive symptoms were significantly associated with poor cultural competency in the Doctor Communication – Positive Behaviors domain (OR 1.73, 95%CI 1.11, 2.69). African-Americans were less likely than Whites to report poor cultural competence in the Doctor Communication – Positive Behaviors domain (OR 0.52, 0.28–0.97). Participants who reported a longer relationship (≥3 years) with their primary care provider were less likely to report poor cultural competence in the Doctor Communication – Health Promotion (OR 0.35, 0.21–0.60) and Trust domains (OR 0.4, 0.24–0.67), while participants with lower educational attainment were less likely to report poor cultural competence in the Trust domain (OR 0.51, 0.30–0.86). Overall, however, sociodemographic and clinical differences in reports of poor cultural competence were insignificant or inconsistent across the various domains of cultural competence examined. Conclusions Cultural competence interventions in safety-net settings should be implemented across populations, rather than being narrowly focused on specific sociodemographic or clinical groups. PMID:22895232
NCWin — A Component Object Model (COM) for processing and visualizing NetCDF data
Liu, Jinxun; Chen, J.M.; Price, D.T.; Liu, S.
2005-01-01
NetCDF (Network Common Data Form) is a data sharing protocol and library that is commonly used in large-scale atmospheric and environmental data archiving and modeling. The NetCDF tool described here, named NCWin and coded with Borland C + + Builder, was built as a standard executable as well as a COM (component object model) for the Microsoft Windows environment. COM is a powerful technology that enhances the reuse of applications (as components). Environmental model developers from different modeling environments, such as Python, JAVA, VISUAL FORTRAN, VISUAL BASIC, VISUAL C + +, and DELPHI, can reuse NCWin in their models to read, write and visualize NetCDF data. Some Windows applications, such as ArcGIS and Microsoft PowerPoint, can also call NCWin within the application. NCWin has three major components: 1) The data conversion part is designed to convert binary raw data to and from NetCDF data. It can process six data types (unsigned char, signed char, short, int, float, double) and three spatial data formats (BIP, BIL, BSQ); 2) The visualization part is designed for displaying grid map series (playing forward or backward) with simple map legend, and displaying temporal trend curves for data on individual map pixels; and 3) The modeling interface is designed for environmental model development by which a set of integrated NetCDF functions is provided for processing NetCDF data. To demonstrate that the NCWin can easily extend the functions of some current GIS software and the Office applications, examples of calling NCWin within ArcGIS and MS PowerPoint for showing NetCDF map animations are given.
PERSEUS QC: preparing statistic data sets
NASA Astrophysics Data System (ADS)
Belokopytov, Vladimir; Khaliulin, Alexey; Ingerov, Andrey; Zhuk, Elena; Gertman, Isaac; Zodiatis, George; Nikolaidis, Marios; Nikolaidis, Andreas; Stylianou, Stavros
2017-09-01
The Desktop Oceanographic Data Processing Module was developed for visual analysis of interdisciplinary cruise measurements. The program provides the possibility of data selection based on different criteria, map plotting, sea horizontal sections, and sea depth vertical profiles. The data selection in the area of interest can be specified according to a set of different physical and chemical parameters complimented by additional parameters, such as the cruise number, ship name, and time period. The visual analysis of a set of vertical profiles in the selected area allows to determine the quality of the data, their location and the time of the in-situ measurements and to exclude any questionable data from the statistical analysis. For each selected set of profiles, the average vertical profile, the minimal and maximal values of the parameter under examination and the root mean square (r.m.s.) are estimated. These estimates are compared with the parameter ranges, set for each sub-region by MEDAR/MEDATLAS-II and SeaDataNet2 projects. In the framework of the PERSEUS project, certain parameters which lacked a range were calculated from scratch, while some of the previously used ranges were re-defined using more comprehensive data sets based on SeaDataNet2, SESAME and PERSEUS projects. In some cases we have used additional sub- regions to redefine the ranges ore precisely. The recalculated ranges are used to improve the PERSEUS Data Quality Control.
Improvements to the National Transport Code Collaboration Data Server
NASA Astrophysics Data System (ADS)
Alexander, David A.
2001-10-01
The data server of the National Transport Code Colaboration Project provides a universal network interface to interpolated or raw transport data accessible by a universal set of names. Data can be acquired from a local copy of the Iternational Multi-Tokamak (ITER) profile database as well as from TRANSP trees of MDS Plus data systems on the net. Data is provided to the user's network client via a CORBA interface, thus providing stateful data server instances, which have the advantage of remembering the desired interpolation, data set, etc. This paper will review the status and discuss the recent improvements made to the data server, such as the modularization of the data server and the addition of hdf5 and MDS Plus data file writing capability.
Radom, Marcin; Rybarczyk, Agnieszka; Szawulak, Bartlomiej; Andrzejewski, Hubert; Chabelski, Piotr; Kozak, Adam; Formanowicz, Piotr
2017-12-01
Model development and its analysis is a fundamental step in systems biology. The theory of Petri nets offers a tool for such a task. Since the rapid development of computer science, a variety of tools for Petri nets emerged, offering various analytical algorithms. From this follows a problem of using different programs to analyse a single model. Many file formats and different representations of results make the analysis much harder. Especially for larger nets the ability to visualize the results in a proper form provides a huge help in the understanding of their significance. We present a new tool for Petri nets development and analysis called Holmes. Our program contains algorithms for model analysis based on different types of Petri nets, e.g. invariant generator, Maximum Common Transitions (MCT) sets and cluster modules, simulation algorithms or knockout analysis tools. A very important feature is the ability to visualize the results of almost all analytical modules. The integration of such modules into one graphical environment allows a researcher to fully devote his or her time to the model building and analysis. Available at http://www.cs.put.poznan.pl/mradom/Holmes/holmes.html. piotr@cs.put.poznan.pl. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Metabolic responses to the seated calf press exercise performed against inertial resistance.
Caruso, John F; Herron, Jacquelyn C; Hernandez, Daniel A; Porter, Aaron; Schweickert, Torrey; Manning, Tommy F
2005-11-01
Future in-flight strength training devices may use inertial resistance to abate mass and strength losses to muscle groups such as the triceps surae, which incurs pronounced deficits from space travel. Yet little data exist regarding physiological outcomes to triceps surae exercise performed against inertial resistance. Two sets of subjects were employed to note either blood lactate (La-) or net caloric cost responses to seated calf presses done on an inertial resistance ergometer. Both sets of subjects performed 3 identical 3-set 10-repetition workouts. Blood La- measurements were made pre- and 5 min post-exercise. During workouts, breath-by-breath O2 uptake values were also recorded to help determine the net caloric cost of exercise. Compared to pre-exercise (mean +/- SEM) blood La- (2.01 +/- 0.08 mmol x L(-1)) values, post-exercise (4.73 +/- 0.24 mmol x L(-1)) measurements showed a significant increase. Delta (post/pre differences) La- correlated significantly (r = 0.31-0.34) to several workout performance measures. Net caloric cost averaged 52.82 +/- 3.26 kcals for workouts; multivariate regression showed a subject's height, body mass, and body surface area described the variance associated with energy expenditure. Workouts evoked minimal energy expenditure, though anaerobic glycolysis likely played a major role in ATP resynthesis. Metabolic and exercise performance measures were likely influenced by series elastic element involvement of the triceps surae-Achilles tendon complex. Ergometer calf presses provided a high-intensity workout stimulus with a minimal metabolic cost.
An interactive environment for the analysis of large Earth observation and model data sets
NASA Technical Reports Server (NTRS)
Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.
1994-01-01
Envision is an interactive environment that provides researchers in the earth sciences convenient ways to manage, browse, and visualize large observed or model data sets. Its main features are support for the netCDF and HDF file formats, an easy to use X/Motif user interface, a client-server configuration, and portability to many UNIX workstations. The Envision package also provides new ways to view and change metadata in a set of data files. It permits a scientist to conveniently and efficiently manage large data sets consisting of many data files. It also provides links to popular visualization tools so that data can be quickly browsed. Envision is a public domain package, freely available to the scientific community. Envision software (binaries and source code) and documentation can be obtained from either of these servers: ftp://vista.atmos.uiuc.edu/pub/envision/ and ftp://csrp.tamu.edu/pub/envision/. Detailed descriptions of Envision capabilities and operations can be found in the User's Guide and Reference Manuals distributed with Envision software.
Helminth.net: expansions to Nematode.net and an introduction to Trematode.net
Martin, John; Rosa, Bruce A.; Ozersky, Philip; Hallsworth-Pepin, Kymberlie; Zhang, Xu; Bhonagiri-Palsikar, Veena; Tyagi, Rahul; Wang, Qi; Choi, Young-Jun; Gao, Xin; McNulty, Samantha N.; Brindley, Paul J.; Mitreva, Makedonka
2015-01-01
Helminth.net (http://www.helminth.net) is the new moniker for a collection of databases: Nematode.net and Trematode.net. Within this collection we provide services and resources for parasitic roundworms (nematodes) and flatworms (trematodes), collectively known as helminths. For over a decade we have provided resources for studying nematodes via our veteran site Nematode.net (http://nematode.net). In this article, (i) we provide an update on the expansions of Nematode.net that hosts omics data from 84 species and provides advanced search tools to the broad scientific community so that data can be mined in a useful and user-friendly manner and (ii) we introduce Trematode.net, a site dedicated to the dissemination of data from flukes, flatworm parasites of the class Trematoda, phylum Platyhelminthes. Trematode.net is an independent component of Helminth.net and currently hosts data from 16 species, with information ranging from genomic, functional genomic data, enzymatic pathway utilization to microbiome changes associated with helminth infections. The databases’ interface, with a sophisticated query engine as a backbone, is intended to allow users to search for multi-factorial combinations of species’ omics properties. This report describes updates to Nematode.net since its last description in NAR, 2012, and also introduces and presents its new sibling site, Trematode.net. PMID:25392426
Non-crystallographic nets: characterization and first steps towards a classification.
Moreira de Oliveira, Montauban; Eon, Jean Guillaume
2014-05-01
Non-crystallographic (NC) nets are periodic nets characterized by the existence of non-trivial bounded automorphisms. Such automorphisms cannot be associated with any crystallographic symmetry in realizations of the net by crystal structures. It is shown that bounded automorphisms of finite order form a normal subgroup F(N) of the automorphism group of NC nets (N, T). As a consequence, NC nets are unstable nets (they display vertex collisions in any barycentric representation) and, conversely, stable nets are crystallographic nets. The labelled quotient graphs of NC nets are characterized by the existence of an equivoltage partition (a partition of the vertex set that preserves label vectors over edges between cells). A classification of NC nets is proposed on the basis of (i) their relationship to the crystallographic net with a homeomorphic barycentric representation and (ii) the structure of the subgroup F(N).
FLUXCOM - Overview and First Synthesis
NASA Astrophysics Data System (ADS)
Jung, M.; Ichii, K.; Tramontana, G.; Camps-Valls, G.; Schwalm, C. R.; Papale, D.; Reichstein, M.; Gans, F.; Weber, U.
2015-12-01
We present a community effort aiming at generating an ensemble of global gridded flux products by upscaling FLUXNET data using an array of different machine learning methods including regression/model tree ensembles, neural networks, and kernel machines. We produced products for gross primary production, terrestrial ecosystem respiration, net ecosystem exchange, latent heat, sensible heat, and net radiation for two experimental protocols: 1) at a high spatial and 8-daily temporal resolution (5 arc-minute) using only remote sensing based inputs for the MODIS era; 2) 30 year records of daily, 0.5 degree spatial resolution by incorporating meteorological driver data. Within each set-up, all machine learning methods were trained with the same input data for carbon and energy fluxes respectively. Sets of input driver variables were derived using an extensive formal variable selection exercise. The performance of the extrapolation capacities of the approaches is assessed with a fully internally consistent cross-validation. We perform cross-consistency checks of the gridded flux products with independent data streams from atmospheric inversions (NEE), sun-induced fluorescence (GPP), catchment water balances (LE, H), satellite products (Rn), and process-models. We analyze the uncertainties of the gridded flux products and for example provide a breakdown of the uncertainty of mean annual GPP originating from different machine learning methods, different climate input data sets, and different flux partitioning methods. The FLUXCOM archive will provide an unprecedented source of information for water, energy, and carbon cycle studies.
A standardized sampling protocol for channel catfish in prairie streams
Vokoun, Jason C.; Rabeni, Charles F.
2001-01-01
Three alternative gears—an AC electrofishing raft, bankpoles, and a 15-hoop-net set—were used in a standardized manner to sample channel catfish Ictalurus punctatus in three prairie streams of varying size in three seasons. We compared these gears as to time required per sample, size selectivity, mean catch per unit effort (CPUE) among months, mean CPUE within months, effect of fluctuating stream stage, and sensitivity to population size. According to these comparisons, the 15-hoop-net set used during stable water levels in October had the most desirable characteristics. Using our catch data, we estimated the precision of CPUE and size structure by varying sample sizes for the 15-hoop-net set. We recommend that 11–15 repetitions of the 15-hoop-net set be used for most management activities. This standardized basic unit of effort will increase the precision of estimates and allow better comparisons among samples as well as increased confidence in management decisions.
NASA Technical Reports Server (NTRS)
Decker, Arthur J. (Inventor)
2006-01-01
An artificial neural network is disclosed that processes holography generated characteristic pattern of vibrating structures along with finite-element models. The present invention provides for a folding operation for conditioning training sets for optimally training forward-neural networks to process characteristic fringe pattern. The folding pattern increases the sensitivity of the feed-forward network for detecting changes in the characteristic pattern The folding routine manipulates input pixels so as to be scaled according to the location in an intensity range rather than the position in the characteristic pattern.
Code of Federal Regulations, 2011 CFR
2011-04-01
...(g)-3T Section 1.904(g)-3T Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY....904(g)-3T Ordering rules for the allocation of net operating losses, net capital losses, U.S. source... domestic losses. The rules must be applied in the order set forth in paragraphs (b) through (g) of this...
Code of Federal Regulations, 2012 CFR
2012-04-01
...(g)-3T Section 1.904(g)-3T Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY....904(g)-3T Ordering rules for the allocation of net operating losses, net capital losses, U.S. source... domestic losses. The rules must be applied in the order set forth in paragraphs (b) through (g) of this...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-21
... stake at one or both ends of the nets). All comments received will become part of the public record and... one of the above methods to ensure that we receive, document, and consider them. Comments sent by any... gill nets (i.e., passive gill net sets deployed with an anchor or stake at one or both ends of the nets...
Wei Ren; Hanqin Tian; Bo Tao; Art Chappelka; Ge Sun; et al
2011-01-01
Aim We investigated how ozone pollution and climate change/variability have interactively affected net primary productivity (NPP) and net carbon exchange (NCE) across Chinaâs forest ecosystem in the past half century. Location Continental China. Methods Using the dynamic land ecosystem model (DLEM) in conjunction with 10-km-resolution gridded historical data sets (...
Enabling Grid Computing resources within the KM3NeT computing model
NASA Astrophysics Data System (ADS)
Filippidis, Christos
2016-04-01
KM3NeT is a future European deep-sea research infrastructure hosting a new generation neutrino detectors that - located at the bottom of the Mediterranean Sea - will open a new window on the universe and answer fundamental questions both in particle physics and astrophysics. International collaborative scientific experiments, like KM3NeT, are generating datasets which are increasing exponentially in both complexity and volume, making their analysis, archival, and sharing one of the grand challenges of the 21st century. These experiments, in their majority, adopt computing models consisting of different Tiers with several computing centres and providing a specific set of services for the different steps of data processing such as detector calibration, simulation and data filtering, reconstruction and analysis. The computing requirements are extremely demanding and, usually, span from serial to multi-parallel or GPU-optimized jobs. The collaborative nature of these experiments demands very frequent WAN data transfers and data sharing among individuals and groups. In order to support the aforementioned demanding computing requirements we enabled Grid Computing resources, operated by EGI, within the KM3NeT computing model. In this study we describe our first advances in this field and the method for the KM3NeT users to utilize the EGI computing resources in a simulation-driven use-case.
Schickedanz, Adam; Huang, David; Lopez, Andrea; Cheung, Edna; Lyles, C R; Bodenheimer, Tom; Sarkar, Urmimala
2013-07-01
Electronic and internet-based tools for patient-provider communication are becoming the standard of care, but disparities exist in their adoption among patients. The reasons for these disparities are unclear, and few studies have looked at the potential communication technologies have to benefit vulnerable patient populations. To characterize access to, interest in, and attitudes toward internet-based communication in an ethnically, economically, and linguistically diverse group of patients from a large urban safety net clinic network. Observational, cross-sectional study Adult patients (≥ 18 years) in six resource-limited community clinics in the San Francisco Department of Public Health (SFDPH) MAIN MEASURES: Current email use, interest in communicating electronically with health care professionals, barriers to and facilitators of electronic health-related communication, and demographic data-all self-reported via survey. Sixty percent of patients used email, 71 % were interested in using electronic communication with health care providers, and 19 % reported currently using email informally with these providers for health care. Those already using any email were more likely to express interest in using it for health matters. Most patients agreed electronic communication would improve clinic efficiency and overall communication with clinicians. A significant majority of safety net patients currently use email, text messaging, and the internet, and they expressed an interest in using these tools for electronic communication with their medical providers. This interest is currently unmet within safety net clinics that do not offer a patient portal or secure messaging. Tools such as email encounters and electronic patient portals should be implemented and supported to a greater extent in resource-poor settings, but this will require tailoring these tools to patients' language, literacy level, and experience with communication technology.
2013-01-01
Background Knowledge of the interactions between mosquitoes and humans, and how vector control interventions affect them, is sparse. A study exploring host-seeking behaviour at a human-occupied bed net, a key event in such interactions, is reported here. Methods Host-seeking female Anopheles gambiae activity was studied using a human-baited ‘sticky-net’ (a bed net without insecticide, coated with non-setting adhesive) to trap mosquitoes. The numbers and distribution of mosquitoes captured on each surface of the bed net were recorded and analysed using non-parametric statistical methods and random effects regression analysis. To confirm sticky-net reliability, the experiment was repeated using a pitched sticky-net (tilted sides converging at apex, i.e., neither horizontal nor vertical). The capture efficiency of horizontal and vertical sticky surfaces were compared, and the potential repellency of the adhesive was investigated. Results In a semi-field experiment, more mosquitoes were caught on the top (74-87%) than on the sides of the net (p < 0.001). In laboratory experiments, more mosquitoes were caught on the top than on the sides in human-baited tests (p < 0.001), significantly different to unbaited controls (p < 0.001) where most mosquitoes were on the sides (p = 0.047). In both experiments, approximately 70% of mosquitoes captured on the top surface were clustered within a 90 × 90 cm (or lesser) area directly above the head and chest (p < 0.001). In pitched net tests, similar clustering occurred over the sleeper’s head and chest in baited tests only (p < 0.001). Capture rates at horizontal and vertical surfaces were not significantly different and the sticky-net was not repellent. Conclusion This study demonstrated that An. gambiae activity occurs predominantly within a limited area of the top surface of bed nets. The results provide support for the two-in-one bed net design for managing pyrethroid-resistant vector populations. Further exploration of vector behaviour at the bed net interface could contribute to additional improvements in insecticide-treated bed net design or the development of novel vector control tools. PMID:23902661
Hydratools, a MATLAB® based data processing package for Sontek Hydra data
Martini, M.; Lightsom, F.L.; Sherwood, C.R.; Xu, Jie; Lacy, J.R.; Ramsey, A.; Horwitz, R.
2005-01-01
The U.S. Geological Survey (USGS) has developed a set of MATLAB tools to process and convert data collected by Sontek Hydra instruments to netCDF, which is a format used by the USGS to process and archive oceanographic time-series data. The USGS makes high-resolution current measurements within 1.5 meters of the bottom. These data are used in combination with other instrument data from sediment transport studies to develop sediment transport models. Instrument manufacturers provide software which outputs unique binary data formats. Multiple data formats are cumbersome. The USGS solution is to translate data streams into a common data format: netCDF. The Hydratools toolbox is written to create netCDF format files following EPIC conventions, complete with embedded metadata. Data are accepted from both the ADV and the PCADP. The toolbox will detect and remove bad data, substitute other sources of heading and tilt measurements if necessary, apply ambiguity corrections, calculate statistics, return information about data quality, and organize metadata. Standardized processing and archiving makes these data more easily and routinely accessible locally and over the Internet. In addition, documentation of the techniques used in the toolbox provides a baseline reference for others utilizing the data.
Biodiversity offsets and the challenge of achieving no net loss.
Gardner, Toby A; VON Hase, Amrei; Brownlie, Susie; Ekstrom, Jonathan M M; Pilgrim, John D; Savy, Conrad E; Stephens, R T Theo; Treweek, Jo; Ussher, Graham T; Ward, Gerri; Ten Kate, Kerry
2013-12-01
Businesses, governments, and financial institutions are increasingly adopting a policy of no net loss of biodiversity for development activities. The goal of no net loss is intended to help relieve tension between conservation and development by enabling economic gains to be achieved without concomitant biodiversity losses. biodiversity offsets represent a necessary component of a much broader mitigation strategy for achieving no net loss following prior application of avoidance, minimization, and remediation measures. However, doubts have been raised about the appropriate use of biodiversity offsets. We examined what no net loss means as a desirable conservation outcome and reviewed the conditions that determine whether, and under what circumstances, biodiversity offsets can help achieve such a goal. We propose a conceptual framework to substitute the often ad hoc approaches evident in many biodiversity offset initiatives. The relevance of biodiversity offsets to no net loss rests on 2 fundamental premises. First, offsets are rarely adequate for achieving no net loss of biodiversity alone. Second, some development effects may be too difficult or risky, or even impossible, to offset. To help to deliver no net loss through biodiversity offsets, biodiversity gains must be comparable to losses, be in addition to conservation gains that may have occurred in absence of the offset, and be lasting and protected from risk of failure. Adherence to these conditions requires consideration of the wider landscape context of development and offset activities, timing of offset delivery, measurement of biodiversity, accounting procedures and rule sets used to calculate biodiversity losses and gains and guide offset design, and approaches to managing risk. Adoption of this framework will strengthen the potential for offsets to provide an ecologically defensible mechanism that can help reconcile conservation and development. Balances de Biodiversidad y el Reto de No Obtener Pérdida Neta. © 2013 Society for Conservation Biology.
Community Intercomparison Suite (CIS) v1.4.0: a tool for intercomparing models and observations
NASA Astrophysics Data System (ADS)
Watson-Parris, Duncan; Schutgens, Nick; Cook, Nicholas; Kipling, Zak; Kershaw, Philip; Gryspeerdt, Edward; Lawrence, Bryan; Stier, Philip
2016-09-01
The Community Intercomparison Suite (CIS) is an easy-to-use command-line tool which has been developed to allow the straightforward intercomparison of remote sensing, in situ and model data. While there are a number of tools available for working with climate model data, the large diversity of sources (and formats) of remote sensing and in situ measurements necessitated a novel software solution. Developed by a professional software company, CIS supports a large number of gridded and ungridded data sources "out-of-the-box", including climate model output in NetCDF or the UK Met Office pp file format, CloudSat, CALIOP (Cloud-Aerosol Lidar with Orthogonal Polarization), MODIS (MODerate resolution Imaging Spectroradiometer), Cloud and Aerosol CCI (Climate Change Initiative) level 2 satellite data and a number of in situ aircraft and ground station data sets. The open-source architecture also supports user-defined plugins to allow many other sources to be easily added. Many of the key operations required when comparing heterogenous data sets are provided by CIS, including subsetting, aggregating, collocating and plotting the data. Output data are written to CF-compliant NetCDF files to ensure interoperability with other tools and systems. The latest documentation, including a user manual and installation instructions, can be found on our website (http://cistools.net). Here, we describe the need which this tool fulfils, followed by descriptions of its main functionality (as at version 1.4.0) and plugin architecture which make it unique in the field.
2013-01-01
Background Connectivity map (cMap) is a recent developed dataset and algorithm for uncovering and understanding the treatment effect of small molecules on different cancer cell lines. It is widely used but there are still remaining challenges for accurate predictions. Method Here, we propose BRCA-MoNet, a network of drug mode of action (MoA) specific to breast cancer, which is constructed based on the cMap dataset. A drug signature selection algorithm fitting the characteristic of cMap data, a quality control scheme as well as a novel query algorithm based on BRCA-MoNet are developed for more effective prediction of drug effects. Result BRCA-MoNet was applied to three independent data sets obtained from the GEO database: Estrodial treated MCF7 cell line, BMS-754807 treated MCF7 cell line, and a breast cancer patient microarray dataset. In the first case, BRCA-MoNet could identify drug MoAs likely to share same and reverse treatment effect. In the second case, the result demonstrated the potential of BRCA-MoNet to reposition drugs and predict treatment effects for drugs not in cMap data. In the third case, a possible procedure of personalized drug selection is showcased. Conclusions The results clearly demonstrated that the proposed BRCA-MoNet approach can provide increased prediction power to cMap and thus will be useful for identification of new therapeutic candidates. Website: The web based application is developed and can be access through the following link http://compgenomics.utsa.edu/BRCAMoNet/ PMID:24564956
A new variant of Petri net controlled grammars
NASA Astrophysics Data System (ADS)
Jan, Nurhidaya Mohamad; Turaev, Sherzod; Fong, Wan Heng; Sarmin, Nor Haniza
2015-10-01
A Petri net controlled grammar is a Petri net with respect to a context-free grammar where the successful derivations of the grammar can be simulated using the occurrence sequences of the net. In this paper, we introduce a new variant of Petri net controlled grammars, called a place-labeled Petri net controlled grammar, which is a context-free grammar equipped with a Petri net and a function which maps places of the net to productions of the grammar. The language consists of all terminal strings that can be obtained by parallelly applying multisets of the rules which are the images of the sets of the input places of transitions in a successful occurrence sequence of the Petri net. We study the effect of the different labeling strategies to the computational power and establish lower and upper bounds for the generative capacity of place-labeled Petri net controlled grammars.
Modelling of current loads on aquaculture net cages
NASA Astrophysics Data System (ADS)
Kristiansen, Trygve; Faltinsen, Odd M.
2012-10-01
In this paper we propose and discuss a screen type of force model for the viscous hydrodynamic load on nets. The screen model assumes that the net is divided into a number of flat net panels, or screens. It may thus be applied to any kind of net geometry. In this paper we focus on circular net cages for fish farms. The net structure itself is modelled by an existing truss model. The net shape is solved for in a time-stepping procedure that involves solving a linear system of equations for the unknown tensions at each time step. We present comparisons to experiments with circular net cages in steady current, and discuss the sensitivity of the numerical results to a set of chosen parameters. Satisfactory agreement between experimental and numerical prediction of drag and lift as function of the solidity ratio of the net and the current velocity is documented.
Crux: Rapid Open Source Protein Tandem Mass Spectrometry Analysis
2015-01-01
Efficiently and accurately analyzing big protein tandem mass spectrometry data sets requires robust software that incorporates state-of-the-art computational, machine learning, and statistical methods. The Crux mass spectrometry analysis software toolkit (http://cruxtoolkit.sourceforge.net) is an open source project that aims to provide users with a cross-platform suite of analysis tools for interpreting protein mass spectrometry data. PMID:25182276
ATLANTIC BIRDS: a data set of bird species from the Brazilian Atlantic Forest.
Hasui, Érica; Metzger, Jean Paul; Pimentel, Rafael G; Silveira, Luís Fábio; Bovo, Alex A D A; Martensen, Alexandre C; Uezu, Alexandre; Regolin, André L; Bispo de Oliveira, Arthur Â; Gatto, Cassiano A F R; Duca, Charles; Andretti, Christian B; Banks-Leite, Cristina; Luz, Daniela; Mariz, Daniele; Alexandrino, Eduardo R; de Barros, Fabio M; Martello, Felipe; Pereira, Iolanda M D S; da Silva, José N; Ferraz, Katia M P M D B; Naka, Luciano N; Dos Anjos, Luiz; Efe, Márcio A; Pizo, Marco Aurélio; Pichorim, Mauro; Gonçalves, Maycon Sanyvan S; Cordeiro, Paulo Henrique Chaves; Dias, Rafael A; Muylaert, Renata D L; Rodrigues, Rodolpho C; da Costa, Thiago V V; Cavarzere, Vagner; Tonetti, Vinicius R; Silva, Wesley R; Jenkins, Clinton N; Galetti, Mauro; Ribeiro, Milton C
2018-02-01
South America holds 30% of the world's avifauna, with the Atlantic Forest representing one of the richest regions of the Neotropics. Here we have compiled a data set on Brazilian Atlantic Forest bird occurrence (150,423) and abundance samples (N = 832 bird species; 33,119 bird individuals) using multiple methods, including qualitative surveys, mist nets, point counts, and line transects). We used four main sources of data: museum collections, on-line databases, literature sources, and unpublished reports. The data set comprises 4,122 localities and data from 1815 to 2017. Most studies were conducted in the Florestas de Interior (1,510 localities) and Serra do Mar (1,280 localities) biogeographic sub-regions. Considering the three main quantitative methods (mist net, point count, and line transect), we compiled abundance data for 745 species in 576 communities. In the data set, the most frequent species were Basileuterus culicivorus, Cyclaris gujanensis, and Conophaga lineata. There were 71 singletons, such as Lipaugus conditus and Calyptura cristata. We suggest that this small number of records reinforces the critical situation of these taxa in the Atlantic Forest. The information provided in this data set can be used for macroecological studies and to foster conservation strategies in this biodiversity hotspot. No copyright restrictions are associated with the data set. Please cite this Data Paper if data are used in publications and teaching events. © 2017 by the Ecological Society of America.
Andrulis, Dennis P; Siddiqui, Nadia J
2011-10-01
The Affordable Care Act of 2010 creates both opportunities and risks for safety-net providers in caring for low-income, diverse patients. New funding for health centers; support for coordinated, patient-centered care; and expansion of the primary care workforce are some of the opportunities that potentially strengthen the safety net. However, declining payments to safety-net hospitals, existing financial hardships, and shifts in the health care marketplace may intensify competition, thwart the ability to innovate, and endanger the financial viability of safety-net providers. Support of state and local governments, as well as philanthropies, will be crucial to helping safety-net providers transition to the new health care environment and to preventing the unintended erosion of the safety net for racially and ethnically diverse populations.
Ma, Chifeng; Chen, Hung-I; Flores, Mario; Huang, Yufei; Chen, Yidong
2013-01-01
Connectivity map (cMap) is a recent developed dataset and algorithm for uncovering and understanding the treatment effect of small molecules on different cancer cell lines. It is widely used but there are still remaining challenges for accurate predictions. Here, we propose BRCA-MoNet, a network of drug mode of action (MoA) specific to breast cancer, which is constructed based on the cMap dataset. A drug signature selection algorithm fitting the characteristic of cMap data, a quality control scheme as well as a novel query algorithm based on BRCA-MoNet are developed for more effective prediction of drug effects. BRCA-MoNet was applied to three independent data sets obtained from the GEO database: Estrodial treated MCF7 cell line, BMS-754807 treated MCF7 cell line, and a breast cancer patient microarray dataset. In the first case, BRCA-MoNet could identify drug MoAs likely to share same and reverse treatment effect. In the second case, the result demonstrated the potential of BRCA-MoNet to reposition drugs and predict treatment effects for drugs not in cMap data. In the third case, a possible procedure of personalized drug selection is showcased. The results clearly demonstrated that the proposed BRCA-MoNet approach can provide increased prediction power to cMap and thus will be useful for identification of new therapeutic candidates.
Directory of ICT Resources for Teaching and Learning of Science, Mathematics and Language
ERIC Educational Resources Information Center
Abdon, Buenafe, Comp.; Henly, John, Comp.; Jeffrey, Marilyn, Comp.
2006-01-01
The UNESCO SchoolNet project, "Strengthening ICT in Schools and SchoolNet Project in ASEAN Setting", was initiated to assist teachers to integrate ICT into teaching and to facilitate participation of teachers and students in the Asia-Pacific region in SchoolNet telecollaboration activities. The project was launched in July 2003 and…
Willging, Cathleen E; Waitzkin, Howard; Nicdao, Ethel
2008-09-01
Few accounts document the rural context of mental health safety net institutions (SNIs), especially as they respond to changing public policies. Embedded in wider processes of welfare state restructuring, privatization has transformed state Medicaid systems nationwide. We carried out an ethnographic study in two rural, culturally distinct regions of New Mexico to assess the effects of Medicaid managed care (MMC) and the implications for future reform. After 160 interviews and participant observation at SNIs, we analyzed data through iterative coding procedures. SNIs responded to MMC by nonparticipation, partnering, downsizing, and tapping into alternative funding sources. Numerous barriers impaired access under MMC: service fragmentation, transportation, lack of cultural and linguistic competency, Medicaid enrollment, stigma, and immigration status. By privatizing Medicaid and contracting with for-profit managed care organizations, the state placed additional responsibilities on "disciplined" providers and clients. Managed care models might compromise the rural mental health safety net unless the serious gaps and limitations are addressed in existing services and funding.
Options for accounting carbon sequestration in German forests
Krug, Joachim; Koehl, Michael; Riedel, Thomas; Bormann, Kristin; Rueter, Sebastian; Elsasser, Peter
2009-01-01
Background The Accra climate change talks held from 21–27 August 2008 in Accra, Ghana, were part of an ongoing series of meetings leading up to the Copenhagen meeting in December 2009. During the meeting a set of options for accounting carbon sequestration in forestry on a post-2012 framework was presented. The options include gross-net and net-net accounting and approaches for establishing baselines. Results This article demonstrates the embedded consequences of Accra Accounting Options for the case study of German national GHG accounting. It presents the most current assessment of sequestration rates by forest management for the period 1990 – 2007, provides an outlook of future emissions and removals (up to the year 2042) as related to three different management scenarios, and shows that implementation of some Accra options may reverse sources to sinks, or sinks to sources. Conclusion The results of the study highlight the importance of elaborating an accounting system that would prioritize the climate convention goals, not national preferences. PMID:19650896
Yukich, Joshua O; Briët, Olivier J T; Ahorlu, Collins K; Nardini, Peter; Keating, Joseph
2017-08-07
Long-lasting insecticidal nets (LLINs) are one of the main interventions recommended by the World Health Organization for malaria vector control. LLINs are ineffective if they are not being used. Subsequent to the completion of a cluster randomized cross over trial conducted in rural Greater Accra where participants were provided with the 'Bɔkɔɔ System'-a set of solar powered net fan and light consoles with a solar panel and battery-or alternative household water filters, all trial participants were invited to participate in a Becker-DeGroot-Marschak auction to determine the mean willingness to pay (WTP) for the fan and light consoles and to estimate the demand curve for the units. Results demonstraed a mean WTP of approximately 55 Cedis (~13 USD). Demand results suggested that at a price which would support full manufacturing cost recovery, a majority of households in the area would be willing to purchase at least one such unit.
Options for accounting carbon sequestration in German forests.
Krug, Joachim; Koehl, Michael; Riedel, Thomas; Bormann, Kristin; Rueter, Sebastian; Elsasser, Peter
2009-08-03
The Accra climate change talks held from 21-27 August 2008 in Accra, Ghana, were part of an ongoing series of meetings leading up to the Copenhagen meeting in December 2009. During the meeting a set of options for accounting carbon sequestration in forestry on a post-2012 framework was presented. The options include gross-net and net-net accounting and approaches for establishing baselines. This article demonstrates the embedded consequences of Accra Accounting Options for the case study of German national GHG accounting. It presents the most current assessment of sequestration rates by forest management for the period 1990 - 2007, provides an outlook of future emissions and removals (up to the year 2042) as related to three different management scenarios, and shows that implementation of some Accra options may reverse sources to sinks, or sinks to sources. The results of the study highlight the importance of elaborating an accounting system that would prioritize the climate convention goals, not national preferences.
Accidental bait: do deceased fish increase freshwater turtle bycatch in commercial fyke nets?
Larocque, Sarah M; Watson, Paige; Blouin-Demers, Gabriel; Cooke, Steven J
2012-07-01
Bycatch of turtles in passive inland fyke net fisheries has been poorly studied, yet bycatch is an important conservation issue given the decline in many freshwater turtle populations. Delayed maturity and low natural adult mortality make turtles particularly susceptible to population declines when faced with additional anthropogenic adult mortality such as bycatch. When turtles are captured in fyke nets, the prolonged submergence can lead to stress and subsequent drowning. Fish die within infrequently checked passive fishing nets and dead fish are a potential food source for many freshwater turtles. Dead fish could thus act as attractants and increase turtle captures in fishing nets. We investigated the attraction of turtles to decomposing fish within fyke nets in eastern Ontario. We set fyke nets with either 1 kg of one-day or five-day decomposed fish, or no decomposed fish in the cod-end of the net. Decomposing fish did not alter the capture rate of turtles or fish, nor did it alter the species composition of the catch. Thus, reducing fish mortality in nets using shorter soak times is unlikely to alter turtle bycatch rates since turtles were not attracted by the dead fish. Interestingly, turtle bycatch rates increased as water temperatures did. Water temperature also influences turtle mortality by affecting the duration turtles can remain submerged. We thus suggest that submerged nets to either not be set or have reduced soak times in warm water conditions (e.g., >20 °C) as turtles tend to be captured more frequently and cannot withstand prolonged submergence.
Ficklin, Stephen P; Feltus, Frank Alex
2013-01-01
Many traits of biological and agronomic significance in plants are controlled in a complex manner where multiple genes and environmental signals affect the expression of the phenotype. In Oryza sativa (rice), thousands of quantitative genetic signals have been mapped to the rice genome. In parallel, thousands of gene expression profiles have been generated across many experimental conditions. Through the discovery of networks with real gene co-expression relationships, it is possible to identify co-localized genetic and gene expression signals that implicate complex genotype-phenotype relationships. In this work, we used a knowledge-independent, systems genetics approach, to discover a high-quality set of co-expression networks, termed Gene Interaction Layers (GILs). Twenty-two GILs were constructed from 1,306 Affymetrix microarray rice expression profiles that were pre-clustered to allow for improved capture of gene co-expression relationships. Functional genomic and genetic data, including over 8,000 QTLs and 766 phenotype-tagged SNPs (p-value < = 0.001) from genome-wide association studies, both covering over 230 different rice traits were integrated with the GILs. An online systems genetics data-mining resource, the GeneNet Engine, was constructed to enable dynamic discovery of gene sets (i.e. network modules) that overlap with genetic traits. GeneNet Engine does not provide the exact set of genes underlying a given complex trait, but through the evidence of gene-marker correspondence, co-expression, and functional enrichment, site visitors can identify genes with potential shared causality for a trait which could then be used for experimental validation. A set of 2 million SNPs was incorporated into the database and serve as a potential set of testable biomarkers for genes in modules that overlap with genetic traits. Herein, we describe two modules found using GeneNet Engine, one with significant overlap with the trait amylose content and another with significant overlap with blast disease resistance.
Ficklin, Stephen P.; Feltus, Frank Alex
2013-01-01
Many traits of biological and agronomic significance in plants are controlled in a complex manner where multiple genes and environmental signals affect the expression of the phenotype. In Oryza sativa (rice), thousands of quantitative genetic signals have been mapped to the rice genome. In parallel, thousands of gene expression profiles have been generated across many experimental conditions. Through the discovery of networks with real gene co-expression relationships, it is possible to identify co-localized genetic and gene expression signals that implicate complex genotype-phenotype relationships. In this work, we used a knowledge-independent, systems genetics approach, to discover a high-quality set of co-expression networks, termed Gene Interaction Layers (GILs). Twenty-two GILs were constructed from 1,306 Affymetrix microarray rice expression profiles that were pre-clustered to allow for improved capture of gene co-expression relationships. Functional genomic and genetic data, including over 8,000 QTLs and 766 phenotype-tagged SNPs (p-value < = 0.001) from genome-wide association studies, both covering over 230 different rice traits were integrated with the GILs. An online systems genetics data-mining resource, the GeneNet Engine, was constructed to enable dynamic discovery of gene sets (i.e. network modules) that overlap with genetic traits. GeneNet Engine does not provide the exact set of genes underlying a given complex trait, but through the evidence of gene-marker correspondence, co-expression, and functional enrichment, site visitors can identify genes with potential shared causality for a trait which could then be used for experimental validation. A set of 2 million SNPs was incorporated into the database and serve as a potential set of testable biomarkers for genes in modules that overlap with genetic traits. Herein, we describe two modules found using GeneNet Engine, one with significant overlap with the trait amylose content and another with significant overlap with blast disease resistance. PMID:23874666
Sowa-Staszczak, Anna; Lenda-Tracz, Wioletta; Tomaszuk, Monika; Głowa, Bogusław; Hubalewska-Dydejczyk, Alicja
2013-01-01
Somatostatin receptor scintigraphy (SRS) is a useful tool in the assessment of GEP-NET (gastroenteropancreatic neuroendocrine tumor) patients. The choice of appropriate settings of image reconstruction parameters is crucial in interpretation of these images. The aim of the study was to investigate how the GEP NET lesion signal to noise ratio (TCS/TCB) depends on different reconstruction settings for Flash 3D software (Siemens). SRS results of 76 randomly selected patients with confirmed GEP-NET were analyzed. For SPECT studies the data were acquired using standard clinical settings 3-4 h after the injection of 740 MBq 99mTc-[EDDA/HYNIC] octreotate. To obtain final images the OSEM 3D Flash reconstruction with different settings and FBP reconstruction were used. First, the TCS/TCB ratio in voxels was analyzed for different combinations of the number of subsets and the number of iterations of the OSEM 3D Flash reconstruction. Secondly, the same ratio was analyzed for different parameters of the Gaussian filter (with FWHM = 2-4 times greater from the pixel size). Also the influence of scatter correction on the TCS/TCB ratio was investigated. With increasing number of subsets and iterations, the increase of TCS/TCB ratio was observed. With increasing settings of Gauss [FWHM coefficient] filter, the decrease of TCS/TCB ratio was reported. The use of scatter correction slightly decreases the values of this ratio. OSEM algorithm provides a meaningfully better reconstruction of the SRS SPECT study as compared to the FBP technique. A high number of subsets improves image quality (images are smoother). Increasing number of iterations gives a better contrast and the shapes of lesions and organs are sharper. The choice of reconstruction parameters is a compromise between image qualitative appearance and its quantitative accuracy and should not be modified when comparing multiple studies of the same patient.
NETS - A NEURAL NETWORK DEVELOPMENT TOOL, VERSION 3.0 (MACINTOSH VERSION)
NASA Technical Reports Server (NTRS)
Phillips, T. A.
1994-01-01
NETS, A Tool for the Development and Evaluation of Neural Networks, provides a simulation of Neural Network algorithms plus an environment for developing such algorithms. Neural Networks are a class of systems modeled after the human brain. Artificial Neural Networks are formed from hundreds or thousands of simulated neurons, connected to each other in a manner similar to brain neurons. Problems which involve pattern matching readily fit the class of problems which NETS is designed to solve. NETS uses the back propagation learning method for all of the networks which it creates. The nodes of a network are usually grouped together into clumps called layers. Generally, a network will have an input layer through which the various environment stimuli are presented to the network, and an output layer for determining the network's response. The number of nodes in these two layers is usually tied to some features of the problem being solved. Other layers, which form intermediate stops between the input and output layers, are called hidden layers. NETS allows the user to customize the patterns of connections between layers of a network. NETS also provides features for saving the weight values of a network during the learning process, which allows for more precise control over the learning process. NETS is an interpreter. Its method of execution is the familiar "read-evaluate-print" loop found in interpreted languages such as BASIC and LISP. The user is presented with a prompt which is the simulator's way of asking for input. After a command is issued, NETS will attempt to evaluate the command, which may produce more prompts requesting specific information or an error if the command is not understood. The typical process involved when using NETS consists of translating the problem into a format which uses input/output pairs, designing a network configuration for the problem, and finally training the network with input/output pairs until an acceptable error is reached. NETS allows the user to generate C code to implement the network loaded into the system. This permits the placement of networks as components, or subroutines, in other systems. In short, once a network performs satisfactorily, the Generate C Code option provides the means for creating a program separate from NETS to run the network. Other features: files may be stored in binary or ASCII format; multiple input propagation is permitted; bias values may be included; capability to scale data without writing scaling code; quick interactive testing of network from the main menu; and several options that allow the user to manipulate learning efficiency. NETS is written in ANSI standard C language to be machine independent. The Macintosh version (MSC-22108) includes code for both a graphical user interface version and a command line interface version. The machine independent version (MSC-21588) only includes code for the command line interface version of NETS 3.0. The Macintosh version requires a Macintosh II series computer and has been successfully implemented under System 7. Four executables are included on these diskettes, two for floating point operations and two for integer arithmetic. It requires Think C 5.0 to compile. A minimum of 1Mb of RAM is required for execution. Sample input files and executables for both the command line version and the Macintosh user interface version are provided on the distribution medium. The Macintosh version is available on a set of three 3.5 inch 800K Macintosh format diskettes. The machine independent version has been successfully implemented on an IBM PC series compatible running MS-DOS, a DEC VAX running VMS, a SunIPC running SunOS, and a CRAY Y-MP running UNICOS. Two executables for the IBM PC version are included on the MS-DOS distribution media, one compiled for floating point operations and one for integer arithmetic. The machine independent version is available on a set of three 5.25 inch 360K MS-DOS format diskettes (standard distribution medium) or a .25 inch streaming magnetic tape cartridge in UNIX tar format. NETS was developed in 1989 and updated in 1992. IBM PC is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. DEC, VAX, and VMS are trademarks of Digital Equipment Corporation. SunIPC and SunOS are trademarks of Sun Microsystems, Inc. CRAY Y-MP and UNICOS are trademarks of Cray Research, Inc.
NETS - A NEURAL NETWORK DEVELOPMENT TOOL, VERSION 3.0 (MACHINE INDEPENDENT VERSION)
NASA Technical Reports Server (NTRS)
Baffes, P. T.
1994-01-01
NETS, A Tool for the Development and Evaluation of Neural Networks, provides a simulation of Neural Network algorithms plus an environment for developing such algorithms. Neural Networks are a class of systems modeled after the human brain. Artificial Neural Networks are formed from hundreds or thousands of simulated neurons, connected to each other in a manner similar to brain neurons. Problems which involve pattern matching readily fit the class of problems which NETS is designed to solve. NETS uses the back propagation learning method for all of the networks which it creates. The nodes of a network are usually grouped together into clumps called layers. Generally, a network will have an input layer through which the various environment stimuli are presented to the network, and an output layer for determining the network's response. The number of nodes in these two layers is usually tied to some features of the problem being solved. Other layers, which form intermediate stops between the input and output layers, are called hidden layers. NETS allows the user to customize the patterns of connections between layers of a network. NETS also provides features for saving the weight values of a network during the learning process, which allows for more precise control over the learning process. NETS is an interpreter. Its method of execution is the familiar "read-evaluate-print" loop found in interpreted languages such as BASIC and LISP. The user is presented with a prompt which is the simulator's way of asking for input. After a command is issued, NETS will attempt to evaluate the command, which may produce more prompts requesting specific information or an error if the command is not understood. The typical process involved when using NETS consists of translating the problem into a format which uses input/output pairs, designing a network configuration for the problem, and finally training the network with input/output pairs until an acceptable error is reached. NETS allows the user to generate C code to implement the network loaded into the system. This permits the placement of networks as components, or subroutines, in other systems. In short, once a network performs satisfactorily, the Generate C Code option provides the means for creating a program separate from NETS to run the network. Other features: files may be stored in binary or ASCII format; multiple input propagation is permitted; bias values may be included; capability to scale data without writing scaling code; quick interactive testing of network from the main menu; and several options that allow the user to manipulate learning efficiency. NETS is written in ANSI standard C language to be machine independent. The Macintosh version (MSC-22108) includes code for both a graphical user interface version and a command line interface version. The machine independent version (MSC-21588) only includes code for the command line interface version of NETS 3.0. The Macintosh version requires a Macintosh II series computer and has been successfully implemented under System 7. Four executables are included on these diskettes, two for floating point operations and two for integer arithmetic. It requires Think C 5.0 to compile. A minimum of 1Mb of RAM is required for execution. Sample input files and executables for both the command line version and the Macintosh user interface version are provided on the distribution medium. The Macintosh version is available on a set of three 3.5 inch 800K Macintosh format diskettes. The machine independent version has been successfully implemented on an IBM PC series compatible running MS-DOS, a DEC VAX running VMS, a SunIPC running SunOS, and a CRAY Y-MP running UNICOS. Two executables for the IBM PC version are included on the MS-DOS distribution media, one compiled for floating point operations and one for integer arithmetic. The machine independent version is available on a set of three 5.25 inch 360K MS-DOS format diskettes (standard distribution medium) or a .25 inch streaming magnetic tape cartridge in UNIX tar format. NETS was developed in 1989 and updated in 1992. IBM PC is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. DEC, VAX, and VMS are trademarks of Digital Equipment Corporation. SunIPC and SunOS are trademarks of Sun Microsystems, Inc. CRAY Y-MP and UNICOS are trademarks of Cray Research, Inc.
Cunningham, Peter J; Bazzoli, Gloria J; Katz, Aaron
2008-01-01
This paper describes how intensifying competitive pressures in the health system are simultaneously driving increased demand for safety-net care and taxing safety-net providers' ability to maintain the mission of serving all, regardless of ability to pay. Although safety-net providers adapted to previous challenges arising from managed care, health system pressures have been more intense and more generalized across different sectors in recent years than in the past. Providers are adopting some of the same strategies being used in the private sector to attract higher-paying patients and changing their "image" as a safety-net provider.
MPL-Net Measurements of Aerosol and Cloud Vertical Distributions at Co-Located AERONET Sites
NASA Technical Reports Server (NTRS)
Welton, Ellsworth J.; Campbell, James R.; Berkoff, Timothy A.; Spinhirne, James D.; Tsay, Si-Chee; Holben, Brent; Starr, David OC. (Technical Monitor)
2002-01-01
In the early 1990s, the first small, eye-safe, and autonomous lidar system was developed, the Micropulse Lidar (MPL). The MPL acquires signal profiles of backscattered laser light from aerosols and clouds. The signals are analyzed to yield multiple layer heights, optical depths of each layer, average extinction-to-backscatter ratios for each layer, and profiles of extinction in each layer. In 2000, several MPL sites were organized into a coordinated network, called MPL-Net, by the Cloud and Aerosol Lidar Group at NASA Goddard Space Flight Center (GSFC) using funding provided by the NASA Earth Observing System. tn addition to the funding provided by NASA EOS, the NASA CERES Ground Validation Group supplied four MPL systems to the project, and the NASA TOMS group contributed their MPL for work at GSFC. The Atmospheric Radiation Measurement Program (ARM) also agreed to make their data available to the MPL-Net project for processing. In addition to the initial NASA and ARM operated sites, several other independent research groups have also expressed interest in joining the network using their own instruments. Finally, a limited amount of EOS funding was set aside to participate in various field experiments each year. The NASA Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) project also provides funds to deploy their MPL during ocean research cruises. All together, the MPL-Net project has participated in four major field experiments since 2000. Most MPL-Net sites and field experiment locations are also co-located with sunphotometers in the NASA Aerosol Robotic Network. (AERONET). Therefore, at these locations data is collected on both aerosol and cloud vertical structure as well as column optical depth and sky radiance. Real-time data products are now available from most MPL-Net sites. Our real-time products are generated at times of AERONET aerosol optical depth (AOD) measurements. The AERONET AOD is used as input to our processing routines, which calculate the aerosol layer top height and extinction profile, and our MPL calibration value. A variety of other data products are available or under development. We present an overview of the MPL-Net project and discuss data products useful to the AERONET community. Results from several sites and field experiments will be presented.
Money for nothing? The net costs of medical training.
Barros, Pedro P; Machado, Sara R
2010-09-01
One of the stages of medical training is the residency programme. Hosting institutions often claim compensation for the training provided. How much should this compensation be? According to our results, given the benefits arising from having residents among the house staff, no transfer (either tuition fee or subsidy) should be set to compensate the hosting institution for providing medical training. This paper quantifies the net costs of medical training, defined as the training costs over and above the wage paid. We jointly consider two effects. On the one hand, residents take extra time and resources from both the hosting institution and the supervisor. On the other hand, residents can be regarded as a less expensive substitute to nurses and/or graduate physicians, in the production of health care, both in primary care centres and hospitals. The net effect can be either positive or negative. We use the fact that residents, in Portugal, are centrally allocated to National Health Service hospitals to treat them as a fixed exogenous production factor. The data used comes from Portuguese hospitals and primary care centres. Cost function estimates point to a small negative marginal impact of residents on hospitals' (-0.02%) and primary care centres' (-0.9%) costs. Nonetheless, there is a positive relation between size and cost to the very large hospitals and primary care centres. Our approach to estimation of residents' costs controls for other teaching activities hospitals might have (namely undergraduate Medical Schools). Overall, the net costs of medical training appear to be quite small.
SpaceNet: Modeling and Simulating Space Logistics
NASA Technical Reports Server (NTRS)
Lee, Gene; Jordan, Elizabeth; Shishko, Robert; de Weck, Olivier; Armar, Nii; Siddiqi, Afreen
2008-01-01
This paper summarizes the current state of the art in interplanetary supply chain modeling and discusses SpaceNet as one particular method and tool to address space logistics modeling and simulation challenges. Fundamental upgrades to the interplanetary supply chain framework such as process groups, nested elements, and cargo sharing, enabled SpaceNet to model an integrated set of missions as a campaign. The capabilities and uses of SpaceNet are demonstrated by a step-by-step modeling and simulation of a lunar campaign.
Coverability graphs for a class of synchronously executed unbounded Petri net
NASA Technical Reports Server (NTRS)
Stotts, P. David; Pratt, Terrence W.
1990-01-01
After detailing a variant of the concurrent-execution rule for firing of maximal subsets, in which the simultaneous firing of conflicting transitions is prohibited, an algorithm is constructed for generating the coverability graph of a net executed under this synchronous firing rule. The omega insertion criteria in the algorithm are shown to be valid for any net on which the algorithm terminates. It is accordingly shown that the set of nets on which the algorithm terminates includes the 'conflict-free' class.
Database architectures for Space Telescope Science Institute
NASA Astrophysics Data System (ADS)
Lubow, Stephen
1993-08-01
At STScI nearly all large applications require database support. A general purpose architecture has been developed and is in use that relies upon an extended client-server paradigm. Processing is in general distributed across three processes, each of which generally resides on its own processor. Database queries are evaluated on one such process, called the DBMS server. The DBMS server software is provided by a database vendor. The application issues database queries and is called the application client. This client uses a set of generic DBMS application programming calls through our STDB/NET programming interface. Intermediate between the application client and the DBMS server is the STDB/NET server. This server accepts generic query requests from the application and converts them into the specific requirements of the DBMS server. In addition, it accepts query results from the DBMS server and passes them back to the application. Typically the STDB/NET server is local to the DBMS server, while the application client may be remote. The STDB/NET server provides additional capabilities such as database deadlock restart and performance monitoring. This architecture is currently in use for some major STScI applications, including the ground support system. We are currently investigating means of providing ad hoc query support to users through the above architecture. Such support is critical for providing flexible user interface capabilities. The Universal Relation advocated by Ullman, Kernighan, and others appears to be promising. In this approach, the user sees the entire database as a single table, thereby freeing the user from needing to understand the detailed schema. A software layer provides the translation between the user and detailed schema views of the database. However, many subtle issues arise in making this transformation. We are currently exploring this scheme for use in the Hubble Space Telescope user interface to the data archive system (DADS).
26 CFR 1.1402(a)-3 - Special rules for computing net earnings from self-employment.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 26 Internal Revenue 12 2010-04-01 2010-04-01 false Special rules for computing net earnings from....1402(a)-3 Special rules for computing net earnings from self-employment. For the purpose of computing... by a partnership of which he is a member shall be computed in accordance with the special rules set...
Secure Peer-to-Peer Networks for Scientific Information Sharing
NASA Technical Reports Server (NTRS)
Karimabadi, Homa
2012-01-01
The most common means of remote scientific collaboration today includes the trio of e-mail for electronic communication, FTP for file sharing, and personalized Web sites for dissemination of papers and research results. With the growth of broadband Internet, there has been a desire to share large files (movies, files, scientific data files) over the Internet. Email has limits on the size of files that can be attached and transmitted. FTP is often used to share large files, but this requires the user to set up an FTP site for which it is hard to set group privileges, it is not straightforward for everyone, and the content is not searchable. Peer-to-peer technology (P2P), which has been overwhelmingly successful in popular content distribution, is the basis for development of a scientific collaboratory called Scientific Peer Network (SciPerNet). This technology combines social networking with P2P file sharing. SciPerNet will be a standalone application, written in Java and Swing, thus insuring portability to a number of different platforms. Some of the features include user authentication, search capability, seamless integration with a data center, the ability to create groups and social networks, and on-line chat. In contrast to P2P networks such as Gnutella, Bit Torrent, and others, SciPerNet incorporates three design elements that are critical to application of P2P for scientific purposes: User authentication, Data integrity validation, Reliable searching SciPerNet also provides a complementary solution to virtual observatories by enabling distributed collaboration and sharing of downloaded and/or processed data among scientists. This will, in turn, increase scientific returns from NASA missions. As such, SciPerNet can serve a two-fold purpose for NASA: a cost-savings software as well as a productivity tool for scientists working with data from NASA missions.
Seafood Safety and Quality: The Consumer’s Role
Hicks, Doris T.
2016-01-01
All the good news about seafood—the health and nutritional benefits, the wide varieties and flavors—has had a positive effect on consumption: people are eating more seafood (http://www.seagrant.sunysb.edu/seafood/pdfs/SeafoodSavvy.pdf). Yet consumers want to be assured that seafood is as safe as, or safer to eat than, other foods. When you hear “seafood safety”, think of a safety net designed to protect you, the consumer, from food-borne illness. Every facet of the seafood industry, from harvester to consumer, plays a role in holding up the safety net. The role of state and federal agencies, fishermen, aquaculturists, retailers, processors, restaurants, and scientists is to provide, update, and carry out the necessary handling, processing, and inspection procedures to give consumers the safest seafood possible. The consumer’s responsibility is to follow through with proper handling techniques, from purchase to preparation. It doesn’t matter how many regulations and inspection procedures are set up; the final edge of the safety net is held by the consumer. This article will give you the information you need to educate yourself and be assured that the fish and shellfish you consume are safe. The most common food-borne illnesses are caused by a combination of bacteria naturally present in our environment and food handling errors made in commercial settings, food service institutions, or at home. PMID:28231165
The CAFE model: A net production model for global ocean phytoplankton
NASA Astrophysics Data System (ADS)
Silsbe, Greg M.; Behrenfeld, Michael J.; Halsey, Kimberly H.; Milligan, Allen J.; Westberry, Toby K.
2016-12-01
The Carbon, Absorption, and Fluorescence Euphotic-resolving (CAFE) net primary production model is an adaptable framework for advancing global ocean productivity assessments by exploiting state-of-the-art satellite ocean color analyses and addressing key physiological and ecological attributes of phytoplankton. Here we present the first implementation of the CAFE model that incorporates inherent optical properties derived from ocean color measurements into a mechanistic and accurate model of phytoplankton growth rates (μ) and net phytoplankton production (NPP). The CAFE model calculates NPP as the product of energy absorption (QPAR), and the efficiency (ϕμ) by which absorbed energy is converted into carbon biomass (CPhyto), while μ is calculated as NPP normalized to CPhyto. The CAFE model performance is evaluated alongside 21 other NPP models against a spatially robust and globally representative set of direct NPP measurements. This analysis demonstrates that the CAFE model explains the greatest amount of variance and has the lowest model bias relative to other NPP models analyzed with this data set. Global oceanic NPP from the CAFE model (52 Pg C m-2 yr-1) and mean division rates (0.34 day-1) are derived from climatological satellite data (2002-2014). This manuscript discusses and validates individual CAFE model parameters (e.g., QPAR and ϕμ), provides detailed sensitivity analyses, and compares the CAFE model results and parameterization to other widely cited models.
Software reuse issues affecting AdaNET
NASA Technical Reports Server (NTRS)
Mcbride, John G.
1989-01-01
The AdaNet program is reviewing its long-term goals and strategies. A significant concern is whether current AdaNet plans adequately address the major strategic issues of software reuse technology. The major reuse issues of providing AdaNet services that should be addressed as part of future AdaNet development are identified and reviewed. Before significant development proceeds, a plan should be developed to resolve the aforementioned issues. This plan should also specify a detailed approach to develop AdaNet. A three phased strategy is recommended. The first phase would consist of requirements analysis and produce an AdaNet system requirements specification. It would consider the requirements of AdaNet in terms of mission needs, commercial realities, and administrative policies affecting development, and the experience of AdaNet and other projects promoting the transfer software engineering technology. Specifically, requirements analysis would be performed to better understand the requirements for AdaNet functions. The second phase would provide a detailed design of the system. The AdaNet should be designed with emphasis on the use of existing technology readily available to the AdaNet program. A number of reuse products are available upon which AdaNet could be based. This would significantly reduce the risk and cost of providing an AdaNet system. Once a design was developed, implementation would proceed in the third phase.
Cloud-Based Perception and Control of Sensor Nets and Robot Swarms
2016-04-01
distributed stream processing framework provides the necessary API and infrastructure to develop and execute such applications in a cluster of computation...streaming DDDAS applications based on challenges they present to the backend Cloud control system. Figure 2 Parallel SLAM Application 3 1) Set of...the art deep learning- based object detectors can recognize among hundreds of object classes and this capability would be very useful for mobile
NASA Astrophysics Data System (ADS)
Hayashida, T.; Tajima, F.
2007-12-01
The Real-time Earthquake Information System (REIS, Horiuchi et al., 2005) detects earthquakes and determines event parameters using the Hi-net (High-sensitivity seismograph network Japan) data in Japan. The system also predicts the arrival time and seismic intensity at a given site before ground motions arrive. Here, the seismic intensity is estimated based on the intensity magnitude which is derived from data of the Hi-net. As the Hi-net stations are located in the boreholes, intensity estimation on the ground surface is evaluated using a constant for subsurface amplification. But the estimated intensities based on the conventionally used amplification constants are not always in agreement with those observed at specific sites on the ground surface. The KiK-net (KIBAN Kyoshin network Japan) consists of strong motion instruments. Each station has two sets of accelerometers, one set is installed on the ground surface and the other one is co-located with a Hi-net station in the borehole. We use data recorded at the KiK-net stations to calibrate subsurface site amplification factors between the borehole and the ground surface. We selected data recorded for over 200 events during the period of 1997 to 2006 in Hiroshima prefecture and calculated the ratios of peak velocity amplitudes on the ground surface ( Asurf) to those in the borehole ( Abor). The subsurface amplification varies from station to station showing dependency on the propagation distance as well as on the incident direction of seismic waves. Results suggest that the site amplification factors shall be described as a function of distance and incident direction, and are not constants. Thus, we derived empirical amplification formulas between Asurf and the peak velocity amplitudes on the engineering bedrock ( Abed) as a function of distance in place of the conventionally used amplification constants. Here, the engineering bedrock is defined as the depth where the S- wave velocity is 600 m/s. The estimated intensities show substantial improvement in the accuracy at most stations as compared with those calculated using conventional constants. When the amplification dependence on the incident direction was accounted for, the estimated intensities somewhat improved. This calibration will help an earthquake early warning system such as REIS provide more accurate intensity estimates.
ERIC Educational Resources Information Center
Weber Guisan, Saskia; Voit, Janine; Lengauer, Sonja; Proinger, Eva; Duvekot, Ruud; Aagaard, Kirsten
2014-01-01
The present publication is one of the outcomes of the OBSERVAL-NET project (follow-up of the OBSERVAL project). The main aim of OBSERVAL-NET was to set up a stakeholder-centric network of organisations supporting the validation of non-formal and informal learning in Europe based on the formation of national working groups in the 8 participating…
ERIC Educational Resources Information Center
Weber Guisan, Saskia; Voit, Janine; Lengauer, Sonja; Proinger, Eva; Duvekot, Ruud; Aagaard, Kirsten
2014-01-01
The present publication is one of the outcomes of the OBSERVAL-NET project (followup of the OBSERVAL project). The main aim of OBSERVAL-NET was to set up a stakeholder centric network of organisations supporting the validation of non-formal and informal learning in Europe based on the formation of national working groups in the 8 participating…
An application of deep learning in the analysis of stellar spectra
NASA Astrophysics Data System (ADS)
Fabbro, S.; Venn, K. A.; O'Briain, T.; Bialek, S.; Kielty, C. L.; Jahandar, F.; Monty, S.
2018-04-01
Spectroscopic surveys require fast and efficient analysis methods to maximize their scientific impact. Here, we apply a deep neural network architecture to analyse both SDSS-III APOGEE DR13 and synthetic stellar spectra. When our convolutional neural network model (StarNet) is trained on APOGEE spectra, we show that the stellar parameters (temperature, gravity, and metallicity) are determined with similar precision and accuracy as the APOGEE pipeline. StarNet can also predict stellar parameters when trained on synthetic data, with excellent precision and accuracy for both APOGEE data and synthetic data, over a wide range of signal-to-noise ratios. In addition, the statistical uncertainties in the stellar parameter determinations are comparable to the differences between the APOGEE pipeline results and those determined independently from optical spectra. We compare StarNet to other data-driven methods; for example, StarNet and the Cannon 2 show similar behaviour when trained with the same data sets; however, StarNet performs poorly on small training sets like those used by the original Cannon. The influence of the spectral features on the stellar parameters is examined via partial derivatives of the StarNet model results with respect to the input spectra. While StarNet was developed using the APOGEE observed spectra and corresponding ASSET synthetic data, we suggest that this technique is applicable to other wavelength ranges and other spectral surveys.
Hoddinott, Pat; Thomson, Gill; Morgan, Heather; Crossland, Nicola; MacLennan, Graeme; Dykes, Fiona; Stewart, Fiona; Bauld, Linda; Campbell, Marion K
2015-01-01
Objective To explore the acceptability, mechanisms and consequences of provider incentives for smoking cessation and breast feeding as part of the Benefits of Incentives for Breastfeeding and Smoking cessation in pregnancy (BIBS) study. Design Cross-sectional survey and qualitative interviews. Setting Scotland and North West England. Participants Early years professionals: 497 survey respondents included 156 doctors; 197 health visitors/maternity staff; 144 other health staff. Qualitative interviews or focus groups were conducted with 68 pregnant/postnatal women/family members; 32 service providers; 22 experts/decision-makers; 63 conference attendees. Methods Early years professionals were surveyed via email about the acceptability of payments to local health services for reaching smoking cessation in pregnancy and breastfeeding targets. Agreement was measured on a 5-point scale using multivariable ordered logit models. A framework approach was used to analyse free-text survey responses and qualitative data. Results Health professional net agreement for provider incentives for smoking cessation targets was 52.9% (263/497); net disagreement was 28.6% (142/497). Health visitors/maternity staff were more likely than doctors to agree: OR 2.35 (95% CI 1.51 to 3.64; p<0.001). Net agreement for provider incentives for breastfeeding targets was 44.1% (219/497) and net disagreement was 38.6% (192/497). Agreement was more likely for women (compared with men): OR 1.81 (1.09 to 3.00; p=0.023) and health visitors/maternity staff (compared with doctors): OR 2.54 (95% CI 1.65 to 3.91; p<0.001). Key emergent themes were ‘moral tensions around acceptability’, ‘need for incentives’, ‘goals’, ‘collective or divisive action’ and ‘monitoring and proof’. While provider incentives can focus action and resources, tensions around the impact on relationships raised concerns. Pressure, burden of proof, gaming, box-ticking bureaucracies and health inequalities were counterbalances to potential benefits. Conclusions Provider incentives are favoured by non-medical staff. Solutions which increase trust and collaboration towards shared goals, without negatively impacting on relationships or increasing bureaucracy are required. PMID:26567253
Decision net, directed graph, and neural net processing of imaging spectrometer data
NASA Technical Reports Server (NTRS)
Casasent, David; Liu, Shiaw-Dong; Yoneyama, Hideyuki; Barnard, Etienne
1989-01-01
A decision-net solution involving a novel hierarchical classifier and a set of multiple directed graphs, as well as a neural-net solution, are respectively presented for large-class problem and mixture problem treatments of imaging spectrometer data. The clustering method for hierarchical classifier design, when used with multiple directed graphs, yields an efficient decision net. New directed-graph rules for reducing local maxima as well as the number of perturbations required, and the new starting-node rules for extending the reachability and reducing the search time of the graphs, are noted to yield superior results, as indicated by an illustrative 500-class imaging spectrometer problem.
A Standardized Reference Data Set for Vertebrate Taxon Name Resolution
Zermoglio, Paula F.; Guralnick, Robert P.; Wieczorek, John R.
2016-01-01
Taxonomic names associated with digitized biocollections labels have flooded into repositories such as GBIF, iDigBio and VertNet. The names on these labels are often misspelled, out of date, or present other problems, as they were often captured only once during accessioning of specimens, or have a history of label changes without clear provenance. Before records are reliably usable in research, it is critical that these issues be addressed. However, still missing is an assessment of the scope of the problem, the effort needed to solve it, and a way to improve effectiveness of tools developed to aid the process. We present a carefully human-vetted analysis of 1000 verbatim scientific names taken at random from those published via the data aggregator VertNet, providing the first rigorously reviewed, reference validation data set. In addition to characterizing formatting problems, human vetting focused on detecting misspelling, synonymy, and the incorrect use of Darwin Core. Our results reveal a sobering view of the challenge ahead, as less than 47% of name strings were found to be currently valid. More optimistically, nearly 97% of name combinations could be resolved to a currently valid name, suggesting that computer-aided approaches may provide feasible means to improve digitized content. Finally, we associated names back to biocollections records and fit logistic models to test potential drivers of issues. A set of candidate variables (geographic region, year collected, higher-level clade, and the institutional digitally accessible data volume) and their 2-way interactions all predict the probability of records having taxon name issues, based on model selection approaches. We strongly encourage further experiments to use this reference data set as a means to compare automated or computer-aided taxon name tools for their ability to resolve and improve the existing wealth of legacy data. PMID:26760296
Barriguete-Meléndez, Jorge Armando; Hercberg, Serge; Galán, Pilar; Parodi, André; Baulieux, Jacques
2018-01-01
NutriNet-Salud Mexico is a digital health information system, e-epidemiology instrument, online, open and free, to recording and analysis the determinants of dietary habits and nutritional status of the Mexican population, for the prevention of overweight, obesity and noncommunicable diseases for the period 2018-2028. Describe the design, development and implementation of NutriNet-Salud Mexico from the French model NutriNet-Santé France 2008-2018. NutriNet-Salud Mexico platform is the basis for the development of health information system for prospective cohort study, scheduled for a period of 10 years (2018-2028), with a dedicated website, and its development will enable to have multiple study populations within an initial set of five self-applicable questionnaires validated in Mexican population. The information will enable to develop applied research, learn and monitor food contributions and nutritional status of the population, assess the impact of public health actions on feeding behavior and nutritional status, comparing populations between countries (Mexico, France, Belgium and Switzerland) and national institutes, universities and states. NutriNet-Salud Mexico will provide information for assist in research and public action, especially to guide public policies on nutrition Mexico. The scientific elements will make appropriate nutritional recommendations to different populations and access to a representative nominal population sample with low-cost, in real-time, and with dual approach to e-epidemiology: cohort study to identify causality and cross-sectional studies (descriptive research, monitoring and evaluation). Copyright: © 2018 Permanyer.
NASA Astrophysics Data System (ADS)
Tang, Hao; Hu, Fuxiang; Xu, Liuxiong; Dong, Shuchuang; Zhou, Cheng; Wang, Xuefang
2017-10-01
Knotless polyethylene (PE) netting has been widely used in aquaculture cages and fishing gears, especially in Japan. In this study, the hydrodynamic coefficient of six knotless PE netting panels with different solidity ratios were assessed in a flume tank under various attack angles of netting from 0° (parallel to flow) to 90° (perpendicular to flow) and current speeds from 40 cm s-1 to 130 cm s-1. It was found that the drag coefficient was related to Reynolds number, solidity ratio and attack angle of netting. The solidity ratio was positively related with drag coefficient for netting panel perpendicular to flow, whereas when setting the netting panel parallel to the flow the opposite result was obtained. For netting panels placed at an angle to the flow, the lift coefficient reached the maximum at an attack angle of 50° and then decreased as the attack angle further increased. The solidity ratio had a dual influence on drag coefficient of inclined netting panels. Compared to result in the literature, the normal drag coefficient of knotless PE netting measured in this study is larger than that of nylon netting or Dyneema netting.
Bioinformatics research in the Asia Pacific: a 2007 update.
Ranganathan, Shoba; Gribskov, Michael; Tan, Tin Wee
2008-01-01
We provide a 2007 update on the bioinformatics research in the Asia-Pacific from the Asia Pacific Bioinformatics Network (APBioNet), Asia's oldest bioinformatics organisation set up in 1998. From 2002, APBioNet has organized the first International Conference on Bioinformatics (InCoB) bringing together scientists working in the field of bioinformatics in the region. This year, the InCoB2007 Conference was organized as the 6th annual conference of the Asia-Pacific Bioinformatics Network, on Aug. 27-30, 2007 at Hong Kong, following a series of successful events in Bangkok (Thailand), Penang (Malaysia), Auckland (New Zealand), Busan (South Korea) and New Delhi (India). Besides a scientific meeting at Hong Kong, satellite events organized are a pre-conference training workshop at Hanoi, Vietnam and a post-conference workshop at Nansha, China. This Introduction provides a brief overview of the peer-reviewed manuscripts accepted for publication in this Supplement. We have organized the papers into thematic areas, highlighting the growing contribution of research excellence from this region, to global bioinformatics endeavours.
Fermi arc mediated entropy transport in topological semimetals
NASA Astrophysics Data System (ADS)
McCormick, Timothy M.; Watzman, Sarah J.; Heremans, Joseph P.; Trivedi, Nandini
2018-05-01
The low-energy excitations of topological Weyl semimetals are composed of linearly dispersing Weyl fermions that act as monopoles of Berry curvature in the bulk momentum space. Furthermore, on the surface there exist topologically protected Fermi arcs at the projections of these Weyl points. We propose a pathway for entropy transport involving Fermi arcs on one surface connecting to Fermi arcs on the other surface via the bulk Weyl monopoles. We present results for the temperature and magnetic field dependence of the magnetothermal conductance of this conveyor belt channel. The circulating currents result in a net entropy transport without any net charge transport. We provide results for the Fermi arc mediated magnetothermal conductivity in the low-field semiclassical limit as well as in the high-field ultraquantum limit, where only chiral Landau levels are involved. Our work provides a proposed signature of Fermi arc mediated magnetothermal transport and sets the stage for utilizing and manipulating the topological Fermi arcs in thermal applications.
Physics of volleyball: Spiking with a purpose
NASA Astrophysics Data System (ADS)
Behroozi, F.
1998-05-01
A few weeks ago our volleyball coach telephoned me with a problem: How high should a player jump to "spike" a "set" ball so it would clear the net and land at a known distance on the other side of the net?
2011-01-01
Background Network inference methods reconstruct mathematical models of molecular or genetic networks directly from experimental data sets. We have previously reported a mathematical method which is exclusively data-driven, does not involve any heuristic decisions within the reconstruction process, and deliveres all possible alternative minimal networks in terms of simple place/transition Petri nets that are consistent with a given discrete time series data set. Results We fundamentally extended the previously published algorithm to consider catalysis and inhibition of the reactions that occur in the underlying network. The results of the reconstruction algorithm are encoded in the form of an extended Petri net involving control arcs. This allows the consideration of processes involving mass flow and/or regulatory interactions. As a non-trivial test case, the phosphate regulatory network of enterobacteria was reconstructed using in silico-generated time-series data sets on wild-type and in silico mutants. Conclusions The new exact algorithm reconstructs extended Petri nets from time series data sets by finding all alternative minimal networks that are consistent with the data. It suggested alternative molecular mechanisms for certain reactions in the network. The algorithm is useful to combine data from wild-type and mutant cells and may potentially integrate physiological, biochemical, pharmacological, and genetic data in the form of a single model. PMID:21762503
SeaDataNet: Pan-European infrastructure for ocean and marine data management
NASA Astrophysics Data System (ADS)
Fichaut, M.; Schaap, D.; Maudire, G.; Manzella, G. M. R.
2012-04-01
The overall objective of the SeaDataNet project is the upgrade the present SeaDataNet infrastructure into an operationally robust and state-of-the-art Pan-European infrastructure for providing up-to-date and high quality access to ocean and marine metadata, data and data products originating from data acquisition activities by all engaged coastal states, by setting, adopting and promoting common data management standards and by realising technical and semantic interoperability with other relevant data management systems and initiatives on behalf of science, environmental management, policy making, and economy. SeaDataNet is undertaken by the National Oceanographic Data Centres (NODCs), and marine information services of major research institutes, from 31 coastal states bordering the European seas, and also includes Satellite Data Centres, expert modelling centres and the international organisations IOC, ICES and EU-JRC in its network. Its 40 data centres are highly skilled and have been actively engaged in data management for many years and have the essential capabilities and facilities for data quality control, long term stewardship, retrieval and distribution. SeaDataNet undertakes activities to achieve data access and data products services that meet requirements of end-users and intermediate user communities, such as GMES Marine Core Services (e.g. MyOcean), establishing SeaDataNet as the core data management component of the EMODNet infrastructure and contributing on behalf of Europe to global portal initiatives, such as the IOC/IODE - Ocean Data Portal (ODP), and GEOSS. Moreover it aims to achieve INSPIRE compliance and to contribute to the INSPIRE process for developing implementing rules for oceanography. • As part of the SeaDataNet upgrading and capacity building, training courses will be organised aiming at data managers and technicians at the data centres. For the data managers it is important, that they learn to work with the upgraded common SeaDataNet formats and procedures and software tools for preparing and updating metadata, processing and quality control of data, and presentation of data in viewing services, and for production of data products. • SeaDataNet maintains and operates several discovery services with overviews of marine organisations in Europe and their engagement in marine research projects, managing large datasets, and data acquisition by research vessels and monitoring programmes for the European seas and global oceans: o European Directory of Marine Environmental Data (EDMED) (at present > 4300 entries from more than 600 data holding centres in Europe) is a comprehensive reference to the marine data and sample collections held within Europe providing marine scientists, engineers and policy makers with a simple discovery mechanism. It covers all marine environmental disciplines. This needs regular maintenance. o European Directory of Marine Environmental Research Projects (EDMERP) (at present > 2200 entries from more than 300 organisations in Europe) gives an overview of research projects relating to the marine environment, that are relevant in the context of data sets and data acquisition activities ( cruises, in situ monitoring networks, ..) that are covered in SeaDataNet. This needs regular updating, following activities by dataholding institutes for preparing metadata references for EDMED, EDIOS, CSR and CDI. o Cruise Summary Reports (CSR) directory (at present > 43000 entries) provides a coarse-grained inventory for tracking oceanographic data collected by research vessels. o European Directory of Oceanographic Observing Systems (EDIOS) (at present > 10000 entries) is an initiative of EuroGOOS and gives an overview of the ocean measuring and monitoring systems operated by European countries. • European Directory of Marine Organisations (EDMO) (at present > 2000 entries) contains the contact information and activity profiles for the organisations whose data and activities are described by the discovery services. • Common Vocabularies (at present > 120000 terms in > 100 lists), covering a broad spectrum of ocean and marine disciplines. The common terms are used to mark up metadata, data and data products in a consistent and coherent way. Governance is regulated by an international board. • Common Data Index (CDI) data discovery and access service: SeaDataNet provides online unified access to distributed datasets via its portal website to the vast resources of marine and ocean datasets, managed by all the connected distributed data centres. The Common Data Index (CDI) service is the key Discovery and Delivery service. It enables users to have a detailed insight of the availability and geographical distribution of marine data, archived at the connected data centres, and it provides the means for downloading datasets in common formats via a transaction mechanism.
Sub-Audible Speech Recognition Based upon Electromyographic Signals
NASA Technical Reports Server (NTRS)
Jorgensen, Charles C. (Inventor); Agabon, Shane T. (Inventor); Lee, Diana D. (Inventor)
2012-01-01
Method and system for processing and identifying a sub-audible signal formed by a source of sub-audible sounds. Sequences of samples of sub-audible sound patterns ("SASPs") for known words/phrases in a selected database are received for overlapping time intervals, and Signal Processing Transforms ("SPTs") are formed for each sample, as part of a matrix of entry values. The matrix is decomposed into contiguous, non-overlapping two-dimensional cells of entries, and neural net analysis is applied to estimate reference sets of weight coefficients that provide sums with optimal matches to reference sets of values. The reference sets of weight coefficients are used to determine a correspondence between a new (unknown) word/phrase and a word/phrase in the database.
Influence of throat configuration and fish density on escapement of channel catfish from hoop nets
Porath, Mark T.; Pape, Larry D.; Richters, Lindsey K.
2011-01-01
In recent years, several state agencies have adopted the use of baited, tandemset hoop nets to assess lentic channel catfish Ictalurus punctatus populations. Some level of escapement from the net is expected because an opening exists in each throat of the net, although factors influencing rates of escapement from hoop nets have not been quantified. We conducted experiments to quantify rates of escapement and to determine the influence of throat configuration and fish density within the net on escapement rates. An initial experiment to determine the rate of escapement from each net compartment utilized individually tagged channel catfish placed within the entrance (between the two throats) and cod (within the second throat) compartments of a single hoop net for overnight sets. From this experiment, the mean rate (±SE) of channel catfish escaping was 4.2% (±1.5) from the cod (cod throat was additionally restricted from the traditionally manufactured product), and 74% (±4.2) from the entrance compartments. In a subsequent experiment, channel catfish were placed only in the cod compartment with different throat configurations (restricted or unrestricted) and at two densities (low [6 fish per net] and high [60 fish per net]) for overnight sets to determine the influence of fish density and throat configuration on escapement rates. Escapement rates between throat configurations were doubled at low fish density (13.3 ± 5.4% restricted versus 26.7 ± 5.6% unrestricted) and tripled at high fish density (14.3 ± 4.9% restricted versus 51.9 ± 5.0% unrestricted). These results suggest that retention efficiency is high from cod compartments with restricted throat entrances. However, managers and researchers need to be aware that modification to the cod throats (restrictions) is needed for hoop nets ordered from manufacturers. Managers need to be consistent in their use and reporting of cod end throat configurations when using this gear.
Bait type influences on catch and bycatch in tandem hoop nets set in reservoirs
Long, James M.; Stewart, David R.; Shiflet, Jeremy; Balsman, Dane; Shoup, Daniel E.
2017-01-01
Tandem hoop nets have become the primary gear for sampling channel catfish Ictalurus punctatus, but suffer from high incidences of bycatch, particularly aquatic turtles that usually drown as a result. We sought to determine if bait type, ZOTE© soap and ground cheese logs, would influence catch of channel catfish (CPUE and mean TL) and bycatch of fishes and aquatic turtles. We sampled with tandem hoop nets in 13 Kentucky reservoirs (5–73 ha) using a crossover design and two sampling events. We found no difference in channel catfish catch rates between bait types, but mean sizes of fish caught using ZOTE© soap were approximately 24 mm longer compared to cheese. Fish bycatch was similar between bait types, but tandem hoop nets baited with ZOTE© soap caught up to 61% fewer turtles and mortality of turtles that were captured was up to 12% lower than those baited with cheese. Depth of net set, water temperature, and Secchi depth were environmental factors measured that affected catch and bycatch, but varied among species. Using ZOTE© soap as bait in tandem hoop nets appears to be a fairly simple and straightforward method for maintaining high catch rates of channel catfish while minimizing turtle mortality.
Motivation: In recent years there have been several efforts to generate sensitivity profiles of collections of genomically characterized cell lines to panels of candidate therapeutic compounds. These data provide the basis for the development of in silico models of sensitivity based on cellular, genetic, or expression biomarkers of cancer cells. However, a remaining challenge is an efficient way to identify accurate sets of biomarkers to validate.
ECHO Services: Foundational Middleware for a Science Cyberinfrastructure
NASA Technical Reports Server (NTRS)
Burnett, Michael
2005-01-01
This viewgraph presentation describes ECHO, an interoperability middleware solution. It uses open, XML-based APIs, and supports net-centric architectures and solutions. ECHO has a set of interoperable registries for both data (metadata) and services, and provides user accounts and a common infrastructure for the registries. It is built upon a layered architecture with extensible infrastructure for supporting community unique protocols. It has been operational since November, 2002 and it available as open source.
NASA Astrophysics Data System (ADS)
Hassell, David; Gregory, Jonathan; Blower, Jon; Lawrence, Bryan N.; Taylor, Karl E.
2017-12-01
The CF (Climate and Forecast) metadata conventions are designed to promote the creation, processing, and sharing of climate and forecasting data using Network Common Data Form (netCDF) files and libraries. The CF conventions provide a description of the physical meaning of data and of their spatial and temporal properties, but they depend on the netCDF file encoding which can currently only be fully understood and interpreted by someone familiar with the rules and relationships specified in the conventions documentation. To aid in development of CF-compliant software and to capture with a minimal set of elements all of the information contained in the CF conventions, we propose a formal data model for CF which is independent of netCDF and describes all possible CF-compliant data. Because such data will often be analysed and visualised using software based on other data models, we compare our CF data model with the ISO 19123 coverage model, the Open Geospatial Consortium CF netCDF standard, and the Unidata Common Data Model. To demonstrate that this CF data model can in fact be implemented, we present cf-python, a Python software library that conforms to the model and can manipulate any CF-compliant dataset.
CrosstalkNet: A Visualization Tool for Differential Co-expression Networks and Communities.
Manem, Venkata; Adam, George Alexandru; Gruosso, Tina; Gigoux, Mathieu; Bertos, Nicholas; Park, Morag; Haibe-Kains, Benjamin
2018-04-15
Variations in physiological conditions can rewire molecular interactions between biological compartments, which can yield novel insights into gain or loss of interactions specific to perturbations of interest. Networks are a promising tool to elucidate intercellular interactions, yet exploration of these large-scale networks remains a challenge due to their high dimensionality. To retrieve and mine interactions, we developed CrosstalkNet, a user friendly, web-based network visualization tool that provides a statistical framework to infer condition-specific interactions coupled with a community detection algorithm for bipartite graphs to identify significantly dense subnetworks. As a case study, we used CrosstalkNet to mine a set of 54 and 22 gene-expression profiles from breast tumor and normal samples, respectively, with epithelial and stromal compartments extracted via laser microdissection. We show how CrosstalkNet can be used to explore large-scale co-expression networks and to obtain insights into the biological processes that govern cross-talk between different tumor compartments. Significance: This web application enables researchers to mine complex networks and to decipher novel biological processes in tumor epithelial-stroma cross-talk as well as in other studies of intercompartmental interactions. Cancer Res; 78(8); 2140-3. ©2018 AACR . ©2018 American Association for Cancer Research.
Equivalent Treatment and Survival after Resection of Pancreatic Cancer at Safety-Net Hospitals.
Dhar, Vikrom K; Hoehn, Richard S; Kim, Young; Xia, Brent T; Jung, Andrew D; Hanseman, Dennis J; Ahmad, Syed A; Shah, Shimul A
2018-01-01
Due to disparities in access to care, patients with Medicaid or no health insurance are at risk of not receiving appropriate adjuvant treatment following resection of pancreatic cancer. We have previously shown inferior short-term outcomes following surgery at safety-net hospitals. Subsequently, we hypothesized that safety-net hospitals caring for these vulnerable populations utilize less adjuvant chemoradiation, resulting in inferior long-term outcomes. The American College of Surgeons National Cancer Data Base was queried for patients diagnosed with pancreatic adenocarcinoma (n = 32,296) from 1998 to 2010. Hospitals were grouped according to safety-net burden, defined as the proportion of patients with Medicaid or no insurance. The highest quartile, representing safety-net hospitals, was compared to lower-burden hospitals with regard to patient demographics, disease characteristics, surgical management, delivery of multimodal systemic therapy, and survival. Patients at safety-net hospitals were less often white, had lower income, and were less educated. Safety-net hospital patients were just as likely to undergo surgical resection (OR 1.03, p = 0.73), achieving similar rates of negative surgical margins when compared to patients at medium and low burden hospitals (70% vs. 73% vs. 66%). Thirty-day mortality rates were 5.6% for high burden hospitals, 5.2% for medium burden hospitals, and 4.3% for low burden hospitals. No clinically significant differences were noted in the proportion of surgical patients receiving either chemotherapy (48% vs. 52% vs. 52%) or radiation therapy (26% vs. 30% vs. 29%) or the time between diagnosis and start of systemic therapy (58 days vs. 61 days vs. 53 days). Across safety-net burden groups, no difference was noted in stage-specific median survival (all p > 0.05) or receipt of adjuvant as opposed to neoadjuvant systemic therapy (82% vs. 85% vs. 85%). Multivariate analysis adjusting for cancer stage revealed no difference in survival for safety-net hospital patients who had surgery and survived > 30 days (HR 1.02, p = 0.63). For patients surviving the perioperative setting following pancreatic cancer surgery, safety-net hospitals achieve equivalent long-term survival outcomes potentially due to equivalent delivery of multimodal therapy at non-safety-net hospitals. Safety-net hospitals are a crucial resource that provides quality long-term cancer treatment for vulnerable populations.
Lorenz, Lena M; Overgaard, Hans J; Massue, Dennis J; Mageni, Zawadi D; Bradley, John; Moore, Jason D; Mandike, Renata; Kramer, Karen; Kisinza, William; Moore, Sarah J
2014-12-13
Long-Lasting Insecticidal Nets (LLINs) are one of the major malaria vector control tools, with most countries adopting free or subsidised universal coverage campaigns of populations at-risk from malaria. It is essential to understand LLIN durability so that public health policy makers can select the most cost effective nets that last for the longest time, and estimate the optimal timing of repeated distribution campaigns. However, there is limited knowledge from few countries of the durability of LLINs under user conditions. This study investigates LLIN durability in eight districts of Tanzania, selected for their demographic, geographic and ecological representativeness of the country as a whole. We use a two-stage approach: First, LLINs from recent national net campaigns will be evaluated retrospectively in 3,420 households. Those households will receive one of three leading LLIN products at random (Olyset®, PermaNet®2.0 or Netprotect®) and will be followed up for three years in a prospective study to compare their performance under user conditions. LLIN durability will be evaluated by measuring Attrition (the rate at which nets are discarded by households), Bioefficacy (the insecticidal efficacy of the nets measured by knock-down and mortality of mosquitoes), Chemical content (g/kg of insecticide available in net fibres) and physical Degradation (size and location of holes). In addition, we will extend the current national mosquito insecticide Resistance monitoring program to additional districts and use these data sets to provide GIS maps for use in health surveillance and decision making by the National Malaria Control Program (NMCP). The data will be of importance to policy makers and vector control specialists both in Tanzania and the SSA region to inform best practice for the maintenance of high and cost-effective coverage and to maximise current health gains in malaria control.
High resolution printing of charge
Rogers, John; Park, Jang-Ung
2015-06-16
Provided are methods of printing a pattern of charge on a substrate surface, such as by electrohydrodynamic (e-jet) printing. The methods relate to providing a nozzle containing a printable fluid, providing a substrate having a substrate surface and generating from the nozzle an ejected printable fluid containing net charge. The ejected printable fluid containing net charge is directed to the substrate surface, wherein the net charge does not substantially degrade and the net charge retained on the substrate surface. Also provided are functional devices made by any of the disclosed methods.
State-based verification of RTCP-nets with nuXmv
NASA Astrophysics Data System (ADS)
Biernacka, Agnieszka; Biernacki, Jerzy; Szpyrka, Marcin
2015-12-01
The paper deals with an algorithm of translation of RTCP-nets' (real-time coloured Petri nets) coverability graphs into nuXmv state machines. The approach enables users to verify RTCP-nets with model checking techniques provided by the nuXmv tool. Full details of the algorithm are presented and an illustrative example of the approach usefulness is provided.
George, Sheba; Moran, Erin; Fish, Allison; Ogunyemi, Lola
2013-01-01
Differential access to everyday technology and healthcare amongst safety net patients is associated with low technological and health literacies, respectively. These low rates of literacy produce a complex patient "knowledge gap" that influences the effectiveness of telehealth technologies. To understand this "knowledge gap", six focus groups (2 African-American and 4 Latino) were conducted with patients who received teleretinal screenings in U.S. urban safety-net settings. Findings indicate that patients' "knowledge gap" is primarily produced at three points: (1) when patients' preexisting personal barriers to care became exacerbated in the clinical setting; (2) through encounters with technology during screening; and (3) in doctor-patient follow-up. This "knowledge gap" can produce confusion and fear, potentially affecting patients' confidence in quality of care and limiting their disease management ability. In rethinking the digital divide to include the consequences of this knowledge gap faced by patients in the clinical setting, we suggest that patient education focus on both their disease and specific telehealth technologies deployed in care delivery.
SeaDataNet Pan-European infrastructure for Ocean & Marine Data Management
NASA Astrophysics Data System (ADS)
Manzella, G. M.; Maillard, C.; Maudire, G.; Schaap, D.; Rickards, L.; Nast, F.; Balopoulos, E.; Mikhailov, N.; Vladymyrov, V.; Pissierssens, P.; Schlitzer, R.; Beckers, J. M.; Barale, V.
2007-12-01
SEADATANET is developing a Pan-European data management infrastructure to insure access to a large number of marine environmental data (i.e. temperature, salinity current, sea level, chemical, physical and biological properties), safeguard and long term archiving. Data are derived from many different sensors installed on board of research vessels, satellite and the various platforms of the marine observing system. SeaDataNet allows to have information on real time and archived marine environmental data collected at a pan-european level, through directories on marine environmental data and projects. SeaDataNet allows the access to the most comprehensive multidisciplinary sets of marine in-situ and remote sensing data, from about 40 laboratories, through user friendly tools. The data selection and access is operated through the Common Data Index (CDI), XML files compliant with ISO standards and unified dictionaries. Technical Developments carried out by SeaDataNet includes: A library of Standards - Meta-data standards, compliant with ISO 19115, for communication and interoperability between the data platforms. Software of interoperable on line system - Interconnection of distributed data centres by interfacing adapted communication technology tools. Off-Line Data Management software - software representing the minimum equipment of all the data centres is developed by AWI "Ocean Data View (ODV)". Training, Education and Capacity Building - Training 'on the job' is carried out by IOC-Unesco in Ostende. SeaDataNet Virtual Educational Centre internet portal provides basic tools for informal education
CentNet—A deployable 100-station network for surface exchange research
NASA Astrophysics Data System (ADS)
Oncley, S.; Horst, T. W.; Semmer, S.; Militzer, J.; Maclean, G.; Knudson, K.
2014-12-01
Climate, air quality, atmospheric composition, surface hydrology, and ecological processes are directly affected by the Earth's surface. Complexity of this surface exists at multiple spatial scales, which complicates the understanding of these processes. NCAR/EOL currently provides a facility to the research community to make direct eddy-covariance flux observations to quantify surface-atmosphere interactions. However, just as model resolution has continued to increase, there is a need to increase the spatial density of flux measurements to capture the wide variety of scales that contribute to exchange processes close to the surface. NCAR/EOL now has developed the CentNet facility, that is envisioned to have on the order of 100 surface flux stations deployable for periods of months to years. Each station would measure standard meteorological variables, all components of the surface energy balance (including turbulence fluxes and radiation), atmospheric composition, and other quantities to characterize the surface. Thus, CentNet can support observational research in the biogeosciences, hydrology, urban meteorology, basic meteorology, and turbulence. CentNet has been designed to be adaptable to a wide variety of research problems while keeping operations manageable. Tower infrastructure has been designed to be lightweight, easily deployed, and with a minimal set-up footprint. CentNet uses sensor networks to increase spatial sampling at each station. The data system saves every sample on site to retain flexibility in data analysis. We welcome guidance on development and funding priorities as we build CentNet.
RadNet Air Data From Providence, RI
This page presents radiation air monitoring and air filter analysis data for Providence, RI from EPA's RadNet system. RadNet is a nationwide network of monitoring stations that measure radiation in air, drinking water and precipitation.
Wides, Cynthia; Alam, Sonia Rab; Mertz, Elizabeth
2014-02-01
In July 2009, California eliminated funding for most adult non-emergency Medicaid dental benefits (Denti-Cal). This paper presents the findings from a qualitative assessment of the impacts of the Denti-Cal cuts on California's oral health safety-net. Interviews were conducted with dental safety-net providers throughout the state, including public health departments, community health centers, dental schools, Native American health clinics, and private providers, and were coded thematically using Atlas.ti. Safety-net providers reported decreased utilization by Denti-Cal-eligible adults, who now primarily seek emergency dental services, and reported shifting to focus on pediatric and privately-insured patients. Significant changes were reported in safety-net clinic finances, operations, and ability to refer. The impact of the Denti-Cal cuts has been distributed unevenly across the safety-net, with private providers and County Health Departments bearing the highest burden.
Point-Process Models of Social Network Interactions: Parameter Estimation and Missing Data Recovery
2014-08-01
treating them as zero will have a de minimis impact on the results, but avoiding computing them (and computing with them) saves tremendous time. Set a... test the methods on simulated time series on artificial social networks, including some toy networks and some meant to resemble IkeNet. We conclude...the section by discussing the results in detail. In each of our tests we begin with a complete data set, whether it is real (IkeNet) or simulated. Then
NetProt: Complex-based Feature Selection.
Goh, Wilson Wen Bin; Wong, Limsoon
2017-08-04
Protein complex-based feature selection (PCBFS) provides unparalleled reproducibility with high phenotypic relevance on proteomics data. Currently, there are five PCBFS paradigms, but not all representative methods have been implemented or made readily available. To allow general users to take advantage of these methods, we developed the R-package NetProt, which provides implementations of representative feature-selection methods. NetProt also provides methods for generating simulated differential data and generating pseudocomplexes for complex-based performance benchmarking. The NetProt open source R package is available for download from https://github.com/gohwils/NetProt/releases/ , and online documentation is available at http://rpubs.com/gohwils/204259 .
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mather, James
Atmospheric Radiation Measurement (ARM) Program standard data format is NetCDF 3 (Network Common Data Form). The object of this tutorial is to provide a basic introduction to NetCDF with an emphasis on aspects of the ARM application of NetCDF. The goal is to provide basic instructions for reading and visualizing ARM NetCDF data with the expectation that these examples can then be applied to more complex applications.
Action-based verification of RTCP-nets with CADP
NASA Astrophysics Data System (ADS)
Biernacki, Jerzy; Biernacka, Agnieszka; Szpyrka, Marcin
2015-12-01
The paper presents an RTCP-nets' (real-time coloured Petri nets) coverability graphs into Aldebaran format translation algorithm. The approach provides the possibility of automatic RTCP-nets verification using model checking techniques provided by the CADP toolbox. An actual fire alarm control panel system has been modelled and several of its crucial properties have been verified to demonstrate the usability of the approach.
Roseen, Eric J; Cornelio-Flores, Oscar; Lemaster, Chelsey; Hernandez, Maria; Fong, Calvin; Resnick, Kirsten; Wardle, Jon; Hanser, Suzanne; Saper, Robert
2017-01-01
Little is known about the feasibility of providing massage or music therapy to medical inpatients at urban safety-net hospitals or the impact these treatments may have on patient experience. To determine the feasibility of providing massage and music therapy to medical inpatients and to assess the impact of these interventions on patient experience. Single-center 3-arm feasibility randomized controlled trial. Urban academic safety-net hospital. Adult inpatients on the Family Medicine ward. Massage therapy consisted of a standardized protocol adapted from a previous perioperative study. Music therapy involved a preference assessment, personalized compact disc, music-facilitated coping, singing/playing music, and/or songwriting. Credentialed therapists provided the interventions. Patient experience was measured with the Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) within 7 days of discharge. We compared the proportion of patients in each study arm reporting "top box" scores for the following a priori HCAHPS domains: pain management, recommendation of hospital, and overall hospital rating. Responses to additional open-ended postdischarge questions were transcribed, coded independently, and analyzed for common themes. From July to December 2014, 90 medical inpatients were enrolled; postdischarge data were collected on 68 (76%) medical inpatients. Participants were 70% females, 43% non-Hispanic black, and 23% Hispanic. No differences between groups were observed on HCAHPS. The qualitative analysis found that massage and music therapy were associated with improved overall hospital experience, pain management, and connectedness to the massage or music therapist. Providing music and massage therapy in an urban safety-net inpatient setting was feasible. There was no quantitative impact on HCAHPS. Qualitative findings suggest benefits related to an improved hospital experience, pain management, and connectedness to the massage or music therapist.
Rotation Control In A Cylindrical Acoustic Levitator
NASA Technical Reports Server (NTRS)
Barmatz, M. B.; Allen, J. L.
1988-01-01
Second driver introduces net circulation around levitated sample. Two transducers produce two sets of equal counterrotating acoustic fields. By appropriate adjustment of amplitudes and phases in two transducers, total acoustic field made to consist of two unequal counterrotating fields, producing net torque on levitated sample.
Improved methods for predicting peptide binding affinity to MHC class II molecules.
Jensen, Kamilla Kjaergaard; Andreatta, Massimo; Marcatili, Paolo; Buus, Søren; Greenbaum, Jason A; Yan, Zhen; Sette, Alessandro; Peters, Bjoern; Nielsen, Morten
2018-07-01
Major histocompatibility complex class II (MHC-II) molecules are expressed on the surface of professional antigen-presenting cells where they display peptides to T helper cells, which orchestrate the onset and outcome of many host immune responses. Understanding which peptides will be presented by the MHC-II molecule is therefore important for understanding the activation of T helper cells and can be used to identify T-cell epitopes. We here present updated versions of two MHC-II-peptide binding affinity prediction methods, NetMHCII and NetMHCIIpan. These were constructed using an extended data set of quantitative MHC-peptide binding affinity data obtained from the Immune Epitope Database covering HLA-DR, HLA-DQ, HLA-DP and H-2 mouse molecules. We show that training with this extended data set improved the performance for peptide binding predictions for both methods. Both methods are publicly available at www.cbs.dtu.dk/services/NetMHCII-2.3 and www.cbs.dtu.dk/services/NetMHCIIpan-3.2. © 2018 John Wiley & Sons Ltd.
Perlow, Haley K; Ramey, Stephen J; Silver, Ben; Kwon, Deukwoo; Chinea, Felix M; Samuels, Stuart E; Samuels, Michael A; Elsayyad, Nagy; Yechieli, Raphael
2018-04-01
Objective To examine the impact of treatment setting and demographic factors on oropharyngeal and laryngeal cancer time to treatment initiation (TTI). Study Design Retrospective case series. Setting Safety net hospital and adjacent private academic hospital. Subjects and Methods Demographic, staging, and treatment details were retrospectively collected for 239 patients treated from January 1, 2014, to June 30, 2016. TTI was defined as days between diagnostic biopsy and initiation of curative treatment (defined as first day of radiotherapy [RT], surgery, or chemotherapy). Results On multivariable analysis, safety net hospital treatment (vs private academic hospital treatment), initial diagnosis at outside hospital, and oropharyngeal cancer (vs laryngeal cancer) were all associated with increased TTI. Surgical treatment, severe comorbidity, and both N1 and N2 status were associated with decreased TTI. Conclusion Safety net hospital treatment was associated with increased TTI. No differences in TTI were found when language spoken and socioeconomic status were examined in the overall cohort.
Tissue-specific NETs alter genome organization and regulation even in a heterologous system.
de Las Heras, Jose I; Zuleger, Nikolaj; Batrakou, Dzmitry G; Czapiewski, Rafal; Kerr, Alastair R W; Schirmer, Eric C
2017-01-02
Different cell types exhibit distinct patterns of 3D genome organization that correlate with changes in gene expression in tissue and differentiation systems. Several tissue-specific nuclear envelope transmembrane proteins (NETs) have been found to influence the spatial positioning of genes and chromosomes that normally occurs during tissue differentiation. Here we study 3 such NETs: NET29, NET39, and NET47, which are expressed preferentially in fat, muscle and liver, respectively. We found that even when exogenously expressed in a heterologous system they can specify particular genome organization patterns and alter gene expression. Each NET affected largely different subsets of genes. Notably, the liver-specific NET47 upregulated many genes in HT1080 fibroblast cells that are normally upregulated in hepatogenesis, showing that tissue-specific NETs can favor expression patterns associated with the tissue where the NET is normally expressed. Similarly, global profiling of peripheral chromatin after exogenous expression of these NETs using lamin B1 DamID revealed that each NET affected the nuclear positioning of distinct sets of genomic regions with a significant tissue-specific component. Thus NET influences on genome organization can contribute to gene expression changes associated with differentiation even in the absence of other factors and overt cellular differentiation changes.
The evolving role and care management approaches of safety-net Medicaid managed care plans.
Gusmano, Michael K; Sparer, Michael S; Brown, Lawrence D; Rowe, Catherine; Gray, Bradford
2002-12-01
This article provides new empirical data about the viability and the care management activities of Medicaid managed-care plans sponsored by provider organizations that serve Medicaid and other low-income populations. Using survey and case study methods, we studied these "safety-net" health plans in 1998 and 2000. Although the number of safety-net plans declined over this period, the surviving plans were larger and enjoying greater financial success than the plans we surveyed in 1998. We also found that, based on a partnership with providers, safety-net plans are moving toward more sophisticated efforts to manage the care of their enrollees. Our study suggests that, with supportive state policies, safety-net plans are capable of remaining viable. Contracting with safety-net plans may not be an efficient mechanism for enabling Medicaid recipients to "enter the mainstream of American health care," but it may provide states with an effective way to manage and coordinate the care of Medicaid recipients, while helping to maintain the health care safety-net for the uninsured.
AgdbNet – antigen sequence database software for bacterial typing
Jolley, Keith A; Maiden, Martin CJ
2006-01-01
Background Bacterial typing schemes based on the sequences of genes encoding surface antigens require databases that provide a uniform, curated, and widely accepted nomenclature of the variants identified. Due to the differences in typing schemes, imposed by the diversity of genes targeted, creating these databases has typically required the writing of one-off code to link the database to a web interface. Here we describe agdbNet, widely applicable web database software that facilitates simultaneous BLAST querying of multiple loci using either nucleotide or peptide sequences. Results Databases are described by XML files that are parsed by a Perl CGI script. Each database can have any number of loci, which may be defined by nucleotide and/or peptide sequences. The software is currently in use on at least five public databases for the typing of Neisseria meningitidis, Campylobacter jejuni and Streptococcus equi and can be set up to query internal isolate tables or suitably-configured external isolate databases, such as those used for multilocus sequence typing. The style of the resulting website can be fully configured by modifying stylesheets and through the use of customised header and footer files that surround the output of the script. Conclusion The software provides a rapid means of setting up customised Internet antigen sequence databases. The flexible configuration options enable typing schemes with differing requirements to be accommodated. PMID:16790057
2010-09-01
The MasterNet project continued to expand in software and hardware complexity until its failure ( Szilagyi , n.d.). Despite all of the issues...were used for MasterNet ( Szilagyi , n.d.). Although executive management committed significant financial resources to MasterNet, Bank of America...implementation failure as well as project- management failure as a whole ( Szilagyi , n.d.). The lesson learned from this vignette is the importance of setting
NASA Astrophysics Data System (ADS)
Fuchsberger, Jürgen; Kirchengast, Gottfried; Bichler, Christoph; Kabas, Thomas; Lenz, Gunther; Leuprecht, Armin
2017-04-01
The Feldbach region in southeast Austria, characteristic for experiencing a rich variety of weather and climate patterns, has been selected as the focus area for a pioneering weather and climate observation network at very high resolution: The WegenerNet comprises 153 meteorological stations measuring temperature, humidity, precipitation, and other parameters, in a tightly spaced grid within an area of about 20 km × 15 km centered near the city of Feldbach (46.93°N, 15.90°E). With its stations about every 2 km2, each with 5-min time sampling, the network provides regular measurements since January 2007. Detailed information is available in the recent description by Kirchengast et al. (2014) and via www.wegcenter.at/wegenernet. As a smaller "sister network" of the WegenerNet Feldbach region, the WegenerNet Johnsbachtal consists of eleven meteorological stations (complemented by one hydrographic station at the Johnsbach creek), measuring temperature, humidity, precipitation, radiation, wind, and other parameters in an alpine setting at altitudes ranging from below 700 m to over 2100 m. Data are available partly since 2007, partly since more recent dates and have a temporal resolution of 10 minutes. The networks are set to serve as a long-term monitoring and validation facility for weather and climate research and applications. Uses include validation of nonhydrostatic models operated at 1-km-scale resolution and of statistical downscaling techniques (in particular for precipitation), validation of radar and satellite data, study of orography-climate relationships, and many others. Quality-controlled station time series and gridded field data (spacing 200 m × 200 m) are available in near-real time (data latency less than 1-2 h) for visualization and download via a data portal (www.wegenernet.org). This data portal has been undergoing a complete renewal over the last year, and now serves as a modern gateway to the WegenerNet's more than 10 years of high-resolution data. The poster gives a brief introduction to the WegenerNet design and setup and shows a detailed overview of the new data portal. It also focuses on showing examples for high-resolution precipitation measurements, especially heavy-precipitation and convective events. Reference: Kirchengast, G., T. Kabas, A. Leuprecht, C. Bichler, and H. Truhetz (2014): WegenerNet: A pioneering high-resolution network for monitoring weather and climate. Bull. Amer. Meteor. Soc., 95, 227-242, doi:10.1175/BAMS-D-11-00161.1.
Using a mass balance to determine the potency loss during the production of a pharmaceutical blend.
Mackaplow, Michael B
2010-09-01
The manufacture of a blend containing the active pharmaceutical ingredient (API) and inert excipients is a precursor for the production of most pharmaceutical capsules and tablets. However, if there is a net water gain or preferential loss of API during production, the potency of the final drug product may be less than the target value. We use a mass balance to predict the mean potency loss during the production of a blend via wet granulation and fluidized bed drying. The result is an explicit analytical equation for the change in blend potency a function of net water gain, solids losses (both regular and high-potency), and the fraction of excipients added extragranularly. This model predicts that each 1% gain in moisture content (as determined by a loss on drying test) will decrease the API concentration of the final blend at least 1% LC. The effect of pre-blend solid losses increases with their degree of superpotency. This work supports Quality by Design by providing a rational method to set the process design space to minimize blend potency losses. When an overage is necessary, the model can help justify it by providing a quantitative, first-principles understanding of the sources of potency loss. The analysis is applicable to other manufacturing processes where the primary sources of potency loss are net water gain and/or mass losses.
Emerging strengths in Asia Pacific bioinformatics.
Ranganathan, Shoba; Hsu, Wen-Lian; Yang, Ueng-Cheng; Tan, Tin Wee
2008-12-12
The 2008 annual conference of the Asia Pacific Bioinformatics Network (APBioNet), Asia's oldest bioinformatics organisation set up in 1998, was organized as the 7th International Conference on Bioinformatics (InCoB), jointly with the Bioinformatics and Systems Biology in Taiwan (BIT 2008) Conference, Oct. 20-23, 2008 at Taipei, Taiwan. Besides bringing together scientists from the field of bioinformatics in this region, InCoB is actively involving researchers from the area of systems biology, to facilitate greater synergy between these two groups. Marking the 10th Anniversary of APBioNet, this InCoB 2008 meeting followed on from a series of successful annual events in Bangkok (Thailand), Penang (Malaysia), Auckland (New Zealand), Busan (South Korea), New Delhi (India) and Hong Kong. Additionally, tutorials and the Workshop on Education in Bioinformatics and Computational Biology (WEBCB) immediately prior to the 20th Federation of Asian and Oceanian Biochemists and Molecular Biologists (FAOBMB) Taipei Conference provided ample opportunity for inducting mainstream biochemists and molecular biologists from the region into a greater level of awareness of the importance of bioinformatics in their craft. In this editorial, we provide a brief overview of the peer-reviewed manuscripts accepted for publication herein, grouped into thematic areas. As the regional research expertise in bioinformatics matures, the papers fall into thematic areas, illustrating the specific contributions made by APBioNet to global bioinformatics efforts.
Emerging strengths in Asia Pacific bioinformatics
Ranganathan, Shoba; Hsu, Wen-Lian; Yang, Ueng-Cheng; Tan, Tin Wee
2008-01-01
The 2008 annual conference of the Asia Pacific Bioinformatics Network (APBioNet), Asia's oldest bioinformatics organisation set up in 1998, was organized as the 7th International Conference on Bioinformatics (InCoB), jointly with the Bioinformatics and Systems Biology in Taiwan (BIT 2008) Conference, Oct. 20–23, 2008 at Taipei, Taiwan. Besides bringing together scientists from the field of bioinformatics in this region, InCoB is actively involving researchers from the area of systems biology, to facilitate greater synergy between these two groups. Marking the 10th Anniversary of APBioNet, this InCoB 2008 meeting followed on from a series of successful annual events in Bangkok (Thailand), Penang (Malaysia), Auckland (New Zealand), Busan (South Korea), New Delhi (India) and Hong Kong. Additionally, tutorials and the Workshop on Education in Bioinformatics and Computational Biology (WEBCB) immediately prior to the 20th Federation of Asian and Oceanian Biochemists and Molecular Biologists (FAOBMB) Taipei Conference provided ample opportunity for inducting mainstream biochemists and molecular biologists from the region into a greater level of awareness of the importance of bioinformatics in their craft. In this editorial, we provide a brief overview of the peer-reviewed manuscripts accepted for publication herein, grouped into thematic areas. As the regional research expertise in bioinformatics matures, the papers fall into thematic areas, illustrating the specific contributions made by APBioNet to global bioinformatics efforts. PMID:19091008
Deep neural nets as a method for quantitative structure-activity relationships.
Ma, Junshui; Sheridan, Robert P; Liaw, Andy; Dahl, George E; Svetnik, Vladimir
2015-02-23
Neural networks were widely used for quantitative structure-activity relationships (QSAR) in the 1990s. Because of various practical issues (e.g., slow on large problems, difficult to train, prone to overfitting, etc.), they were superseded by more robust methods like support vector machine (SVM) and random forest (RF), which arose in the early 2000s. The last 10 years has witnessed a revival of neural networks in the machine learning community thanks to new methods for preventing overfitting, more efficient training algorithms, and advancements in computer hardware. In particular, deep neural nets (DNNs), i.e. neural nets with more than one hidden layer, have found great successes in many applications, such as computer vision and natural language processing. Here we show that DNNs can routinely make better prospective predictions than RF on a set of large diverse QSAR data sets that are taken from Merck's drug discovery effort. The number of adjustable parameters needed for DNNs is fairly large, but our results show that it is not necessary to optimize them for individual data sets, and a single set of recommended parameters can achieve better performance than RF for most of the data sets we studied. The usefulness of the parameters is demonstrated on additional data sets not used in the calibration. Although training DNNs is still computationally intensive, using graphical processing units (GPUs) can make this issue manageable.
16 CFR 1610.6 - Test procedure.
Code of Federal Regulations, 2010 CFR
2010-01-01
... dimension of the specimen and arranged so the test flame impinges on a metallic thread. (iv) Embroidery. Embroidery on netting material shall be tested with two sets of preliminary specimens to determine the most flammable area (which offers the greatest amount of netting or embroidery in the 150 mm (6 in.) direction...
Second Language Teaching and Learning in the Net Generation
ERIC Educational Resources Information Center
Oxford, Raquel, Ed.; Oxford, Jeffrey, Ed.
2009-01-01
Today's young people--the Net Generation--have grown up with technology all around them. However, teachers cannot assume that students' familiarity with technology in general transfers successfully to pedagogical settings. This volume examines various technologies and offers concrete advice on how each can be successfully implemented in the second…
Convergent Validity of O*NET Holland Code Classifications
ERIC Educational Resources Information Center
Eggerth, Donald E.; Bowles, Shannon M.; Tunick, Roy H.; Andrew, Michael E.
2005-01-01
The interpretive ease and intuitive appeal of the Holland RIASEC typology have made it nearly ubiquitous in vocational guidance settings. Its incorporation into the Occupational Information Network (O*NET) has moved it another step closer to reification. This research investigated the rates of agreement between Holland code classifications from…
Agile convolutional neural network for pulmonary nodule classification using CT images.
Zhao, Xinzhuo; Liu, Liyao; Qi, Shouliang; Teng, Yueyang; Li, Jianhua; Qian, Wei
2018-04-01
To distinguish benign from malignant pulmonary nodules using CT images is critical for their precise diagnosis and treatment. A new Agile convolutional neural network (CNN) framework is proposed to conquer the challenges of a small-scale medical image database and the small size of the nodules, and it improves the performance of pulmonary nodule classification using CT images. A hybrid CNN of LeNet and AlexNet is constructed through combining the layer settings of LeNet and the parameter settings of AlexNet. A dataset with 743 CT image nodule samples is built up based on the 1018 CT scans of LIDC to train and evaluate the Agile CNN model. Through adjusting the parameters of the kernel size, learning rate, and other factors, the effect of these parameters on the performance of the CNN model is investigated, and an optimized setting of the CNN is obtained finally. After finely optimizing the settings of the CNN, the estimation accuracy and the area under the curve can reach 0.822 and 0.877, respectively. The accuracy of the CNN is significantly dependent on the kernel size, learning rate, training batch size, dropout, and weight initializations. The best performance is achieved when the kernel size is set to [Formula: see text], the learning rate is 0.005, the batch size is 32, and dropout and Gaussian initialization are used. This competitive performance demonstrates that our proposed CNN framework and the optimization strategy of the CNN parameters are suitable for pulmonary nodule classification characterized by small medical datasets and small targets. The classification model might help diagnose and treat pulmonary nodules effectively.
Therapeutic benefit of selective inhibition of p110α PI3-kinase in pancreatic neuroendocrine tumors
Soler, Adriana; Figueiredo, Ana M; Castel, Pau; Martin, Laura; Monelli, Erika; Angulo-Urarte, Ana; Milà-Guasch, Maria; Viñals, Francesc; Casanovas, Oriol
2017-01-01
Purpose Mutations in the PI3-kinase (PI3K) pathway occur in 16% of patients with pancreatic neuroendocrine tumors (PanNETs), which suggests that these tumors are an exciting setting for PI3K/AKT/mTOR pharmacological intervention. Everolimus, an mTOR inhibitor, is being used to treat patients with advanced PanNETs. However, resistance to mTOR targeted therapy is emerging partially due to the loss of mTOR-dependent feedback inhibition of AKT. In contrast, the response to PI3K inhibitors in PanNETs is unknown. Experimental Design In the present study, we assessed the frequency of PI3K pathway activation in human PanNETs and in RIP1-Tag2 mice, a preclinical tumor model of PanNETs, and we investigated the therapeutic efficacy of inhibiting PI3K in RIP1-Tag2 mice using a combination of pan (GDC-0941) and p110α selective (GDC-0326) inhibitors and isoform specific PI3K kinase-dead mutant mice. Results Human and mouse PanNETs showed enhanced pAKT, pPRAS40 and pS6 positivity compared to normal tissue. While treatment of RIP1-Tag2 mice with GDC-0941 led to reduced tumor growth with no impact on tumor vessels, the selective inactivation of the p110α PI3K isoform, either genetically or pharmacologically, reduced tumor growth as well as vascular area. Furthermore, GDC-0326 reduced the incidence of liver and lymph node (LN) metastasis compared to vehicle treated mice. We also demonstrated that tumor and stromal cells are implicated in the anti-tumor activity of GDC-0326 in RIP1-Tag2 tumors. Conclusion Our data provide a rationale for p110α selective intervention in PanNETs and unravel a new function of this kinase in cancer biology through its role in promoting metastasis. PMID:27225693
Therapeutic Benefit of Selective Inhibition of p110α PI3-Kinase in Pancreatic Neuroendocrine Tumors.
Soler, Adriana; Figueiredo, Ana M; Castel, Pau; Martin, Laura; Monelli, Erika; Angulo-Urarte, Ana; Milà-Guasch, Maria; Viñals, Francesc; Baselga, Jose; Casanovas, Oriol; Graupera, Mariona
2016-12-01
Mutations in the PI3K pathway occur in 16% of patients with pancreatic neuroendocrine tumors (PanNETs), which suggests that these tumors are an exciting setting for PI3K/AKT/mTOR pharmacologic intervention. Everolimus, an mTOR inhibitor, is being used to treat patients with advanced PanNETs. However, resistance to mTOR-targeted therapy is emerging partially due to the loss of mTOR-dependent feedback inhibition of AKT. In contrast, the response to PI3K inhibitors in PanNETs is unknown. In the current study, we assessed the frequency of PI3K pathway activation in human PanNETs and in RIP1-Tag2 mice, a preclinical tumor model of PanNETs, and we investigated the therapeutic efficacy of inhibiting PI3K in RIP1-Tag2 mice using a combination of pan (GDC-0941) and p110α-selective (GDC-0326) inhibitors and isoform-specific PI3K kinase-dead-mutant mice. Human and mouse PanNETs showed enhanced pAKT, pPRAS40, and pS6 positivity compared with normal tissue. Although treatment of RIP1-Tag2 mice with GDC-0941 led to reduced tumor growth with no impact on tumor vessels, the selective inactivation of the p110α PI3K isoform, either genetically or pharmacologically, reduced tumor growth as well as vascular area. Furthermore, GDC-0326 reduced the incidence of liver and lymph node metastasis compared with vehicle-treated mice. We also demonstrated that tumor and stromal cells are implicated in the antitumor activity of GDC-0326 in RIP1-Tag2 tumors. Our data provide a rationale for p110α-selective intervention in PanNETs and unravel a new function of this kinase in cancer biology through its role in promoting metastasis. Clin Cancer Res; 22(23); 5805-17. ©2016 AACR. ©2016 American Association for Cancer Research.
StarNet: An application of deep learning in the analysis of stellar spectra
NASA Astrophysics Data System (ADS)
Kielty, Collin; Bialek, Spencer; Fabbro, Sebastien; Venn, Kim; O'Briain, Teaghan; Jahandar, Farbod; Monty, Stephanie
2018-06-01
In an era when spectroscopic surveys are capable of collecting spectra for hundreds of thousands of stars, fast and efficient analysis methods are required to maximize scientific impact. These surveys provide a homogeneous database of stellar spectra that are ideal for machine learning applications. In this poster, we present StarNet: a convolutional neural network model applied to the analysis of both SDSS-III APOGEE DR13 and synthetic stellar spectra. When trained on synthetic spectra alone, the calculated stellar parameters (temperature, surface gravity, and metallicity) are of excellent precision and accuracy for both APOGEE data and synthetic data, over a wide range of signal-to-noise ratios. While StarNet was developed using the APOGEE observed spectra and corresponding ASSeT synthetic grid, we suggest that this technique is applicable to other spectral resolutions, spectral surveys, and wavelength regimes. As a demonstration of this, we present a StarNet model trained on lower resolution, R=6000, IR synthetic spectra, describing the spectra delivered by Gemini/NIFS and the forthcoming Gemini/GIRMOS instrument (PI Sivanandam, UToronto). Preliminary results suggest that the stellar parameters determined from this low resolution StarNet model are comparable in precision to the high-resolution APOGEE results. The success of StarNet at lower resolution can be attributed to (1) a large training set of synthetic spectra (N ~200,000) with a priori stellar labels, and (2) the use of the entire spectrum in the solution rather than a few weighted windows, which are common methods in other spectral analysis tools (e.g. FERRE or The Cannon). Remaining challenges in our StarNet applications include rectification, continuum normalization, and wavelength coverage. Solutions to these problems could be used to guide decisions made in the development of future spectrographs, spectroscopic surveys, and data reduction pipelines, such as for the future MSE.
Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard
2013-01-01
Purpose: With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. Methods: A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. Results: The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. Conclusions: The work demonstrates the viability of the design approach and the software tool for analysis of large data sets. PMID:24320426
Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard
2013-11-01
With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. The work demonstrates the viability of the design approach and the software tool for analysis of large data sets.
Gravel Transport Measured With Bedload Traps in Mountain Streams: Field Data Sets to be Published
NASA Astrophysics Data System (ADS)
Bunte, K.; Swingle, K. W.; Abt, S. R.; Ettema, R.; Cenderelli, D. A.
2017-12-01
Direct, accurate measurements of coarse bedload transport exist for only a few streams worldwide, because the task is laborious and requires a suitable device. However, sets of accurate field data would be useful for reference with unsampled sites and as a basis for model developments. The authors have carefully measured gravel transport and are compiling their data sets for publication. To ensure accurate measurements of gravel bedload in wadeable flow, the designed instrument consisted of an unflared aluminum frame (0.3 x 0.2 m) large enough for entry of cobbles. The attached 1 m or longer net with a 4 mm mesh held large bedload volumes. The frame was strapped onto a ground plate anchored onto the channel bed. This setup avoided involuntary sampler particle pick-up and enabled long sampling times, integrating over fluctuating transport. Beveled plates and frames facilitated easy particle entry. Accelerating flow over smooth plates compensated for deceleration within the net. Spacing multiple frames by 1 m enabled sampling much of the stream width. Long deployment, and storage of sampled bedload away from the frame's entrance, were attributes of traps rather than samplers; hence the name "bedload traps". The authors measured gravel transport with 4-6 bedload traps per cross-section at 10 mountain streams in CO, WY, and OR, accumulating 14 data sets (>1,350 samples). In 10 data sets, measurements covered much of the snowmelt high-flow season yielding 50-200 samples. Measurement time was typically 1 hour but ranged from 3 minutes to 3 hours, depending on transport intensity. Measuring back-to-back provided 6 to 10 samples over a 6 to 10-hour field day. Bedload transport was also measured with a 3-inch Helley-Smith sampler. The data set provides fractional (0.5 phi) transport rates in terms of particle mass and number for each bedload trap in the cross-section, the largest particle size, as well as total cross-sectional gravel transport rates. Ancillary field data include stage, discharge, long-term flow records if available, surface and subsurface sediment sizes, as well as longitudinal and cross-sectional site surveys. Besides transport relations, incipient motion conditions, hysteresis, and lateral variation, the data provide a reliable modeling basis to test insights and hypotheses regarding bedload transport.
Yeast pheromone pathway modeling using Petri nets
2014-01-01
Background Our environment is composed of biological components of varying magnitude. The relationships between the different biological elements can be represented as a biological network. The process of mating in S. cerevisiae is initiated by secretion of pheromone by one of the cells. Our interest lies in one particular question: how does a cell dynamically adapt the pathway to continue mating under severe environmental changes or under mutation (which might result in the loss of functionality of some proteins known to participate in the pheromone pathway). Our work attempts to answer this question. To achieve this, we first propose a model to simulate the pheromone pathway using Petri nets. Petri nets are directed graphs that can be used for describing and modeling systems characterized as concurrent, asynchronous, distributed, parallel, non-deterministic, and/or stochastic. We then analyze our Petri net-based model of the pathway to investigate the following: 1) Given the model of the pheromone response pathway, under what conditions does the cell respond positively, i.e., mate? 2) What kinds of perturbations in the cell would result in changing a negative response to a positive one? Method In our model, we classify proteins into two categories: core component proteins (set ψ) and additional proteins (set λ). We randomly generate our model's parameters in repeated simulations. To simulate the pathway, we carry out three different experiments. In the experiments, we simply change the concentration of the additional proteins (λ) available to the cell. The concentration of proteins in ψ is varied consistently from 300 to 400. In Experiment 1, the range of values for λ is set to be 100 to 150. In Experiment 2, it is set to be 151 to 200. In Experiment 3, the set λ is further split into σ and ς, with the idea that proteins in σ are more important than those in ς. The range of values for σ is set to be between 151 to 200 while that of ς is 100 to 150. Decision trees were derived from each of the first two experiments to allow us to more easily analyze the conditions under which the pheromone is expressed. Conclusion The simulation results reveal that a cell can overcome the detrimental effects of the conditions by using more concentration of additional proteins in λ. The first two experiments provide evidence that employing more concentration of proteins might be one of the ways that the cell uses to adapt itself in inhibiting conditions to facilitate mating. The results of the third experiment reveal that in some case the protein set σ is sufficient in regulating the response of the cell. Results of Experiments 4 and 5 reveal that there are certain conditions (parameters) in the model that are more important in determining whether a cell will respond positively or not. PMID:25080237
Yeast pheromone pathway modeling using Petri nets.
Majumdar, Abhishek; Scott, Stephen D; Deogun, Jitender S; Harris, Steven
2014-01-01
Our environment is composed of biological components of varying magnitude. The relationships between the different biological elements can be represented as a biological network. The process of mating in S. cerevisiae is initiated by secretion of pheromone by one of the cells. Our interest lies in one particular question: how does a cell dynamically adapt the pathway to continue mating under severe environmental changes or under mutation (which might result in the loss of functionality of some proteins known to participate in the pheromone pathway). Our work attempts to answer this question. To achieve this, we first propose a model to simulate the pheromone pathway using Petri nets. Petri nets are directed graphs that can be used for describing and modeling systems characterized as concurrent, asynchronous, distributed, parallel, non-deterministic, and/or stochastic. We then analyze our Petri net-based model of the pathway to investigate the following: 1) Given the model of the pheromone response pathway, under what conditions does the cell respond positively, i.e., mate? 2) What kinds of perturbations in the cell would result in changing a negative response to a positive one? In our model, we classify proteins into two categories: core component proteins (set ψ) and additional proteins (set λ). We randomly generate our model's parameters in repeated simulations. To simulate the pathway, we carry out three different experiments. In the experiments, we simply change the concentration of the additional proteins (λ) available to the cell. The concentration of proteins in ψ is varied consistently from 300 to 400. In Experiment 1, the range of values for λ is set to be 100 to 150. In Experiment 2, it is set to be 151 to 200. In Experiment 3, the set λ is further split into σ and ς, with the idea that proteins in σ are more important than those in ς. The range of values for σ is set to be between 151 to 200 while that of ς is 100 to 150. Decision trees were derived from each of the first two experiments to allow us to more easily analyze the conditions under which the pheromone is expressed. The simulation results reveal that a cell can overcome the detrimental effects of the conditions by using more concentration of additional proteins in λ. The first two experiments provide evidence that employing more concentration of proteins might be one of the ways that the cell uses to adapt itself in inhibiting conditions to facilitate mating. The results of the third experiment reveal that in some case the protein set σ is sufficient in regulating the response of the cell. Results of Experiments 4 and 5 reveal that there are certain conditions (parameters) in the model that are more important in determining whether a cell will respond positively or not.
Amberg, Alexander; Barrett, Dave; Beale, Michael H.; Beger, Richard; Daykin, Clare A.; Fan, Teresa W.-M.; Fiehn, Oliver; Goodacre, Royston; Griffin, Julian L.; Hankemeier, Thomas; Hardy, Nigel; Harnly, James; Higashi, Richard; Kopka, Joachim; Lane, Andrew N.; Lindon, John C.; Marriott, Philip; Nicholls, Andrew W.; Reily, Michael D.; Thaden, John J.; Viant, Mark R.
2013-01-01
There is a general consensus that supports the need for standardized reporting of metadata or information describing large-scale metabolomics and other functional genomics data sets. Reporting of standard metadata provides a biological and empirical context for the data, facilitates experimental replication, and enables the re-interrogation and comparison of data by others. Accordingly, the Metabolomics Standards Initiative is building a general consensus concerning the minimum reporting standards for metabolomics experiments of which the Chemical Analysis Working Group (CAWG) is a member of this community effort. This article proposes the minimum reporting standards related to the chemical analysis aspects of metabolomics experiments including: sample preparation, experimental analysis, quality control, metabolite identification, and data pre-processing. These minimum standards currently focus mostly upon mass spectrometry and nuclear magnetic resonance spectroscopy due to the popularity of these techniques in metabolomics. However, additional input concerning other techniques is welcomed and can be provided via the CAWG on-line discussion forum at http://msi-workgroups.sourceforge.net/ or http://Msi-workgroups-feedback@lists.sourceforge.net. Further, community input related to this document can also be provided via this electronic forum. PMID:24039616
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paegert, Martin; Stassun, Keivan G.; Burger, Dan M.
2014-08-01
We describe a new neural-net-based light curve classifier and provide it with documentation as a ready-to-use tool for the community. While optimized for identification and classification of eclipsing binary stars, the classifier is general purpose, and has been developed for speed in the context of upcoming massive surveys such as the Large Synoptic Survey Telescope. A challenge for classifiers in the context of neural-net training and massive data sets is to minimize the number of parameters required to describe each light curve. We show that a simple and fast geometric representation that encodes the overall light curve shape, together withmore » a chi-square parameter to capture higher-order morphology information results in efficient yet robust light curve classification, especially for eclipsing binaries. Testing the classifier on the ASAS light curve database, we achieve a retrieval rate of 98% and a false-positive rate of 2% for eclipsing binaries. We achieve similarly high retrieval rates for most other periodic variable-star classes, including RR Lyrae, Mira, and delta Scuti. However, the classifier currently has difficulty discriminating between different sub-classes of eclipsing binaries, and suffers a relatively low (∼60%) retrieval rate for multi-mode delta Cepheid stars. We find that it is imperative to train the classifier's neural network with exemplars that include the full range of light curve quality to which the classifier will be expected to perform; the classifier performs well on noisy light curves only when trained with noisy exemplars. The classifier source code, ancillary programs, a trained neural net, and a guide for use, are provided.« less
Carbon Sequestration by Fruit Trees - Chinese Apple Orchards as an Example
Wu, Ting; Wang, Yi; Yu, Changjiang; Chiarawipa, Rawee; Zhang, Xinzhong; Han, Zhenhai; Wu, Lianhai
2012-01-01
Apple production systems are an important component in the Chinese agricultural sector with 1.99 million ha plantation. The orchards in China could play an important role in the carbon (C) cycle of terrestrial ecosystems and contribute to C sequestration. The carbon sequestration capability in apple orchards was analyzed through identifying a set of potential assessment factors and their weighting factors determined by a field model study and literature. The dynamics of the net C sink in apple orchards in China was estimated based on the apple orchard inventory data from 1990s and the capability analysis. The field study showed that the trees reached the peak of C sequestration capability when they were 18 years old, and then the capability began to decline with age. Carbon emission derived from management practices would not be compensated through C storage in apple trees before reaching the mature stage. The net C sink in apple orchards in China ranged from 14 to 32 Tg C, and C storage in biomass from 230 to 475 Tg C between 1990 and 2010. The estimated net C sequestration in Chinese apple orchards from 1990 to 2010 was equal to 4.5% of the total net C sink in the terrestrial ecosystems in China. Therefore, apple production systems can be potentially considered as C sinks excluding the energy associated with fruit production in addition to provide fruits. PMID:22719974
Carbon sequestration by fruit trees--Chinese apple orchards as an example.
Wu, Ting; Wang, Yi; Yu, Changjiang; Chiarawipa, Rawee; Zhang, Xinzhong; Han, Zhenhai; Wu, Lianhai
2012-01-01
Apple production systems are an important component in the Chinese agricultural sector with 1.99 million ha plantation. The orchards in China could play an important role in the carbon (C) cycle of terrestrial ecosystems and contribute to C sequestration. The carbon sequestration capability in apple orchards was analyzed through identifying a set of potential assessment factors and their weighting factors determined by a field model study and literature. The dynamics of the net C sink in apple orchards in China was estimated based on the apple orchard inventory data from 1990s and the capability analysis. The field study showed that the trees reached the peak of C sequestration capability when they were 18 years old, and then the capability began to decline with age. Carbon emission derived from management practices would not be compensated through C storage in apple trees before reaching the mature stage. The net C sink in apple orchards in China ranged from 14 to 32 Tg C, and C storage in biomass from 230 to 475 Tg C between 1990 and 2010. The estimated net C sequestration in Chinese apple orchards from 1990 to 2010 was equal to 4.5% of the total net C sink in the terrestrial ecosystems in China. Therefore, apple production systems can be potentially considered as C sinks excluding the energy associated with fruit production in addition to provide fruits.
SeaDataNet network services monitoring: Definition and Implementation of Service availability index
NASA Astrophysics Data System (ADS)
Lykiardopoulos, Angelos; Mpalopoulou, Stavroula; Vavilis, Panagiotis; Pantazi, Maria; Iona, Sissy
2014-05-01
SeaDataNet (SDN) is a standardized system for managing large and diverse data sets collected by the oceanographic fleets and the automatic observation systems. The SeaDataNet network is constituted of national oceanographic data centres of 35 countries, active in data collection. SeaDataNetII project's objective is to upgrade the present SeaDataNet infrastructure into an operationally robust and state-of-the-art infrastructure; therefore Network Monitoring is a step to this direction. The term Network Monitoring describes the use of system that constantly monitors a computer network for slow or failing components and that notifies the network administrator in case of outages. Network monitoring is crucial when implementing widely distributed systems over the Internet and in real-time systems as it detects malfunctions that may occur and notifies the system administrator who can immediately respond and correct the problem. In the framework of SeaDataNet II project a monitoring system was developed in order to monitor the SeaDataNet components. The core system is based on Nagios software. Some plug-ins were developed to support SeaDataNet modules. On the top of Nagios Engine a web portal was developed in order to give access to local administrators of SeaDataNet components, to view detailed logs of their own service(s). Currently the system monitors 35 SeaDataNet Download Managers, 9 SeaDataNet Services, 25 GeoSeas Download Managers and 23 UBSS Download Managers . Taking advantage of the continuous monitoring of SeaDataNet system components a total availability index will be implemented. The term availability can be defined as the ability of a functional unit to be in a state to perform a required function under given conditions at a given instant of time or over a given time interval, assuming that the required external resources are provided. Availability measures can be considered as a are very important benefit becauseT - The availability trends that can be extracted from the stored availability measurements will give an indication of the condition of the service modules. - Will help in planning upgrades planning - and the maintenance of the network service. - It is a prerequisite in case of signing a Service Level Agreement. To construct the service availability index, a method for measuring availability of SeaDataNet network is developed and a database is implemented to store the measured values. Although the measurements of availability of a single component in a network service can be considered as simple (is a percentage of time in a year that the service is available to the users), the ipmlementation of a method to measure the total availability of a composite system can be complicated and there is no a standardized method to deal with it. The method followed to calculate the total availability index in case of SeaDataNet can be described as follows: The whole system was divided in operational modules providing a single service in which the availability can be measured by monitoring portal. Next the dependences between these modules were defined in order to formulate the influence of availability of each module against the whole system. For each module a weight coefficient depending on module's involvement in total system productivity was defined. A mathematical formula was developed to measure the index.
Automated analysis of Physarum network structure and dynamics
NASA Astrophysics Data System (ADS)
Fricker, Mark D.; Akita, Dai; Heaton, Luke LM; Jones, Nick; Obara, Boguslaw; Nakagaki, Toshiyuki
2017-06-01
We evaluate different ridge-enhancement and segmentation methods to automatically extract the network architecture from time-series of Physarum plasmodia withdrawing from an arena via a single exit. Whilst all methods gave reasonable results, judged by precision-recall analysis against a ground-truth skeleton, the mean phase angle (Feature Type) from intensity-independent, phase-congruency edge enhancement and watershed segmentation was the most robust to variation in threshold parameters. The resultant single pixel-wide segmented skeleton was converted to a graph representation as a set of weighted adjacency matrices containing the physical dimensions of each vein, and the inter-vein regions. We encapsulate the complete image processing and network analysis pipeline in a downloadable software package, and provide an extensive set of metrics that characterise the network structure, including hierarchical loop decomposition to analyse the nested structure of the developing network. In addition, the change in volume for each vein and intervening plasmodial sheet was used to predict the net flow across the network. The scaling relationships between predicted current, speed and shear force with vein radius were consistent with predictions from Murray’s law. This work was presented at PhysNet 2015.
SAFEGUARD: An Assured Safety Net Technology for UAS
NASA Technical Reports Server (NTRS)
Dill, Evan T.; Young, Steven D.; Hayhurst, Kelly J.
2016-01-01
As demands increase to use unmanned aircraft systems (UAS) for a broad spectrum of commercial applications, regulatory authorities are examining how to safely integrate them without loss of safety or major disruption to existing airspace operations. This work addresses the development of the Safeguard system as an assured safety net technology for UAS. The Safeguard system monitors and enforces conformance to a set of rules defined prior to flight (e.g., geospatial stay-out or stay-in regions, speed limits, altitude limits). Safeguard operates independently of the UAS autopilot and is strategically designed in a way that can be realized by a small set of verifiable functions to simplify compliance with regulatory standards for commercial aircraft. A framework is described that decouples the system from any other devices on the UAS as well as introduces complementary positioning source(s) for applications that require integrity and availability beyond what the Global Positioning System (GPS) can provide. Additionally, the high level logic embedded within the software is presented, as well as the steps being taken toward verification and validation (V&V) of proper functionality. Next, an initial prototype implementation of the described system is disclosed. Lastly, future work including development, testing, and system V&V is summarized.
Shallow infiltration processes at Yucca Mountain, Nevada : neutron logging data 1984-93
Flint, Lorraine E.; Flint, Alan L.
1995-01-01
To determine site suitability of Yucca Mountain, Nevada, as a potential high-level radioactive waste repository, a study was devised to characterize net infiltration. This study involves a detailed data set produced from 99 neutron boreholes that consisted of volumetric water-content readings with depth from 1984 through 1993 at Yucca Mountain. Boreholes were drilled with minimal disturbance to the surrounding soil or rock in order to best represent field conditions. Boreholes were located in topographic positions representing infiltration zones identified as ridgetops, sideslopes, terraces, and active channels. Through careful field calibration, neutron moisture logs, collected on a monthly basis and representing most of the areal locations at Yucca Mountain, illustrated that the depth of penetration of seasonal moisture, important for escaping loss to evapotranspiration, was influenced by several factors. It was increased (1) by thin soil cover, especially in locations where thin soil is underlain by fractured bedrock; (2) on ridgetops; and (3) during the winter when evapotranspiration is low and runoff is less frequent. This data set helps to provide a seasonal and areal distribution of changes in volumetric water content with which to assess hydrologic processes contributing to net infiltration.
NASA Astrophysics Data System (ADS)
Garner, G.; Hannah, D. M.; Malcolm, I.; Sadler, J. P.
2012-12-01
Riparian forest is recognised as important for moderating stream temperature variability and has the potential to mitigate thermal extremes in a changing climate. Previous research on the heat exchanges controlling water column temperature has often been short-term or seasonally-constrained, with the few multi-year studies limited to a maximum of two years. This study advances previous work by providing a longer-term perspective which allows assessment of inter-annual variability in stream temperature, microclimate and heat exchange dynamics between a semi-natural woodland and a moorland (no trees) reach of the Girnock Burn, a tributary of the Scottish Dee. Automatic weather stations collected 15-minute data over seven consecutive years, which to our knowledge is a unique data set in providing the longest term perspective to date on stream temperature, microclimate and heat exchange processes. Results for spring-summer indicate that the presence of a riparian canopy has a consistent effect between years in reducing the magnitude and variability of mean daily water column temperature and daily net energy totals. Differences in the magnitude and variability in net energy fluxes between the study reaches were driven primarily by fluctuations in net radiation and latent heat fluxes in response to between- and within-year variability in growth of the riparian forest canopy at the forest and prevailing weather conditions at both the forest and moorland. This research provides new insights on the inter-annual variability of stream energy exchanges for moorland and forested reaches under a wide range of climatological and hydrological conditions. The findings therefore provide a more robust process basis for modelling the impact of changes in forest practice and climate change on river thermal dynamics.
High-Throughput Classification of Radiographs Using Deep Convolutional Neural Networks.
Rajkomar, Alvin; Lingam, Sneha; Taylor, Andrew G; Blum, Michael; Mongan, John
2017-02-01
The study aimed to determine if computer vision techniques rooted in deep learning can use a small set of radiographs to perform clinically relevant image classification with high fidelity. One thousand eight hundred eighty-five chest radiographs on 909 patients obtained between January 2013 and July 2015 at our institution were retrieved and anonymized. The source images were manually annotated as frontal or lateral and randomly divided into training, validation, and test sets. Training and validation sets were augmented to over 150,000 images using standard image manipulations. We then pre-trained a series of deep convolutional networks based on the open-source GoogLeNet with various transformations of the open-source ImageNet (non-radiology) images. These trained networks were then fine-tuned using the original and augmented radiology images. The model with highest validation accuracy was applied to our institutional test set and a publicly available set. Accuracy was assessed by using the Youden Index to set a binary cutoff for frontal or lateral classification. This retrospective study was IRB approved prior to initiation. A network pre-trained on 1.2 million greyscale ImageNet images and fine-tuned on augmented radiographs was chosen. The binary classification method correctly classified 100 % (95 % CI 99.73-100 %) of both our test set and the publicly available images. Classification was rapid, at 38 images per second. A deep convolutional neural network created using non-radiological images, and an augmented set of radiographs is effective in highly accurate classification of chest radiograph view type and is a feasible, rapid method for high-throughput annotation.
Placental alpha-microglobulin-1 and combined traditional diagnostic test: a cost-benefit analysis.
Echebiri, Nelson C; McDoom, M Maya; Pullen, Jessica A; Aalto, Meaghan M; Patel, Natasha N; Doyle, Nora M
2015-01-01
We sought to evaluate if the placental alpha-microglobulin (PAMG)-1 test vs the combined traditional diagnostic test (CTDT) of pooling, nitrazine, and ferning would be a cost-beneficial screening strategy in the setting of potential preterm premature rupture of membranes. A decision analysis model was used to estimate the economic impact of PAMG-1 test vs the CTDT on preterm delivery costs from a societal perspective. Our primary outcome was the annual net cost-benefit per person tested. Baseline probabilities and costs assumptions were derived from published literature. We conducted sensitivity analyses using both deterministic and probabilistic models. Cost estimates reflect 2013 US dollars. Annual net benefit from PAMG-1 was $20,014 per person tested, while CTDT had a net benefit of $15,757 per person tested. If the probability of rupture is <38%, PAMG-1 will be cost-beneficial with an annual net benefit of $16,000-37,000 per person tested, while CTDT will have an annual net benefit of $16,000-19,500 per person tested. If the probability of rupture is >38%, CTDT is more cost-beneficial. Monte Carlo simulations of 1 million trials selected PAMG-1 as the optimal strategy with a frequency of 89%, while CTDT was only selected as the optimal strategy with a frequency of 11%. Sensitivity analyses were robust. Our cost-benefit analysis provides the economic evidence for the adoption of PAMG-1 in diagnosing preterm premature rupture of membranes in uncertain presentations and when CTDT is equivocal at 34 to <37 weeks' gestation. Copyright © 2015 Elsevier Inc. All rights reserved.
Assessing Cybercrime Through the Eyes of the WOMBAT
NASA Astrophysics Data System (ADS)
Dacier, Marc; Leita, Corrado; Thonnard, Olivier; van Pham, Hau; Kirda, Engin
The WOMBAT project is a collaborative European funded research project that aims at providing new means to understand the existing and emerging threats that are targeting the Internet economy and the net citizens. The approach carried out by the partners include a data collection effort as well as some sophisticated analysis techniques. In this chapter, we present one of the threats-related data collection system in use by the project, as well as some of the early results obtained when digging into these data sets.
Using business analytics to improve outcomes.
Rivera, Jose; Delaney, Stephen
2015-02-01
Orlando Health has brought its hospital and physician practice revenue cycle systems into better balance using four sets of customized analytics: Physician performance analytics gauge the total net revenue for every employed physician. Patient-pay analytics provide financial risk scores for all patients on both the hospital and physician practice sides. Revenue management analytics bridge the gap between the back-end central business office and front-end physician practice managers and administrators. Enterprise management analytics allow the hospitals and physician practices to share important information about common patients.
Net returns, fiscal risks, and the optimal patient mix for a profit-maximizing hospital.
Ozatalay, S; Broyles, R
1987-10-01
As is well recognized, the provisions of PL98-21 not only transfer financial risks from the Medicare program to the hospital but also induce institutions to adjust the diagnostic mix of Medicare beneficiaries so as to maximize net income or minimize the net loss. This paper employs variation in the set of net returns as the sole measure of financial risk and develops a model that identifies the mix of beneficiaries that maximizes net income, subject to a given level of risk. The results indicate that the provisions of PL98-21 induce the institution to deny admission to elderly patients presenting conditions for which the net return is relatively low and the variance in the cost per case is large. Further, the paper suggests that the treatment of beneficiaries at a level commensurate with previous periods or the preferences of physicians may jeopardize the viability and solvency of Medicare-dependent hospitals.
Why is there net surface heating over the Antarctic Circumpolar Current?
NASA Astrophysics Data System (ADS)
Czaja, Arnaud; Marshall, John
2015-05-01
Using a combination of atmospheric reanalysis data, climate model outputs and a simple model, key mechanisms controlling net surface heating over the Southern Ocean are identified. All data sources used suggest that, in a streamline-averaged view, net surface heating over the Antarctic Circumpolar Current (ACC) is a result of net accumulation of solar radiation rather than a result of heat gain through turbulent fluxes (the latter systematically cool the upper ocean). It is proposed that the fraction of this net radiative heat gain realized as net ACC heating is set by two factors. First, the sea surface temperature at the southern edge of the ACC. Second, the relative strength of the negative heatflux feedbacks associated with evaporation at the sea surface and advection of heat by the residual flow in the oceanic mixed layer. A large advective feedback and a weak evaporative feedback maximize net ACC heating. It is shown that the present Southern Ocean and its circumpolar current are in this heating regime.
Balasubramanian, Bijal A.; Garcia, Michael P.; Corley, Douglas A.; Doubeni, Chyke A.; Haas, Jennifer S.; Kamineni, Aruna; Quinn, Virginia P.; Wernli, Karen; Zheng, Yingye; Skinner, Celette Sugg
2017-01-01
Abstract Previous research shows that patients in integrated health systems experience fewer racial disparities compared with more traditional healthcare systems. Little is known about patterns of racial/ethnic disparities between safety-net and non safety-net integrated health systems. We evaluated racial/ethnic differences in body mass index (BMI) and the Charlson comorbidity index from 3 non safety-net- and 1 safety-net integrated health systems in a cross-sectional study. Multinomial logistic regression modeled comorbidity and BMI on race/ethnicity and health care system type adjusting for age, sex, insurance, and zip-code-level income The study included 1.38 million patients. Higher proportions of safety-net versus non safety-net patients had comorbidity score of 3+ (11.1% vs. 5.0%) and BMI ≥35 (27.7% vs. 15.8%). In both types of systems, blacks and Hispanics were more likely than whites to have higher BMIs. Whites were more likely than blacks or Hispanics to have higher comorbidity scores in a safety net system, but less likely to have higher scores in the non safety-nets. The odds of comorbidity score 3+ and BMI 35+ in blacks relative to whites were significantly lower in safety-net than in non safety-net settings. Racial/ethnic differences were present within both safety-net and non safety-net integrated health systems, but patterns differed. Understanding patterns of racial/ethnic differences in health outcomes in safety-net and non safety-net integrated health systems is important to tailor interventions to eliminate racial/ethnic disparities in health and health care. PMID:28296752
Controls on the variability of net infiltration to desert sandstone
Heilweil, Victor M.; McKinney, Tim S.; Zhdanov, Michael S.; Watt, Dennis E.
2007-01-01
As populations grow in arid climates and desert bedrock aquifers are increasingly targeted for future development, understanding and quantifying the spatial variability of net infiltration becomes critically important for accurately inventorying water resources and mapping contamination vulnerability. This paper presents a conceptual model of net infiltration to desert sandstone and then develops an empirical equation for its spatial quantification at the watershed scale using linear least squares inversion methods for evaluating controlling parameters (independent variables) based on estimated net infiltration rates (dependent variables). Net infiltration rates used for this regression analysis were calculated from environmental tracers in boreholes and more than 3000 linear meters of vadose zone excavations in an upland basin in southwestern Utah underlain by Navajo sandstone. Soil coarseness, distance to upgradient outcrop, and topographic slope were shown to be the primary physical parameters controlling the spatial variability of net infiltration. Although the method should be transferable to other desert sandstone settings for determining the relative spatial distribution of net infiltration, further study is needed to evaluate the effects of other potential parameters such as slope aspect, outcrop parameters, and climate on absolute net infiltration rates.
Linking netCDF Data with the Semantic Web - Enhancing Data Discovery Across Domains
NASA Astrophysics Data System (ADS)
Biard, J. C.; Yu, J.; Hedley, M.; Cox, S. J. D.; Leadbetter, A.; Car, N. J.; Druken, K. A.; Nativi, S.; Davis, E.
2016-12-01
Geophysical data communities are publishing large quantities of data across a wide variety of scientific domains which are overlapping more and more. Whilst netCDF is a common format for many of these communities, it is only one of a large number of data storage and transfer formats. One of the major challenges ahead is finding ways to leverage these diverse data sets to advance our understanding of complex problems. We describe a methodology for incorporating Resource Description Framework (RDF) triples into netCDF files called netCDF-LD (netCDF Linked Data). NetCDF-LD explicitly connects the contents of netCDF files - both data and metadata, with external web-based resources, including vocabularies, standards definitions, and data collections, and through them, a whole host of related information. This approach also preserves and enhances the self describing essence of the netCDF format and its metadata, whilst addressing the challenge of integrating various conventions into files. We present a case study illustrating how reasoning over RDF graphs can empower researchers to discover datasets across domain boundaries.
NASA Astrophysics Data System (ADS)
Konolige, Kurt G.; Gutmann, Steffen; Guzzoni, Didier; Ficklin, Robert W.; Nicewarner, Keith E.
1999-08-01
Mobile robot hardware and software is developing to the point where interesting applications for groups of such robots can be contemplated. We envision a set of mobots acting to map and perform surveillance or other task within an indoor environment (the Sense Net). A typical application of the Sense Net would be to detect survivors in buildings damaged by earthquake or other disaster, where human searchers would be put a risk. As a team, the Sense Net could reconnoiter a set of buildings faster, more reliably, and more comprehensibly than an individual mobot. The team, for example, could dynamically form subteams to perform task that cannot be done by individual robots, such as measuring the range to a distant object by forming a long baseline stereo sensor form a pari of mobots. In addition, the team could automatically reconfigure itself to handle contingencies such as disabled mobots. This paper is a report of our current progress in developing the Sense Net, after the first year of a two-year project. In our approach, each mobot has sufficient autonomy to perform several tasks, such as mapping unknown areas, navigating to specific positions, and detecting, tracking, characterizing, and classifying human and vehicular activity. We detail how some of these tasks are accomplished, and how the mobot group is tasked.
Libsharp - spherical harmonic transforms revisited
NASA Astrophysics Data System (ADS)
Reinecke, M.; Seljebotn, D. S.
2013-06-01
We present libsharp, a code library for spherical harmonic transforms (SHTs), which evolved from the libpsht library and addresses several of its shortcomings, such as adding MPI support for distributed memory systems and SHTs of fields with arbitrary spin, but also supporting new developments in CPU instruction sets like the Advanced Vector Extensions (AVX) or fused multiply-accumulate (FMA) instructions. The library is implemented in portable C99 and provides an interface that can be easily accessed from other programming languages such as C++, Fortran, Python, etc. Generally, libsharp's performance is at least on par with that of its predecessor; however, significant improvements were made to the algorithms for scalar SHTs, which are roughly twice as fast when using the same CPU capabilities. The library is available at
Zhang, Qingzhou; Yang, Bo; Chen, Xujiao; Xu, Jing; Mei, Changlin; Mao, Zhiguo
2014-01-01
We present a bioinformatics database named Renal Gene Expression Database (RGED), which contains comprehensive gene expression data sets from renal disease research. The web-based interface of RGED allows users to query the gene expression profiles in various kidney-related samples, including renal cell lines, human kidney tissues and murine model kidneys. Researchers can explore certain gene profiles, the relationships between genes of interests and identify biomarkers or even drug targets in kidney diseases. The aim of this work is to provide a user-friendly utility for the renal disease research community to query expression profiles of genes of their own interest without the requirement of advanced computational skills. Availability and implementation: Website is implemented in PHP, R, MySQL and Nginx and freely available from http://rged.wall-eva.net. Database URL: http://rged.wall-eva.net PMID:25252782
Zhang, Qingzhou; Yang, Bo; Chen, Xujiao; Xu, Jing; Mei, Changlin; Mao, Zhiguo
2014-01-01
We present a bioinformatics database named Renal Gene Expression Database (RGED), which contains comprehensive gene expression data sets from renal disease research. The web-based interface of RGED allows users to query the gene expression profiles in various kidney-related samples, including renal cell lines, human kidney tissues and murine model kidneys. Researchers can explore certain gene profiles, the relationships between genes of interests and identify biomarkers or even drug targets in kidney diseases. The aim of this work is to provide a user-friendly utility for the renal disease research community to query expression profiles of genes of their own interest without the requirement of advanced computational skills. Website is implemented in PHP, R, MySQL and Nginx and freely available from http://rged.wall-eva.net. http://rged.wall-eva.net. © The Author(s) 2014. Published by Oxford University Press.
Using Petri Net Tools to Study Properties and Dynamics of Biological Systems
Peleg, Mor; Rubin, Daniel; Altman, Russ B.
2005-01-01
Petri Nets (PNs) and their extensions are promising methods for modeling and simulating biological systems. We surveyed PN formalisms and tools and compared them based on their mathematical capabilities as well as by their appropriateness to represent typical biological processes. We measured the ability of these tools to model specific features of biological systems and answer a set of biological questions that we defined. We found that different tools are required to provide all capabilities that we assessed. We created software to translate a generic PN model into most of the formalisms and tools discussed. We have also made available three models and suggest that a library of such models would catalyze progress in qualitative modeling via PNs. Development and wide adoption of common formats would enable researchers to share models and use different tools to analyze them without the need to convert to proprietary formats. PMID:15561791
Intermediate Decoding Skills. NetNews. Volume 4, Number 4
ERIC Educational Resources Information Center
LDA of Minnesota, 2004
2004-01-01
Intermediate decoding refers to word analysis skills that are beyond a beginning, one-syllable level as described in an earlier NetNews issue, yet are just as important for building adult level reading proficiency. Research from secondary settings indicates that struggling readers in middle school or high school programs often read between the…
Putative regulatory sites unraveled by network-embedded thermodynamic analysis of metabolome data
Kümmel, Anne; Panke, Sven; Heinemann, Matthias
2006-01-01
As one of the most recent members of the omics family, large-scale quantitative metabolomics data are currently complementing our systems biology data pool and offer the chance to integrate the metabolite level into the functional analysis of cellular networks. Network-embedded thermodynamic analysis (NET analysis) is presented as a framework for mechanistic and model-based analysis of these data. By coupling the data to an operating metabolic network via the second law of thermodynamics and the metabolites' Gibbs energies of formation, NET analysis allows inferring functional principles from quantitative metabolite data; for example it identifies reactions that are subject to active allosteric or genetic regulation as exemplified with quantitative metabolite data from Escherichia coli and Saccharomyces cerevisiae. Moreover, the optimization framework of NET analysis was demonstrated to be a valuable tool to systematically investigate data sets for consistency, for the extension of sub-omic metabolome data sets and for resolving intracompartmental concentrations from cell-averaged metabolome data. Without requiring any kind of kinetic modeling, NET analysis represents a perfectly scalable and unbiased approach to uncover insights from quantitative metabolome data. PMID:16788595
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanders, J; Tian, X; Segars, P
2016-06-15
Purpose: To develop an automated technique for estimating patient-specific regional imparted energy and dose from tube current modulated (TCM) computed tomography (CT) exams across a diverse set of head and body protocols. Methods: A library of 58 adult computational anthropomorphic extended cardiac-torso (XCAT) phantoms were used to model a patient population. A validated Monte Carlo program was used to simulate TCM CT exams on the entire library of phantoms for three head and 10 body protocols. The net imparted energy to the phantoms, normalized by dose length product (DLP), and the net tissue mass in each of the scan regionsmore » were computed. A knowledgebase containing relationships between normalized imparted energy and scanned mass was established. An automated computer algorithm was written to estimate the scanned mass from actual clinical CT exams. The scanned mass estimate, DLP of the exam, and knowledgebase were used to estimate the imparted energy to the patient. The algorithm was tested on 20 chest and 20 abdominopelvic TCM CT exams. Results: The normalized imparted energy increased with increasing kV for all protocols. However, the normalized imparted energy was relatively unaffected by the strength of the TCM. The average imparted energy was 681 ± 376 mJ for abdominopelvic exams and 274 ± 141 mJ for chest exams. Overall, the method was successful in providing patientspecific estimates of imparted energy for 98% of the cases tested. Conclusion: Imparted energy normalized by DLP increased with increasing tube potential. However, the strength of the TCM did not have a significant effect on the net amount of energy deposited to tissue. The automated program can be implemented into the clinical workflow to provide estimates of regional imparted energy and dose across a diverse set of clinical protocols.« less
Barnard, Juliana G; Dempsey, Amanda F; Brewer, Sarah E; Pyrzanowski, Jennifer; Mazzoni, Sara E; O'Leary, Sean T
2017-01-01
Many young and middle-aged women receive their primary health care from their obstetrician-gynecologists. A recent change to vaccination recommendations during pregnancy has forced the integration of new clinical processes at obstetrician-gynecology practices. Evidence-based best practices for vaccination delivery include the establishment of vaccination standing orders. As part of an intervention to increase adoption of evidence-based vaccination strategies for women in safety-net and private obstetrician-gynecology settings, we conducted a qualitative study to identify the facilitators and barriers experienced by obstetrician-gynecology sites when establishing vaccination standing orders. At 6 safety-net and private obstetrician-gynecology practices, 51 semistructured interviews were completed by trained qualitative researchers over 2 years with clinical staff and vaccination program personnel. Standardized qualitative research methods were used during data collection and team-based data analysis to identify major themes and subthemes within the interview data. All study practices achieved partial to full implementation of vaccine standing orders for human papillomavirus, tetanus diphtheria pertussis, and influenza vaccines. Facilitating factors for vaccine standing order adoption included process standardization, acceptance of a continual modification process, and staff training. Barriers to vaccine standing order adoption included practice- and staff-level competing demands, pregnant women's preference for medical providers to discuss vaccine information with them, and staff hesitation in determining HPV vaccine eligibility. With guidance and commitment to integration of new processes, obstetrician-gynecology practices are able to establish vaccine standing orders for pregnant and nonpregnant women. Attention to certain process barriers can aid the adoption of processes to support the delivery of vaccinations in obstetrician-gynecology practice setting, and provide access to preventive health care for many women. Copyright © 2016 Elsevier Inc. All rights reserved.
Apinjoh, Tobias O; Anchang-Kimbi, Judith K; Mugri, Regina N; Tangoh, Delphine A; Nyingchu, Robert V; Chi, Hanesh F; Tata, Rolland B; Njumkeng, Charles; Njua-Yafi, Clarisse; Achidi, Eric A
2015-01-01
Insecticide Treated Nets (ITNs) have been shown to reduce morbidity and mortality, but coverage and proper utilization continues to be moderate in many parts of sub-Saharan Africa. The gains made through a nationwide free distribution were explored as well as the effect on malaria prevalence in semi-urban and rural communities in south western Cameroon. A cross sectional survey was conducted between August and December 2013. Information on net possession, status and use were collected using a structured questionnaire while malaria parasitaemia was determined on Giemsa-stained blood smears by light microscopy. ITN ownership increased from 41.9% to 68.1% following the free distribution campaign, with 58.3% (466/799) reportedly sleeping under the net. ITN ownership was lower in rural settings (adjusted OR = 1.93, 95%CI = 1.36-2.74, p<0.001) and at lower altitude (adjusted OR = 1.79, 95%CI = 1.22-2.62, p = 0.003) compared to semi-urban settings and intermediate altitude respectively. Conversely, ITN usage was higher in semi-urban settings (p = 0.002) and at intermediate altitude (p = 0.002) compared with rural localities and low altitude. Malaria parasitaemia prevalence was higher in rural (adjusted OR = 1.63, 95%CI = 1.07-2.49) compared to semi-urban settings and in those below 15 years compared to those 15 years and above. Overall, participants who did not sleep under ITN were more susceptible to malaria parasitaemia (adjusted OR = 1.70, 95%CI = 1.14-2.54, p = 0.009). Despite the free distribution campaign, ITN ownership and usage, though improved, is still low. As children who reside in rural settings have greater disease burden (parasitemia) than children in semi-urban settings, the potential gains on both reducing inequities in ITN possession as well as disease burden might be substantial if equitable distribution strategies are adopted.
NASA Astrophysics Data System (ADS)
Haran, T. M.; Brodzik, M. J.; Nordgren, B.; Estilow, T.; Scott, D. J.
2015-12-01
An increasing number of new Earth science datasets are being producedby data providers in self-describing, machine-independent file formatsincluding Hierarchical Data Format version 5 (HDF5) and NetworkCommon Data Form version 4 (netCDF-4). Furthermore data providers maybe producing netCDF-4 files that follow the conventions for Climateand Forecast metadata version 1.6 (CF 1.6) which, for datasets mappedto a projected raster grid covering all or a portion of the earth,includes the Coordinate Reference System (CRS) used to define howlatitude and longitude are mapped to grid coordinates, i.e. columnsand rows, and vice versa. One problem that users may encounter is thattheir preferred visualization and analysis tool may not yet includesupport for one of these newer formats. Moreover, data distributorssuch as NASA's NSIDC DAAC may not yet include support for on-the-flyconversion of data files for all data sets produced in a new format toa preferred older distributed format.There do exist open source solutions to this dilemma in the form ofsoftware packages that can translate files in one of the new formatsto one of the preferred formats. However these software packagesrequire that the file to be translated conform to the specificationsof its respective format. Although an online CF-Convention compliancechecker is available from cfconventions.org, a recent NSIDC userservices incident described here in detail involved an NSIDC-supporteddata set that passed the (then current) CF Checker Version 2.0.6, butwas in fact lacking two variables necessary for conformance. Thisproblem was not detected until GDAL, a software package which reliedon the missing variables, was employed by a user in an attempt totranslate the data into a different file format, namely GeoTIFF.This incident indicates that testing a candidate data product with oneor more software products written to accept the advertised conventionsis proposed as a practice which improves interoperability. Differencesbetween data file contents and software package expectations areexposed, affording an opportunity to improve conformance of software,data or both. The incident can also serve as a demonstration that dataproviders, distributors, and users can work together to improve dataproduct quality and interoperability.
Mueller, Gordon A.; Wydoski, Richard; Best, Eric; Hiebert, Steve; Lantow, Jeff; Santee, Mark; Goettlicher, Bill; Millosovich, Joe
2008-01-01
Trammel netting is generally the accepted method of monitoring razorback sucker in reservoirs, but this method is ineffective for monitoring this fish in rivers. Trammel nets set in the current become fouled with debris, and nets set in backwaters capture high numbers of nontarget species. Nontargeted fish composed 97 percent of fish captured in previous studies (1999-2005). In 2005, discovery of a large spawning aggregation of razorback sucker in midchannel near Needles, Calif., prompted the development of more effective methods to monitor this and possibly other riverine fish populations. This study examined the effectiveness of four methods of monitoring razorback sucker in a riverine environment. Hoop netting, electrofishing, boat surveys, and aerial photography were evaluated in terms of data accuracy, costs, stress on targeted fish, and effect on nontargeted fish as compared with trammel netting. Trammel netting in the riverine portion of the Colorado River downstream of Davis Dam, Arizona-Nevada yielded an average of 43 razorback suckers a year (1999 to 2005). Capture rates averaged 0.5 razorback suckers per staff day effort, at a cost exceeding $1,100 per fish. Population estimates calculated for 2003-2005 were 3,570 (95 percent confidence limits [CL] = 1,306i??i??i??-8,925), 1,768 (CL = 878-3,867) and 1,652 (CL = 706-5,164); wide confidence ranges reflect the small sample size. By-catch associated with trammel netting included common carp, game fish and, occasionally, shorebirds, waterfowl, and muskrats. Hoop nets were prone to downstream drift owing to design and anchoring problems aggravated by hydropower ramping. Tests were dropped after the 2006 field season and replaced with electrofishing. Electrofishing at night during low flow and when spawning razorback suckers moved to the shoreline proved extremely effective. In 2006 and 2007, 263 and 299 (respectively) razorback suckers were taken. Capture rates averaged 8.3 razorback suckers per staff day at a cost of $62 per fish. The adult population was estimated at 1,196 (925-1,546) fish. Compared with trammel netting, confidence limits narrowed substantially, from +or- 500 percent to +or- 30 percent, reflecting more precise estimates. By-catch was limited to two common carp. No recreational game fish, waterfowl, or mammals were captured or handled during use of electrofishing. Aerial photography (2006 and 2007) suggested an annual average of 580 fish detected on imagery. Identification of species was not possible; carp commonly have been mistaken for razorback sucker. Field verification determined that the proportion of razorback suckers to other fish was 3:1. On that basis, we estimated 435 razorback suckers were photographed, which equals 8.4 razorback suckers per staff day at a cost of $78 per fish. The data did not lend itself to population estimates. Fish were more easily identified from boats, where their lateral rather than their dorsal aspect is visible. On average, 888 razorback suckers were positively identified each year. Observation rates averaged 29.6 razorback suckers per staff day at a cost less than $18 per fish observed. Sucker densities averaged 20.5 and 9.6 fish/hectare which equated to an average spawning population at Needles, Calif., of 2,520 in 2006 and 1152 in 2007. The lower 2007 estimate reflected a refinement in sampling approach which removed a sampling bias. Electrofishing and boat surveys were more cost effective than other methods tested, and they provided more accurate information without the by-catch associated with trammel netting. However, they provided different types of data. Handling fish may be necessary for research purposes but unnecessary for general trend analysis. Electrofishing was extremely effective but can harm fish if not used with caution. Unnecessary electrofishing increases the likelihood of spinal damage and possible damage to eggs and potential young, and it may alter spawning behavior or duration. B
Pourat, Nadereh; Martinez, Ana E; Crall, James J
2015-09-01
Community Health Centers (CHCs) are one of the principal safety-net providers of health care for low-income and uninsured populations. Co-locating dental services in primary care settings provides an opportunity to improve access to dental care. Yet this study of California CHCs that provide primary care services shows that only about one-third of them co-located primary and dental care services on-site. An additional one-third were members of multisite organizations in which at least one other site provided dental care. The remaining one-third of CHC sites had no dental care capacity. Policy options to promote co-location include requiring on-site availability of dental services, providing infrastructure funding to build and equip dental facilities, and offering financial incentives to provide dental care and recruit dental providers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doris, E.; Busche, S.; Hockett, S.
2009-12-01
The goal of the Minnesota net metering policy is to give the maximum possible encouragement to distributed generation assets, especially solar electric systems (MN 2008). However, according to a published set of best practices (NNEC 2008) that prioritize the maximum development of solar markets within states, the Minnesota policy does not incorporate many of the important best practices that may help other states transform their solar energy markets and increase the amount of grid-connected distributed solar generation assets. Reasons cited include the low system size limit of 40kW (the best practices document recommends a 2 MW limit) and a lackmore » of language protecting generators from additional utility fees. This study was conducted to compare Minnesota's policies to national best practices. It provides an overview of the current Minnesota policy in the context of these best practices and other jurisdictions' net metering policies, as well as a qualitative assessment of the impacts of raising the system size cap within the policy based on the experiences of other states.« less
Fabrication and Testing of a Thin-Film Heat Flux Sensor for a Stirling Convertor
NASA Technical Reports Server (NTRS)
Wilson, Scott D.; Fralick, Gustave; Wrbanek, John; Sayir, Ali
2009-01-01
The NASA Glenn Research Center (GRC) has been testing high efficiency free-piston Stirling convertors for potential use in radioisotope power systems since 1999. Stirling convertors are being operated for many years to demonstrate a radioisotope power system capable of providing reliable power for potential multi-year missions. Techniques used to monitor the convertors for change in performance include measurements of temperature, pressure, energy addition, and energy rejection. Micro-porous bulk insulation is used in the Stirling convertor test set up to minimize the loss of thermal energy from the electric heat source to the environment. The insulation is characterized before extended operation, enabling correlation of the net thermal energy addition to the convertor. Aging microporous bulk insulation changes insulation efficiency, introducing errors in the correlation for net thermal energy addition. A thin-mm heat flux sensor was designed and fabricated to directly measure the net thermal energy addition to the Stirling convertor. The fabrication techniques include slip casting and using Physical Vapor Deposition (PVD). One micron thick noble metal thermocouples measure temperature on the surface of an Alumina ceramic disc and heat flux is calculated. Fabrication, integration, and test results of a thin film heat flux sensor are presented.
Fuzzy Stochastic Petri Nets for Modeling Biological Systems with Uncertain Kinetic Parameters
Liu, Fei; Heiner, Monika; Yang, Ming
2016-01-01
Stochastic Petri nets (SPNs) have been widely used to model randomness which is an inherent feature of biological systems. However, for many biological systems, some kinetic parameters may be uncertain due to incomplete, vague or missing kinetic data (often called fuzzy uncertainty), or naturally vary, e.g., between different individuals, experimental conditions, etc. (often called variability), which has prevented a wider application of SPNs that require accurate parameters. Considering the strength of fuzzy sets to deal with uncertain information, we apply a specific type of stochastic Petri nets, fuzzy stochastic Petri nets (FSPNs), to model and analyze biological systems with uncertain kinetic parameters. FSPNs combine SPNs and fuzzy sets, thereby taking into account both randomness and fuzziness of biological systems. For a biological system, SPNs model the randomness, while fuzzy sets model kinetic parameters with fuzzy uncertainty or variability by associating each parameter with a fuzzy number instead of a crisp real value. We introduce a simulation-based analysis method for FSPNs to explore the uncertainties of outputs resulting from the uncertainties associated with input parameters, which works equally well for bounded and unbounded models. We illustrate our approach using a yeast polarization model having an infinite state space, which shows the appropriateness of FSPNs in combination with simulation-based analysis for modeling and analyzing biological systems with uncertain information. PMID:26910830
36 CFR 2.4 - Weapons, traps and nets.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 36 Parks, Forests, and Public Property 1 2011-07-01 2011-07-01 false Weapons, traps and nets. 2.4... PROTECTION, PUBLIC USE AND RECREATION § 2.4 Weapons, traps and nets. (a)(1) Except as otherwise provided in... prohibited: (i) Possessing a weapon, trap or net (ii) Carrying a weapon, trap or net (iii) Using a weapon...
36 CFR 2.4 - Weapons, traps and nets.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 36 Parks, Forests, and Public Property 1 2012-07-01 2012-07-01 false Weapons, traps and nets. 2.4... PROTECTION, PUBLIC USE AND RECREATION § 2.4 Weapons, traps and nets. (a)(1) Except as otherwise provided in... prohibited: (i) Possessing a weapon, trap or net (ii) Carrying a weapon, trap or net (iii) Using a weapon...
36 CFR 2.4 - Weapons, traps and nets.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 36 Parks, Forests, and Public Property 1 2013-07-01 2013-07-01 false Weapons, traps and nets. 2.4... PROTECTION, PUBLIC USE AND RECREATION § 2.4 Weapons, traps and nets. (a)(1) Except as otherwise provided in... prohibited: (i) Possessing a weapon, trap or net (ii) Carrying a weapon, trap or net (iii) Using a weapon...
36 CFR 2.4 - Weapons, traps and nets.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 36 Parks, Forests, and Public Property 1 2014-07-01 2014-07-01 false Weapons, traps and nets. 2.4... PROTECTION, PUBLIC USE AND RECREATION § 2.4 Weapons, traps and nets. (a)(1) Except as otherwise provided in... prohibited: (i) Possessing a weapon, trap or net (ii) Carrying a weapon, trap or net (iii) Using a weapon...
36 CFR 2.4 - Weapons, traps and nets.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 36 Parks, Forests, and Public Property 1 2010-07-01 2010-07-01 false Weapons, traps and nets. 2.4... PROTECTION, PUBLIC USE AND RECREATION § 2.4 Weapons, traps and nets. (a)(1) Except as otherwise provided in... prohibited: (i) Possessing a weapon, trap or net (ii) Carrying a weapon, trap or net (iii) Using a weapon...
NASA Astrophysics Data System (ADS)
Clempner, Julio B.
2017-01-01
This paper presents a novel analytical method for soundness verification of workflow nets and reset workflow nets, using the well-known stability results of Lyapunov for Petri nets. We also prove that the soundness property is decidable for workflow nets and reset workflow nets. In addition, we provide evidence of several outcomes related with properties such as boundedness, liveness, reversibility and blocking using stability. Our approach is validated theoretically and by a numerical example related to traffic signal-control synchronisation.
SpectralNET – an application for spectral graph analysis and visualization
Forman, Joshua J; Clemons, Paul A; Schreiber, Stuart L; Haggarty, Stephen J
2005-01-01
Background Graph theory provides a computational framework for modeling a variety of datasets including those emerging from genomics, proteomics, and chemical genetics. Networks of genes, proteins, small molecules, or other objects of study can be represented as graphs of nodes (vertices) and interactions (edges) that can carry different weights. SpectralNET is a flexible application for analyzing and visualizing these biological and chemical networks. Results Available both as a standalone .NET executable and as an ASP.NET web application, SpectralNET was designed specifically with the analysis of graph-theoretic metrics in mind, a computational task not easily accessible using currently available applications. Users can choose either to upload a network for analysis using a variety of input formats, or to have SpectralNET generate an idealized random network for comparison to a real-world dataset. Whichever graph-generation method is used, SpectralNET displays detailed information about each connected component of the graph, including graphs of degree distribution, clustering coefficient by degree, and average distance by degree. In addition, extensive information about the selected vertex is shown, including degree, clustering coefficient, various distance metrics, and the corresponding components of the adjacency, Laplacian, and normalized Laplacian eigenvectors. SpectralNET also displays several graph visualizations, including a linear dimensionality reduction for uploaded datasets (Principal Components Analysis) and a non-linear dimensionality reduction that provides an elegant view of global graph structure (Laplacian eigenvectors). Conclusion SpectralNET provides an easily accessible means of analyzing graph-theoretic metrics for data modeling and dimensionality reduction. SpectralNET is publicly available as both a .NET application and an ASP.NET web application from . Source code is available upon request. PMID:16236170
SpectralNET--an application for spectral graph analysis and visualization.
Forman, Joshua J; Clemons, Paul A; Schreiber, Stuart L; Haggarty, Stephen J
2005-10-19
Graph theory provides a computational framework for modeling a variety of datasets including those emerging from genomics, proteomics, and chemical genetics. Networks of genes, proteins, small molecules, or other objects of study can be represented as graphs of nodes (vertices) and interactions (edges) that can carry different weights. SpectralNET is a flexible application for analyzing and visualizing these biological and chemical networks. Available both as a standalone .NET executable and as an ASP.NET web application, SpectralNET was designed specifically with the analysis of graph-theoretic metrics in mind, a computational task not easily accessible using currently available applications. Users can choose either to upload a network for analysis using a variety of input formats, or to have SpectralNET generate an idealized random network for comparison to a real-world dataset. Whichever graph-generation method is used, SpectralNET displays detailed information about each connected component of the graph, including graphs of degree distribution, clustering coefficient by degree, and average distance by degree. In addition, extensive information about the selected vertex is shown, including degree, clustering coefficient, various distance metrics, and the corresponding components of the adjacency, Laplacian, and normalized Laplacian eigenvectors. SpectralNET also displays several graph visualizations, including a linear dimensionality reduction for uploaded datasets (Principal Components Analysis) and a non-linear dimensionality reduction that provides an elegant view of global graph structure (Laplacian eigenvectors). SpectralNET provides an easily accessible means of analyzing graph-theoretic metrics for data modeling and dimensionality reduction. SpectralNET is publicly available as both a .NET application and an ASP.NET web application from http://chembank.broad.harvard.edu/resources/. Source code is available upon request.
Optimizing Sampling Design to Deal with Mist-Net Avoidance in Amazonian Birds and Bats
Marques, João Tiago; Ramos Pereira, Maria J.; Marques, Tiago A.; Santos, Carlos David; Santana, Joana; Beja, Pedro; Palmeirim, Jorge M.
2013-01-01
Mist netting is a widely used technique to sample bird and bat assemblages. However, captures often decline with time because animals learn and avoid the locations of nets. This avoidance or net shyness can substantially decrease sampling efficiency. We quantified the day-to-day decline in captures of Amazonian birds and bats with mist nets set at the same location for four consecutive days. We also evaluated how net avoidance influences the efficiency of surveys under different logistic scenarios using re-sampling techniques. Net avoidance caused substantial declines in bird and bat captures, although more accentuated in the latter. Most of the decline occurred between the first and second days of netting: 28% in birds and 47% in bats. Captures of commoner species were more affected. The numbers of species detected also declined. Moving nets daily to minimize the avoidance effect increased captures by 30% in birds and 70% in bats. However, moving the location of nets may cause a reduction in netting time and captures. When moving the nets caused the loss of one netting day it was no longer advantageous to move the nets frequently. In bird surveys that could even decrease the number of individuals captured and species detected. Net avoidance can greatly affect sampling efficiency but adjustments in survey design can minimize this. Whenever nets can be moved without losing netting time and the objective is to capture many individuals, they should be moved daily. If the main objective is to survey species present then nets should still be moved for bats, but not for birds. However, if relocating nets causes a significant loss of netting time, moving them to reduce effects of shyness will not improve sampling efficiency in either group. Overall, our findings can improve the design of mist netting sampling strategies in other tropical areas. PMID:24058579
Comparison of Bottomless Lift Nets and Breder Traps for Sampling Salt-Marsh Nekton
Data set contains: the length of mummichogs (Fundulus heteroclitus) caught on lift nets and Breder traps from May to September 2002; the sizes of green crabs caught in the lift nets and Breder traps during same time frame; the mean density and sample size data for each sampling time and each site (3 sites total) for total nekton sampled and total nekton minus shrimp.This dataset is associated with the following publication:Raposa, K., and M. Chintala. Comparison of Bottomless Lift Nets and Breder Traps for Sampling Salt-Marsh Nekton. TRANSACTIONS OF THE AMERICAN FISHERIES SOCIETY. American Fisheries Society, Bethesda, MD, USA, 145(1): 163-172, (2016).
Fault detection and initial state verification by linear programming for a class of Petri nets
NASA Technical Reports Server (NTRS)
Rachell, Traxon; Meyer, David G.
1992-01-01
The authors present an algorithmic approach to determining when the marking of a LSMG (live safe marked graph) or a LSFC (live safe free choice) net is in the set of live safe markings M. Hence, once the marking of a net is determined to be in M, then if at some time thereafter the marking of this net is determined not to be in M, this indicates a fault. It is shown how linear programming can be used to determine if m is an element of M. The worst-case computational complexity of each algorithm is bounded by the number of linear programs necessary to compute.
Threats to the health care safety net.
Taylor, T B
2001-11-01
The American health care safety net is threatened due to inadequate funding in the face of increasing demand for services by virtually every segment of our society. The safety net is vital to public safety because it is the sole provider for first-line emergency care, as well as for routine health care of last resort, through hospital emergency departments (ED), emergency medical services providers (EMS), and public/free clinics. Despite the perceived complexity, the causes and solutions for the current crisis reside in simple economics. During the last two decades health care funding has radically changed, yet the fundamental infrastructure of the safety net has change little. In 1986, the Emergency Medical Treatment and Active Labor Act established federally mandated safety net care that inadvertently encouraged reliance on hospital EDs as the principal safety net resource. At the same time, decreasing health care funding from both private and public sources resulted in declining availability of services necessary to support this shift in demand, including hospital inpatient beds, EDs, EMS providers, on-call specialists, hospital-based nurses, and public hospitals/clinics. The result has been ED/hospital crowding and resource shortages that at times limit the ability to provide even true emergency care and threaten the ability of the traditional safety net to protect public health and safety. This paper explores the composition of the American health care safety net, the root causes for its disintegration, and offers short- and long-term solutions. The solutions discussed include restructuring of disproportionate share funding; presumed (deemed) eligibility for Medicaid eligibility; restructuring of funding for emergency care; health care for foreign nationals; the nursing shortage; utilization of a "health care resources commission"; "episodic (periodic)" health care coverage; best practices and health care services coordination; and government and hospital providers' roles. There is a base amount of funding that must be available to the American health care safety net to maintain its infrastructure and provide appropriate growth, research, development, and expansion of services. Fall below this level and the infrastructure will eventually crumble. America must patch the safety net with short-term funding and repair it with long-term health care policy and environmental changes.
Phoenix: SOA based information management services
NASA Astrophysics Data System (ADS)
Grant, Rob; Combs, Vaughn; Hanna, Jim; Lipa, Brian; Reilly, Jim
2009-05-01
The Air Force Research Laboratory (AFRL) has developed a reference set of Information Management (IM) Services that will provide an essential piece of the envisioned final Net-Centric IM solution for the Department of Defense (DoD). These IM Services will provide mission critical functionality to enable seamless interoperability between existing and future DoD systems and services while maintaining a highly available IM capability across the wide spectrum of differing scalability and performance requirements. AFRL designed this set of IM Services for integration with other DoD and commercial SOA environments. The services developed will provide capabilities for information submission, information brokering and discovery, repository, query, type management, dissemination, session management, authorization, service brokering and event notification. In addition, the IM services support common information models that facilitate the management and dissemination of information consistent with client needs and established policy. The services support flexible and extensible definitions of session, service, and channel contexts that enable the application of Quality of Service (QoS) and security policies at many levels within the SOA.
Advancing MEMS Technology Usage through the MUMPS (Multi-User MEMS Processes) Program
NASA Technical Reports Server (NTRS)
Koester, D. A.; Markus, K. W.; Dhuler, V.; Mahadevan, R.; Cowen, A.
1995-01-01
In order to help provide access to advanced micro-electro-mechanical systems (MEMS) technologies and lower the barriers for both industry and academia, the Microelectronic Center of North Carolina (MCNC) and ARPA have developed a program which provides users with access to both MEMS processes and advanced electronic integration techniques. The four distinct aspects of this program, the multi-user MEMS processes (MUMP's), the consolidated micro-mechanical element library, smart MEMS, and the MEMS technology network are described in this paper. MUMP's is an ARPA-supported program created to provide inexpensive access to MEMS technology in a multi-user environment. It is both a proof-of-concept and educational tool that aids in the development of MEMS in the domestic community. MUMP's technologies currently include a 3-layer poly-silicon surface micromachining process and LIGA (lithography, electroforming, and injection molding) processes that provide reasonable design flexibility within set guidelines. The consolidated micromechanical element library (CaMEL) is a library of active and passive MEMS structures that can be downloaded by the MEMS community via the internet. Smart MEMS is the development of advanced electronics integration techniques for MEMS through the application of flip chip technology. The MEMS technology network (TechNet) is a menu of standard substrates and MEMS fabrication processes that can be purchased and combined to create unique process flows. TechNet provides the MEMS community greater flexibility and enhanced technology accessibility.
Daniel, J B; Friggens, N C; van Laar, H; Ingvartsen, K L; Sauvant, D
2018-06-01
The control of nutrient partitioning is complex and affected by many factors, among them physiological state and production potential. Therefore, the current model aims to provide for dairy cows a dynamic framework to predict a consistent set of reference performance patterns (milk component yields, body composition change, dry-matter intake) sensitive to physiological status across a range of milk production potentials (within and between breeds). Flows and partition of net energy toward maintenance, growth, gestation, body reserves and milk components are described in the model. The structure of the model is characterized by two sub-models, a regulating sub-model of homeorhetic control which sets dynamic partitioning rules along the lactation, and an operating sub-model that translates this into animal performance. The regulating sub-model describes lactation as the result of three driving forces: (1) use of previously acquired resources through mobilization, (2) acquisition of new resources with a priority of partition towards milk and (3) subsequent use of resources towards body reserves gain. The dynamics of these three driving forces were adjusted separately for fat (milk and body), protein (milk and body) and lactose (milk). Milk yield is predicted from lactose and protein yields with an empirical equation developed from literature data. The model predicts desired dry-matter intake as an outcome of net energy requirements for a given dietary net energy content. The parameters controlling milk component yields and body composition changes were calibrated using two data sets in which the diet was the same for all animals. Weekly data from Holstein dairy cows was used to calibrate the model within-breed across milk production potentials. A second data set was used to evaluate the model and to calibrate it for breed differences (Holstein, Danish Red and Jersey) on the mobilization/reconstitution of body composition and on the yield of individual milk components. These calibrations showed that the model framework was able to adequately simulate milk yield, milk component yields, body composition changes and dry-matter intake throughout lactation for primiparous and multiparous cows differing in their production level.
Facing the Recession: How Did Safety-Net Hospitals Fare Financially Compared with Their Peers?
Reiter, Kristin L; Jiang, H Joanna; Wang, Jia
2014-01-01
Objective To examine the effect of the recession on the financial performance of safety-net versus non-safety-net hospitals. Data Sources/Study Setting Agency for Healthcare Research and Quality Hospital Cost and Utilization Project State Inpatient Databases, Medicare Cost Reports, American Hospital Association Annual Survey, InterStudy, and Area Health Resource File. Study Design Retrospective, longitudinal panel of hospitals, 2007–2011. Safety-net hospitals were identified using percentage of patients who were Medicaid or uninsured. Generalized estimating equations were used to estimate average effects of the recession on hospital operating and total margins, revenues and expenses in each year, 2008–2011, comparing safety-net with non-safety-net hospitals. Data Collection/Extraction Methods 1,453 urban, nonfederal, general acute hospitals in 32 states with complete data. Principal Findings Safety-net hospitals, as identified in 2007, had lower operating and total margins. The gap in operating margin between safety-net and non-safety-net hospitals was sustained throughout the recession; however, total margin was more negatively affected for non-safety-net hospitals in 2008. Higher percentages of Medicaid and uninsured patients were associated with lower revenue in private hospitals in all years, and lower revenue and expenses in public hospitals in 2011. Conclusions Safety-net hospitals may not be disproportionately vulnerable to macro-economic fluctuations, but their significantly lower margins leave less financial cushion to weather sustained financial pressure. PMID:25220012
Kweka, Eliningaya J; Lyaruu, Lucile J; Mahande, Aneth M
2017-01-18
Mosquitoes have developed resistance against pyrethroids, the only class of insecticides approved for use on long-lasting insecticidal nets (LLINs). The present study sought to evaluate the efficacy of the pyrethroid synergist PermaNet® 3.0 LLIN versus the pyrethroid-only PermaNet® 2.0 LLIN, in an East African hut design in Lower Moshi, northern Tanzania. In this setting, resistance to pyrethroid insecticides has been identified in Anopheles gambiae mosquitoes. Standard World Health Organization bioefficacy evaluations were conducted in both laboratory and experimental huts. Experimental hut evaluations were conducted in an area where there was presence of a population of highly pyrethroid-resistant An. arabiensis mosquitoes. All nets used were subjected to cone bioassays and then to experimental hut trials. Mosquito mortality, blood-feeding inhibition and personal protection rate were compared between untreated nets, unwashed LLINs and LLINs that were washed 20 times. Both washed and unwashed PermaNet® 2.0 and PermaNet® 3.0 LLINs had knockdown and mortality rates of 100% against a susceptible strain of An. gambiae sensu stricto. The adjusted mortality rate of the wild mosquito population after use of the unwashed PermaNet® 3.0 and PermaNet® 2.0 nets was found to be higher than after use of the washed PermaNet® 2.0 and PermaNet® 3.0 nets. Given the increasing incidence of pyrethroid resistance in An. gambiae mosquitoes in Tanzania, we recommend that consideration is given to its distribution in areas with pyrethroid-resistant malaria vectors within the framework of a national insecticide-resistance management plan.
Cantarello, Elena; Newton, Adrian C; Martin, Philip A; Evans, Paul M; Gosal, Arjan; Lucash, Melissa S
2017-11-01
Resilience is increasingly being considered as a new paradigm of forest management among scientists, practitioners, and policymakers. However, metrics of resilience to environmental change are lacking. Faced with novel disturbances, forests may be able to sustain existing ecosystem services and biodiversity by exhibiting resilience, or alternatively these attributes may undergo either a linear or nonlinear decline. Here we provide a novel quantitative approach for assessing forest resilience that focuses on three components of resilience, namely resistance, recovery, and net change, using a spatially explicit model of forest dynamics. Under the pulse set scenarios, we explored the resilience of nine ecosystem services and four biodiversity measures following a one-off disturbance applied to an increasing percentage of forest area. Under the pulse + press set scenarios, the six disturbance intensities explored during the pulse set were followed by a continuous disturbance. We detected thresholds in net change under pulse + press scenarios for the majority of the ecosystem services and biodiversity measures, which started to decline sharply when disturbance affected >40% of the landscape. Thresholds in net change were not observed under the pulse scenarios, with the exception of timber volume and ground flora species richness. Thresholds were most pronounced for aboveground biomass, timber volume with respect to the ecosystem services, and ectomycorrhizal fungi and ground flora species richness with respect to the biodiversity measures. Synthesis and applications . The approach presented here illustrates how the multidimensionality of stability research in ecology can be addressed and how forest resilience can be estimated in practice. Managers should adopt specific management actions to support each of the three components of resilience separately, as these may respond differently to disturbance. In addition, management interventions aiming to deliver resilience should incorporate an assessment of both pulse and press disturbances to ensure detection of threshold responses to disturbance, so that appropriate management interventions can be identified.
Addressing the NETS*S in K-12 Classrooms: Implications for Teacher Education
ERIC Educational Resources Information Center
Niederhauser, Dale S.; Lindstrom, Denise L.; Strobel, Johannes
2007-01-01
The National Educational Technology Standards for Students (NETS*S) were developed to provide guidelines for effective and meaningful technology use with K-12 students. In the present study we used the NETS*S as a framework to analyze ways that teachers integrated instructional technology use and provided opportunities for their students to…
Understanding the Knowledge Gap Experienced by U.S. Safety Net Patients in Teleretinal Screening.
George, Sheba M; Hayes, Erin Moran; Fish, Allison; Daskivich, Lauren Patty; Ogunyemi, Omolola I
2016-01-01
Safety-net patients' socioeconomic barriers interact with limited digital and health literacies to produce a "knowledge gap" that impacts the delivery of healthcare via telehealth technologies. Six focus groups (2 African- American and 4 Latino) were conducted with patients who received teleretinal screening in a U.S. urban safety-net setting. Focus groups were analyzed using a modified grounded theory methodology. Findings indicate that patients' knowledge gap is primarily produced at three points during the delivery of care: (1) exacerbation of patients' pre-existing personal barriers in the clinical setting; (2) encounters with technology during screening; and (3) lack of follow up after the visit. This knowledge gap produces confusion, potentially limiting patients' perceptions of care and their ability to manage their own care. It may be ameliorated through delivery of patient education focused on both disease pathology and specific role of telehealth technologies in disease management.
Innovation and Transformation in California’s Safety-net Healthcare Settings: An Inside Perspective
Lyles, Courtney R.; Aulakh, Veenu; Jameson, Wendy; Schillinger, Dean; Yee, Hal; Sarkar, Urmimala
2016-01-01
Background Health reform requires safety-net settings to transform care delivery, but how they will innovate in order to achieve this transformation is unknown. Methods We conducted two series of key informant interviews (N= 28) in 2012 with leadership from both California’s public hospital systems and community health centers. Interviews focused on how innovation was conceptualized and solicited examples of successful innovations. Results In contrast to disruptive innovation, interviewees often defined innovation as improving implementation, making incremental changes, and promoting integration. Many leaders gave examples of existing innovative practices such as patient-centered approaches to meeting their diverse patient needs. Participants expressed challenges to adapting quickly, but a desire to partner together. Conclusions Safety-net systems have already begun implementing innovative practices supporting their key priority areas. However, more support is needed, specifically to accelerate the change needed to succeed under health reform. PMID:24170938
A technique for global monitoring of net solar irradiance at the ocean surface. II - Validation
NASA Technical Reports Server (NTRS)
Chertock, Beth; Frouin, Robert; Gautier, Catherine
1992-01-01
The generation and validation of the first satellite-based long-term record of surface solar irradiance over the global oceans are addressed. The record is generated using Nimbus-7 earth radiation budget (ERB) wide-field-of-view plentary-albedo data as input to a numerical algorithm designed and implemented based on radiative transfer theory. The mean monthly values of net surface solar irradiance are computed on a 9-deg latitude-longitude spatial grid for November 1978-October 1985. The new data set is validated in comparisons with short-term, regional, high-resolution, satellite-based records. The ERB-based values of net surface solar irradiance are compared with corresponding values based on radiance measurements taken by the Visible-Infrared Spin Scan Radiometer aboard GOES series satellites. Errors in the new data set are estimated to lie between 10 and 20 W/sq m on monthly time scales.
Testing Electronic Algorithms to Create Disease Registries in a Safety Net System
Hanratty, Rebecca; Estacio, Raymond O.; Dickinson, L. Miriam; Chandramouli, Vijayalaxmi; Steiner, John F.; Havranek, Edward P.
2008-01-01
Electronic disease registries are a critical feature of the chronic disease management programs that are used to improve the care of individuals with chronic illnesses. These registries have been developed primarily in managed care settings; use in safety net institutions—organizations whose mission is to serve the uninsured and underserved—has not been described. We sought to assess the feasibility of developing disease registries from electronic data in a safety net institution, focusing on hypertension because of its importance in minority populations. We compared diagnoses obtained from algorithms utilizing electronic data, including laboratory and pharmacy records, against diagnoses derived from chart review. We found good concordance between diagnoses identified from electronic data and those identified by chart review, suggesting that registries of patients with chronic diseases can be developed outside the setting of closed panel managed care organizations. PMID:18469416
Greening the Net Generation: Outdoor Adult Learning in the Digital Age
ERIC Educational Resources Information Center
Walter, Pierre
2013-01-01
Adult learning today takes place primarily within walled classrooms or in other indoor settings, and often in front of various types of digital screens. As adults have adopted the digital technologies and indoor lifestyle attributed to the so-called "Net Generation," we have become detached from contact with the natural world outdoors.…
1968-01-01
which forms a conducting medium between the electrodes of a dry cell , storage cell , or electrolytic capacitor. ELECTROMAGNETIC FIELD - A mlagnetic...Dry cel batteries. (2) Vehicular batteries. (3) Hand generators. (4) Gas engine generators. (5) Wet cell batteries. 2-5. NETTING TWO RADIO SETS: To net...1600 meters Power output .. .. .. ..... ..... ..... 5watt Power source. .. .. .. ..... ...... ... dry cell battery flA-270/U Battery lift
Barley 4H QTL confers NFNB resistance to a global set of P. teres f. teres isolates
USDA-ARS?s Scientific Manuscript database
Net form net blotch (NFNB), caused by Pyrenophora teres f. teres Drechs., is prevalent in barley-growing regions worldwide. A population of 132 recombinant inbred lines (RILs) developed from a cross of the barley varieties 'Falcon' and 'Azhul' were used to evaluate resistance to NFNB due to their di...
Fleming, Mark D; Shim, Janet K; Yen, Irene H; Thompson-Lastad, Ariana; Rubin, Sara; Van Natta, Meredith; Burke, Nancy J
2017-06-01
Increasing "patient engagement" has become a priority for health care organizations and policy-makers seeking to reduce cost and improve the quality of care. While concepts of patient engagement have proliferated rapidly across health care settings, little is known about how health care providers make use of these concepts in clinical practice. This paper uses 20 months of ethnographic and interview research carried out from 2015 to 2016 to explore how health care providers working at two public, urban, safety-net hospitals in the United States define, discuss, and assess patient engagement. We investigate how health care providers describe engagement for high cost patients-the "super-utilizers" of the health care system-who often face complex challenges related to socioeconomic marginalization including poverty, housing insecurity, exposure to violence and trauma, cognitive and mental health issues, and substance use. The health care providers in our study faced institutional pressure to assess patient engagement and to direct care towards engaged patients. However, providers considered such assessments to be highly challenging and oftentimes inaccurate, particularly because they understood low patient engagement to be the result of difficult socioeconomic conditions. Providers tried to navigate the demand to assess patient engagement in care by looking for explicit positive and negative indicators of engagement, while also being sensitive to more subtle and intuitive signs of engagement for marginalized patients. Copyright © 2017 Elsevier Ltd. All rights reserved.
Overview of methods in economic analyses of behavioral interventions to promote oral health
O’Connell, Joan M.; Griffin, Susan
2016-01-01
Background Broad adoption of interventions that prove effective in randomized clinical trials or comparative effectiveness research may depend to a great extent on their costs and cost-effectiveness (CE). Many studies of behavioral health interventions for oral health promotion and disease prevention lack robust economic assessments of costs and CE. Objective To describe methodologies employed to assess intervention costs, potential savings, net costs, CE, and the financial sustainability of behavioral health interventions to promote oral health. Methods We provide an overview of terminology and strategies for conducting economic evaluations of behavioral interventions to improve oral health based on the recommendations of the Panel of Cost-Effectiveness in Health and Medicine. To illustrate these approaches, we summarize methodologies and findings from a limited number of published studies. The strategies include methods for assessing intervention costs, potential savings, net costs, CE, and financial sustainability from various perspectives (e.g., health-care provider, health system, health payer, employer, society). Statistical methods for estimating short-term and long-term economic outcomes and for examining the sensitivity of economic outcomes to cost parameters are described. Discussion Through the use of established protocols for evaluating costs and savings, it is possible to assess and compare intervention costs, net costs, CE, and financial sustainability. The addition of economic outcomes to outcomes reflecting effectiveness, appropriateness, acceptability, and organizational sustainability strengthens evaluations of oral health interventions and increases the potential that those found to be successful in research settings will be disseminated more broadly. PMID:21656966
An investigation of interference coordination in heterogeneous network for LTE-Advanced systems
NASA Astrophysics Data System (ADS)
Hasan, M. K.; Ismail, A. F.; H, Aisha-Hassan A.; Abdullah, Khaizuran; Ramli, H. A. M.
2013-12-01
The novel "femtocell" in Heterogeneous Network (HetNet) for LTE-Advanced (LTE-A) set-up will allow Malaysian wireless telecommunication operators (Maxis, Celcom, Digi, U-Mobile, P1, YTL and etc2.) to extend connectivity coverage where access would otherwise be limited or unavailable, particularly indoors of large building complexes. A femtocell is a small-sized cellular base station that encompasses all the functionality of a typical station. It therefore allows a simpler and self-contained deployment including private residences. For the Malaysian service providers, the main attractions of femtocell usage are the improvements to both coverage and capacity. The operators can provide a better service to end-users in turn reduce much of the agitations and complaints. There will be opportunity for new services at reduced cost. In addition, the operator not only benefits from the improved capacity and coverage but also can reduce both capital expenditure and operating expense i.e. alternative to brand new base station or macrocell installation. Interference is a key issue associated with femtocell development. There are a large number of issues associated with interference all of which need to be investigated, identified, quantified and solved. This is to ensure that the deployment of any femtocells will take place successfully. Among the most critical challenges in femtocell deployment is the interference between femtocell-to-macrocell and femtocell-to-femtocell in HetNets. In this paper, all proposed methods and algorithms will be investigated in the OFDMA femtocell system considering HetNet scenarios for LTE-A.
Electric power monthly, June 1988
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1988-09-15
Total net generation by electric utilities in the United States for the month of June 1988 was 232,183 gigawatthours, 3 percent higher than the amount reported a year ago. Although temperatures (measured by cooling degree days) for June 1988 were 9 percent warmer than normal, they were 3 percent cooler than for June 1987. A large portion of that higher demand for electricity was met by nuclear-powered generation. Net generation from nuclear power during June 1988 (44,079 gigawatthours) was only 1 percent below the record set in January of this year, and 21 percent above that reported in June 1987more » (36,560 gigawatthours). The only energy source other than nuclear that reported higher levels of net generation during June 1988 was coal, up 2 percent over the same period last year. Warmer-than-normal temperatures did, however, have an affect on various parts of the country. For example, on Wednesday, June 22, 1988, unseasonably high temperatures forced the Pennsylvania, New Jersey, and Maryland Interconnection (PJM) into a system-wide 5-percent voltage reduction for 2 hours. Contributing to that reduction in voltage was the shutdown of the Three Mile Island, Unit 1, for refueling and the closing of the Peach Bottom Units 2 and 3 by the Nuclear Regulatory Commission. Three Mile Island, Unit 1, normally provides the PJM system with about 800 megawatts while the two Peach Bottom units, combined, provide approximately 2100 megawatts. 10 refs., 1 fig., 27 tabs.« less
Overview of methods in economic analyses of behavioral interventions to promote oral health.
O'Connell, Joan M; Griffin, Susan
2011-01-01
Broad adoption of interventions that prove effective in randomized clinical trials or comparative effectiveness research may depend to a great extent on their costs and cost-effectiveness (CE). Many studies of behavioral health interventions for oral health promotion and disease prevention lack robust economic assessments of costs and CE. To describe methodologies employed to assess intervention costs, potential savings, net costs, CE, and the financial sustainability of behavioral health interventions to promote oral health. We provide an overview of terminology and strategies for conducting economic evaluations of behavioral interventions to improve oral health based on the recommendations of the Panel of Cost-Effectiveness in Health and Medicine. To illustrate these approaches, we summarize methodologies and findings from a limited number of published studies. The strategies include methods for assessing intervention costs, potential savings, net costs, CE, and financial sustainability from various perspectives (e.g., health-care provider, health system, health payer, employer, society). Statistical methods for estimating short-term and long-term economic outcomes and for examining the sensitivity of economic outcomes to cost parameters are described. Through the use of established protocols for evaluating costs and savings, it is possible to assess and compare intervention costs, net costs, CE, and financial sustainability. The addition of economic outcomes to outcomes reflecting effectiveness, appropriateness, acceptability, and organizational sustainability strengthens evaluations of oral health interventions and increases the potential that those found to be successful in research settings will be disseminated more broadly.
NASA Astrophysics Data System (ADS)
Kharkar, Prashant S.; Reith, Maarten E. A.; Dutta, Aloke K.
2008-01-01
Three-dimensional quantitative structure-activity relationship (3D QSAR) using comparative molecular field analysis (CoMFA) was performed on a series of substituted tetrahydropyran (THP) derivatives possessing serotonin (SERT) and norepinephrine (NET) transporter inhibitory activities. The study aimed to rationalize the potency of these inhibitors for SERT and NET as well as the observed selectivity differences for NET over SERT. The dataset consisted of 29 molecules, of which 23 molecules were used as the training set for deriving CoMFA models for SERT and NET uptake inhibitory activities. Superimpositions were performed using atom-based fitting and 3-point pharmacophore-based alignment. Two charge calculation methods, Gasteiger-Hückel and semiempirical PM3, were tried. Both alignment methods were analyzed in terms of their predictive abilities and produced comparable results with high internal and external predictivities. The models obtained using the 3-point pharmacophore-based alignment outperformed the models with atom-based fitting in terms of relevant statistics and interpretability of the generated contour maps. Steric fields dominated electrostatic fields in terms of contribution. The selectivity analysis (NET over SERT), though yielded models with good internal predictivity, showed very poor external test set predictions. The analysis was repeated with 24 molecules after systematically excluding so-called outliers (5 out of 29) from the model derivation process. The resulting CoMFA model using the atom-based fitting exhibited good statistics and was able to explain most of the selectivity (NET over SERT)-discriminating factors. The presence of -OH substituent on the THP ring was found to be one of the most important factors governing the NET selectivity over SERT. Thus, a 4-point NET-selective pharmacophore, after introducing this newly found H-bond donor/acceptor feature in addition to the initial 3-point pharmacophore, was proposed.
Residual Deep Convolutional Neural Network Predicts MGMT Methylation Status.
Korfiatis, Panagiotis; Kline, Timothy L; Lachance, Daniel H; Parney, Ian F; Buckner, Jan C; Erickson, Bradley J
2017-10-01
Predicting methylation of the O6-methylguanine methyltransferase (MGMT) gene status utilizing MRI imaging is of high importance since it is a predictor of response and prognosis in brain tumors. In this study, we compare three different residual deep neural network (ResNet) architectures to evaluate their ability in predicting MGMT methylation status without the need for a distinct tumor segmentation step. We found that the ResNet50 (50 layers) architecture was the best performing model, achieving an accuracy of 94.90% (+/- 3.92%) for the test set (classification of a slice as no tumor, methylated MGMT, or non-methylated). ResNet34 (34 layers) achieved 80.72% (+/- 13.61%) while ResNet18 (18 layers) accuracy was 76.75% (+/- 20.67%). ResNet50 performance was statistically significantly better than both ResNet18 and ResNet34 architectures (p < 0.001). We report a method that alleviates the need of extensive preprocessing and acts as a proof of concept that deep neural architectures can be used to predict molecular biomarkers from routine medical images.
Finding the Optimal Nets for Self-Folding Kirigami
NASA Astrophysics Data System (ADS)
Araújo, N. A. M.; da Costa, R. A.; Dorogovtsev, S. N.; Mendes, J. F. F.
2018-05-01
Three-dimensional shells can be synthesized from the spontaneous self-folding of two-dimensional templates of interconnected panels, called nets. However, some nets are more likely to self-fold into the desired shell under random movements. The optimal nets are the ones that maximize the number of vertex connections, i.e., vertices that have only two of its faces cut away from each other in the net. Previous methods for finding such nets are based on random search, and thus, they do not guarantee the optimal solution. Here, we propose a deterministic procedure. We map the connectivity of the shell into a shell graph, where the nodes and links of the graph represent the vertices and edges of the shell, respectively. Identifying the nets that maximize the number of vertex connections corresponds to finding the set of maximum leaf spanning trees of the shell graph. This method allows us not only to design the self-assembly of much larger shell structures but also to apply additional design criteria, as a complete catalog of the maximum leaf spanning trees is obtained.
NASA Technical Reports Server (NTRS)
Farhat, Nabil H.
1987-01-01
Self-organization and learning is a distinctive feature of neural nets and processors that sets them apart from conventional approaches to signal processing. It leads to self-programmability which alleviates the problem of programming complexity in artificial neural nets. In this paper architectures for partitioning an optoelectronic analog of a neural net into distinct layers with prescribed interconnectivity pattern to enable stochastic learning by simulated annealing in the context of a Boltzmann machine are presented. Stochastic learning is of interest because of its relevance to the role of noise in biological neural nets. Practical considerations and methodologies for appreciably accelerating stochastic learning in such a multilayered net are described. These include the use of parallel optical computing of the global energy of the net, the use of fast nonvolatile programmable spatial light modulators to realize fast plasticity, optical generation of random number arrays, and an adaptive noisy thresholding scheme that also makes stochastic learning more biologically plausible. The findings reported predict optoelectronic chips that can be used in the realization of optical learning machines.
Capturing birds with mist nets: A review
Keyes, B.E.; Grue, C.E.
1982-01-01
Herein we have tried to provide a comprehensive review of mist-netting techniques suitable for both novice and experienced netters. General mist-netting procedures and modifications developed by netters for particular bird species and habitats are included. Factors which influence capture success, including site selection, net specifications and placement, weather, and time of day, are discussed. Guidelines are presented for the care of netted birds and the use of mist-net data in the study of bird communities. The advantages of the use of mist nets over other methods of capturing birds are also discussed.
47 CFR 69.610 - Other hypothetical net balances.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 3 2011-10-01 2011-10-01 false Other hypothetical net balances. 69.610 Section... (CONTINUED) ACCESS CHARGES Exchange Carrier Association § 69.610 Other hypothetical net balances. (a) The hypothetical net balance for an access element other than a Common Line element shall be computed as provided...
47 CFR 69.610 - Other hypothetical net balances.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 3 2010-10-01 2010-10-01 false Other hypothetical net balances. 69.610 Section... (CONTINUED) ACCESS CHARGES Exchange Carrier Association § 69.610 Other hypothetical net balances. (a) The hypothetical net balance for an access element other than a Common Line element shall be computed as provided...
Campylobacter Database Update: USDA VetNet (2004-2009)
USDA-ARS?s Scientific Manuscript database
In March 2004, USDA VetNet was launched with the intention of providing Pulsed-Field Gel Electrophoresis (PFGE) on Salmonella isolates originating from animals as a complement to the CDC PulseNet program. The objectives of USDA VetNet are to determine PFGE patterns of Salmonella and Campylobacter i...
7 CFR 1221.17 - Net market value.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 10 2010-01-01 2010-01-01 false Net market value. 1221.17 Section 1221.17 Agriculture... INFORMATION ORDER Sorghum Promotion, Research, and Information Order Definitions § 1221.17 Net market value. Net market value means: (a) Except as provided in paragraph (b)and (c) of this section, the value...
ShakeNet: a portable wireless sensor network for instrumenting large civil structures
Kohler, Monica D.; Hao, Shuai; Mishra, Nilesh; Govindan, Ramesh; Nigbor, Robert
2015-08-03
We report our findings from a U.S. Geological Survey (USGS) National Earthquake Hazards Reduction Program-funded project to develop and test a wireless, portable, strong-motion network of up to 40 triaxial accelerometers for structural health monitoring. The overall goal of the project was to record ambient vibrations for several days from USGS-instrumented structures. Structural health monitoring has important applications in fields like civil engineering and the study of earthquakes. The emergence of wireless sensor networks provides a promising means to such applications. However, while most wireless sensor networks are still in the experimentation stage, very few take into consideration the realistic earthquake engineering application requirements. To collect comprehensive data for structural health monitoring for civil engineers, high-resolution vibration sensors and sufficient sampling rates should be adopted, which makes it challenging for current wireless sensor network technology in the following ways: processing capabilities, storage limit, and communication bandwidth. The wireless sensor network has to meet expectations set by wired sensor devices prevalent in the structural health monitoring community. For this project, we built and tested an application-realistic, commercially based, portable, wireless sensor network called ShakeNet for instrumentation of large civil structures, especially for buildings, bridges, or dams after earthquakes. Two to three people can deploy ShakeNet sensors within hours after an earthquake to measure the structural response of the building or bridge during aftershocks. ShakeNet involved the development of a new sensing platform (ShakeBox) running a software suite for networking, data collection, and monitoring. Deployments reported here on a tall building and a large dam were real-world tests of ShakeNet operation, and helped to refine both hardware and software.
Costs and consequences of large-scale vector control for malaria
Yukich, Joshua O; Lengeler, Christian; Tediosi, Fabrizio; Brown, Nick; Mulligan, Jo-Ann; Chavasse, Des; Stevens, Warren; Justino, John; Conteh, Lesong; Maharaj, Rajendra; Erskine, Marcy; Mueller, Dirk H; Wiseman, Virginia; Ghebremeskel, Tewolde; Zerom, Mehari; Goodman, Catherine; McGuire, David; Urrutia, Juan Manuel; Sakho, Fana; Hanson, Kara; Sharp, Brian
2008-01-01
Background Five large insecticide-treated net (ITN) programmes and two indoor residual spraying (IRS) programmes were compared using a standardized costing methodology. Methods Costs were measured locally or derived from existing studies and focused on the provider perspective, but included the direct costs of net purchases by users, and are reported in 2005 USD. Effectiveness was estimated by combining programme outputs with standard impact indicators. Findings Conventional ITNs: The cost per treated net-year of protection ranged from USD 1.21 in Eritrea to USD 6.05 in Senegal. The cost per child death averted ranged from USD 438 to USD 2,199 when targeting to children was successful. Long-lasting insecticidal nets (LLIN) of five years duration: The cost per treated-net year of protection ranged from USD 1.38 in Eritrea to USD 1.90 in Togo. The cost per child death averted ranged from USD 502 to USD 692. IRS: The costs per person-year of protection for all ages were USD 3.27 in KwaZulu Natal and USD 3.90 in Mozambique. If only children under five years of age were included in the denominator the cost per person-year of protection was higher: USD 23.96 and USD 21.63. As a result, the cost per child death averted was higher than for ITNs: USD 3,933–4,357. Conclusion Both ITNs and IRS are highly cost-effective vector control strategies. Integrated ITN free distribution campaigns appeared to be the most efficient way to rapidly increase ITN coverage. Other approaches were as or more cost-effective, and appeared better suited to "keep-up" coverage levels. ITNs are more cost-effective than IRS for highly endemic settings, especially if high ITN coverage can be achieved with some demographic targeting. PMID:19091114
Roth, Robert Paul; Hahn, David C.; Scaringe, Robert P.
2015-12-08
A device and method are provided to improve performance of a vapor compression system using a retrofittable control board to start up the vapor compression system with the evaporator blower initially set to a high speed. A baseline evaporator operating temperature with the evaporator blower operating at the high speed is recorded, and then the device detects if a predetermined acceptable change in evaporator temperature has occurred. The evaporator blower speed is reduced from the initially set high speed as long as there is only a negligible change in the measured evaporator temperature and therefore a negligible difference in the compressor's power consumption so as to obtain a net increase in the Coefficient of Performance.
Sun, Duanchen; Liu, Yinliang; Zhang, Xiang-Sun; Wu, Ling-Yun
2017-09-21
High-throughput experimental techniques have been dramatically improved and widely applied in the past decades. However, biological interpretation of the high-throughput experimental results, such as differential expression gene sets derived from microarray or RNA-seq experiments, is still a challenging task. Gene Ontology (GO) is commonly used in the functional enrichment studies. The GO terms identified via current functional enrichment analysis tools often contain direct parent or descendant terms in the GO hierarchical structure. Highly redundant terms make users difficult to analyze the underlying biological processes. In this paper, a novel network-based probabilistic generative model, NetGen, was proposed to perform the functional enrichment analysis. An additional protein-protein interaction (PPI) network was explicitly used to assist the identification of significantly enriched GO terms. NetGen achieved a superior performance than the existing methods in the simulation studies. The effectiveness of NetGen was explored further on four real datasets. Notably, several GO terms which were not directly linked with the active gene list for each disease were identified. These terms were closely related to the corresponding diseases when accessed to the curated literatures. NetGen has been implemented in the R package CopTea publicly available at GitHub ( http://github.com/wulingyun/CopTea/ ). Our procedure leads to a more reasonable and interpretable result of the functional enrichment analysis. As a novel term combination-based functional enrichment analysis method, NetGen is complementary to current individual term-based methods, and can help to explore the underlying pathogenesis of complex diseases.
[The Competence Network for HIV/AIDS. Data, Samples, Facts].
Michalik, Claudia; Skaletz-Rorowski, Adriane; Brockmeyer, Norbert H
2016-04-01
With funding for the Competence Networks in Medicine from the Federal Ministry of Education and Research, the Competence Network for HIV/AIDS (KompNet HIV/AIDS) was established as an interdisciplinary research association. Essential working groups were incorporated all over Germany, which are active in clinical and basic HIV/AIDS research. After successful establishment, providing research infrastructure for national and international cooperation in the field of HIV/AIDS was the focus of the network. By bringing together research activities, preconditions are created for improving HIV infection treatment and increasing life expectancy of HIV-infected patients. The members of KompNet HIV/AIDS are HIV experts from university clinics, HIV physicians, patient representatives, as well as national reference centers. As a scientific research basis, the network established an HIV patient cohort. Clinical and sociodemographic data of HIV patients were documented biannually and complemented by serum and DNA-samples collected twice per year. Furthermore, a child cohort was set up. Within the KompNet HIV/AIDS, a research infrastructure for HIV was established for internal, external as well international scientists. Within the HIV cohort a total of 16,500 patients are documented. The associated biobank comprises ~ 56,000 serum samples and ~ 16,000 DNA samples. The child cohort consists of 647 HIV-exposed and 230 infected children. The KompNet HIV/AIDS cohorts became an important partner in several international collaborations. Nevertheless, the maintenance of such infrastructures without public funding is a challenge.
McDonough, Randal P; Harthan, Aaron A; McLeese, Kelly E; Doucette, William R
2010-01-01
To determine the net financial gain or loss for medication therapy management (MTM) services provided to patients by an independent community pharmacy during 16 months of operation. Retrospective study. Independent community pharmacy in Iowa City, IA, from September 1, 2006, to December 31, 2007. Patients receiving MTM services during the specified period who had proper documentation of reimbursement for the services. MTM services were provided to the patient and documented by the pharmacist or student pharmacist. Net financial gains or losses for providing MTM services. Sensitivity analyses included costs that might be incurred under various conditions of operation. 103 initial and 88 follow-up MTM visits were conducted during a 16-month time period. The total cost for these services to the pharmacy was $11,191.72. Total revenue from these services was $11,195.00; therefore, the pharmacy experienced a net financial gain of $3.28. Sensitivity analyses were conducted, revealing the net gain/loss to the pharmacy if a student pharmacist was used and the net gain/loss if the pharmacist needed extra training to provide the services. Using a student pharmacist resulted in a net gain of $6,308.48, while extra training for the pharmacist resulted in a net loss of $1,602.72. The MTM service programs showed a positive financial gain after 16 months of operation, which should encourage pharmacists to incorporate these services into their practice.
Brown, Stephen; Hutton, Brian; Clifford, Tammy; Coyle, Doug; Grima, Daniel; Wells, George; Cameron, Chris
2014-09-29
The use of network meta-analysis has increased dramatically in recent years. WinBUGS, a freely available Bayesian software package, has been the most widely used software package to conduct network meta-analyses. However, the learning curve for WinBUGS can be daunting, especially for new users. Furthermore, critical appraisal of network meta-analyses conducted in WinBUGS can be challenging given its limited data manipulation capabilities and the fact that generation of graphical output from network meta-analyses often relies on different software packages than the analyses themselves. We developed a freely available Microsoft-Excel-based tool called NetMetaXL, programmed in Visual Basic for Applications, which provides an interface for conducting a Bayesian network meta-analysis using WinBUGS from within Microsoft Excel. . This tool allows the user to easily prepare and enter data, set model assumptions, and run the network meta-analysis, with results being automatically displayed in an Excel spreadsheet. It also contains macros that use NetMetaXL's interface to generate evidence network diagrams, forest plots, league tables of pairwise comparisons, probability plots (rankograms), and inconsistency plots within Microsoft Excel. All figures generated are publication quality, thereby increasing the efficiency of knowledge transfer and manuscript preparation. We demonstrate the application of NetMetaXL using data from a network meta-analysis published previously which compares combined resynchronization and implantable defibrillator therapy in left ventricular dysfunction. We replicate results from the previous publication while demonstrating result summaries generated by the software. Use of the freely available NetMetaXL successfully demonstrated its ability to make running network meta-analyses more accessible to novice WinBUGS users by allowing analyses to be conducted entirely within Microsoft Excel. NetMetaXL also allows for more efficient and transparent critical appraisal of network meta-analyses, enhanced standardization of reporting, and integration with health economic evaluations which are frequently Excel-based.
2014-01-01
Background The use of network meta-analysis has increased dramatically in recent years. WinBUGS, a freely available Bayesian software package, has been the most widely used software package to conduct network meta-analyses. However, the learning curve for WinBUGS can be daunting, especially for new users. Furthermore, critical appraisal of network meta-analyses conducted in WinBUGS can be challenging given its limited data manipulation capabilities and the fact that generation of graphical output from network meta-analyses often relies on different software packages than the analyses themselves. Methods We developed a freely available Microsoft-Excel-based tool called NetMetaXL, programmed in Visual Basic for Applications, which provides an interface for conducting a Bayesian network meta-analysis using WinBUGS from within Microsoft Excel. . This tool allows the user to easily prepare and enter data, set model assumptions, and run the network meta-analysis, with results being automatically displayed in an Excel spreadsheet. It also contains macros that use NetMetaXL’s interface to generate evidence network diagrams, forest plots, league tables of pairwise comparisons, probability plots (rankograms), and inconsistency plots within Microsoft Excel. All figures generated are publication quality, thereby increasing the efficiency of knowledge transfer and manuscript preparation. Results We demonstrate the application of NetMetaXL using data from a network meta-analysis published previously which compares combined resynchronization and implantable defibrillator therapy in left ventricular dysfunction. We replicate results from the previous publication while demonstrating result summaries generated by the software. Conclusions Use of the freely available NetMetaXL successfully demonstrated its ability to make running network meta-analyses more accessible to novice WinBUGS users by allowing analyses to be conducted entirely within Microsoft Excel. NetMetaXL also allows for more efficient and transparent critical appraisal of network meta-analyses, enhanced standardization of reporting, and integration with health economic evaluations which are frequently Excel-based. PMID:25267416
Safety-net providers in some US communities have increasingly embraced coordinated care models.
Cunningham, Peter; Felland, Laurie; Stark, Lucy
2012-08-01
Safety-net organizations, which provide health services to uninsured and low-income people, increasingly are looking for ways to coordinate services among providers to improve access to and quality of care and to reduce costs. In this analysis, a part of the Community Tracking Study, we examined trends in safety-net coordination activities from 2000 to 2010 within twelve communities in the United States and found a notable increase in such activities. Six of the twelve communities had made formal efforts to link uninsured people to medical homes and coordinate care with specialists in 2010, compared to only two communities in 2000. We also identified key attributes of safety-net coordinated care systems, such as reliance on a medical home for meeting patients' primary care needs, and lingering challenges to safety-net integration, such as competition among hospitals and community health centers for Medicaid patients.
BioNetFit: a fitting tool compatible with BioNetGen, NFsim and distributed computing environments
Thomas, Brandon R.; Chylek, Lily A.; Colvin, Joshua; ...
2015-11-09
Rule-based models are analyzed with specialized simulators, such as those provided by the BioNetGen and NFsim open-source software packages. Here in this paper, we present BioNetFit, a general-purpose fitting tool that is compatible with BioNetGen and NFsim. BioNetFit is designed to take advantage of distributed computing resources. This feature facilitates fitting (i.e. optimization of parameter values for consistency with data) when simulations are computationally expensive.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-26
...] Canola oil 2005 Diesel Fuel type biodiesel baseline Net Domestic Agriculture (w/o land 8 use change) Net... approach. 2. Models Used The proposed analysis EPA has prepared for canola oil biodiesel uses the same set... gallons) covers the range of production likely by 2022. We believe that this modeled change in canola oil...
ERIC Educational Resources Information Center
St. John, Edward P.; Starkey, Johnny B.
1995-01-01
This study reviews higher education assumptions of traditional net-price theory and an emerging approach considering a set of price and subsidies in enrollment and persistence decisions. Results suggest that within-year persistence decisions made by students from all income groups are more sensitive to tuition charges than to student aid.…
Dale, Ann Marie; Ekenga, Christine C; Buckner-Petty, Skye; Merlino, Linda; Thiese, Matthew S; Bao, Stephen; Meyers, Alysha Rose; Harris-Adamson, Carisa; Kapellusch, Jay; Eisen, Ellen A; Gerr, Fred; Hegmann, Kurt T; Silverstein, Barbara; Garg, Arun; Rempel, David; Zeringue, Angelique; Evanoff, Bradley A
2018-03-29
There is growing use of a job exposure matrix (JEM) to provide exposure estimates in studies of work-related musculoskeletal disorders; few studies have examined the validity of such estimates, nor did compare associations obtained with a JEM with those obtained using other exposures. This study estimated upper extremity exposures using a JEM derived from a publicly available data set (Occupational Network, O*NET), and compared exposure-disease associations for incident carpal tunnel syndrome (CTS) with those obtained using observed physical exposure measures in a large prospective study. 2393 workers from several industries were followed for up to 2.8 years (5.5 person-years). Standard Occupational Classification (SOC) codes were assigned to the job at enrolment. SOC codes linked to physical exposures for forceful hand exertion and repetitive activities were extracted from O*NET. We used multivariable Cox proportional hazards regression models to describe exposure-disease associations for incident CTS for individually observed physical exposures and JEM exposures from O*NET. Both exposure methods found associations between incident CTS and exposures of force and repetition, with evidence of dose-response. Observed associations were similar across the two methods, with somewhat wider CIs for HRs calculated using the JEM method. Exposures estimated using a JEM provided similar exposure-disease associations for CTS when compared with associations obtained using the 'gold standard' method of individual observation. While JEMs have a number of limitations, in some studies they can provide useful exposure estimates in the absence of individual-level observed exposures. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Gong, Kuang; Yang, Jaewon; Kim, Kyungsang; El Fakhri, Georges; Seo, Youngho; Li, Quanzheng
2018-05-23
Positron Emission Tomography (PET) is a functional imaging modality widely used in neuroscience studies. To obtain meaningful quantitative results from PET images, attenuation correction is necessary during image reconstruction. For PET/MR hybrid systems, PET attenuation is challenging as Magnetic Resonance (MR) images do not reflect attenuation coefficients directly. To address this issue, we present deep neural network methods to derive the continuous attenuation coefficients for brain PET imaging from MR images. With only Dixon MR images as the network input, the existing U-net structure was adopted and analysis using forty patient data sets shows it is superior than other Dixon based methods. When both Dixon and zero echo time (ZTE) images are available, we have proposed a modified U-net structure, named GroupU-net, to efficiently make use of both Dixon and ZTE information through group convolution modules when the network goes deeper. Quantitative analysis based on fourteen real patient data sets demonstrates that both network approaches can perform better than the standard methods, and the proposed network structure can further reduce the PET quantification error compared to the U-net structure. © 2018 Institute of Physics and Engineering in Medicine.
Low-Value Medical Services in the Safety-Net Population
Linder, Jeffrey A.; Clark, Cheryl R.; Sommers, Benjamin D.
2017-01-01
Importance National patterns of low-value and high-value care delivered to patients without insurance or with Medicaid could inform public policy but have not been previously examined. Objective To measure rates of low-value care and high-value care received by patients without insurance or with Medicaid, compared with privately insured patients, and provided by safety-net physicians vs non–safety-net physicians. Design, Setting, and Participants This multiyear cross-sectional observational study included all patients ages 18 to 64 years from the National Ambulatory Medical Care Survey (2005-2013) and the National Hospital Ambulatory Medical Care Survey (2005-2011) eligible for any of the 21 previously defined low-value or high-value care measures. All measures were analyzed with multivariable logistic regression and adjusted for patient and physician characteristics. Exposures Comparison of patients by insurance status (uninsured/Medicaid vs privately insured) and safety-net physicians (seeing >25% uninsured/Medicaid patients) vs non–safety-net physicians (seeing 1%-10%). Main Outcomes and Measures Delivery of 9 low-value or 12 high-value care measures, based on previous research definitions, and composite measures for any high-value or low-value care delivery during an office visit. Results Overall, 193 062 office visits were eligible for at least 1 measure. Mean (95% CI) age for privately insured patients (n = 94 707) was 44.7 (44.5-44.9) years; patients on Medicaid (n = 45 123), 39.8 (39.3-40.3) years; and uninsured patients (n = 19 530), 41.9 (41.5-42.4) years. Overall, low-value and high-value care was delivered in 19.4% (95% CI, 18.5%-20.2%) and 33.4% (95% CI, 32.4%-34.3%) of eligible encounters, respectively. Rates of low-value and high-value care delivery were similar across insurance types for the majority of services examined. Among Medicaid patients, adjusted rates of use were no different for 6 of 9 low-value and 9 of 12 high-value services compared with privately insured beneficiaries, whereas among the uninsured, rates were no different for 7 of 9 low-value and 9 of 12 high-value services. Safety-net physicians provided similar care compared with non–safety-net physicians, with no difference for 8 out of 9 low-value and for all 12 high-value services. Conclusions and Relevance Overuse of low-value care is common among patients without insurance or with Medicaid. Rates of low-value and high-value care were similar among physicians serving vulnerable patients and other physicians. Overuse of low-value care is a potentially important focus for state Medicaid programs and safety-net institutions to pursue cost savings and improved quality of health care delivery. PMID:28395014
NASA Technical Reports Server (NTRS)
Marshall, J.; Sauke, T.
1999-01-01
Electrostatic forces strongly influence the behavior of granular materials in both dispersed (cloud) systems and semi-packed systems. These forces can cause aggregation or dispersion of particles and are important in a variety of astrophysical and planetary settings. There are also many industrial and commercial settings where granular matter and electrostatics become partners for both good and bad. This partnership is important for human exploration on Mars where dust adheres to suits, machines, and habitats. Long-range Coulombic (electrostatic) forces, as opposed to contact-induced dipoles and van der Waals attractions, are generally regarded as resulting from net charge. We have proposed that in addition to net charge interactions, randomly distributed charge carriers on grains will result in a dipole moment regardless of any net charge. If grains are unconfined, or fluidized, they will rotate so that the dipole always induces attraction between grains. Aggregates are readily formed, and Coulombic polarity resulting from the dipole produces end-to-end stacking of grains to form filamentary aggregates. This has been demonstrated in USML experiments on Space Shuttle where microgravity facilitated the unmasking of static forces. It has also been demonstrated in a computer model using grains with charge carriers of both sign. Model results very closely resembled micro-g results with actual sand grains. Further computer modeling of the aggregation process has been conducted to improve our understanding of the aggregation process, and to provide a predictive tool for microgravity experiments slated for Space Station. These experiments will attempt to prove the dipole concept as outlined above. We have considerably enhanced the original computer model: refinements to the algorithm have improved the fidelity of grain behavior during grain contact, special attention has been paid to simulation time steps to enable establishment of a meaningful, quantitative time axis, and calibration of rounding accuracies have been conducted to test cumulative numerical influences in the model. The model has been run for larger grain populations, variable initial cloud densities, and we have introduced random net charging to individual grains, as well as a net charge to the cloud as a whole. The model uses 3 positive and 3 negative charges randomly distributed on each grain, with up to 160 grains contained within various size "boxes" that define the initial number densities in the clouds. Each charge represents localized charged region on a grain, but does not necessarily imply single quantized charge carriers. The Coulomb equations are then allowed to interact for each monopole: dipoles and any higher order charge coupling is a natural product of these "free" interactions over which the modeler exerts no influence. The charges are placed on surfaces of grains at random locations. A series of runs was conducted for neutral grains that had a perfect balance of negative and positive char carriers. Runs were also conducted with grains having additional fractional charges ranging between 0 and 1. By adding fractional charges of one sign, the model created grain populations in which all grains had excess charges the same sign, giving the cloud an overall net charge. This simulates clouds subjected to ionizing radiation (e. protoplanetary debris disk around a protosun), or any other process of charge biasing in a grain population (e.g., volcanic plumes). In another run series, random fractional charges of either sign were added to the grains so th some grains had a slight net positive charge while others had a slight net negative charge. This simulates triboelectrically-charged grain populations in which acquisition of an electron by one surface is at the expense creating a hole elsewhere. This dual sign charging was applied in two ways: in one case the cloud remained neutral by ensuring that all grain excess charges added to zero; in the other case, the cloud was permitted slight net char by not imposing a charge-balance condition. Additional information is contained in the original.
Fleming, Mark D.; Shim, Janet K.; Yen, Irene; Thompson-Lastad, Ariana; Rubin, Sara; Van Natta, Meredith; Burke, Nancy J.
2017-01-01
Increasing “patient engagement” has become a priority for health care organizations and policy-makers seeking to reduce cost and improve the quality of care. While concepts of patient engagement have proliferated rapidly across health care settings, little is known about how health care providers make use of these concepts in clinical practice. This paper uses 20 months of ethnographic and interview research carried out from 2015 to 2016 to explore how health care providers working at two public, urban, safety-net hospitals in the United States define, discuss, and assess patient engagement. We investigate how health care providers describe engagement for high cost patients—the “super-utilizers” of the health care system—who often face complex challenges related to socioeconomic marginalization including poverty, housing insecurity, exposure to violence and trauma, cognitive and mental health issues, and substance use. The health care providers in our study faced institutional pressure to assess patient engagement and to direct care towards engaged patients. However, providers considered such assessments to be highly challenging and oftentimes inaccurate, particularly because they understood low patient engagement to be the result of difficult socioeconomic conditions. Providers tried to navigate the demand to assess patient engagement in care by looking for explicit positive and negative indicators of engagement, while also being sensitive to more subtle and intuitive signs of engagement for marginalized patients. PMID:28445806
Learning fuzzy information in a hybrid connectionist, symbolic model
NASA Technical Reports Server (NTRS)
Romaniuk, Steve G.; Hall, Lawrence O.
1993-01-01
An instance-based learning system is presented. SC-net is a fuzzy hybrid connectionist, symbolic learning system. It remembers some examples and makes groups of examples into exemplars. All real-valued attributes are represented as fuzzy sets. The network representation and learning method is described. To illustrate this approach to learning in fuzzy domains, an example of segmenting magnetic resonance images of the brain is discussed. Clearly, the boundaries between human tissues are ill-defined or fuzzy. Example fuzzy rules for recognition are generated. Segmentations are presented that provide results that radiologists find useful.
Demand and willingness-to-pay for bed nets in Tanzania: results from a choice experiment.
Gingrich, Chris D; Ricotta, Emily; Kahwa, Amos; Kahabuka, Catherine; Koenker, Hannah
2017-07-14
Universal coverage campaigns for long-lasting insecticide-treated nets do not always reach the goal of one net for every two household members, and even when ownership of at least one net per household is high, many households may not own enough nets. The retail market provides these households options for replacing or increasing the number of nets they own with products that best fit their needs since a variety of net shapes, sizes, and colours are available. Hence, it is important to understand the factors affecting private net demand. This study explores private demand for nets in Tanzania using a discrete choice experiment. The experiment provides participants the option to buy nets with their own money, and thus should prove more accurate than a hypothetical survey of net preferences. Nearly 800 participants sampled in two regions showed an overall strong demand for nets, with 40% choosing to buy a net across all seven combinations of net prices and characteristics such as size, shape, and insecticide treatment. Only 8% of all participants chose not to buy a single net. A key factor influencing demand was whether a participant's household currently owned sufficient nets for all members, with rural participants showing lower net coverage and greater demand than urban participants. Both poor and less poor households showed strong evidence of making purchase decisions based on more than price alone. Mean willingness-to-pay values for a net started at US$1.10 and grew by US$0.50-1.40 for various attributes such as rectangular shape, large size, and insecticide treatment. The impact of price on demand was negative but small, with elasticity values between -0.25 and -0.45. The results suggest that private demand for nets in Tanzania could potentially supplement future coverage campaigns. Net manufacturers and retailers should advertise and promote consumers' preferred net attributes to improve sales and further expand net access and coverage. To overcome household liquidity concerns and best replicate the experiment results, policy makers should consider making credit available for interested buyers.
36 CFR 1002.4 - Weapons, traps and nets.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 36 Parks, Forests, and Public Property 3 2012-07-01 2012-07-01 false Weapons, traps and nets. 1002... AND RECREATION § 1002.4 Weapons, traps and nets. (a)(1) Except as otherwise provided in this section, the following are prohibited: (i) Possessing a weapon, trap or net. (ii) Carrying a weapon, trap or...
36 CFR 1002.4 - Weapons, traps and nets.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 36 Parks, Forests, and Public Property 3 2014-07-01 2014-07-01 false Weapons, traps and nets. 1002... AND RECREATION § 1002.4 Weapons, traps and nets. (a)(1) Except as otherwise provided in this section, the following are prohibited: (i) Possessing a weapon, trap or net. (ii) Carrying a weapon, trap or...
26 CFR 1.1258-1 - Netting rule for certain conversion transactions.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 26 Internal Revenue 11 2010-04-01 2010-04-01 true Netting rule for certain conversion transactions....1258-1 Netting rule for certain conversion transactions. (a) Purpose. The purpose of this section is to provide taxpayers with a method to net certain gains and losses from positions of the same conversion...
26 CFR 1.1402(a)-7 - Net operating loss deduction.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 26 Internal Revenue 12 2010-04-01 2010-04-01 false Net operating loss deduction. 1.1402(a)-7...) INCOME TAX (CONTINUED) INCOME TAXES Tax on Self-Employment Income § 1.1402(a)-7 Net operating loss deduction. The deduction provided by section 172, relating to net operating losses sustained in years other...
In the Right Ballpark? Assessing the Accuracy of Net Price Calculators
ERIC Educational Resources Information Center
Anthony, Aaron M.; Page, Lindsay C.; Seldin, Abigail
2016-01-01
Large differences often exist between a college's sticker price and net price after accounting for financial aid. Net price calculators (NPCs) were designed to help students more accurately estimate their actual costs to attend a given college. This study assesses the accuracy of information provided by net price calculators. Specifically, we…
Nematodes enhance plant growth and nutrient uptake under C and N-rich conditions.
Gebremikael, Mesfin T; Steel, Hanne; Buchan, David; Bert, Wim; De Neve, Stefaan
2016-09-08
The role of soil fauna in crucial ecosystem services such as nutrient cycling remains poorly quantified, mainly because of the overly reductionistic approach adopted in most experimental studies. Given that increasing nitrogen inputs in various ecosystems influence the structure and functioning of soil microbes and the activity of fauna, we aimed to quantify the role of the entire soil nematode community in nutrient mineralization in an experimental set-up emulating nutrient-rich field conditions and accounting for crucial interactions amongst the soil microbial communities and plants. To this end, we reconstructed a complex soil foodweb in mesocosms that comprised largely undisturbed native microflora and the entire nematode community added into defaunated soil, planted with Lolium perenne as a model plant, and amended with fresh grass-clover residues. We determined N and P availability and plant uptake, plant biomass and abundance and structure of the microbial and nematode communities during a three-month incubation. The presence of nematodes significantly increased plant biomass production (+9%), net N (+25%) and net P (+23%) availability compared to their absence, demonstrating that nematodes link below- and above-ground processes, primarily through increasing nutrient availability. The experimental set-up presented allows to realistically quantify the crucial ecosystem services provided by the soil biota.
Nematodes enhance plant growth and nutrient uptake under C and N-rich conditions
NASA Astrophysics Data System (ADS)
Gebremikael, Mesfin T.; Steel, Hanne; Buchan, David; Bert, Wim; de Neve, Stefaan
2016-09-01
The role of soil fauna in crucial ecosystem services such as nutrient cycling remains poorly quantified, mainly because of the overly reductionistic approach adopted in most experimental studies. Given that increasing nitrogen inputs in various ecosystems influence the structure and functioning of soil microbes and the activity of fauna, we aimed to quantify the role of the entire soil nematode community in nutrient mineralization in an experimental set-up emulating nutrient-rich field conditions and accounting for crucial interactions amongst the soil microbial communities and plants. To this end, we reconstructed a complex soil foodweb in mesocosms that comprised largely undisturbed native microflora and the entire nematode community added into defaunated soil, planted with Lolium perenne as a model plant, and amended with fresh grass-clover residues. We determined N and P availability and plant uptake, plant biomass and abundance and structure of the microbial and nematode communities during a three-month incubation. The presence of nematodes significantly increased plant biomass production (+9%), net N (+25%) and net P (+23%) availability compared to their absence, demonstrating that nematodes link below- and above-ground processes, primarily through increasing nutrient availability. The experimental set-up presented allows to realistically quantify the crucial ecosystem services provided by the soil biota.
Scaling Deep Learning Workloads: NVIDIA DGX-1/Pascal and Intel Knights Landing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gawande, Nitin A.; Landwehr, Joshua B.; Daily, Jeffrey A.
Deep Learning (DL) algorithms have become ubiquitous in data analytics. As a result, major computing vendors --- including NVIDIA, Intel, AMD and IBM --- have architectural road-maps influenced by DL workloads. Furthermore, several vendors have recently advertised new computing products as accelerating DL workloads. Unfortunately, it is difficult for data scientists to quantify the potential of these different products. This paper provides a performance and power analysis of important DL workloads on two major parallel architectures: NVIDIA DGX-1 (eight Pascal P100 GPUs interconnected with NVLink) and Intel Knights Landing (KNL) CPUs interconnected with Intel Omni-Path. Our evaluation consists of amore » cross section of convolutional neural net workloads: CifarNet, CaffeNet, AlexNet and GoogleNet topologies using the Cifar10 and ImageNet datasets. The workloads are vendor optimized for each architecture. GPUs provide the highest overall raw performance. Our analysis indicates that although GPUs provide the highest overall performance, the gap can close for some convolutional networks; and KNL can be competitive when considering performance/watt. Furthermore, NVLink is critical to GPU scaling.« less
Neural network classification of questionable EGRET events
NASA Astrophysics Data System (ADS)
Meetre, C. A.; Norris, J. P.
1992-02-01
High energy gamma rays (greater than 20 MeV) pair producing in the spark chamber of the Energetic Gamma Ray Telescope Experiment (EGRET) give rise to a characteristic but highly variable 3-D locus of spark sites, which must be processed to decide whether the event is to be included in the database. A significant fraction (about 15 percent or 104 events/day) of the candidate events cannot be categorized (accept/reject) by an automated rule-based procedure; they are therefore tagged, and must be examined and classified manually by a team of expert analysts. We describe a feedforward, back-propagation neural network approach to the classification of the questionable events. The algorithm computes a set of coefficients using representative exemplars drawn from the preclassified set of questionable events. These coefficients map a given input event into a decision vector that, ideally, describes the correct disposition of the event. The net's accuracy is then tested using a different subset of preclassified events. Preliminary results demonstrate the net's ability to correctly classify a large proportion of the events for some categories of questionables. Current work includes the use of much larger training sets to improve the accuracy of the net.
Neural network classification of questionable EGRET events
NASA Technical Reports Server (NTRS)
Meetre, C. A.; Norris, J. P.
1992-01-01
High energy gamma rays (greater than 20 MeV) pair producing in the spark chamber of the Energetic Gamma Ray Telescope Experiment (EGRET) give rise to a characteristic but highly variable 3-D locus of spark sites, which must be processed to decide whether the event is to be included in the database. A significant fraction (about 15 percent or 10(exp 4) events/day) of the candidate events cannot be categorized (accept/reject) by an automated rule-based procedure; they are therefore tagged, and must be examined and classified manually by a team of expert analysts. We describe a feedforward, back-propagation neural network approach to the classification of the questionable events. The algorithm computes a set of coefficients using representative exemplars drawn from the preclassified set of questionable events. These coefficients map a given input event into a decision vector that, ideally, describes the correct disposition of the event. The net's accuracy is then tested using a different subset of preclassified events. Preliminary results demonstrate the net's ability to correctly classify a large proportion of the events for some categories of questionables. Current work includes the use of much larger training sets to improve the accuracy of the net.
Goicoechea, H C; Olivieri, A C
2001-07-01
A newly developed multivariate method involving net analyte preprocessing (NAP) was tested using central composite calibration designs of progressively decreasing size regarding the multivariate simultaneous spectrophotometric determination of three active components (phenylephrine, diphenhydramine and naphazoline) and one excipient (methylparaben) in nasal solutions. Its performance was evaluated and compared with that of partial least-squares (PLS-1). Minimisation of the calibration predicted error sum of squares (PRESS) as a function of a moving spectral window helped to select appropriate working spectral ranges for both methods. The comparison of NAP and PLS results was carried out using two tests: (1) the elliptical joint confidence region for the slope and intercept of a predicted versus actual concentrations plot for a large validation set of samples and (2) the D-optimality criterion concerning the information content of the calibration data matrix. Extensive simulations and experimental validation showed that, unlike PLS, the NAP method is able to furnish highly satisfactory results when the calibration set is reduced from a full four-component central composite to a fractional central composite, as expected from the modelling requirements of net analyte based methods.
NASA Astrophysics Data System (ADS)
Hsieh, Fu-Shiung
2011-03-01
Design of robust supervisory controllers for manufacturing systems with unreliable resources has received significant attention recently. Robustness analysis provides an alternative way to analyse a perturbed system to quickly respond to resource failures. Although we have analysed the robustness properties of several subclasses of ordinary Petri nets (PNs), analysis for non-ordinary PNs has not been done. Non-ordinary PNs have weighted arcs and have the advantage to compactly model operations requiring multiple parts or resources. In this article, we consider a class of flexible assembly/disassembly manufacturing systems and propose a non-ordinary flexible assembly/disassembly Petri net (NFADPN) model for this class of systems. As the class of flexible assembly/disassembly manufacturing systems can be regarded as the integration and interactions of a set of assembly/disassembly subprocesses, a bottom-up approach is adopted in this article to construct the NFADPN models. Due to the routing flexibility in NFADPN, there may exist different ways to accomplish the tasks. To characterise different ways to accomplish the tasks, we propose the concept of completely connected subprocesses. As long as there exists a set of completely connected subprocesses for certain type of products, the production of that type of products can still be maintained without requiring the whole NFADPN to be live. To take advantage of the alternative routes without enforcing liveness for the whole system, we generalise the concept of persistent production proposed to NFADPN. We propose a condition for persistent production based on the concept of completely connected subprocesses. We extend robustness analysis to NFADPN by exploiting its structure. We identify several patterns of resource failures and characterise the conditions to maintain operation in the presence of resource failures.
Martínez Vega, Mabel V; Sharifzadeh, Sara; Wulfsohn, Dvoralai; Skov, Thomas; Clemmensen, Line Harder; Toldam-Andersen, Torben B
2013-12-01
Visible-near infrared spectroscopy remains a method of increasing interest as a fast alternative for the evaluation of fruit quality. The success of the method is assumed to be achieved by using large sets of samples to produce robust calibration models. In this study we used representative samples of an early and a late season apple cultivar to evaluate model robustness (in terms of prediction ability and error) on the soluble solids content (SSC) and acidity prediction, in the wavelength range 400-1100 nm. A total of 196 middle-early season and 219 late season apples (Malus domestica Borkh.) cvs 'Aroma' and 'Holsteiner Cox' samples were used to construct spectral models for SSC and acidity. Partial least squares (PLS), ridge regression (RR) and elastic net (EN) models were used to build prediction models. Furthermore, we compared three sub-sample arrangements for forming training and test sets ('smooth fractionator', by date of measurement after harvest and random). Using the 'smooth fractionator' sampling method, fewer spectral bands (26) and elastic net resulted in improved performance for SSC models of 'Aroma' apples, with a coefficient of variation CVSSC = 13%. The model showed consistently low errors and bias (PLS/EN: R(2) cal = 0.60/0.60; SEC = 0.88/0.88°Brix; Biascal = 0.00/0.00; R(2) val = 0.33/0.44; SEP = 1.14/1.03; Biasval = 0.04/0.03). However, the prediction acidity and for SSC (CV = 5%) of the late cultivar 'Holsteiner Cox' produced inferior results as compared with 'Aroma'. It was possible to construct local SSC and acidity calibration models for early season apple cultivars with CVs of SSC and acidity around 10%. The overall model performance of these data sets also depend on the proper selection of training and test sets. The 'smooth fractionator' protocol provided an objective method for obtaining training and test sets that capture the existing variability of the fruit samples for construction of visible-NIR prediction models. The implication is that by using such 'efficient' sampling methods for obtaining an initial sample of fruit that represents the variability of the population and for sub-sampling to form training and test sets it should be possible to use relatively small sample sizes to develop spectral predictions of fruit quality. Using feature selection and elastic net appears to improve the SSC model performance in terms of R(2), RMSECV and RMSEP for 'Aroma' apples. © 2013 Society of Chemical Industry.
Synchronous, Remote, Internet Conferencing with Unique Populations in Various Settings.
ERIC Educational Resources Information Center
Mallory, James R.; MacKenzie, Douglas
This paper focuses on the authors' experiences with interactive, synchronous Internet video conferencing using Microsoft's NetMeeting software with deaf and hard-of-hearing students in two different settings. One setting involved teaching and tutoring computer programming to remote deaf and hard-of-hearing students in a remote situation using…
Picado, Albert; Kumar, Vijay; Das, Murari; Burniston, Ian; Roy, Lalita; Suman, Rijal; Dinesh, Diwakar; Coosemans, Marc; Sundar, Shyam; Shreekant, Kesari; Boelaert, Marleen; Davies, Clive; Cameron, Mary
2009-12-01
Observational studies in the Indian subcontinent have shown that untreated nets may be protective against visceral leishmaniasis (VL). In this study, we evaluated the effect of untreated nets on the blood feeding rates of Phlebotomus argentipes as well as the human blood index (HBI) in VL endemic villages in India and Nepal. The study had a 'before and after intervention' design in 58 households in six clusters. The use of untreated nets reduced the blood feeding rate by 85% (95% CI 76.5-91.1%) and the HBI by 42.2% (95% CI 11.1-62.5%). These results provide circumstantial evidence that untreated nets may provide some degree of personal protection against sand fly bites.
Net-mortality of Common Murres and Atlantic Puffins in Newfoundland, 1951-81
Piatt, John F.; Nettleship, David N.; Threlfall, William; Nettleship, David N.; Sanger, Gerald A.; Springer, Paul F.
1982-01-01
Band recoveries (N = 315) over 26 years (1951-77) and three surveys of seabird bycatch in inshore fishing nets (1972, 1980-81) indicate that there has been a substantial net-mortality of Atlantic Puffins (Fratercula arctica) and Common Murres (Uria aalge) in Newfoundland coastal waters for the past 2 decades. Offshore (e.g. Grand Banks) gill-netting is limited, but some data suggest that murre net-mortality also occurs offshore at murre wintering areas. The vast majority of inshore net-mortality incidents occur over a 2-week period during the annual inshore spawning migration of capelin (Mallotus villosus), the major prey item for alcids in eastern Canada. Most murres (83%) were drowned in bottom-set (30-185 m) cod (Gadus morhua) gill nets, whereas more puffins were drowned in surface-set salmon (Salmo salar) gill nets or cod traps (55%) than in cod gillnets (45%). Murre band recoveries, colony censuses, and fishing-effort data suggest that at the second largest Common Murre colony in Newfoundland (Witless Bay Seabird Sanctuary, 77,000 breeding pairs) net-mortality was relatively low in the 1950s and early 1960s, but increased during the 1960s as the murre population grew in size and gill-net fishing effort increased in the colony area. By 1971, net-mortality accounted for 70% of murre band recoveries and calculations show that almost 30,000 breeding adults, or about 20% of the local breeding population, were drowned in that year. More reliable estimates of alcid bycatch in the Witless Bay area have been made on the basis of actual bycatch surveys. In 1972 about 20,000 adult murres, or 13% of the breeding stock, were killed in gill-nets. Net-mortality of murres apparently diminished through the 1970s as capelin stocks declined and fewer birds foraged in heavily netted inshore areas. Bycatch surveys in the Witless Bay area in 1980-81 revealed that, relative to previous years, murre net-mortality was greatly reduced and resulted in the loss of only 3-4% of the breeding stock. Even these low mortality rates, however, are cause for concern as adult murre mortality from all sources (including hunting, oil, and natural mortality) should not exceed 6-12% per annum to maintain a stable breeding population. Little is known about the magnitude of net-mortality at other major Newfoundland murre colonies though it is known to be a problem in all colony areas. The bycatch of adult Atlantic Puffins in the Witless Bay area was low compared to murre bycatch and in 3 years of study never exceeded 1.6% of the breeding population. During the 1970s, fishing effort increased five-fold in colony areas and we predict that if capelin spawning stocks return to early 1970s size, then net-mortality of puffins and murres in Newfoundland coastal regions will increase dramatically. Indeed, preliminary examination of 1982 capelin spawning and seabird bycatch data suggests that capelin were much more abundant inshore and murre bycatch increased two- to three-fold over 1981.
Blanc-Durand, Paul; Van Der Gucht, Axel; Schaefer, Niklaus; Itti, Emmanuel; Prior, John O
2018-01-01
Amino-acids positron emission tomography (PET) is increasingly used in the diagnostic workup of patients with gliomas, including differential diagnosis, evaluation of tumor extension, treatment planning and follow-up. Recently, progresses of computer vision and machine learning have been translated for medical imaging. Aim was to demonstrate the feasibility of an automated 18F-fluoro-ethyl-tyrosine (18F-FET) PET lesion detection and segmentation relying on a full 3D U-Net Convolutional Neural Network (CNN). All dynamic 18F-FET PET brain image volumes were temporally realigned to the first dynamic acquisition, coregistered and spatially normalized onto the Montreal Neurological Institute template. Ground truth segmentations were obtained using manual delineation and thresholding (1.3 x background). The volumetric CNN was implemented based on a modified Keras implementation of a U-Net library with 3 layers for the encoding and decoding paths. Dice similarity coefficient (DSC) was used as an accuracy measure of segmentation. Thirty-seven patients were included (26 [70%] in the training set and 11 [30%] in the validation set). All 11 lesions were accurately detected with no false positive, resulting in a sensitivity and a specificity for the detection at the tumor level of 100%. After 150 epochs, DSC reached 0.7924 in the training set and 0.7911 in the validation set. After morphological dilatation and fixed thresholding of the predicted U-Net mask a substantial improvement of the DSC to 0.8231 (+ 4.1%) was noted. At the voxel level, this segmentation led to a 0.88 sensitivity [95% CI, 87.1 to, 88.2%] a 0.99 specificity [99.9 to 99.9%], a 0.78 positive predictive value: [76.9 to 78.3%], and a 0.99 negative predictive value [99.9 to 99.9%]. With relatively high performance, it was proposed the first full 3D automated procedure for segmentation of 18F-FET PET brain images of patients with different gliomas using a U-Net CNN architecture.
36 CFR § 1002.4 - Weapons, traps and nets.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 36 Parks, Forests, and Public Property 3 2013-07-01 2012-07-01 true Weapons, traps and nets. Â... USE AND RECREATION § 1002.4 Weapons, traps and nets. (a)(1) Except as otherwise provided in this section, the following are prohibited: (i) Possessing a weapon, trap or net. (ii) Carrying a weapon, trap...
Comparing the net cost of CSP-TES to PV deployed with battery storage
NASA Astrophysics Data System (ADS)
Jorgenson, Jennie; Mehos, Mark; Denholm, Paul
2016-05-01
Concentrated solar power with thermal energy storage (CSP-TES) is a unique source of renewable energy in that its energy can be shifted over time and it can provide the electricity system with dependable generation capacity. In this study, we provide a framework to determine if the benefits of CSP-TES (shiftable energy and the ability to provide firm capacity) exceed the benefits of PV and firm capacity sources such as long-duration battery storage or conventional natural gas combustion turbines (CTs). The results of this study using current capital cost estimates indicate that a combination of PV and conventional gas CTs provides a lower net cost compared to CSP-TES and PV with batteries. Some configurations of CSP-TES have a lower net cost than PV with batteries for even the lowest battery cost estimate. Using projected capital cost targets, however, some configurations of CSP-TES have a lower net cost than PV with either option for even the lowest battery cost estimate. The net cost of CSP-TES varies with configuration, and lower solar multiples coupled with less storage are more attractive at current cost levels, due to high component costs. However, higher solar multiples show a lower net cost using projected future costs for heliostats and thermal storage materials.
Kim, Woo-Yeon; Kang, Sungsoo; Kim, Byoung-Chul; Oh, Jeehyun; Cho, Seongwoong; Bhak, Jong; Choi, Jong-Soon
2008-01-01
Cyanobacteria are model organisms for studying photosynthesis, carbon and nitrogen assimilation, evolution of plant plastids, and adaptability to environmental stresses. Despite many studies on cyanobacteria, there is no web-based database of their regulatory and signaling protein-protein interaction networks to date. We report a database and website SynechoNET that provides predicted protein-protein interactions. SynechoNET shows cyanobacterial domain-domain interactions as well as their protein-level interactions using the model cyanobacterium, Synechocystis sp. PCC 6803. It predicts the protein-protein interactions using public interaction databases that contain mutually complementary and redundant data. Furthermore, SynechoNET provides information on transmembrane topology, signal peptide, and domain structure in order to support the analysis of regulatory membrane proteins. Such biological information can be queried and visualized in user-friendly web interfaces that include the interactive network viewer and search pages by keyword and functional category. SynechoNET is an integrated protein-protein interaction database designed to analyze regulatory membrane proteins in cyanobacteria. It provides a platform for biologists to extend the genomic data of cyanobacteria by predicting interaction partners, membrane association, and membrane topology of Synechocystis proteins. SynechoNET is freely available at http://synechocystis.org/ or directly at http://bioportal.kobic.kr/SynechoNET/.
Evaluation of Contrail Reduction Strategies Based on Environmental and Operational Costs
NASA Technical Reports Server (NTRS)
Chen, Neil Y.; Sridhar, Banavar; Ng, Hok K.; Li, Jinhua
2013-01-01
This paper evaluates a set of contrail reduction strategies based on environmental and operational costs. A linear climate model was first used to convert climate effects of carbon dioxide emissions and aircraft contrails to changes in Absolute Global Temperature Potential, a metric that measures the mean surface temperature change due to aircraft emissions and persistent contrail formations. The concept of social cost of carbon and the carbon auction price from recent California's cap-and-trade system were then used to relate the carbon dioxide emissions and contrail formations to an environmental cost index. The strategy for contrail reduction is based on minimizing contrail formations by altering the aircraft's cruising altitude. The strategy uses a user-defined factor to trade off between contrail reduction and additional fuel burn and carbon dioxide emissions. A higher value of tradeoff factor results in more contrail reduction but also more fuel burn and carbon emissions. The strategy is considered favorable when the net environmental cost benefit exceeds the operational cost. The results show how the net environmental benefit varies with different decision-making time-horizon and different carbon cost. The cost models provide a guidance to select the trade-off factor that will result in the most net environmental benefit.
Lifecycle greenhouse gas implications of US national scenarios for cellulosic ethanol production
NASA Astrophysics Data System (ADS)
Scown, Corinne D.; Nazaroff, William W.; Mishra, Umakant; Strogen, Bret; Lobscheid, Agnes B.; Masanet, Eric; Santero, Nicholas J.; Horvath, Arpad; McKone, Thomas E.
2012-03-01
The Energy Independence and Security Act of 2007 set an annual US national production goal of 39.7 billion l of cellulosic ethanol by 2020. This paper explores the possibility of meeting that target by growing and processing Miscanthus × giganteus. We define and assess six production scenarios in which active cropland and/or Conservation Reserve Program land are used to grow to Miscanthus. The crop and biorefinery locations are chosen with consideration of economic, land-use, water management and greenhouse gas (GHG) emissions reduction objectives. Using lifecycle assessment, the net GHG footprint of each scenario is evaluated, providing insight into the climate costs and benefits associated with each scenario’s objectives. Assuming that indirect land-use change is successfully minimized or mitigated, the results suggest two major drivers for overall GHG impact of cellulosic ethanol from Miscanthus: (a) net soil carbon sequestration or emissions during Miscanthus cultivation and (b) GHG offset credits for electricity exported by biorefineries to the grid. Without these factors, the GHG intensity of bioethanol from Miscanthus is calculated to be 11-13 g CO2-equivalent per MJ of fuel, which is 80-90% lower than gasoline. Including soil carbon sequestration and the power-offset credit results in net GHG sequestration up to 26 g CO2-equivalent per MJ of fuel.
Understanding and preventing type 1 diabetes through the unique working model of TrialNet.
Battaglia, Manuela; Anderson, Mark S; Buckner, Jane H; Geyer, Susan M; Gottlieb, Peter A; Kay, Thomas W H; Lernmark, Åke; Muller, Sarah; Pugliese, Alberto; Roep, Bart O; Greenbaum, Carla J; Peakman, Mark
2017-11-01
Type 1 diabetes is an autoimmune disease arising from the destruction of pancreatic insulin-producing beta cells. The disease represents a continuum, progressing sequentially at variable rates through identifiable stages prior to the onset of symptoms, through diagnosis and into the critical periods that follow, culminating in a variable depth of beta cell depletion. The ability to identify the very earliest of these presymptomatic stages has provided a setting in which prevention strategies can be trialled, as well as furnishing an unprecedented opportunity to study disease evolution, including intrinsic and extrinsic initiators and drivers. This niche opportunity is occupied by Type 1 Diabetes TrialNet, an international consortium of clinical trial centres that leads the field in intervention and prevention studies, accompanied by deep longitudinal bio-sampling. In this review, we focus on discoveries arising from this unique bioresource, comprising more than 70,000 samples, and outline the processes and science that have led to new biomarkers and mechanistic insights, as well as identifying new challenges and opportunities. We conclude that via integration of clinical trials and mechanistic studies, drawing in clinicians and scientists and developing partnership with industry, TrialNet embodies an enviable and unique working model for understanding a disease that to date has no cure and for designing new therapeutic approaches.
Understanding and preventing type 1 diabetes through the unique working model of TrialNet
Anderson, Mark S.; Buckner, Jane H.; Geyer, Susan M.; Gottlieb, Peter A.; Kay, Thomas W. H.; Lernmark, Åke; Muller, Sarah; Pugliese, Alberto; Roep, Bart O.; Greenbaum, Carla J.
2018-01-01
Type 1 diabetes is an autoimmune disease arising from the destruction of pancreatic insulin-producing beta cells. The disease represents a continuum, progressing sequentially at variable rates through identifiable stages prior to the onset of symptoms, through diagnosis and into the critical periods that follow, culminating in a variable depth of beta cell depletion. The ability to identify the very earliest of these presymptomatic stages has provided a setting in which prevention strategies can be trialled, as well as furnishing an unprecedented opportunity to study disease evolution, including intrinsic and extrinsic initiators and drivers. This niche opportunity is occupied by Type 1 Diabetes TrialNet, an international consortium of clinical trial centres that leads the field in intervention and prevention studies, accompanied by deep longitudinal bio-sampling. In this review, we focus on discoveries arising from this unique bioresource, comprising more than 70,000 samples, and outline the processes and science that have led to new biomarkers and mechanistic insights, as well as identifying new challenges and opportunities. We conclude that via integration of clinical trials and mechanistic studies, drawing in clinicians and scientists and developing partnership with industry, TrialNet embodies an enviable and unique working model for understanding a disease that to date has no cure and for designing new therapeutic approaches. PMID:28770323
Tokdemir, Sibel; Nelson, William H
2005-06-01
Three radical species were detected in an EPR/ENDOR study of X-irradiated hypoxanthine.HCl.H2O single crystals at room temperature: RI was identified as the product of net H addition to C8, RII was identified as the product of net H addition to C2, and RIII was identified as the product of OH addition to C8. The observed set of radicals was the same for room-temperature irradiation as for irradiation at 10 K followed by warming the crystals to room temperature; however, the C2 H-addition and C8 OH-addition radicals were not detectable after storage of the crystals for about 2 months at room temperature. Use of selectively deuterated crystals permitted unique assignment of the observed hyperfine couplings, and results of density functional theory calculations on each of the radical structures were consistent with the experimental results. Comparison of these experimental results with others from previous crystal-based systems and model system computations provides insight into the mechanisms by which the biologically important purine C8 hydroxyl addition products are formed. The evidence from solid systems supports the mechanism of net water addition to one-electron oxidized purine bases and demonstrates the importance of a facial approach between the reactants.
Papatheodorou, Irene; Ziehm, Matthias; Wieser, Daniela; Alic, Nazif; Partridge, Linda; Thornton, Janet M.
2012-01-01
A challenge of systems biology is to integrate incomplete knowledge on pathways with existing experimental data sets and relate these to measured phenotypes. Research on ageing often generates such incomplete data, creating difficulties in integrating RNA expression with information about biological processes and the phenotypes of ageing, including longevity. Here, we develop a logic-based method that employs Answer Set Programming, and use it to infer signalling effects of genetic perturbations, based on a model of the insulin signalling pathway. We apply our method to RNA expression data from Drosophila mutants in the insulin pathway that alter lifespan, in a foxo dependent fashion. We use this information to deduce how the pathway influences lifespan in the mutant animals. We also develop a method for inferring the largest common sub-paths within each of our signalling predictions. Our comparisons reveal consistent homeostatic mechanisms across both long- and short-lived mutants. The transcriptional changes observed in each mutation usually provide negative feedback to signalling predicted for that mutation. We also identify an S6K-mediated feedback in two long-lived mutants that suggests a crosstalk between these pathways in mutants of the insulin pathway, in vivo. By formulating the problem as a logic-based theory in a qualitative fashion, we are able to use the efficient search facilities of Answer Set Programming, allowing us to explore larger pathways, combine molecular changes with pathways and phenotype and infer effects on signalling in in vivo, whole-organism, mutants, where direct signalling stimulation assays are difficult to perform. Our methods are available in the web-service NetEffects: http://www.ebi.ac.uk/thornton-srv/software/NetEffects. PMID:23251396
Papatheodorou, Irene; Ziehm, Matthias; Wieser, Daniela; Alic, Nazif; Partridge, Linda; Thornton, Janet M
2012-01-01
A challenge of systems biology is to integrate incomplete knowledge on pathways with existing experimental data sets and relate these to measured phenotypes. Research on ageing often generates such incomplete data, creating difficulties in integrating RNA expression with information about biological processes and the phenotypes of ageing, including longevity. Here, we develop a logic-based method that employs Answer Set Programming, and use it to infer signalling effects of genetic perturbations, based on a model of the insulin signalling pathway. We apply our method to RNA expression data from Drosophila mutants in the insulin pathway that alter lifespan, in a foxo dependent fashion. We use this information to deduce how the pathway influences lifespan in the mutant animals. We also develop a method for inferring the largest common sub-paths within each of our signalling predictions. Our comparisons reveal consistent homeostatic mechanisms across both long- and short-lived mutants. The transcriptional changes observed in each mutation usually provide negative feedback to signalling predicted for that mutation. We also identify an S6K-mediated feedback in two long-lived mutants that suggests a crosstalk between these pathways in mutants of the insulin pathway, in vivo. By formulating the problem as a logic-based theory in a qualitative fashion, we are able to use the efficient search facilities of Answer Set Programming, allowing us to explore larger pathways, combine molecular changes with pathways and phenotype and infer effects on signalling in in vivo, whole-organism, mutants, where direct signalling stimulation assays are difficult to perform. Our methods are available in the web-service NetEffects: http://www.ebi.ac.uk/thornton-srv/software/NetEffects.
Romero-Campero, Francisco J; Perez-Hurtado, Ignacio; Lucas-Reina, Eva; Romero, Jose M; Valverde, Federico
2016-03-12
Chlamydomonas reinhardtii is the model organism that serves as a reference for studies in algal genomics and physiology. It is of special interest in the study of the evolution of regulatory pathways from algae to higher plants. Additionally, it has recently gained attention as a potential source for bio-fuel and bio-hydrogen production. The genome of Chlamydomonas is available, facilitating the analysis of its transcriptome by RNA-seq data. This has produced a massive amount of data that remains fragmented making necessary the application of integrative approaches based on molecular systems biology. We constructed a gene co-expression network based on RNA-seq data and developed a web-based tool, ChlamyNET, for the exploration of the Chlamydomonas transcriptome. ChlamyNET exhibits a scale-free and small world topology. Applying clustering techniques, we identified nine gene clusters that capture the structure of the transcriptome under the analyzed conditions. One of the most central clusters was shown to be involved in carbon/nitrogen metabolism and signalling, whereas one of the most peripheral clusters was involved in DNA replication and cell cycle regulation. The transcription factors and regulators in the Chlamydomonas genome have been identified in ChlamyNET. The biological processes potentially regulated by them as well as their putative transcription factor binding sites were determined. The putative light regulated transcription factors and regulators in the Chlamydomonas genome were analyzed in order to provide a case study on the use of ChlamyNET. Finally, we used an independent data set to cross-validate the predictive power of ChlamyNET. The topological properties of ChlamyNET suggest that the Chlamydomonas transcriptome posseses important characteristics related to error tolerance, vulnerability and information propagation. The central part of ChlamyNET constitutes the core of the transcriptome where most authoritative hub genes are located interconnecting key biological processes such as light response with carbon and nitrogen metabolism. Our study reveals that key elements in the regulation of carbon and nitrogen metabolism, light response and cell cycle identified in higher plants were already established in Chlamydomonas. These conserved elements are not only limited to transcription factors, regulators and their targets, but also include the cis-regulatory elements recognized by them.
The association between household bed net ownership and all-cause child mortality in Madagascar.
Meekers, Dominique; Yukich, Joshua O
2016-09-17
Malaria continues to be an important cause of morbidity and mortality in Madagascar. It has been estimated that the malaria burden costs Madagascar over $52 million annually in terms of treatment costs, lost productivity and prevention expenses. One of the key malaria prevention strategies of the Government of Madagascar consists of large-scale mass distribution campaigns of long-lasting insecticide-treated bed nets (LLIN). Although there is ample evidence that child mortality has decreased in Madagascar, it is unclear whether increases in LLIN ownership have contributed to this decline. This study analyses multiple recent cross-sectional survey data sets to examine the association between household bed net ownership and all-cause child mortality. Data on household-level bed net ownership confirm that the percentage of households that own one or more bed nets increased substantially following the 2009 and 2010 mass LLIN distribution campaigns. Additionally, all-cause child mortality in Madagascar has declined during the period 2008-2013. Bed net ownership was associated with a 22 % reduction in the all-cause child mortality hazard in Madagascar. Mass bed net distributions contributed strongly to the overall decline in child mortality in Madagascar during the period 2008-2013. However, the decline was not solely attributable to increases in bed net coverage, and nets alone were not able to eliminate most of the child mortality hazard across the island.
Innovation in the safety net: integrating community health centers through accountable care.
Lewis, Valerie A; Colla, Carrie H; Schoenherr, Karen E; Shortell, Stephen M; Fisher, Elliott S
2014-11-01
Safety net primary care providers, including as community health centers, have long been isolated from mainstream health care providers. Current delivery system reforms such as Accountable Care Organizations (ACOs) may either reinforce the isolation of these providers or may spur new integration of safety net providers. This study examines the extent of community health center involvement in ACOs, as well as how and why ACOs are partnering with these safety net primary care providers. Mixed methods study pairing the cross-sectional National Survey of ACOs (conducted 2012 to 2013), followed by in-depth, qualitative interviews with a subset of ACOs that include community health centers (conducted 2013). One hundred and seventy-three ACOs completed the National Survey of ACOs. Executives from 18 ACOs that include health centers participated in in-depth interviews, along with leadership at eight community health centers participating in ACOs. Key survey measures include ACO organizational characteristics, care management and quality improvement capabilities. Qualitative interviews used a semi-structured interview guide. Interviews were recorded and transcribed, then coded for thematic content using NVivo software. Overall, 28% of ACOs include a community health center (CHC). ACOs with CHCs are similar to those without CHCs in organizational structure, care management and quality improvement capabilities. Qualitative results showed two major themes. First, ACOs with CHCs typically represent new relationships or formal partnerships between CHCs and other local health care providers. Second, CHCs are considered valued partners brought into ACOs to expand primary care capacity and expertise. A substantial number of ACOs include CHCs. These results suggest that rather than reinforcing segmentation of safety net providers from the broader delivery system, the ACO model may lead to the integration of safety net primary care providers.
Inferring Phylogenetic Networks Using PhyloNet.
Wen, Dingqiao; Yu, Yun; Zhu, Jiafan; Nakhleh, Luay
2018-07-01
PhyloNet was released in 2008 as a software package for representing and analyzing phylogenetic networks. At the time of its release, the main functionalities in PhyloNet consisted of measures for comparing network topologies and a single heuristic for reconciling gene trees with a species tree. Since then, PhyloNet has grown significantly. The software package now includes a wide array of methods for inferring phylogenetic networks from data sets of unlinked loci while accounting for both reticulation (e.g., hybridization) and incomplete lineage sorting. In particular, PhyloNet now allows for maximum parsimony, maximum likelihood, and Bayesian inference of phylogenetic networks from gene tree estimates. Furthermore, Bayesian inference directly from sequence data (sequence alignments or biallelic markers) is implemented. Maximum parsimony is based on an extension of the "minimizing deep coalescences" criterion to phylogenetic networks, whereas maximum likelihood and Bayesian inference are based on the multispecies network coalescent. All methods allow for multiple individuals per species. As computing the likelihood of a phylogenetic network is computationally hard, PhyloNet allows for evaluation and inference of networks using a pseudolikelihood measure. PhyloNet summarizes the results of the various analyzes and generates phylogenetic networks in the extended Newick format that is readily viewable by existing visualization software.
Control of Wind Tunnel Operations Using Neural Net Interpretation of Flow Visualization Records
NASA Technical Reports Server (NTRS)
Buggele, Alvin E.; Decker, Arthur J.
1994-01-01
Neural net control of operations in a small subsonic/transonic/supersonic wind tunnel at Lewis Research Center is discussed. The tunnel and the layout for neural net control or control by other parallel processing techniques are described. The tunnel is an affordable, multiuser platform for testing instrumentation and components, as well as parallel processing and control strategies. Neural nets have already been tested on archival schlieren and holographic visualizations from this tunnel as well as recent supersonic and transonic shadowgraph. This paper discusses the performance of neural nets for interpreting shadowgraph images in connection with a recent exercise for tuning the tunnel in a subsonic/transonic cascade mode of operation. That mode was operated for performing wake surveys in connection with NASA's Advanced Subsonic Technology (AST) noise reduction program. The shadowgraph was presented to the neural nets as 60 by 60 pixel arrays. The outputs were tunnel parameters such as valve settings or tunnel state identifiers for selected tunnel operating points, conditions, or states. The neural nets were very sensitive, perhaps too sensitive, to shadowgraph pattern detail. However, the nets exhibited good immunity to variations in brightness, to noise, and to changes in contrast. The nets are fast enough so that ten or more can be combined per control operation to interpret flow visualization data, point sensor data, and model calculations. The pattern sensitivity of the nets will be utilized and tested to control wind tunnel operations at Mach 2.0 based on shock wave patterns.
The relative efficiency of nylon and cotton gill nets for taking lake trout in Lake Superior
Pycha, Richard L.
1962-01-01
The change from cotton to nylon twine for gill nets in 1949–52 resulted in a sharp increase in the efficiency of the most important gear used for taking lake trout in Lake Superior, and, consequently, biased estimates of fishing intensity and abundance severely.From early May to the end of September 1961, short gangs (2000 or 4000 linear feet) of cotton and nylon nets were fished in parallel sets for lake trout. A total of 343,000 feet of gill netting was lifted. Nylon nets were 2.25 times as efficient as cotton nets for taking legal-sized fish and 2.8 times as efficient for undersized lake trout. The average lengths of legal, undersized, and all lake trout taken in nets of the two materials did not differ greatly. The percentage of the catch which was undersized (less than 1.25 lb, dressed weight) was 20.8 in nylon nets and 17.7 in cotton. The relative efficiency of cotton and nylon nets showed no trend during the season. The efficiency ratio determined in this study was closely similar to that obtained by earlier workers.Correction of estimates of fishing intensity and abundance for the greater efficiency of the nylon nets used since 1951 has not been attempted. The drastic decline of the lake trout fishery has forced fishermen to make changes in fishing practices in the past few years that cause new bias of an unknown extent to estimates of fishing intensity.
Petri net modelling of gene regulation of the Duchenne muscular dystrophy.
Grunwald, Stefanie; Speer, Astrid; Ackermann, Jörg; Koch, Ina
2008-05-01
Searching for therapeutic strategies for Duchenne muscular dystrophy, it is of great interest to understand the responsible molecular pathways down-stream of dystrophin completely. For this reason we have performed real-time PCR experiments to compare mRNA expression levels of relevant genes in tissues of affected patients and controls. To bring experimental data in context with the underlying pathway theoretical models are needed. Modelling of biological processes in the cell at higher description levels is still an open problem in the field of systems biology. In this paper, a new application of Petri net theory is presented to model gene regulatory processes of Duchenne muscular dystrophy. We have developed a Petri net model, which is based mainly on own experimental and literature data. We distinguish between up- and down-regulated states of gene expression. The analysis of the model comprises the computation of structural and dynamic properties with focus on a thorough T-invariant analysis, including clustering techniques and the decomposition of the network into maximal common transition sets (MCT-sets), which can be interpreted as functionally related building blocks. All possible pathways, which reflect the complex net behaviour in dependence of different gene expression patterns, are discussed. We introduce Mauritius maps of T-invariants, which enable, for example, theoretical knockout analysis. The resulted model serves as basis for a better understanding of pathological processes, and thereby for planning next experimental steps in searching for new therapeutic possibilities. Free availability of the Petri net editor and animator Snoopy and the clustering tool PInA via http://www-dssz.informatik.tu-cottbus.de/~ wwwdssz/. The Petri net models used can be accessed via http://www.tfh-berlin.de/bi/duchenne/.
Petri net controllers for distributed robotic systems
NASA Technical Reports Server (NTRS)
Lefebvre, D. R.; Saridis, George N.
1992-01-01
Petri nets are a well established modelling technique for analyzing parallel systems. When coupled with an event-driven operating system, Petri nets can provide an effective means for integrating and controlling the functions of distributed robotic applications. Recent work has shown that Petri net graphs can also serve as remarkably intuitive operator interfaces. In this paper, the advantages of using Petri nets as high-level controllers to coordinate robotic functions are outlined, the considerations for designing Petri net controllers are discussed, and simple Petri net structures for implementing an interface for operator supervision are presented. A detailed example is presented which illustrates these concepts for a sensor-based assembly application.
Orzeck-Byrnes, Natasha A; Aidasani, Sneha R; Moloney, Dana N; Nguyen, Lisa H; Park, Agnes; Hu, Lu; Langford, Aisha T; Wang, Binhuan; Sevick, Mary Ann; Rogers, Erin S
2018-01-01
Background The Mobile Insulin Titration Intervention (MITI) program helps patients with type 2 diabetes find their correct basal insulin dose without in-person care. Requiring only basic cell phone technology (text messages and phone calls), MITI is highly accessible to patients receiving care in safety-net settings. MITI was shown in a randomized controlled trial (RCT) to be efficacious at a New York City (NYC) safety-net clinic where patients often have challenges coming for in-person care. In 2016, MITI was implemented as usual care at Bellevue Hospital (the site of the original RCT) and at Gouverneur Health (a second NYC safety-net clinic) under 2 different staffing models. Objective This implementation study examined MITI’s transition into real-world settings. To understand MITI’s flexibility, generalizability, and acceptability among patients and providers, we evaluated whether MITI continued to produce positive outcomes in expanded underserved populations, outside of an RCT setting. Methods Patients enrolled in MITI received weekday text messages asking for their fasting blood glucose (FBG) values and a weekly titration call. The goal was for patients to reach their optimal insulin dose (OID), defined either as the dose of once-daily basal insulin required to achieve either an FBG of 80-130 mg/dL (4.4-7.2 mmol/L) or as the reaching of the maximum dose of 50 units. After 12 weeks, if OID was not reached, the patients were asked to return to the clinic for in-person care and titration. MITI program outcomes, clinical outcomes, process outcomes, and patient satisfaction were assessed. Results MITI was successful at both sites, each with a different staffing model. Providers referred 170 patients to the program—129 of whom (75.9%, 129/170) were eligible. Of these, 113 (87.6%, 113/129) enrolled. Moreover, 84.1% (95/113) of patients reached their OID, and they did so in an average of 24 days. Clinical outcomes show that mean FBG levels fell from 209 mg/dL (11.6 mmol/L) to 141 mg/dL (7.8 mmol/L), P<.001. HbA1c levels fell from 11.4% (101 mmol/mol) to 10.0% (86 mmol/mol), P<.001. Process outcomes show that 90.1% of MITI’s text message prompts received a response, nurses connected with patients 81.9% of weeks to provide titration instructions, and 85% of attending physicians made at least one referral to the MITI program. Satisfaction surveys showed that most patients felt comfortable sharing information over text and felt the texts reminded them to take their insulin, check their sugar, and make healthy food choices. Conclusions This implementation study showed MITI to have continued success after transitioning from an RCT program into real-world settings. MITI showed itself to be flexible and generalizable as it easily fits into a second site staffed by general medical clinic–registered nurses and remained acceptable to patients and staff who had high levels of engagement with the program. PMID:29555621
Ontology-supported research on vaccine efficacy, safety and integrative biological networks.
He, Yongqun
2014-07-01
While vaccine efficacy and safety research has dramatically progressed with the methods of in silico prediction and data mining, many challenges still exist. A formal ontology is a human- and computer-interpretable set of terms and relations that represent entities in a specific domain and how these terms relate to each other. Several community-based ontologies (including Vaccine Ontology, Ontology of Adverse Events and Ontology of Vaccine Adverse Events) have been developed to support vaccine and adverse event representation, classification, data integration, literature mining of host-vaccine interaction networks, and analysis of vaccine adverse events. The author further proposes minimal vaccine information standards and their ontology representations, ontology-based linked open vaccine data and meta-analysis, an integrative One Network ('OneNet') Theory of Life, and ontology-based approaches to study and apply the OneNet theory. In the Big Data era, these proposed strategies provide a novel framework for advanced data integration and analysis of fundamental biological networks including vaccine immune mechanisms.
Surface dynamics of amorphous polymers used for high-voltage insulators.
Shemella, Philip T; Laino, Teodoro; Fritz, Oliver; Curioni, Alessandro
2011-11-24
Amorphous siloxane polymers are the backbone of high-voltage insulation materials. The natural hydrophobicity of their surface is a necessary property for avoiding leakage currents and dielectric breakdown. As these surfaces are exposed to the environment, electrical discharges or strong mechanical impact can temporarily destroy their water-repellent properties. After such events, however, a self-healing process sets in and restores the original hydrophobicity within some hours. In the present study, we investigate possible mechanisms of this restoration process. Using large-scale, all-atom molecular dynamics simulations, we show that molecules on the material surface have augmented motion that allows them to rearrange with a net polarization. The overall surface region has a net orientation that contributes to hydrophobicity, and charged groups that are placed at the surface migrate inward, away from the vacuum interface and into the bulk-like region. Our simulations provide insight into the mechanisms for hydrophobic self-recovery that repair material strength and functionality and suggest material compositions for future high-voltage insulators. © 2011 American Chemical Society
Modeling maintenance-strategies with rainbow nets
NASA Astrophysics Data System (ADS)
Johnson, Allen M., Jr.; Schoenfelder, Michael A.; Lebold, David
The Rainbow net (RN) modeling technique offers a promising alternative to traditional reliability modeling techniques. RNs are evaluated through discrete event simulation. Using specialized tokens to represent systems and faults, an RN models the fault-handling behavior of an inventory of systems produced over time. In addition, a portion of the RN represents system repair and the vendor's spare part production. Various dependability parameters are measured and used to calculate the impact of four variations of maintenance strategies. Input variables are chosen to demonstrate the technique. The number of inputs allowed to vary is intentionally constrained to limit the volume of data presented and to avoid overloading the reader with complexity. If only availability data were reviewed, it is possible that the conclusion might be drawn that both strategies are about the same and therefore the cheaper strategy from the vendor's perspective may be chosen. The richer set of metrics provided by the RN simulation gives greater insight into the problem, which leads to better decisions. By using RNs, the impact of several different variables is integrated.
Structural Fingerprinting of Nanocrystals in the Transmission Electron Microscope
NASA Astrophysics Data System (ADS)
Rouvimov, Sergei; Plachinda, Pavel; Moeck, Peter
2010-03-01
Three novel strategies for the structurally identification of nanocrystals in a transmission electron microscope are presented. Either a single high-resolution transmission electron microscopy image [1] or a single precession electron diffractogram (PED) [2] may be employed. PEDs from fine-grained crystal powders may also be utilized. Automation of the former two strategies is in progress and shall lead to statistically significant results on ensembles of nanocrystals. Open-access databases such as the Crystallography Open Database which provides more than 81,500 crystal structure data sets [3] or its mainly inorganic and educational subsets [4] may be utilized. [1] http://www.scientificjournals.org/journals 2007/j/of/dissertation.htm [2] P. Moeck and S. Rouvimov, in: {Drugs and the Pharmaceutical Sciences}, Vol. 191, 2009, 270-313 [3] http://cod.ibt.lt, http://www.crystallography.net, http://cod.ensicaen.fr, http://nanocrystallography.org, http://nanocrystallography.net, http://journals.iucr.org/j/issues/2009/04/00/kk5039/kk5039.pdf [4] http://nanocrystallography.research.pdx.edu/CIF-searchable
The dental safety net in Connecticut.
Beazoglou, Tryfon; Heffley, Dennis; Lepowsky, Steven; Douglass, Joanna; Lopez, Monica; Bailit, Howard
2005-10-01
Many poor, medically disabled and geographically isolated populations have difficulty accessing private-sector dental care and are considered underserved. To address this problem, public- and voluntary-sector organizations have established clinics and provide care to the underserved. Collectively, these clinics are known as "the dental safety net." The authors describe the dental safety net in Connecticut and examine the capacity and efficiency of this system to provide care to the noninstitutionalized underserved population of the state. The authors describe Connecticut's dental safety net in terms of dentists, allied health staff members, operatories, patient visits and patients treated per dentist per year. The authors compare the productivity of safety-net dentists with that of private practitioners. They also estimate the capacity of the safety net to treat people enrolled in Medicaid and the State Children's Health Insurance Program. The safety net is made up of dental clinics in community health centers, hospitals, the dental school and public schools. One hundred eleven dentists, 38 hygienists and 95 dental assistants staff the clinics. Safety-net dentists have fewer patient visits and patients than do private practitioners. The Connecticut safety-net system has the capacity to treat about 28.2 percent of publicly insured patients. The dental safety net is an important community resource, and greater use of allied dental personnel could substantially improve the capacity of the system to care for the poor and other underserved populations.
Adaptive control with an expert system based supervisory level. Thesis
NASA Technical Reports Server (NTRS)
Sullivan, Gerald A.
1991-01-01
Adaptive control is presently one of the methods available which may be used to control plants with poorly modelled dynamics or time varying dynamics. Although many variations of adaptive controllers exist, a common characteristic of all adaptive control schemes, is that input/output measurements from the plant are used to adjust a control law in an on-line fashion. Ideally the adjustment mechanism of the adaptive controller is able to learn enough about the dynamics of the plant from input/output measurements to effectively control the plant. In practice, problems such as measurement noise, controller saturation, and incorrect model order, to name a few, may prevent proper adjustment of the controller and poor performance or instability result. In this work we set out to avoid the inadequacies of procedurally implemented safety nets, by introducing a two level control scheme in which an expert system based 'supervisor' at the upper level provides all the safety net functions for an adaptive controller at the lower level. The expert system is based on a shell called IPEX, (Interactive Process EXpert), that we developed specifically for the diagnosis and treatment of dynamic systems. Some of the more important functions that the IPEX system provides are: (1) temporal reasoning; (2) planning of diagnostic activities; and (3) interactive diagnosis. Also, because knowledge and control logic are separate, the incorporation of new diagnostic and treatment knowledge is relatively simple. We note that the flexibility available in the system to express diagnostic and treatment knowledge, allows much greater functionality than could ever be reasonably expected from procedural implementations of safety nets. The remainder of this chapter is divided into three sections. In section 1.1 we give a detailed review of the literature in the area of supervisory systems for adaptive controllers. In particular, we describe the evolution of safety nets from simple ad hoc techniques, up to the use of expert systems for more advanced supervision capabilities.
Do evergreen and deciduous trees have different effects on net N mineralization in soil?
Mueller, Kevin E; Hobbie, Sarah E; Oleksyn, Jacek; Reich, Peter B; Eissenstat, David M
2012-06-01
Evergreen and deciduous plants are widely expected to have different impacts on soil nitrogen (N) availability because of differences in leaf litter chemistry and ensuing effects on net N mineralization (N(min)). We evaluated this hypothesis by compiling published data on net N(min) rates beneath co-occurring stands of evergreen and deciduous trees. The compiled data included 35 sets of co-occurring stands in temperate and boreal forests. Evergreen and deciduous stands did not have consistently divergent effects on net N(min) rates; net N(min) beneath deciduous trees was higher when comparing natural stands (19 contrasts), but equivalent to evergreens in plantations (16 contrasts). We also compared net N(min) rates beneath pairs of co-occurring genera. Most pairs of genera did not differ consistently, i.e., tree species from one genus had higher net N(min) at some sites and lower net N(min) at other sites. Moreover, several common deciduous genera (Acer, Betula, Populus) and deciduous Quercus spp. did not typically have higher net N(min) rates than common evergreen genera (Pinus, Picea). There are several reasons why tree effects on net N(min) are poorly predicted by leaf habit and phylogeny. For example, the amount of N mineralized from decomposing leaves might be less than the amount of N mineralized from organic matter pools that are less affected by leaf litter traits, such as dead roots and soil organic matter. Also, effects of plant traits and plant groups on net N(min) probably depend on site-specific factors such as stand age and soil type.
NASA Technical Reports Server (NTRS)
Kratochvil, D.; Bowyer, J.; Bhushan, C.; Steinnagel, K.; Kaushal, D.; Al-Kinani, G.
1983-01-01
Potential satellite-provided fixed communications services, baseline forecasts, net long haul forecasts, cost analysis, net addressable forecasts, capacity requirements, and satellite system market development are considered.
Health reform and primary care capacity: evidence from Houston/Harris County, Texas.
Begley, Charles; Le, Phuc; Lairson, David; Hanks, Jeanne; Omojasola, Anthony
2012-02-01
This study estimated the possible surge in demand for primary care among the low-income population in Houston/Harris County under the Patient Protection and Affordable Care Act, and related it to existing supply by safety-net providers. A model of the demand for primary care visits was developed based on California Health Interview Survey data and applied to the Houston/Harris County population. The current supply of primary care visits by safety-net providers was determined by a local survey. Comparisons indicate that safety-net providers in Houston/Harris County are currently meeting about 30% of the demand for primary care visits by the low-income population, and the rest are either met by private practice physicians or are unmet. Demand for primary care by this population is projected to increase by 30% under health reform leading to a drop in demand met by safety-net providers to less than 25%.
Coloured Petri Net Refinement Specification and Correctness Proof with Coq
NASA Technical Reports Server (NTRS)
Choppy, Christine; Mayero, Micaela; Petrucci, Laure
2009-01-01
In this work, we address the formalisation of symmetric nets, a subclass of coloured Petri nets, refinement in COQ. We first provide a formalisation of the net models, and of their type refinement in COQ. Then the COQ proof assistant is used to prove the refinement correctness lemma. An example adapted from a protocol example illustrates our work.
Tomography and generative training with quantum Boltzmann machines
NASA Astrophysics Data System (ADS)
Kieferová, Mária; Wiebe, Nathan
2017-12-01
The promise of quantum neural nets, which utilize quantum effects to model complex data sets, has made their development an aspirational goal for quantum machine learning and quantum computing in general. Here we provide methods of training quantum Boltzmann machines. Our work generalizes existing methods and provides additional approaches for training quantum neural networks that compare favorably to existing methods. We further demonstrate that quantum Boltzmann machines enable a form of partial quantum state tomography that further provides a generative model for the input quantum state. Classical Boltzmann machines are incapable of this. This verifies the long-conjectured connection between tomography and quantum machine learning. Finally, we prove that classical computers cannot simulate our training process in general unless BQP=BPP , provide lower bounds on the complexity of the training procedures and numerically investigate training for small nonstoquastic Hamiltonians.
Targeting Net Zero Energy for Military Installations (Presentation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burman, K.
2012-05-01
Targeting Net Zero Energy for Military Installations in Kaneohe Bay, Hawaii. A net zero energy installation (NZEI) is one that produces as much energy from on-site renewable sources as it consumes. NZEI assessment provides a systematic approach to energy projects.
Hussein, Mustafa; Diez Roux, Ana V; Field, Robert I
2016-12-01
Neighborhood socioeconomic status (SES), an overall marker of neighborhood conditions, may determine residents' access to health care, independently of their own individual characteristics. It remains unclear, however, how the distinct settings where individuals seek care vary by neighborhood SES, particularly in US urban areas. With existing literature being relatively old, revealing how these associations might have changed in recent years is also timely in this US health care reform era. Using data on the Philadelphia region from 2002 to 2012, we performed multilevel analysis to examine the associations of neighborhood SES (measured as census tract median household income) with access to usual sources of primary care (physician offices, community health centers, and hospital outpatient clinics). We found no evidence that residence in a low-income (versus high-income) neighborhood was associated with poorer overall access. However, low-income neighborhood residence was associated with less reliance on physician offices [-4.40 percentage points; 95 % confidence intervals (CI) -5.80, -3.00] and greater reliance on the safety net provided by health centers [2.08; 95 % CI 1.42, 2.75] and outpatient clinics [1.61; 95 % CI 0.97, 2.26]. These patterns largely persisted over the 10 years investigated. These findings suggest that safety-net providers have continued to play an important role in ensuring access to primary care in urban, low-income communities, further underscoring the importance of supporting a strong safety net to ensure equitable access to care regardless of place of residence.
Dugassa, Sisay; Lindh, Jenny M; Lindsay, Steven W; Fillinger, Ulrike
2016-05-10
New sampling tools are needed for collecting exophilic malaria mosquitoes in sub-Saharan Africa to monitor the impact of vector control interventions. The OviART gravid trap and squares of electrocuting nets (e-nets) were recently developed under semi-field conditions for collecting oviposition site seeking Anopheles gambiae (sensu stricto) (s.s.). This study was designed to evaluate the efficacy of these traps for sampling malaria vectors under field conditions. Prior to field testing, two modifications to the prototype OviART gravid trap were evaluated by (i) increasing the surface area and volume of water in the artificial pond which forms part of the trap, and (ii) increasing the strength of the suction fan. Six sampling tools targeting gravid females (Box gravid trap, detergent-treated ponds, e-nets insect glue-treated ponds, sticky boards and sticky floating-acetate sheets) were compared under field conditions to evaluate their relative catching performance and to select a method for comparison with the OviART gravid trap. Finally, the trapping efficacy of the OviART gravid trap and the square of e-nets were compared with a Box gravid trap during the long rainy season in three household clusters in western Kenya. The OviART gravid trap prototype's catch size was doubled by increasing the pond size [rate ratio (RR) 1.9; 95 % confidence interval (CI) 1.1-3.4] but a stronger fan did not improve the catch. The square of e-nets performed better than the other devices, collecting three times more gravid Anopheles spp. than the Box gravid trap (RR 3.3; 95 % CI 1.4-7.6). The OviART gravid trap collections were comparable to those from the e-nets and 3.3 (95 % CI 1.5-7.0) times higher than the number of An. gambiae senso lato (s.l.) collected by the Box gravid trap. Both OviART gravid trap and squares of e-nets collected wild gravid Anopheles gambiae (s.l.) where natural habitats were within 200-400 m of the trap. Whilst the e-nets are difficult to handle and might therefore only be useful as a research device, the OviART gravid trap presents a promising new surveillance tool. Further field testing is needed in different eco-epidemiological settings to provide recommendations for its use.
Prakash Nepal; Peter J. Ince; Kenneth E. Skog; Sun J. Chang
2012-01-01
This paper describes a set of empirical net forest growth models based on forest growing-stock density relationships for three U.S. regions (North, South, and West) and two species groups (softwoods and hardwoods) at the regional aggregate level. The growth models accurately predict historical U.S. timber inventory trends when we incorporate historical timber harvests...
Code of Federal Regulations, 2010 CFR
2010-04-01
...(g)-3T Section 1.904(g)-3T Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES Income from Sources Without the United States § 1.904(g)-3T.... The rules must be applied in the order set forth in paragraphs (b) through (g) of this section. (b...
An Extensible NetLogo Model for Visualizing Message Routing Protocols
2017-08-01
the hard sciences to the social sciences to computer-generated art. NetLogo represents the world as a set of...describe the model is shown here; for the supporting methods , refer to the source code. Approved for public release; distribution is unlimited. 4 iv...if ticks - last-inject > time-to-inject [inject] if run# > #runs [stop] end Next, we present some basic statistics collected for the
Code of Federal Regulations, 2013 CFR
2013-04-01
... tax return for year 1 reported a loss of three million dollars, which was carried to taxpayer's year 2...)(2)(iii)(B)(6) of this section to earlier taxable years in accordance with the rules set forth in § 1... section 482 and net section 482 transfer price adjustments. 1.6662-6 Section 1.6662-6 Internal Revenue...
Code of Federal Regulations, 2012 CFR
2012-04-01
... tax return for year 1 reported a loss of three million dollars, which was carried to taxpayer's year 2...)(2)(iii)(B)(6) of this section to earlier taxable years in accordance with the rules set forth in § 1... section 482 and net section 482 transfer price adjustments. 1.6662-6 Section 1.6662-6 Internal Revenue...
Code of Federal Regulations, 2014 CFR
2014-04-01
... tax return for year 1 reported a loss of three million dollars, which was carried to taxpayer's year 2...)(2)(iii)(B)(6) of this section to earlier taxable years in accordance with the rules set forth in § 1... section 482 and net section 482 transfer price adjustments. 1.6662-6 Section 1.6662-6 Internal Revenue...
Hierarchical Kohonenen net for anomaly detection in network security.
Sarasamma, Suseela T; Zhu, Qiuming A; Huff, Julie
2005-04-01
A novel multilevel hierarchical Kohonen Net (K-Map) for an intrusion detection system is presented. Each level of the hierarchical map is modeled as a simple winner-take-all K-Map. One significant advantage of this multilevel hierarchical K-Map is its computational efficiency. Unlike other statistical anomaly detection methods such as nearest neighbor approach, K-means clustering or probabilistic analysis that employ distance computation in the feature space to identify the outliers, our approach does not involve costly point-to-point computation in organizing the data into clusters. Another advantage is the reduced network size. We use the classification capability of the K-Map on selected dimensions of data set in detecting anomalies. Randomly selected subsets that contain both attacks and normal records from the KDD Cup 1999 benchmark data are used to train the hierarchical net. We use a confidence measure to label the clusters. Then we use the test set from the same KDD Cup 1999 benchmark to test the hierarchical net. We show that a hierarchical K-Map in which each layer operates on a small subset of the feature space is superior to a single-layer K-Map operating on the whole feature space in detecting a variety of attacks in terms of detection rate as well as false positive rate.
ASSESSING ACCURACY OF NET CHANGE DERIVED FROM LAND COVER MAPS
Net change derived from land-cover maps provides important descriptive information for environmental monitoring and is often used as an input or explanatory variable in environmental models. The sampling design and analysis for assessing net change accuracy differ from traditio...
Limitations of shallow nets approximation.
Lin, Shao-Bo
2017-10-01
In this paper, we aim at analyzing the approximation abilities of shallow networks in reproducing kernel Hilbert spaces (RKHSs). We prove that there is a probability measure such that the achievable lower bound for approximating by shallow nets can be realized for all functions in balls of reproducing kernel Hilbert space with high probability, which is different with the classical minimax approximation error estimates. This result together with the existing approximation results for deep nets shows the limitations for shallow nets and provides a theoretical explanation on why deep nets perform better than shallow nets. Copyright © 2017 Elsevier Ltd. All rights reserved.
Net reclassification index at event rate: properties and relationships.
Pencina, Michael J; Steyerberg, Ewout W; D'Agostino, Ralph B
2017-12-10
The net reclassification improvement (NRI) is an attractively simple summary measure quantifying improvement in performance because of addition of new risk marker(s) to a prediction model. Originally proposed for settings with well-established classification thresholds, it quickly extended into applications with no thresholds in common use. Here we aim to explore properties of the NRI at event rate. We express this NRI as a difference in performance measures for the new versus old model and show that the quantity underlying this difference is related to several global as well as decision analytic measures of model performance. It maximizes the relative utility (standardized net benefit) across all classification thresholds and can be viewed as the Kolmogorov-Smirnov distance between the distributions of risk among events and non-events. It can be expressed as a special case of the continuous NRI, measuring reclassification from the 'null' model with no predictors. It is also a criterion based on the value of information and quantifies the reduction in expected regret for a given regret function, casting the NRI at event rate as a measure of incremental reduction in expected regret. More generally, we find it informative to present plots of standardized net benefit/relative utility for the new versus old model across the domain of classification thresholds. Then, these plots can be summarized with their maximum values, and the increment in model performance can be described by the NRI at event rate. We provide theoretical examples and a clinical application on the evaluation of prognostic biomarkers for atrial fibrillation. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Yu, Hua; Jiao, Bingke; Lu, Lu; Wang, Pengfei; Chen, Shuangcheng; Liang, Chengzhi; Liu, Wei
2018-01-01
Accurately reconstructing gene co-expression network is of great importance for uncovering the genetic architecture underlying complex and various phenotypes. The recent availability of high-throughput RNA-seq sequencing has made genome-wide detecting and quantifying of the novel, rare and low-abundance transcripts practical. However, its potential merits in reconstructing gene co-expression network have still not been well explored. Using massive-scale RNA-seq samples, we have designed an ensemble pipeline, called NetMiner, for building genome-scale and high-quality Gene Co-expression Network (GCN) by integrating three frequently used inference algorithms. We constructed a RNA-seq-based GCN in one species of monocot rice. The quality of network obtained by our method was verified and evaluated by the curated gene functional association data sets, which obviously outperformed each single method. In addition, the powerful capability of network for associating genes with functions and agronomic traits was shown by enrichment analysis and case studies. In particular, we demonstrated the potential value of our proposed method to predict the biological roles of unknown protein-coding genes, long non-coding RNA (lncRNA) genes and circular RNA (circRNA) genes. Our results provided a valuable and highly reliable data source to select key candidate genes for subsequent experimental validation. To facilitate identification of novel genes regulating important biological processes and phenotypes in other plants or animals, we have published the source code of NetMiner, making it freely available at https://github.com/czllab/NetMiner.
Colored petri net modeling of small interfering RNA-mediated messenger RNA degradation.
Nickaeen, Niloofar; Moein, Shiva; Heidary, Zarifeh; Ghaisari, Jafar
2016-01-01
Mathematical modeling of biological systems is an attractive way for studying complex biological systems and their behaviors. Petri Nets, due to their ability to model systems with various levels of qualitative information, have been wildly used in modeling biological systems in which enough qualitative data may not be at disposal. These nets have been used to answer questions regarding the dynamics of different cell behaviors including the translation process. In one stage of the translation process, the RNA sequence may be degraded. In the process of degradation of RNA sequence, small-noncoding RNA molecules known as small interfering RNA (siRNA) match the target RNA sequence. As a result of this matching, the target RNA sequence is destroyed. In this context, the process of matching and destruction is modeled using Colored Petri Nets (CPNs). The model is constructed using CPNs which allow tokens to have a value or type on them. Thus, CPN is a suitable tool to model string structures in which each element of the string has a different type. Using CPNs, long RNA, and siRNA strings are modeled with a finite set of colors. The model is simulated via CPN Tools. A CPN model of the matching between RNA and siRNA strings is constructed in CPN Tools environment. In previous studies, a network of stoichiometric equations was modeled. However, in this particular study, we modeled the mechanism behind the silencing process. Modeling this kind of mechanisms provides us with a tool to examine the effects of different factors such as mutation or drugs on the process.
Robotic enucleation of benign pancreatic tumors
Ore, Ana Sofia; Barrows, Courtney E.; Solis-Velasco, Monica; Shaker, Jessica
2017-01-01
Robot-assisted enucleation provides the dual benefits of a minimally-invasive technique and pancreatic parenchymal conservation to selected patients with functional pancreatic neuroendocrine tumors (F-pNETs) and serous cystadenomas. Insulinomas, the most common F-pNETs, are ideal candidates for enucleation when <2 cm given the 80% probability of being benign. Current evidence suggests enucleation for the following: benign, isolated lesions with a distance between tumor and main pancreatic duct ≥3 mm (no focal stricture or dilation), insulinomas, gastrinomas <2 cm, and nonfunctional pancreatic neuroendocrine tumors (NF-pNETs) <1–2 cm and low Ki67 mitotic index. Minimally-invasive enucleation is an imaging-dependent procedure that requires recognizable anatomic landmarks for successful completion, including tumor proximity to the pancreatic duct as well as localization relative to major structures such as the gastroduodenal artery, bile duct, and portal vein. Tumor localization often mandates intraoperative ultrasound aided by duplex studies of intratumoral blood flow and frozen section confirmation. Five patients have undergone robot-assisted enucleation at Beth Israel Deaconess Medical Center between January 2014 and January 2017 with median tumor diameter of 1.3 cm (0.9–1.7 cm) located in the pancreatic head [2] and tail [3]. Surgical indications included insulinoma [2] and NF-pNETs [3]. Median operative time was 204 min (range, 137–347 min) and estimated blood loss of 50 mL. There were no conversions to open or transfusions. Robotic enucleation is a safe and feasible technique that allows parenchymal conservation in a minimally-invasive setting, reducing operative time and length of stay with equivalent pathological outcomes compared to open surgery. PMID:29302427
Playback Station #2 for Cal Net and 5-day-recorder tapes
Eaton, Jerry P.
1978-01-01
A second system (Playback Station #2) has been set up to play back Cal Net 1" tapes and 5-day-recorder 1/2" tapes. As with the first playback system (Playback Station #1) the tapes are played back on a Bell and Howell VR3700B tape deck and the records are written out on a 16-channel direct-writing Siemens "0scillomink." Separate reproduce heads, tape guides, and tape tension sensor rollers are required for playing back 111 tapes and 1/2" tapes, but changing these tape deck components is a simple task that requires only a few minutes. The discriminators, patch panels, selector switches, filters, time code translators, and signal conditioning circuits for the time code translators and for the tape-speed-compensation signal are all mounted in an equipment rack that stands beside the playback tape deck. Changing playback speeds (15/16 ips or 3 3/4 ips) or changing from Cal Net tapes to 5-day-recorder tapes requires only flipping a few switches and/or changing a few patch cables on the patch panel (in addition to changing the reproduce heads, etc., to change from 1" tape to 1/2" tape). For the Cal Net tapes, the system provides for playback of 9 data channels (680 Hz thru 3060 Hz plus 400 Hz) and 3 time signals (IRIG-E, IRIG-C, and WWVB) at both 15/16 ips (x1 speed) and 3 3/4 ips (x4 speed). Available modes of compensation (using either a 4688 Hz reference or a 3125 Hz reference) are subtractive, capstan, capstan plus subtractive, or no compensation.
NASA Technical Reports Server (NTRS)
1993-01-01
This is the Final Technical Report for the NetView Technical Research task. This report is prepared in accordance with Contract Data Requirements List (CDRL) item A002. NetView assistance was provided and details are presented under the following headings: NetView Management Systems (NMS) project tasks; WBAFB IBM 3090; WPAFB AMDAHL; WPAFB IBM 3084; Hill AFB; McClellan AFB AMDAHL; McClellan AFB IBM 3090; and Warner-Robins AFB.
D.P. Turner; W.D. Ritts; B.E. Law; W.B. Cohen; Z. Yan; T. Hudiburg; J.L. Campbell; M. Duane
2007-01-01
Bottom-up scaling of net ecosystem production (NEP) and net biome production (NBP) was used to generate a carbon budget for a large heterogeneous region (the state of Oregon, 2.5x105 km2 ) in the Western United States. Landsat resolution (30 m) remote sensing provided the basis for mapping land cover and disturbance history...
Under the radar: community safety nets for AIDS-affected households in sub-Saharan Africa.
Foster, G
2007-01-01
Safety nets are mechanisms to mitigate the effects of poverty on vulnerable households during times of stress. In sub-Saharan Africa, extended families, together with communities, are the most effective responses enabling access to support for households facing crises. This paper reviews literature on informal social security systems in sub-Saharan Africa, analyses changes taking place in their functioning as a result of HIV/AIDS and describes community safety net components including economic associations, cooperatives, loan providers, philanthropic groups and HIV/AIDS initiatives. Community safety nets target households in greatest need, respond rapidly to crises, are cost efficient, based on local needs and available resources, involve the specialized knowledge of community members and provide financial and psycho-social support. Their main limitations are lack of material resources and reliance on unpaid labour of women. Changes have taken place in safety net mechanisms because of HIV/AIDS, suggesting the resilience of communities rather than their impending collapse. Studies are lacking that assess the value of informal community-level transfers, describe how safety nets assist the poor or analyse modifications in response to HIV/AIDS. The role of community safety nets remains largely invisible under the radar of governments, non-governmental organizations and international bodies. External support can strengthen this system of informal social security that provides poor HIV/AIDS-affected households with significant support.
The development of the Project NetWork administrative records database for policy evaluation.
Rupp, K; Driessen, D; Kornfeld, R; Wood, M
1999-01-01
This article describes the development of SSA's administrative records database for the Project NetWork return-to-work experiment targeting persons with disabilities. The article is part of a series of papers on the evaluation of the Project NetWork demonstration. In addition to 8,248 Project NetWork participants randomly assigned to receive case management services and a control group, the simulation identified 138,613 eligible nonparticipants in the demonstration areas. The output data files contain detailed monthly information on Supplemental Security Income (SSI) and Disability Insurance (DI) benefits, annual earnings, and a set of demographic and diagnostic variables. The data allow for the measurement of net outcomes and the analysis of factors affecting participation. The results suggest that it is feasible to simulate complex eligibility rules using administrative records, and create a clean and edited data file for a comprehensive and credible evaluation. The study shows that it is feasible to use administrative records data for selecting control or comparison groups in future demonstration evaluations.
Interactive Visualization and Analysis of Geospatial Data Sets - TrikeND-iGlobe
NASA Astrophysics Data System (ADS)
Rosebrock, Uwe; Hogan, Patrick; Chandola, Varun
2013-04-01
The visualization of scientific datasets is becoming an ever-increasing challenge as advances in computing technologies have enabled scientists to build high resolution climate models that have produced petabytes of climate data. To interrogate and analyze these large datasets in real-time is a task that pushes the boundaries of computing hardware and software. But integration of climate datasets with geospatial data requires considerable amount of effort and close familiarity of various data formats and projection systems, which has prevented widespread utilization outside of climate community. TrikeND-iGlobe is a sophisticated software tool that bridges this gap, allows easy integration of climate datasets with geospatial datasets and provides sophisticated visualization and analysis capabilities. The objective for TrikeND-iGlobe is the continued building of an open source 4D virtual globe application using NASA World Wind technology that integrates analysis of climate model outputs with remote sensing observations as well as demographic and environmental data sets. This will facilitate a better understanding of global and regional phenomenon, and the impact analysis of climate extreme events. The critical aim is real-time interactive interrogation. At the data centric level the primary aim is to enable the user to interact with the data in real-time for the purpose of analysis - locally or remotely. TrikeND-iGlobe provides the basis for the incorporation of modular tools that provide extended interactions with the data, including sub-setting, aggregation, re-shaping, time series analysis methods and animation to produce publication-quality imagery. TrikeND-iGlobe may be run locally or can be accessed via a web interface supported by high-performance visualization compute nodes placed close to the data. It supports visualizing heterogeneous data formats: traditional geospatial datasets along with scientific data sets with geographic coordinates (NetCDF, HDF, etc.). It also supports multiple data access mechanisms, including HTTP, FTP, WMS, WCS, and Thredds Data Server (for NetCDF data and for scientific data, TrikeND-iGlobe supports various visualization capabilities, including animations, vector field visualization, etc. TrikeND-iGlobe is a collaborative open-source project, contributors include NASA (ARC-PX), ORNL (Oakridge National Laboratories), Unidata, Kansas University, CSIRO CMAR Australia and Geoscience Australia.
ERIC Educational Resources Information Center
Sederburg, William A.
2002-01-01
Using the example of Ferris State University, discusses how a "net-enhanced" university functions and offers guiding principles: serve the core activity, recognize the limits to technology, create a policy structure, provide technical infrastructure, provide personnel infrastructure, build communities, digitize, and don't duplicate. (EV)
Ohio SchoolNet. Schools on the Move.
ERIC Educational Resources Information Center
Ohio State Dept. of Education, Columbus.
SchoolNet is a state-funded partnership that will facilitate the installation of computer and communications networking technology in public schools and classrooms across Ohio and coordinate its use. SchoolNet seeks to provide Ohio students with expanded course offerings; more individualized educational opportunities; interactive learning…
Koch, Ina; Junker, Björn H; Heiner, Monika
2005-04-01
Because of the complexity of metabolic networks and their regulation, formal modelling is a useful method to improve the understanding of these systems. An essential step in network modelling is to validate the network model. Petri net theory provides algorithms and methods, which can be applied directly to metabolic network modelling and analysis in order to validate the model. The metabolism between sucrose and starch in the potato tuber is of great research interest. Even if the metabolism is one of the best studied in sink organs, it is not yet fully understood. We provide an approach for model validation of metabolic networks using Petri net theory, which we demonstrate for the sucrose breakdown pathway in the potato tuber. We start with hierarchical modelling of the metabolic network as a Petri net and continue with the analysis of qualitative properties of the network. The results characterize the net structure and give insights into the complex net behaviour.
Nawrotzki, Raphael J.; Jiang, Leiwen
2015-01-01
Although data for the total number of international migrant flows is now available, no global dataset concerning demographic characteristics, such as the age and gender composition of migrant flows exists. This paper reports on the methods used to generate the CDM-IM dataset of age and gender specific profiles of bilateral net (not gross) migrant flows. We employ raw data from the United Nations Global Migration Database and estimate net migrant flows by age and gender between two time points around the year 2000, accounting for various demographic processes (fertility, mortality). The dataset contains information on 3,713 net migrant flows. Validation analyses against existing data sets and the historical, geopolitical context demonstrate that the CDM-IM dataset is of reasonably high quality. PMID:26692590
Small town health care safety nets: report on a pilot study.
Taylor, Pat; Blewett, Lynn; Brasure, Michelle; Call, Kathleen Thiede; Larson, Eric; Gale, John; Hagopian, Amy; Hart, L Gary; Hartley, David; House, Peter; James, Mary Katherine; Ricketts, Thomas
2003-01-01
Very little is known about the health care safety net in small towns, especially in towns where there is no publicly subsidized safety-net health care. This pilot study of the primary care safety net in 7 such communities was conducted to start building knowledge about the rural safety net. Interviews were conducted and secondary data collected to assess the community need for safety-net care, the health care safety-net role of public officials, and the availability of safety-net care at private primary care practices and its financial impact on these practices. An estimated 20% to 40% of the people in these communities were inadequately insured and needed access to affordable health care, and private primary care practices in most towns played an important role in making primary care available to them. Most of the physician practices were owned or subsidized by a hospital or regional network, though not explicitly to provide charity care. It is likely this ownership or support enabled the practices to sustain a higher level of charity care than would have been possible otherwise. In the majority of communities studied, the leading public officials played no role in ensuring access to safety-net care. State and national government policy makers should consider subsidy programs for private primary care practices that attempt to meet the needs of the inadequately insured in the many rural communities where no publicly subsidized primary safety-net care is available. Subsidies should be directed to physicians in primary care shortage areas who provide safety-net care; this will improve safety-net access and, at the same time, improve physician retention by bolstering physician incomes. Options include enhanced Medicare physician bonuses and grants or tax credits to support income-related sliding fee scales.
Web-based visualization of gridded dataset usings OceanBrowser
NASA Astrophysics Data System (ADS)
Barth, Alexander; Watelet, Sylvain; Troupin, Charles; Beckers, Jean-Marie
2015-04-01
OceanBrowser is a web-based visualization tool for gridded oceanographic data sets. Those data sets are typically four-dimensional (longitude, latitude, depth and time). OceanBrowser allows one to visualize horizontal sections at a given depth and time to examine the horizontal distribution of a given variable. It also offers the possibility to display the results on an arbitrary vertical section. To study the evolution of the variable in time, the horizontal and vertical sections can also be animated. Vertical section can be generated by using a fixed distance from coast or fixed ocean depth. The user can customize the plot by changing the color-map, the range of the color-bar, the type of the plot (linearly interpolated color, simple contours, filled contours) and download the current view as a simple image or as Keyhole Markup Language (KML) file for visualization in applications such as Google Earth. The data products can also be accessed as NetCDF files and through OPeNDAP. Third-party layers from a web map service can also be integrated. OceanBrowser is used in the frame of the SeaDataNet project (http://gher-diva.phys.ulg.ac.be/web-vis/) and EMODNET Chemistry (http://oceanbrowser.net/emodnet/) to distribute gridded data sets interpolated from in situ observation using DIVA (Data-Interpolating Variational Analysis).
Sluydts, Vincent; Durnez, Lies; Heng, Somony; Gryseels, Charlotte; Canier, Lydie; Kim, Saorin; Van Roey, Karel; Kerkhof, Karen; Khim, Nimol; Mao, Sokny; Uk, Sambunny; Sovannaroth, Siv; Grietens, Koen Peeters; Sochantha, Tho; Menard, Didier; Coosemans, Marc
2016-10-01
Although effective topical repellents provide personal protection against malaria, whether mass use of topical repellents in addition to long-lasting insecticidal nets can contribute to a further decline of malaria is not known, particularly in areas where outdoor transmission occurs. We aimed to assess the epidemiological efficacy of a highly effective topical repellent in addition to long-lasting insecticidal nets in reducing malaria prevalence in this setting. A cluster randomised controlled trial was done in the 117 most endemic villages in Ratanakiri province, Cambodia, to assess the efficacy of topical repellents in addition to long-lasting insecticidal nets in controlling malaria in a low-endemic setting. We did a pre-trial assessment of village accessibility and excluded four villages because of their inaccessibility during the rainy season. Another 25 villages were grouped because of their proximity to each other, resulting in 98 study clusters (comprising either a single village or multiple neighbouring villages). Clusters were randomly assigned (1:1) to either a control (long-lasting insecticidal nets) or intervention (long-lasting insecticidal nets plus topical repellent) study group after a restricted randomisation. All clusters received one long-lasting insecticidal net per individual, whereas those in the intervention group also received safe and effective topical repellents (picaridin KBR3023, SC Johnson, Racine, WI, USA), along with instruction and promotion of its daily use. Cross-sectional surveys of 65 randomly selected individuals per cluster were done at the beginning and end of the malaria transmission season in 2012 and 2013. The primary outcome was Plasmodium species-specific prevalence in participants obtained by real-time PCR, assessed in the intention-to-treat population. Complete safety analysis data will be published seperately; any ad-hoc adverse events are reported here. This trial is registered with ClinicalTrials.gov, number NCT01663831. Of the 98 clusters that villages were split into, 49 were assigned to the control group and 49 were assigned to the intervention group. Despite having a successful distribution system, the daily use of repellents was suboptimum. No post-intervention differences in PCR plasmodium prevalence were observed between study groups in 2012 (4·91% in the control group vs 4·86% in the intervention group; adjusted odds ratio [aOR] 1·01 [95% CI 0·60-1·70]; p=0·975) or in 2013 (2·96% in the control group vs 3·85% in the intervention group; aOR 1·31 [0·81-2·11]; p=0·266). Similar results were obtained according to Plasmodium species (1·33% of participants in the intervention group vs 1·10% in the intervention group were infected with Plasmodium falciparum; aOR 0·83 [0·44-1·56]; p=0·561; and 1·85% in the control group vs 2·67% in the intervention group were infected with Plasmodium vivax; aOR 1·51 [0·88-2·57]; p=0·133). 41 adverse event notifications from nine villages were received, of which 33 were classified as adverse reactions (11 of these 33 were cases of repellent abuse through oral ingestion, either accidental or not). All participants with adverse reactions fully recovered and 17 were advised to permanently stop using the repellent. Mass distribution of highly effective topical repellents in resource-sufficient conditions did not contribute to a further decline in malaria endemicity in a pre-elimination setting in the Greater Mekong subregion. Daily compliance and appropriate use of the repellents remains the main obstacle. Bill & Melinda Gates Foundation. Copyright © 2016 The Author(s). Published by Elsevier Ltd. This is an Open Access article under the CC BY license. Published by Elsevier Ltd.. All rights reserved.
Management and development of local area network upgrade prototype
NASA Technical Reports Server (NTRS)
Fouser, T. J.
1981-01-01
Given the situation of having management and development users accessing a central computing facility and given the fact that these same users have the need for local computation and storage, the utilization of a commercially available networking system such as CP/NET from Digital Research provides the building blocks for communicating intelligent microsystems to file and print services. The major problems to be overcome in the implementation of such a network are the dearth of intelligent communication front-ends for the microcomputers and the lack of a rich set of management and software development tools.
Fast Response Shape Memory Effect Titanium Nickel (TiNi) Foam Torque Tubes
NASA Technical Reports Server (NTRS)
Jardine, Peter
2014-01-01
Shape Change Technologies has developed a process to manufacture net-shaped TiNi foam torque tubes that demonstrate the shape memory effect. The torque tubes dramatically reduce response time by a factor of 10. This Phase II project matured the actuator technology by rigorously characterizing the process to optimize the quality of the TiNi and developing a set of metrics to provide ISO 9002 quality assurance. A laboratory virtual instrument engineering workbench (LabVIEW'TM')-based, real-time control of the torsional actuators was developed. These actuators were developed with The Boeing Company for aerospace applications.
NASA Astrophysics Data System (ADS)
Ward-Garrison, C.; May, R.; Davis, E.; Arms, S. C.
2016-12-01
NetCDF is a set of software libraries and self-describing, machine-independent data formats that support the creation, access, and sharing of array-oriented scientific data. The Climate and Forecasting (CF) metadata conventions for netCDF foster the ability to work with netCDF files in general and useful ways. These conventions include metadata attributes for physical units, standard names, and spatial coordinate systems. While these conventions have been successful in easing the use of working with netCDF-formatted output from climate and forecast models, their use for point-based observation data has been less so. Unidata has prototyped using the discrete sampling geometry (DSG) CF conventions to serve, using the THREDDS Data Server, the real-time point observation data flowing across the Internet Data Distribution (IDD). These data originate in text format reports for individual stations (e.g. METAR surface data or TEMP upper air data) and are converted and stored in netCDF files in real-time. This work discusses the experiences and challenges of using the current CF DSG conventions for storing such real-time data. We also test how parts of netCDF's extended data model can address these challenges, in order to inform decisions for a future version of CF (CF 2.0) that would take advantage of features of the netCDF enhanced data model.
Clark, S.J.; Jackson, J.R.; Lochmann, S.E.
2007-01-01
We compared shoreline seines with fyke nets in terms of their ability to sample fish species in the littoral zone of 22 floodplain lakes of the White River, Arkansas. Lakes ranged in size from less than 0.5 to 51.0 ha. Most contained large amounts of coarse woody debris within the littoral zone, thus making seining in shallow areas difficult. We sampled large lakes (>2 ha) using three fyke nets; small lakes (<2 ha) were sampled using two fyke nets. Fyke nets were set for 24 h. Large lakes were sampled with an average of 11 seine hauls/ lake and small lakes were sampled with an average of 3 seine hauls/lake, but exact shoreline seining effort varied among lakes depending on the amount of open shoreline. Fyke nets collected more fish and produced greater species richness and diversity measures than did seining. Species evenness was similar for the two gear types. Two species were unique to seine samples, whereas 13 species and 3 families were unique to fyke-net samples. Although fyke nets collected more fish and more species than did shoreline seines, neither gear collected all the species present in the littoral zone of floodplain lakes. These results confirm the need for a multiple-gear approach to fully characterize the littoral fish assemblages in floodplain lakes. ?? Copyright by the American Fisheries Society 2007.
A comparison of lead lengths for mini-fyke nets to sample age-0 gar species
Long, James M.; Snow, Richard A.; Patterson, Chas P.
2016-01-01
Mini-fyke nets are often used to sample small-bodied fishes in shallow (<1 m depth) water, especially in vegetated shoreline habitats where seines are ineffective. Recent interest in gar (Lepisosteidae) ecology and conservation led us to explore the use of mini-fyke nets to capture age-0 gar and specifically how capture is affected by lead length of the fyke net. In the summers of 2012, 2013, and 2015, mini-fyke nets with two different lead lengths (4.57 m and 9.14 m) were set at random sites in backwaters and coves of the Red River arm of Lake Texoma, Oklahoma. Mean CPUE (catch-per-unit-effort; number per net night) was significantly lower for mini-fyke nets with short leads (0.52) compared to those with long leads (1.51). Additionally, Spotted Gar (Lepisosteus oculatus) were captured at a higher rate than the other three gar species present in Lake Texoma, although this could have been an artifact of sampling location. We found that differences in length-frequency of captured gar between gear types were nearly significant, with total length ranging from 47mm to 590mm. Mini-fyke nets with longer leads increased the efficiency of sampling for age-0 gar by increasing catch rate without affecting estimates of other population parameters and appear to be useful for this purpose.
NetCDF-U - Uncertainty conventions for netCDF datasets
NASA Astrophysics Data System (ADS)
Bigagli, Lorenzo; Nativi, Stefano; Domenico, Ben
2013-04-01
To facilitate the automated processing of uncertain data (e.g. uncertainty propagation in modeling applications), we have proposed a set of conventions for expressing uncertainty information within the netCDF data model and format: the NetCDF Uncertainty Conventions (NetCDF-U). From a theoretical perspective, it can be said that no dataset is a perfect representation of the reality it purports to represent. Inevitably, errors arise from the observation process, including the sensor system and subsequent processing, differences in scales of phenomena and the spatial support of the observation mechanism, lack of knowledge about the detailed conversion between the measured quantity and the target variable. This means that, in principle, all data should be treated as uncertain. The most natural representation of an uncertain quantity is in terms of random variables, with a probabilistic approach. However, it must be acknowledged that almost all existing data resources are not treated in this way. Most datasets come simply as a series of values, often without any uncertainty information. If uncertainty information is present, then it is typically within the metadata, as a data quality element. This is typically a global (dataset wide) representation of uncertainty, often derived through some form of validation process. Typically, it is a statistical measure of spread, for example the standard deviation of the residuals. The introduction of a mechanism by which such descriptions of uncertainty can be integrated into existing geospatial applications is considered a practical step towards a more accurate modeling of our uncertain understanding of any natural process. Given the generality and flexibility of the netCDF data model, conventions on naming, syntax, and semantics have been adopted by several communities of practice, as a means of improving data interoperability. Some of the existing conventions include provisions on uncertain elements and concepts, but, to our knowledge, no general convention on the encoding of uncertainty has been proposed, to date. In particular, the netCDF Climate and Forecast Conventions (NetCDF-CF), a de-facto standard for a large amount of data in Fluid Earth Sciences, mention the issue and provide limited support for uncertainty representation. NetCDF-U is designed to be fully compatible with NetCDF-CF, where possible adopting the same mechanisms (e.g. using the same attributes name with compatible semantics). The rationale for this is that a probabilistic description of scientific quantities is a crosscutting aspect, which may be modularized (note that a netCDF dataset may be compliant with more than one convention). The scope of NetCDF-U is to extend and qualify the netCDF classic data model (also known as netCDF3), to capture the uncertainty related to geospatial information encoded in that format. In the future, a netCDF4 approach for uncertainty encoding will be investigated. The NetCDF-U Conventions have the following rationale: • Compatibility with netCDF-CF Conventions 1.5. • Human-readability of conforming datasets structure. • Minimal difference between certain/agnostic and uncertain representations of data (e.g. with respect to dataset structure). NetCDF-U is based on a generic mechanism for annotating netCDF data variables with probability theory semantics. The Uncertainty Markup Language (UncertML) 2.0 is used as a controlled conceptual model and vocabulary for NetCDF-U annotations. The proposed mechanism anticipates a generalized support for semantic annotations in netCDF. NetCDF-U defines syntactical conventions for encoding samples, summary statistics, and distributions, along with mechanisms for expressing dependency relationships among variables. The conventions were accepted as an Open Geospatial Consortium (OGC) Discussion Paper (OGC 11-163); related discussions are conducted on a public forum hosted by the OGC. NetCDF-U may have implications for future work directed at communicating geospatial data provenance and uncertainty in contexts other than netCDF. The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement n° 248488.
Kilian, Albert; Byamukama, Wilson; Pigeon, Olivier; Atieli, Francis; Duchon, Stephan; Phan, Chi
2008-01-01
Background In order to evaluate whether criteria for LLIN field performance (phase III) set by the WHO Pesticide Evaluation Scheme are met, first and second generations of one of these products, PermaNet®, a polyester net using the coating technology were tested. Methods A randomized, double blinded study design was used comparing LLIN to conventionally treated nets and following LLIN for three years under regular household use in rural conditions. Primary outcome measures were deltamethrin residue and bioassay performance (60 minute knock-down and 24 hour mortality after a three minute exposure) using a strain of Anopheles gambiae s.s. sensitive to pyrethroid insecticides. Results Baseline concentration of deltamethrin was within targets for all net types but was rapidly lost in conventionally treated nets and first generation PermaNet® with median of 0.7 and 2.5 mg/m2 after six months respectively. In contrast, second generation PermaNet® retained insecticide well and had 41.5% of baseline dose after 36 months (28.7 mg/m2). Similarly, vector mortality and knockdown dropped to 18% and 70% respectively for first generation LLIN after six months but remained high (88.5% and 97.8% respectively) for second generation PermaNet® after 36 months of follow up at which time 90.0% of nets had either a knockdown rate ≥ 95% or mortality rate ≥ 80%. Conclusion Second generation PermaNet® showed excellent results after three years of field use and fulfilled the WHOPES criteria for LLIN. Loss of insecticide on LLIN using coating technology under field conditions was far more influenced by factors associated with handling rather than washing. PMID:18355408
Complex versus simple models: ion-channel cardiac toxicity prediction.
Mistry, Hitesh B
2018-01-01
There is growing interest in applying detailed mathematical models of the heart for ion-channel related cardiac toxicity prediction. However, a debate as to whether such complex models are required exists. Here an assessment in the predictive performance between two established large-scale biophysical cardiac models and a simple linear model B net was conducted. Three ion-channel data-sets were extracted from literature. Each compound was designated a cardiac risk category using two different classification schemes based on information within CredibleMeds. The predictive performance of each model within each data-set for each classification scheme was assessed via a leave-one-out cross validation. Overall the B net model performed equally as well as the leading cardiac models in two of the data-sets and outperformed both cardiac models on the latest. These results highlight the importance of benchmarking complex versus simple models but also encourage the development of simple models.
Tuot, Delphine S; McCulloch, Charles E; Velasquez, Alexandra; Schillinger, Dean; Hsu, Chi-Yuan; Handley, Margaret; Powe, Neil R
2018-04-23
Many individuals with chronic kidney disease (CKD) do not receive guideline-concordant care. We examined the impact of a team-based primary care CKD registry on clinical measures and processes of care among patients with CKD cared for in a public safety-net health care delivery system. Pragmatic trial of a CKD registry versus a usual-care registry for 1 year. Primary care providers (PCPs) and their patients with CKD in a safety-net primary care setting in San Francisco. The CKD registry identified at point of care all patients with CKD, those with blood pressure (BP)>140/90mmHg, those without angiotensin-converting enzyme (ACE) inhibitor/angiotensin receptor blocker (ARB) prescription, and those without albuminuria quantification in the past year. It also provided quarterly feedback pertinent to these metrics to promote "outreach" to patients with CKD. The usual-care registry provided point-of-care cancer screening and immunization data. Changes in systolic BP at 12 months (primary outcome), proportion of patients with BP control, prescription of ACE inhibitors/ARBs, quantification of albuminuria, severity of albuminuria, and estimated glomerular filtration rate. The patient population (n=746) had a mean age of 56.7±12.1 (standard deviation) years, was 53% women, and was diverse (8% non-Hispanic white, 35.7% black, 24.5% Hispanic, and 24.4% Asian). Randomization to the CKD registry (30 PCPs, 285 patients) versus the usual-care registry (49 PCPs, 461 patients) was associated with 2-fold greater odds of ACE inhibitor/ARB prescription (adjusted OR, 2.25; 95% CI, 1.45-3.49) and albuminuria quantification (adjusted OR, 2.44; 95% CI, 1.38-4.29) during the 1-year study period. Randomization to the CKD registry was not associated with changes in systolic BP, proportion of patients with uncontrolled BP, or degree of albuminuria or estimated glomerular filtration rate. Potential misclassification of CKD; missing baseline medication data; limited to study of a public safety-net health care system. A team-based safety-net primary care CKD registry did not improve BP parameters, but led to greater albuminuria quantification and more ACE inhibitor/ARB prescriptions after 1 year. Adoption of team-based CKD registries may represent an important step in translating evidence into practice for CKD management. Copyright © 2018 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.
Martiník, Ivo
2015-01-01
Rich-media describes a broad range of digital interactive media that is increasingly used in the Internet and also in the support of education. Last year, a special pilot audiovisual lecture room was built as a part of the MERLINGO (MEdia-rich Repository of LearnING Objects) project solution. It contains all the elements of the modern lecture room determined for the implementation of presentation recordings based on the rich-media technologies and their publication online or on-demand featuring the access of all its elements in the automated mode including automatic editing. Property-preserving Petri net process algebras (PPPA) were designed for the specification and verification of the Petri net processes. PPPA does not need to verify the composition of the Petri net processes because all their algebraic operators preserve the specified set of the properties. These original PPPA are significantly generalized for the newly introduced class of the SNT Petri process and agent nets in this paper. The PLACE-SUBST and ASYNC-PROC algebraic operators are defined for this class of Petri nets and their chosen properties are proved. The SNT Petri process and agent nets theory were significantly applied at the design, verification, and implementation of the programming system ensuring the pilot audiovisual lecture room functionality.
Martiník, Ivo
2015-01-01
Rich-media describes a broad range of digital interactive media that is increasingly used in the Internet and also in the support of education. Last year, a special pilot audiovisual lecture room was built as a part of the MERLINGO (MEdia-rich Repository of LearnING Objects) project solution. It contains all the elements of the modern lecture room determined for the implementation of presentation recordings based on the rich-media technologies and their publication online or on-demand featuring the access of all its elements in the automated mode including automatic editing. Property-preserving Petri net process algebras (PPPA) were designed for the specification and verification of the Petri net processes. PPPA does not need to verify the composition of the Petri net processes because all their algebraic operators preserve the specified set of the properties. These original PPPA are significantly generalized for the newly introduced class of the SNT Petri process and agent nets in this paper. The PLACE-SUBST and ASYNC-PROC algebraic operators are defined for this class of Petri nets and their chosen properties are proved. The SNT Petri process and agent nets theory were significantly applied at the design, verification, and implementation of the programming system ensuring the pilot audiovisual lecture room functionality. PMID:26258164
Galvin, Kathleen T; Petford, Nick; Ajose, Frances; Davies, Dai
2011-01-01
Background: The effectiveness of malaria control programs is determined by an array of complex factors, including the acceptability and sustained use of preventative measures such as the bed net. A small-scale exploratory study was conducted in several locations in the Niger Delta region, Nigeria, to discover barriers against the use of bed nets, in the context of a current drive to scale up net use in Nigeria. Methods: A qualitative approach with a convenience sample was used. One to one interviews with mostly male adult volunteers were undertaken which explored typical living and sleeping arrangements, and perceptions about and barriers against the use of the mosquito prevention bed net. Results: Several key issues emerged from the qualitative data. Bed nets were not reported as widely used in this small sample. The reasons reported for lack of use included issues of convenience, especially net set up and dismantling; potential hazard and safety concerns; issues related to typical family composition and nature of accommodation; humid weather conditions; and perceptions of cost and effectiveness. Most barriers to net use concerned issues about everyday practical living and sleeping arrangements and perceptions about comfort. Interviewees identified were aware of malaria infection risks, but several also indicated certain beliefs that were barriers to net use. Conclusions: Successful control of malaria and scale up of insecticide-treated net coverage relies on community perceptions and practice. This small study has illuminated a number of important everyday life issues, which remain barriers to sustained net use, and has clarified further questions to be considered in net design and in future research studies. The study highlights the need for further research on the human concerns that contribute to sustained use of nets or, conversely, present significant barriers to their use. PMID:21544249
Neutrophil extracellular traps - the dark side of neutrophils.
Sørensen, Ole E; Borregaard, Niels
2016-05-02
Neutrophil extracellular traps (NETs) were discovered as extracellular strands of decondensed DNA in complex with histones and granule proteins, which were expelled from dying neutrophils to ensnare and kill microbes. NETs are formed during infection in vivo by mechanisms different from those originally described in vitro. Citrullination of histones by peptidyl arginine deiminase 4 (PAD4) is central for NET formation in vivo. NETs may spur formation of autoantibodies and may also serve as scaffolds for thrombosis, thereby providing a link among infection, autoimmunity, and thrombosis. In this review, we present the mechanisms by which NETs are formed and discuss the physiological and pathophysiological consequences of NET formation. We conclude that NETs may be of more importance in autoimmunity and thrombosis than in innate immune defense.
Modeling of Biometric Identification System Using the Colored Petri Nets
NASA Astrophysics Data System (ADS)
Petrosyan, G. R.; Ter-Vardanyan, L. A.; Gaboutchian, A. V.
2015-05-01
In this paper we present a model of biometric identification system transformed into Petri Nets. Petri Nets, as a graphical and mathematical tool, provide a uniform environment for modelling, formal analysis, and design of discrete event systems. The main objective of this paper is to introduce the fundamental concepts of Petri Nets to the researchers and practitioners, both from identification systems, who are involved in the work in the areas of modelling and analysis of biometric identification types of systems, as well as those who may potentially be involved in these areas. In addition, the paper introduces high-level Petri Nets, as Colored Petri Nets (CPN). In this paper the model of Colored Petri Net describes the identification process much simpler.
QuarkNet: A Unique and Transformative Physics Education Program
ERIC Educational Resources Information Center
Bardeen, Marjorie; Wayne, Mitchell; Young, M. Jean
2018-01-01
The QuarkNet Collaboration has forged nontraditional relationships among particle physicists, high school teachers, and their students. QuarkNet centers are located at 50+ universities and labs across the United States and Puerto Rico. We provide professional development for teachers and create opportunities for teachers and students to engage in…
BioNet Digital Communications Framework
NASA Technical Reports Server (NTRS)
Gifford, Kevin; Kuzminsky, Sebastian; Williams, Shea
2010-01-01
BioNet v2 is a peer-to-peer middleware that enables digital communication devices to talk to each other. It provides a software development framework, standardized application, network-transparent device integration services, a flexible messaging model, and network communications for distributed applications. BioNet is an implementation of the Constellation Program Command, Control, Communications and Information (C3I) Interoperability specification, given in CxP 70022-01. The system architecture provides the necessary infrastructure for the integration of heterogeneous wired and wireless sensing and control devices into a unified data system with a standardized application interface, providing plug-and-play operation for hardware and software systems. BioNet v2 features a naming schema for mobility and coarse-grained localization information, data normalization within a network-transparent device driver framework, enabling of network communications to non-IP devices, and fine-grained application control of data subscription band width usage. BioNet directly integrates Disruption Tolerant Networking (DTN) as a communications technology, enabling networked communications with assets that are only intermittently connected including orbiting relay satellites and planetary rover vehicles.
Squeeze-SegNet: a new fast deep convolutional neural network for semantic segmentation
NASA Astrophysics Data System (ADS)
Nanfack, Geraldin; Elhassouny, Azeddine; Oulad Haj Thami, Rachid
2018-04-01
The recent researches in Deep Convolutional Neural Network have focused their attention on improving accuracy that provide significant advances. However, if they were limited to classification tasks, nowadays with contributions from Scientific Communities who are embarking in this field, they have become very useful in higher level tasks such as object detection and pixel-wise semantic segmentation. Thus, brilliant ideas in the field of semantic segmentation with deep learning have completed the state of the art of accuracy, however this architectures become very difficult to apply in embedded systems as is the case for autonomous driving. We present a new Deep fully Convolutional Neural Network for pixel-wise semantic segmentation which we call Squeeze-SegNet. The architecture is based on Encoder-Decoder style. We use a SqueezeNet-like encoder and a decoder formed by our proposed squeeze-decoder module and upsample layer using downsample indices like in SegNet and we add a deconvolution layer to provide final multi-channel feature map. On datasets like Camvid or City-states, our net gets SegNet-level accuracy with less than 10 times fewer parameters than SegNet.
Marketing netcoatings for aquaculture.
Martin, Robert J
2014-10-17
Unsustainable harvesting of natural fish stocks is driving an ever growing marine aquaculture industry. Part of the aquaculture support industry is net suppliers who provide producers with nets used in confining fish while they are grown to market size. Biofouling must be addressed in marine environments to ensure maximum product growth by maintaining water flow and waste removal through the nets. Biofouling is managed with copper and organic biocide based net coatings. The aquaculture industry provides a case study for business issues related to entry of improved fouling management technology into the marketplace. Several major hurdles hinder entry of improved novel technologies into the market. The first hurdle is due to the structure of business relationships. Net suppliers can actually cut their business profits dramatically by introducing improved technologies. A second major hurdle is financial costs of registration and demonstration of efficacy and quality product with a new technology. Costs of registration are prohibitive if only the net coatings market is involved. Demonstration of quality product requires collaboration and a team approach between formulators, net suppliers and farmers. An alternative solution is a vertically integrated business model in which the support business and product production business are part of the same company.
Marketing Netcoatings for Aquaculture
Martin, Robert J.
2014-01-01
Unsustainable harvesting of natural fish stocks is driving an ever growing marine aquaculture industry. Part of the aquaculture support industry is net suppliers who provide producers with nets used in confining fish while they are grown to market size. Biofouling must be addressed in marine environments to ensure maximum product growth by maintaining water flow and waste removal through the nets. Biofouling is managed with copper and organic biocide based net coatings. The aquaculture industry provides a case study for business issues related to entry of improved fouling management technology into the marketplace. Several major hurdles hinder entry of improved novel technologies into the market. The first hurdle is due to the structure of business relationships. Net suppliers can actually cut their business profits dramatically by introducing improved technologies. A second major hurdle is financial costs of registration and demonstration of efficacy and quality product with a new technology. Costs of registration are prohibitive if only the net coatings market is involved. Demonstration of quality product requires collaboration and a team approach between formulators, net suppliers and farmers. An alternative solution is a vertically integrated business model in which the support business and product production business are part of the same company. PMID:25329615
NASA Astrophysics Data System (ADS)
Cristofoletti, P.; Esposito, A.; Anzidei, M.
2003-04-01
This paper presents the methodologies and issues involved in the use of GIS techniques to manage geodetic information derived from networks in seismic and volcanic areas. Organization and manipulation of different geodetical, geological and seismic database, give us a new challenge in interpretation of information that has several dimensions, including spatial and temporal variations, also the flexibility and brand range of tools available in GeoNetGIS, make it an attractive platform for earthquake risk assessment. During the last decade the use of geodetic networks based on the Global Positioning System, devoted to geophysical applications, especially for crustal deformation monitoring in seismic and volcanic areas, increased dramatically. The large amount of data provided by these networks, combined with different and independent observations, such as epicentre distribution of recent and historical earthquakes, geological and structural data, photo interpretation of aerial and satellite images, can aid for the detection and parameterization of seismogenic sources. In particular we applied our geodetic oriented GIS to a new GPS network recently set up and surveyed in the Central Apennine region: the CA-GeoNet. GeoNetGIS is designed to analyze in three and four dimensions GPS sources and to improve crustal deformation analysis and interpretation related with tectonic structures and seismicity. It manages many database (DBMS) consisting of different classes, such as Geodesy, Topography, Seismicity, Geology, Geography and Raster Images, administrated according to Thematic Layers. GeoNetGIS represents a powerful research tool allowing to join the analysis of all data layers to integrate the different data base which aid for the identification of the activity of known faults or structures and suggesting the new evidences of active tectonics. A new approach to data integration given by GeoNetGIS capabilities, allow us to create and deliver a wide range of maps, digital and 3-dimensional environment data analysis applications for geophysical users and civil defense companies, also distributing them on the World Wide Web or in wireless connection realized by PDA computer. It runs on powerful PC platform under Win2000 Prof OS © and based on ArcGIS 8.2 ESRI © software.
BioNetFit: a fitting tool compatible with BioNetGen, NFsim and distributed computing environments
Thomas, Brandon R.; Chylek, Lily A.; Colvin, Joshua; Sirimulla, Suman; Clayton, Andrew H.A.; Hlavacek, William S.; Posner, Richard G.
2016-01-01
Summary: Rule-based models are analyzed with specialized simulators, such as those provided by the BioNetGen and NFsim open-source software packages. Here, we present BioNetFit, a general-purpose fitting tool that is compatible with BioNetGen and NFsim. BioNetFit is designed to take advantage of distributed computing resources. This feature facilitates fitting (i.e. optimization of parameter values for consistency with data) when simulations are computationally expensive. Availability and implementation: BioNetFit can be used on stand-alone Mac, Windows/Cygwin, and Linux platforms and on Linux-based clusters running SLURM, Torque/PBS, or SGE. The BioNetFit source code (Perl) is freely available (http://bionetfit.nau.edu). Supplementary information: Supplementary data are available at Bioinformatics online. Contact: bionetgen.help@gmail.com PMID:26556387
NETopathies? Unraveling the Dark Side of Old Diseases through Neutrophils.
Mitsios, Alexandros; Arampatzioglou, Athanasios; Arelaki, Stella; Mitroulis, Ioannis; Ritis, Konstantinos
2016-01-01
Neutrophil extracellular traps (NETs) were initially described as an antimicrobial mechanism of neutrophils. Over the last decade, several lines of evidence support the involvement of NETs in a plethora of pathological conditions. Clinical and experimental data indicate that NET release constitutes a shared mechanism, which is involved in a different degree in various manifestations of non-infectious diseases. Even though the backbone of NETs is similar, there are differences in their protein load in different diseases, which represent alterations in neutrophil protein expression in distinct disorder-specific microenvironments. The characterization of NET protein load in different NET-driven disorders could be of significant diagnostic and/or therapeutic value. Additionally, it will provide further evidence for the role of NETs in disease pathogenesis, and it will enable the characterization of disorders in which neutrophils and NET-dependent inflammation are of critical importance.
Gold, M; Mittler, J; Lyons, B
2000-12-01
Studies have highlighted the tensions that can arise between Medicaid managed care organizations and safety net providers. This article seeks to identify what other states can learn from Maryland's effort to include protections for safety net providers in its Medicaid managed care program--HealthChoice. Under HealthChoice, traditional provider systems can sponsor managed care organizations, historical providers are assured of having a role, patients can self-refer and have open access to certain public health providers, and capitation rates are risk adjusted through the use of adjusted clinical groups and claims data. The article is based on a week-long site visit to Maryland in fall 1998 that was one part of a seven-state study. Maryland's experience suggests that states have much to gain in the way of "good" public policy by considering the impact of their Medicaid managed care programs on the safety net, but states should not underestimate the challenges involved in balancing the need to protect the safety net with the need to contain costs and minimize the administrative burden on providers. No amount of protection can compensate for a poorly designed or implemented program. As the health care environment continues to change, so may the need for and the types of protections change. It also may be most difficult to guarantee adequate protections to those who need it most--among relatively financially insecure providers that have a limited management infrastructure and that depend heavily on Medicaid and the state for funds to care for the uninsured.
Wilson, Justin B; Osterhaus, Matt C; Farris, Karen B; Doucette, William R; Currie, Jay D; Bullock, Tammy; Kumbera, Patty
2005-01-01
To perform a retrospective financial analysis on the implementation of a self-insured company's wellness program from the pharmaceutical care provider's perspective and conduct sensitivity analyses to estimate costs versus revenues for pharmacies without resident pharmacists, program implementation for a second employer, the second year of the program, and a range of pharmacist wages. Cost-benefit and sensitivity analyses. Self-insured employer with headquarters in Canton, N.C. 36 employees at facility in Clinton, Iowa. Pharmacist-provided cardiovascular wellness program. Costs and revenues collected from pharmacy records, including pharmacy purchasing records, billing records, and pharmacists' time estimates. All costs and revenues were calculated for the development and first year of the intervention program. Costs included initial and follow-up screening supplies, office supplies, screening/group presentation time, service provision time, documentation/preparation time, travel expenses, claims submission time, and administrative fees. Revenues included initial screening revenues, follow-up screening revenues, group session revenues, and Heart Smart program revenues. For the development and first year of Heart Smart, net benefit to the pharmacy (revenues minus costs) amounted to dollars 2,413. All sensitivity analyses showed a net benefit. For pharmacies without a resident pharmacist, the net benefit was dollars 106; for Heart Smart in a second employer, the net benefit was dollars 6,024; for the second year, the projected net benefit was dollars 6,844; factoring in a lower pharmacist salary, the net benefit was dollars 2,905; and for a higher pharmacist salary, the net benefit was dollars 1,265. For the development and first year of Heart Smart, the revenues of the wellness program in a self-insured company outweighed the costs.
Robinson, Thomas N.; Walters, Paul A.
1987-01-01
Computer-based health education has been employed in many settings. However, data on resultant behavior change are lacking. A randomized, controlled, prospective study was performed to test the efficacy of Stanford Health-Net in changing community health behaviors. Graduate and undergraduate students (N=1003) were randomly assigned to treatment and control conditions. The treatment group received access to Health-Net, a health promotion computer network emphasizing specific self-care and preventive strategies. Over a four month intervention period, 26% of the treatment group used Health-Net an average of 6.4 times each (range 1 to 97). Users rated Health-Net favorably. The mean number of ambulatory medical visits decreesed 22.5% more in the treatment group than in the control group (P<.05), while hospitalizations did not differ significantly between groups. In addition, perceived self-efficacy for preventing the acquisition of a STD and herpes increased 577% (P<.05) and 261% (P<.01) more, respectively, in the treatment group than in the control group. These findings suggest that access to Stanford Health-Net can result in significant health behavior change. The advantages of the network approach make it a potential model for other communities.
Microbial genotype-phenotype mapping by class association rule mining.
Tamura, Makio; D'haeseleer, Patrik
2008-07-01
Microbial phenotypes are typically due to the concerted action of multiple gene functions, yet the presence of each gene may have only a weak correlation with the observed phenotype. Hence, it may be more appropriate to examine co-occurrence between sets of genes and a phenotype (multiple-to-one) instead of pairwise relations between a single gene and the phenotype. Here, we propose an efficient class association rule mining algorithm, netCAR, in order to extract sets of COGs (clusters of orthologous groups of proteins) associated with a phenotype from COG phylogenetic profiles and a phenotype profile. netCAR takes into account the phylogenetic co-occurrence graph between COGs to restrict hypothesis space, and uses mutual information to evaluate the biconditional relation. We examined the mining capability of pairwise and multiple-to-one association by using netCAR to extract COGs relevant to six microbial phenotypes (aerobic, anaerobic, facultative, endospore, motility and Gram negative) from 11,969 unique COG profiles across 155 prokaryotic organisms. With the same level of false discovery rate, multiple-to-one association can extract about 10 times more relevant COGs than one-to-one association. We also reveal various topologies of association networks among COGs (modules) from extracted multiple-to-one correlation rules relevant with the six phenotypes; including a well-connected network for motility, a star-shaped network for aerobic and intermediate topologies for the other phenotypes. netCAR outperforms a standard CAR mining algorithm, CARapriori, while requiring several orders of magnitude less computational time for extracting 3-COG sets. Source code of the Java implementation is available as Supplementary Material at the Bioinformatics online website, or upon request to the author. Supplementary data are available at Bioinformatics online.
Wilkinson, S N; Dougall, C; Kinsey-Henderson, A E; Searle, R D; Ellis, R J; Bartley, R
2014-01-15
The use of river basin modelling to guide mitigation of non-point source pollution of wetlands, estuaries and coastal waters has become widespread. To assess and simulate the impacts of alternate land use or climate scenarios on river washload requires modelling techniques that represent sediment sources and transport at the time scales of system response. Building on the mean-annual SedNet model, we propose a new D-SedNet model which constructs daily budgets of fine sediment sources, transport and deposition for each link in a river network. Erosion rates (hillslope, gully and streambank erosion) and fine sediment sinks (floodplains and reservoirs) are disaggregated from mean annual rates based on daily rainfall and runoff. The model is evaluated in the Burdekin basin in tropical Australia, where policy targets have been set for reducing sediment and nutrient loads to the Great Barrier Reef (GBR) lagoon from grazing and cropping land. D-SedNet predicted annual loads with similar performance to that of a sediment rating curve calibrated to monitored suspended sediment concentrations. Relative to a 22-year reference load time series at the basin outlet derived from a dynamic general additive model based on monitoring data, D-SedNet had a median absolute error of 68% compared with 112% for the rating curve. RMS error was slightly higher for D-SedNet than for the rating curve due to large relative errors on small loads in several drought years. This accuracy is similar to existing agricultural system models used in arable or humid environments. Predicted river loads were sensitive to ground vegetation cover. We conclude that the river network sediment budget model provides some capacity for predicting load time-series independent of monitoring data in ungauged basins, and for evaluating the impact of land management on river sediment load time-series, which is challenging across large regions in data-poor environments. © 2013. Published by Elsevier B.V. All rights reserved.
2006-01-01
Background Insecticide-treated bed nets (ITN) provide real hope for the reduction of the malaria burden across Africa. Understanding factors that determine access to ITN is crucial to debates surrounding the optimal delivery systems. The influence of homestead wealth on use of nets purchased from the retail sector is well documented, however, the competing influence of mother's education and physical access to net providers is less well understood. Methods Between December 2004 and January 2005, a random sample of 72 rural communities was selected across four Kenyan districts. Demographic, assets, education and net use data were collected at homestead, mother and child (aged < 5 years) levels. An assets-based wealth index was developed using principal components analysis, travel time to net sources was modelled using geographic information systems, and factors influencing the use of retail sector nets explored using a multivariable logistic regression model. Results Homestead heads and guardians of 3,755 children < 5 years of age were interviewed. Approximately 15% (562) of children slept under a net the night before the interview; 58% (327) of the nets used were purchased from the retail sector. Homestead wealth (adjusted OR = 10.17, 95% CI = 5.45–18.98), travel time to nearest market centres (adjusted OR = 0.51, 95% CI = 0.37–0.72) and mother's education (adjusted OR = 2.92, 95% CI = 1.93–4.41) were significantly associated with use of retail sector nets by children aged less than 5 years. Conclusion Approaches to promoting access to nets through the retail sector disadvantage poor and remote communities where mothers are less well educated. PMID:16436216
Worldwide estimates and bibliography of net primary productivity derived from pre-1982 publications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Esser, G.; Lieth, H.F.H.; Scurlock, J.M.O.
An extensive compilation of more than 700 field estimates of net primary productivity of natural and agricultural ecosystems worldwide was synthesized in Germany in the 1970s and early 1980s. Although the Osnabrueck data set has not been updated since the 1980s, it represents a wealth of information for use in model development and validation. This report documents the development of this data set, its contents, and its recent availability on the Internet from the Oak Ridge National Laboratory Distributed Active Archive Center for Biogeochemical Dynamics. Caution is advised in using these data, which necessarily include assumptions and conversions that maymore » not be universally applicable to all sites.« less
On INM's Use of Corrected Net Thrust for the Prediction of Jet Aircraft Noise
NASA Technical Reports Server (NTRS)
McAninch, Gerry L.; Shepherd, Kevin P.
2011-01-01
The Federal Aviation Administration s (FAA) Integrated Noise Model (INM) employs a prediction methodology that relies on corrected net thrust as the sole correlating parameter between aircraft and engine operating states and aircraft noise. Thus aircraft noise measured for one set of atmospheric and aircraft operating conditions is assumed to be applicable to all other conditions as long as the corrected net thrust remains constant. This hypothesis is investigated under two primary assumptions: (1) the sound field generated by the aircraft is dominated by jet noise, and (2) the sound field generated by the jet flow is adequately described by Lighthill s theory of noise generated by turbulence.
Markets, voucher subsidies and free nets combine to achieve high bed net coverage in rural Tanzania.
Khatib, Rashid A; Killeen, Gerry F; Abdulla, Salim M K; Kahigwa, Elizeus; McElroy, Peter D; Gerrets, Rene P M; Mshinda, Hassan; Mwita, Alex; Kachur, S Patrick
2008-06-02
Tanzania has a well-developed network of commercial ITN retailers. In 2004, the government introduced a voucher subsidy for pregnant women and, in mid 2005, helped distribute free nets to under-fives in small number of districts, including Rufiji on the southern coast, during a child health campaign. Contributions of these multiple insecticide-treated net delivery strategies existing at the same time and place to coverage in a poor rural community were assessed. Cross-sectional household survey in 6,331 members of randomly selected 1,752 households of 31 rural villages of Demographic Surveillance System in Rufiji district, Southern Tanzania was conducted in 2006. A questionnaire was administered to every consenting respondent about net use, treatment status and delivery mechanism. Net use was 62.7% overall, 87.2% amongst infants (0 to 1 year), 81.8% amongst young children (>1 to 5 years), 54.5% amongst older children (6 to 15 years) and 59.6% amongst adults (>15 years). 30.2% of all nets had been treated six months prior to interview. The biggest source of nets used by infants was purchase from the private sector with a voucher subsidy (41.8%). Half of nets used by young children (50.0%) and over a third of those used by older children (37.2%) were obtained free of charge through the vaccination campaign. The largest source of nets amongst the population overall was commercial purchase (45.1% use) and was the primary means for protecting adults (60.2% use). All delivery mechanisms, especially sale of nets at full market price, under-served the poorest but no difference in equity was observed between voucher-subsidized and freely distributed nets. All three delivery strategies enabled a poor rural community to achieve net coverage high enough to yield both personal and community level protection for the entire population. Each of them reached their relevant target group and free nets only temporarily suppressed the net market, illustrating that in this setting that these are complementary rather than mutually exclusive approaches.
The value of personal health record (PHR) systems.
Kaelber, David; Pan, Eric C
2008-11-06
Personal health records (PHRs) are a rapidly growing area of health information technology despite a lack of significant value-based assessment.Here we present an assessment of the potential value of PHR systems, looking at both costs and benefits.We examine provider-tethered, payer-tethered, and third-party PHRs, as well as idealized interoperable PHRs. An analytical model was developed that considered eight PHR application and infrastructure functions. Our analysis projects the initial and annual costs and annual benefits of PHRs to the entire US over the next 10 years.This PHR analysis shows that all forms of PHRs have initial net negative value. However, at the end of 10 years, steady state annual net value ranging from$13 billion to -$29 billion. Interoperable PHRs provide the most value, followed by third-party PHRs and payer-tethered PHRs also showing positive net value. Provider-tethered PHRs constantly demonstrating negative net value.
NASA Astrophysics Data System (ADS)
Kearney, Michael R.; Maino, James L.
2018-06-01
Accurate models of soil moisture are vital for solving core problems in meteorology, hydrology, agriculture and ecology. The capacity for soil moisture modelling is growing rapidly with the development of high-resolution, continent-scale gridded weather and soil data together with advances in modelling methods. In particular, the GlobalSoilMap.net initiative represents next-generation, depth-specific gridded soil products that may substantially increase soil moisture modelling capacity. Here we present an implementation of Campbell's infiltration and redistribution model within the NicheMapR microclimate modelling package for the R environment, and use it to assess the predictive power provided by the GlobalSoilMap.net product Soil and Landscape Grid of Australia (SLGA, ∼100 m) as well as the coarser resolution global product SoilGrids (SG, ∼250 m). Predictions were tested in detail against 3 years of root-zone (3-75 cm) soil moisture observation data from 35 monitoring sites within the OzNet project in Australia, with additional tests of the finalised modelling approach against cosmic-ray neutron (CosmOz, 0-50 cm, 9 sites from 2011 to 2017) and satellite (ASCAT, 0-2 cm, continent-wide from 2007 to 2009) observations. The model was forced by daily 0.05° (∼5 km) gridded meteorological data. The NicheMapR system predicted soil moisture to within experimental error for all data sets. Using the SLGA or the SG soil database, the OzNet soil moisture could be predicted with a root mean square error (rmse) of ∼0.075 m3 m-3 and a correlation coefficient (r) of 0.65 consistently through the soil profile without any parameter tuning. Soil moisture predictions based on the SLGA and SG datasets were ≈ 17% closer to the observations than when using a chloropleth-derived soil data set (Digital Atlas of Australian Soils), with the greatest improvements occurring for deeper layers. The CosmOz observations were predicted with similar accuracy (r = 0.76 and rmse of ∼0.085 m3 m-3). Comparisons at the continental scale to 0-2 cm satellite data (ASCAT) showed that the SLGA/SG datasets increased model fit over simulations using the DAAS soil properties (r ∼ 0.63 &rmse 15% vs. r 0.48 &rmse 18%, respectively). Overall, our results demonstrate the advantages of using GlobalSoilMap.net products in combination with gridded weather data for modelling soil moisture at fine spatial and temporal resolution at the continental scale.
Estimating the cost of no-shows and evaluating the effects of mitigation strategies.
Berg, Bjorn P; Murr, Michael; Chermak, David; Woodall, Jonathan; Pignone, Michael; Sandler, Robert S; Denton, Brian T
2013-11-01
To measure the cost of nonattendance ("no-shows") and benefit of overbooking and interventions to reduce no-shows for an outpatient endoscopy suite. We used a discrete-event simulation model to determine improved overbooking scheduling policies and examine the effect of no-shows on procedure utilization and expected net gain, defined as the difference in expected revenue based on Centers for Medicare & Medicaid Services reimbursement rates and variable costs based on the sum of patient waiting time and provider and staff overtime. No-show rates were estimated from historical attendance (18% on average, with a sensitivity range of 12%-24%). We then evaluated the effectiveness of scheduling additional patients and the effect of no-show reduction interventions on the expected net gain. The base schedule booked 24 patients per day. The daily expected net gain with perfect attendance is $4433.32. The daily loss attributed to the base case no-show rate of 18% is $725.42 (16.4% of net gain), ranging from $472.14 to $1019.29 (10.7%-23.0% of net gain). Implementing no-show interventions reduced net loss by $166.61 to $463.09 (3.8%-10.5% of net gain). The overbooking policy of 9 additional patients per day resulted in no loss in expected net gain when compared with the reference scenario. No-shows can significantly decrease the expected net gain of outpatient procedure centers. Overbooking can help mitigate the impact of no-shows on a suite's expected net gain and has a lower expected cost of implementation to the provider than intervention strategies.
On comparison of net survival curves.
Pavlič, Klemen; Perme, Maja Pohar
2017-05-02
Relative survival analysis is a subfield of survival analysis where competing risks data are observed, but the causes of death are unknown. A first step in the analysis of such data is usually the estimation of a net survival curve, possibly followed by regression modelling. Recently, a log-rank type test for comparison of net survival curves has been introduced and the goal of this paper is to explore its properties and put this methodological advance into the context of the field. We build on the association between the log-rank test and the univariate or stratified Cox model and show the analogy in the relative survival setting. We study the properties of the methods using both the theoretical arguments as well as simulations. We provide an R function to enable practical usage of the log-rank type test. Both the log-rank type test and its model alternatives perform satisfactory under the null, even if the correlation between their p-values is rather low, implying that both approaches cannot be used simultaneously. The stratified version has a higher power in case of non-homogeneous hazards, but also carries a different interpretation. The log-rank type test and its stratified version can be interpreted in the same way as the results of an analogous semi-parametric additive regression model despite the fact that no direct theoretical link can be established between the test statistics.
ERIC Educational Resources Information Center
Shor, Mikhael
2003-01-01
States making game theory relevant and accessible to students is challenging. Describes the primary goal of GameTheory.net is to provide interactive teaching tools. Indicates the site strives to unite educators from economics, political and computer science, and ecology by providing a repository of lecture notes and tests for courses using…
[Automated Assessment for Bone Age of Left Wrist Joint in Uyghur Teenagers by Deep Learning].
Hu, T H; Huo, Z; Liu, T A; Wang, F; Wan, L; Wang, M W; Chen, T; Wang, Y H
2018-02-01
To realize the automated bone age assessment by applying deep learning to digital radiography (DR) image recognition of left wrist joint in Uyghur teenagers, and explore its practical application value in forensic medicine bone age assessment. The X-ray films of left wrist joint after pretreatment, which were taken from 245 male and 227 female Uyghur nationality teenagers in Uygur Autonomous Region aged from 13.0 to 19.0 years old, were chosen as subjects. And AlexNet was as a regression model of image recognition. From the total samples above, 60% of male and female DR images of left wrist joint were selected as net train set, and 10% of samples were selected as validation set. As test set, the rest 30% were used to obtain the image recognition accuracy with an error range in ±1.0 and ±0.7 age respectively, compared to the real age. The modelling results of deep learning algorithm showed that when the error range was in ±1.0 and ±0.7 age respectively, the accuracy of the net train set was 81.4% and 75.6% in male, and 80.5% and 74.8% in female, respectively. When the error range was in ±1.0 and ±0.7 age respectively, the accuracy of the test set was 79.5% and 71.2% in male, and 79.4% and 66.2% in female, respectively. The combination of bone age research on teenagers' left wrist joint and deep learning, which has high accuracy and good feasibility, can be the research basis of bone age automatic assessment system for the rest joints of body. Copyright© by the Editorial Department of Journal of Forensic Medicine.
A Study of NetCDF as an Approach for High Performance Medical Image Storage
NASA Astrophysics Data System (ADS)
Magnus, Marcone; Coelho Prado, Thiago; von Wangenhein, Aldo; de Macedo, Douglas D. J.; Dantas, M. A. R.
2012-02-01
The spread of telemedicine systems increases every day. The systems and PACS based on DICOM images has become common. This rise reflects the need to develop new storage systems, more efficient and with lower computational costs. With this in mind, this article discusses a study for application in NetCDF data format as the basic platform for storage of DICOM images. The study case comparison adopts an ordinary database, the HDF5 and the NetCDF to storage the medical images. Empirical results, using a real set of images, indicate that the time to retrieve images from the NetCDF for large scale images has a higher latency compared to the other two methods. In addition, the latency is proportional to the file size, which represents a drawback to a telemedicine system that is characterized by a large amount of large image files.
Processing Ocean Images to Detect Large Drift Nets
NASA Technical Reports Server (NTRS)
Veenstra, Tim
2009-01-01
A computer program processes the digitized outputs of a set of downward-looking video cameras aboard an aircraft flying over the ocean. The purpose served by this software is to facilitate the detection of large drift nets that have been lost, abandoned, or jettisoned. The development of this software and of the associated imaging hardware is part of a larger effort to develop means of detecting and removing large drift nets before they cause further environmental damage to the ocean and to shores on which they sometimes impinge. The software is capable of near-realtime processing of as many as three video feeds at a rate of 30 frames per second. After a user sets the parameters of an adjustable algorithm, the software analyzes each video stream, detects any anomaly, issues a command to point a high-resolution camera toward the location of the anomaly, and, once the camera has been so aimed, issues a command to trigger the camera shutter. The resulting high-resolution image is digitized, and the resulting data are automatically uploaded to the operator s computer for analysis.
2017-01-01
Although deep learning approaches have had tremendous success in image, video and audio processing, computer vision, and speech recognition, their applications to three-dimensional (3D) biomolecular structural data sets have been hindered by the geometric and biological complexity. To address this problem we introduce the element-specific persistent homology (ESPH) method. ESPH represents 3D complex geometry by one-dimensional (1D) topological invariants and retains important biological information via a multichannel image-like representation. This representation reveals hidden structure-function relationships in biomolecules. We further integrate ESPH and deep convolutional neural networks to construct a multichannel topological neural network (TopologyNet) for the predictions of protein-ligand binding affinities and protein stability changes upon mutation. To overcome the deep learning limitations from small and noisy training sets, we propose a multi-task multichannel topological convolutional neural network (MM-TCNN). We demonstrate that TopologyNet outperforms the latest methods in the prediction of protein-ligand binding affinities, mutation induced globular protein folding free energy changes, and mutation induced membrane protein folding free energy changes. Availability: weilab.math.msu.edu/TDL/ PMID:28749969
A Java program for LRE-based real-time qPCR that enables large-scale absolute quantification.
Rutledge, Robert G
2011-03-02
Linear regression of efficiency (LRE) introduced a new paradigm for real-time qPCR that enables large-scale absolute quantification by eliminating the need for standard curves. Developed through the application of sigmoidal mathematics to SYBR Green I-based assays, target quantity is derived directly from fluorescence readings within the central region of an amplification profile. However, a major challenge of implementing LRE quantification is the labor intensive nature of the analysis. Utilizing the extensive resources that are available for developing Java-based software, the LRE Analyzer was written using the NetBeans IDE, and is built on top of the modular architecture and windowing system provided by the NetBeans Platform. This fully featured desktop application determines the number of target molecules within a sample with little or no intervention by the user, in addition to providing extensive database capabilities. MS Excel is used to import data, allowing LRE quantification to be conducted with any real-time PCR instrument that provides access to the raw fluorescence readings. An extensive help set also provides an in-depth introduction to LRE, in addition to guidelines on how to implement LRE quantification. The LRE Analyzer provides the automated analysis and data storage capabilities required by large-scale qPCR projects wanting to exploit the many advantages of absolute quantification. Foremost is the universal perspective afforded by absolute quantification, which among other attributes, provides the ability to directly compare quantitative data produced by different assays and/or instruments. Furthermore, absolute quantification has important implications for gene expression profiling in that it provides the foundation for comparing transcript quantities produced by any gene with any other gene, within and between samples.
A Java Program for LRE-Based Real-Time qPCR that Enables Large-Scale Absolute Quantification
Rutledge, Robert G.
2011-01-01
Background Linear regression of efficiency (LRE) introduced a new paradigm for real-time qPCR that enables large-scale absolute quantification by eliminating the need for standard curves. Developed through the application of sigmoidal mathematics to SYBR Green I-based assays, target quantity is derived directly from fluorescence readings within the central region of an amplification profile. However, a major challenge of implementing LRE quantification is the labor intensive nature of the analysis. Findings Utilizing the extensive resources that are available for developing Java-based software, the LRE Analyzer was written using the NetBeans IDE, and is built on top of the modular architecture and windowing system provided by the NetBeans Platform. This fully featured desktop application determines the number of target molecules within a sample with little or no intervention by the user, in addition to providing extensive database capabilities. MS Excel is used to import data, allowing LRE quantification to be conducted with any real-time PCR instrument that provides access to the raw fluorescence readings. An extensive help set also provides an in-depth introduction to LRE, in addition to guidelines on how to implement LRE quantification. Conclusions The LRE Analyzer provides the automated analysis and data storage capabilities required by large-scale qPCR projects wanting to exploit the many advantages of absolute quantification. Foremost is the universal perspective afforded by absolute quantification, which among other attributes, provides the ability to directly compare quantitative data produced by different assays and/or instruments. Furthermore, absolute quantification has important implications for gene expression profiling in that it provides the foundation for comparing transcript quantities produced by any gene with any other gene, within and between samples. PMID:21407812
Whellan, David J; Reed, Shelby D; Liao, Lawrence; Gould, Stuart D; O'connor, Christopher M; Schulman, Kevin A
2007-01-15
Although heart failure disease management (HFDM) programs improve patient outcomes, the implementation of these programs has been limited because of financial barriers. We undertook the present study to understand the economic incentives and disincentives for adoption of disease management strategies from the perspectives of a physician (group), a hospital, an integrated health system, and a third-party payer. Using the combined results of a group of randomized controlled trials and a set of financial assumptions from a single academic medical center, a financial model was developed to compute the expected costs before and after the implementation of a HFDM program by 3 provider types (physicians, hospitals, and health systems), as well as the costs incurred from a payer perspective. The base-case model showed that implementation of HFDM results in a net financial loss to all potential providers of HFDM. Implementation of HFDM as described in our base-case analysis would create a net loss of US dollars 179,549 in the first year for a physician practice, US dollars 464,132 for an integrated health system, and US dollars 652,643 in the first year for a hospital. Third-party payers would be able to save US dollars 713,661 annually for the care of 350 patients with heart failure in a HFDM program. In conclusion, although HFDM programs may provide patients with improved clinical outcomes and decreased hospitalizations that save third-party payers money, limited financial incentives are currently in place for healthcare providers and hospitals to initiate these programs.
The State of Research on Racial/Ethnic Discrimination in The Receipt of Health Care
Fagan, Pebbles; Jones, Dionne; Klein, William M. P.; Boyington, Josephine; Moten, Carmen; Rorie, Edward
2012-01-01
Objectives. We conducted a review to examine current literature on the effects of interpersonal and institutional racism and discrimination occurring within health care settings on the health care received by racial/ethnic minority patients. Methods. We searched the PsychNet, PubMed, and Scopus databases for articles on US populations published between January 1, 2008 and November 1, 2011. We used various combinations of the following search terms: discrimination, perceived discrimination, race, ethnicity, racism, institutional racism, stereotype, prejudice or bias, and health or health care. Fifty-eight articles were reviewed. Results. Patient perception of discriminatory treatment and implicit provider biases were the most frequently examined topics in health care settings. Few studies examined the overall prevalence of racial/ethnic discrimination and none examined temporal trends. In general, measures used were insufficient for examining the impact of interpersonal discrimination or institutional racism within health care settings on racial/ethnic disparities in health care. Conclusions. Better instrumentation, innovative methodology, and strategies are needed for identifying and tracking racial/ethnic discrimination in health care settings. PMID:22494002
Bradley, Cathy J; Dahman, Bassam; Sabik, Lindsay M
2015-02-01
We examined whether safety net hospitals reduce the likelihood of emergency colorectal cancer (CRC) surgery in uninsured and Medicaid-insured patients. If these patients have better access to care through safety net providers, they should be less likely to undergo emergency resection relative to similar patients at non- safety net hospitals. Using population-based data, we estimated the relationship between safety net hospitals, patient insurance status, and emergency CRC surgery. We extracted inpatient admission data from the Virginia Health Information discharge database and matched them to the Virginia Cancer Registry for patients aged 21 to 64 years who underwent a CRC resection between January 1, 1999, and December 31, 2005 (n = 5488). We differentiated between medically defined emergencies and those that originated in the emergency department (ED). For each definition of emergency surgery, we estimated the linear probability models of the effects of being treated at a safety net hospital on the probability of having an emergency resection. Safety net hospitals reduce emergency surgeries among uninsured and Medicaid CRC patients. When defining an emergency resection as those that involved an ED visit, these patients were 15 to 20 percentage points less likely to have an emergency resection when treated in a safety net hospital. Our results suggest that these hospitals provide a benefit, most likely through the access they afford to timely and appropriate care, to uninsured and Medicaid-insured patients relative to hospitals without a safety net mission.
Cost-outcome description of clinical pharmacist interventions in a university teaching hospital.
Gallagher, James; Byrne, Stephen; Woods, Noel; Lynch, Deirdre; McCarthy, Suzanne
2014-04-17
Pharmacist interventions are one of the pivotal parts of a clinical pharmacy service within a hospital. This study estimates the cost avoidance generated by pharmacist interventions due to the prevention of adverse drug events (ADE). The types of interventions identified are also analysed. Interventions recorded by a team of hospital pharmacists over a one year time period were included in the study. Interventions were assigned a rating score, determined by the probability that an ADE would have occurred in the absence of an intervention. These scores were then used to calculate cost avoidance. Net cost benefit and cost benefit ratio were the primary outcomes. Categories of interventions were also analysed. A total cost avoidance of €708,221 was generated. Input costs were calculated at €81,942. This resulted in a net cost benefit of €626,279 and a cost benefit ratio of 8.64: 1. The most common type of intervention was the identification of medication omissions, followed by dosage adjustments and requests to review therapies. This study provides further evidence that pharmacist interventions provide substantial cost avoidance to the healthcare payer. There is a serious issue of patient's regular medication being omitted on transfer to an inpatient setting in Irish hospitals.
NiftyNet: a deep-learning platform for medical imaging.
Gibson, Eli; Li, Wenqi; Sudre, Carole; Fidon, Lucas; Shakir, Dzhoshkun I; Wang, Guotai; Eaton-Rosen, Zach; Gray, Robert; Doel, Tom; Hu, Yipeng; Whyntie, Tom; Nachev, Parashkev; Modat, Marc; Barratt, Dean C; Ourselin, Sébastien; Cardoso, M Jorge; Vercauteren, Tom
2018-05-01
Medical image analysis and computer-assisted intervention problems are increasingly being addressed with deep-learning-based solutions. Established deep-learning platforms are flexible but do not provide specific functionality for medical image analysis and adapting them for this domain of application requires substantial implementation effort. Consequently, there has been substantial duplication of effort and incompatible infrastructure developed across many research groups. This work presents the open-source NiftyNet platform for deep learning in medical imaging. The ambition of NiftyNet is to accelerate and simplify the development of these solutions, and to provide a common mechanism for disseminating research outputs for the community to use, adapt and build upon. The NiftyNet infrastructure provides a modular deep-learning pipeline for a range of medical imaging applications including segmentation, regression, image generation and representation learning applications. Components of the NiftyNet pipeline including data loading, data augmentation, network architectures, loss functions and evaluation metrics are tailored to, and take advantage of, the idiosyncracies of medical image analysis and computer-assisted intervention. NiftyNet is built on the TensorFlow framework and supports features such as TensorBoard visualization of 2D and 3D images and computational graphs by default. We present three illustrative medical image analysis applications built using NiftyNet infrastructure: (1) segmentation of multiple abdominal organs from computed tomography; (2) image regression to predict computed tomography attenuation maps from brain magnetic resonance images; and (3) generation of simulated ultrasound images for specified anatomical poses. The NiftyNet infrastructure enables researchers to rapidly develop and distribute deep learning solutions for segmentation, regression, image generation and representation learning applications, or extend the platform to new applications. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
Dennehy, Patricia; White, Mary P; Hamilton, Andrew; Pohl, Joanne M; Tanner, Clare; Onifade, Tiffiani J
2011-01-01
Objective To present a partnership-based and community-oriented approach designed to ease provider anxiety and facilitate the implementation of electronic health records (EHR) in resource-limited primary care settings. Materials and Methods The approach, referred to as partnership model, was developed and iteratively refined through the research team's previous work on implementing health information technology (HIT) in over 30 safety net practices. This paper uses two case studies to illustrate how the model was applied to help two nurse-managed health centers (NMHC), a particularly vulnerable primary care setting, implement EHR and get prepared to meet the meaningful use criteria. Results The strong focus of the model on continuous quality improvement led to eventual implementation success at both sites, despite difficulties encountered during the initial stages of the project. Discussion There has been a lack of research, particularly in resource-limited primary care settings, on strategies for abating provider anxiety and preparing them to manage complex changes associated with EHR uptake. The partnership model described in this paper may provide useful insights into the work shepherded by HIT regional extension centers dedicated to supporting resource-limited communities disproportionally affected by EHR adoption barriers. Conclusion NMHC, similar to other primary care settings, are often poorly resourced, understaffed, and lack the necessary expertise to deploy EHR and integrate its use into their day-to-day practice. This study demonstrates that implementation of EHR, a prerequisite to meaningful use, can be successfully achieved in this setting, and partnership efforts extending far beyond the initial software deployment stage may be the key. PMID:21828225
NetMHCcons: a consensus method for the major histocompatibility complex class I predictions.
Karosiene, Edita; Lundegaard, Claus; Lund, Ole; Nielsen, Morten
2012-03-01
A key role in cell-mediated immunity is dedicated to the major histocompatibility complex (MHC) molecules that bind peptides for presentation on the cell surface. Several in silico methods capable of predicting peptide binding to MHC class I have been developed. The accuracy of these methods depends on the data available characterizing the binding specificity of the MHC molecules. It has, moreover, been demonstrated that consensus methods defined as combinations of two or more different methods led to improved prediction accuracy. This plethora of methods makes it very difficult for the non-expert user to choose the most suitable method for predicting binding to a given MHC molecule. In this study, we have therefore made an in-depth analysis of combinations of three state-of-the-art MHC-peptide binding prediction methods (NetMHC, NetMHCpan and PickPocket). We demonstrate that a simple combination of NetMHC and NetMHCpan gives the highest performance when the allele in question is included in the training and is characterized by at least 50 data points with at least ten binders. Otherwise, NetMHCpan is the best predictor. When an allele has not been characterized, the performance depends on the distance to the training data. NetMHCpan has the highest performance when close neighbours are present in the training set, while the combination of NetMHCpan and PickPocket outperforms either of the two methods for alleles with more remote neighbours. The final method, NetMHCcons, is publicly available at www.cbs.dtu.dk/services/NetMHCcons , and allows the user in an automatic manner to obtain the most accurate predictions for any given MHC molecule.
Hazeldine, Jon; Harris, Phillipa; Chapple, Iain L; Grant, Melissa; Greenwood, Hannah; Livesey, Amy; Sapey, Elizabeth; Lord, Janet M
2014-08-01
Neutrophil extracellular traps (NETs) are a recently discovered addition to the defensive armamentarium of neutrophils, assisting in the immune response against rapidly dividing bacteria. Although older adults are more susceptible to such infections, no study has examined whether aging in humans influences NET formation. We report that TNF-α-primed neutrophils generate significantly more NETs than unprimed neutrophils and that lipopolysaccharide (LPS)- and interleukin-8 (IL-8)-induced NET formation exhibits a significant age-related decline. NET formation requires generation of reactive oxygen species (ROS), and this was also reduced in neutrophils from older donors identifying a mechanism for reduced NET formation. Expression of IL-8 receptors (CXCR1 and CXCR2) and the LPS receptor TLR4 was similar on neutrophils from young and old subjects, and neutrophils challenged with phorbol-12-myristate-13-acetate (PMA) showed no age-associated differences in ROS or NET production. Taken together, these data suggest a defect in proximal signalling underlies the age-related decline in NET and ROS generation. TNF-α priming involves signalling through p38 MAP kinase, but activation kinetics were comparable in neutrophils from young and old donors. In a clinical setting, we assessed the capacity of neutrophils from young and older patients with chronic periodontitis to generate NETs in response to PMA and hypochlorous acid (HOCL). Neutrophil extracellular trap generation to HOCL, but not PMA, was lower in older periodontitis patients but not in comparison with age-matched controls. Impaired NET formation is thus a novel defect of innate immunity in older adults but does not appear to contribute to the increased incidence of periodontitis in older adults. © 2014 The Authors. Aging Cell published by the Anatomical Society and John Wiley & Sons Ltd.
Russell, Cheryl L.; Sallau, Adamu; Emukah, Emmanuel; Graves, Patricia M.; Noland, Gregory S.; Ngondi, Jeremiah M.; Ozaki, Masayo; Nwankwo, Lawrence; Miri, Emmanuel; McFarland, Deborah A.; Richards, Frank O.; Patterson, Amy E.
2015-01-01
Millions of long-lasting insecticide treated nets (LLINs) have been distributed as part of the global malaria control strategy. LLIN ownership, however, does not necessarily guarantee use. Thus, even in the ideal setting in which universal coverage with LLINs has been achieved, maximal malaria protection will only be achieved if LLINs are used both correctly and consistently. This study investigated the factors associated with net use, independent of net ownership. Data were collected during a household survey conducted in Ebonyi State in southeastern Nigeria in November 2011 following a statewide mass LLIN distribution campaign and, in select locations, a community-based social behavior change (SBC) intervention. Logistic regression analyses, controlling for household bed net ownership, were conducted to examine the association between individual net use and various demographic, environmental, behavioral and social factors. The odds of net use increased among individuals who were exposed to tailored SBC in the context of a home visit (OR = 17.11; 95% CI 4.45–65.79) or who received greater degrees of social support from friends and family (ptrend < 0.001). Factors associated with decreased odds of net use included: increasing education level (ptrend = 0.020), increasing malaria knowledge level (ptrend = 0.022), and reporting any disadvantage of bed nets (OR = 0.39; 95% CI 0.23–0.78). The findings suggest that LLIN use is significantly influenced by social support and exposure to a malaria-related SBC home visit. The malaria community should thus further consider the importance of community outreach, interpersonal communication and social support on adoption of net use behaviors when designing future research and interventions. PMID:26430747
Hazeldine, Jon; Harris, Phillipa; Chapple, Iain L; Grant, Melissa; Greenwood, Hannah; Livesey, Amy; Sapey, Elizabeth; Lord, Janet M
2014-01-01
Neutrophil extracellular traps (NETs) are a recently discovered addition to the defensive armamentarium of neutrophils, assisting in the immune response against rapidly dividing bacteria. Although older adults are more susceptible to such infections, no study has examined whether aging in humans influences NET formation. We report that TNF-α-primed neutrophils generate significantly more NETs than unprimed neutrophils and that lipopolysaccharide (LPS)- and interleukin-8 (IL-8)-induced NET formation exhibits a significant age-related decline. NET formation requires generation of reactive oxygen species (ROS), and this was also reduced in neutrophils from older donors identifying a mechanism for reduced NET formation. Expression of IL-8 receptors (CXCR1 and CXCR2) and the LPS receptor TLR4 was similar on neutrophils from young and old subjects, and neutrophils challenged with phorbol-12-myristate-13-acetate (PMA) showed no age-associated differences in ROS or NET production. Taken together, these data suggest a defect in proximal signalling underlies the age-related decline in NET and ROS generation. TNF-α priming involves signalling through p38 MAP kinase, but activation kinetics were comparable in neutrophils from young and old donors. In a clinical setting, we assessed the capacity of neutrophils from young and older patients with chronic periodontitis to generate NETs in response to PMA and hypochlorous acid (HOCL). Neutrophil extracellular trap generation to HOCL, but not PMA, was lower in older periodontitis patients but not in comparison with age-matched controls. Impaired NET formation is thus a novel defect of innate immunity in older adults but does not appear to contribute to the increased incidence of periodontitis in older adults. PMID:24779584
Energy Independence and Security Act of 2007: A Summary of Major Provisions
2007-12-21
Service,The Library of Congress,101 Independence Ave, SW,Washington,DC,20540-7500 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/ MONITORING ...establishes a zero -energy commercial buildings initiative. A national goal is set to achieve zero -net-energy use CRS-8 for new commercial buildings built...after 2025. A further goal is to retrofit all pre- 2025 buildings to zero -net-energy use by 2050. Section 423 requires that DOE establish a national
NASA Astrophysics Data System (ADS)
Porto, Paolo; Walling, Des E.; Cogliandro, Vanessa; Callegari, Giovanni
2016-07-01
Use of the fallout radionuclides cesium-137 and excess lead-210 offers important advantages over traditional methods of quantifying erosion and soil redistribution rates. However, both radionuclides provide information on longer-term (i.e., 50-100 years) average rates of soil redistribution. Beryllium-7, with its half-life of 53 days, can provide a basis for documenting short-term soil redistribution and it has been successfully employed in several studies. However, the approach commonly used introduces several important constraints related to the timing and duration of the study period. A new approach proposed by the authors that overcomes these constraints has been successfully validated using an erosion plot experiment undertaken in southern Italy. Here, a further validation exercise undertaken in a small (1.38 ha) catchment is reported. The catchment was instrumented to measure event sediment yields and beryllium-7 measurements were employed to document the net soil loss for a series of 13 events that occurred between November 2013 and June 2015. In the absence of significant sediment storage within the catchment's ephemeral channel system and of a significant contribution from channel erosion to the measured sediment yield, the estimates of net soil loss for the individual events could be directly compared with the measured sediment yields to validate the former. The close agreement of the two sets of values is seen as successfully validating the use of beryllium-7 measurements and the new approach to obtain estimates of net soil loss for a sequence of individual events occurring over an extended period at the scale of a small catchment.
From striving to thriving: systems thinking, strategy, and the performance of safety net hospitals.
Clark, Jonathan; Singer, Sara; Kane, Nancy; Valentine, Melissa
2013-01-01
Safety net hospitals (SNH) have, on average, experienced declining financial margins and faced an elevated risk of closure over the past decade. Despite these challenges, not all SNHs are weakening and some are prospering. These higher-performing SNHs provide substantial care to safety net populations and produce sustainable financial returns. Drawing on the alternative structural positioning and resource-based views, we explore strategic management as a source of performance differences across SNHs. We employ a mixed-method design, blending quantitative and qualitative data and analysis. We measure financial performance using hospital operating margin and quantitatively evaluate its relationship with a limited set of well-defined structural positions. We further evaluate these structures and also explore the internal resources of SNHs based on nine in-depth case studies developed from site visits and extensive interviews. Quantitative results suggest that structural positions alone are not related to performance. Comparative case studies suggest that higher-performing SNH differ in four respects: (1) coordinating patient flow across the care continuum, (2) engaging in partnerships with other providers, (3) managing scope of services, and (4) investing in human capital. On the basis of these findings, we propose a model of strategic action related to systems thinking--the ability to see wholes and interrelationships rather than individual parts alone. Our exploratory findings suggest the need to move beyond generic strategies alone and acknowledge the importance of underlying managerial capabilities. Specifically, our findings suggest that effective strategy is a function of both the internal resources (e.g., managers' systems-thinking capability) and structural positions (e.g., partnerships) of organizations. From this perspective, framing resources and positioning as distinct alternatives misses the nuances of how strategic advantage is actually achieved.
A framework for quantifying net benefits of alternative prognostic models.
Rapsomaniki, Eleni; White, Ian R; Wood, Angela M; Thompson, Simon G
2012-01-30
New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measures and can be used in full health economic evaluations of prognostic models used for screening and allocating risk reduction interventions. We extend previous work in this area by quantifying net benefits in life years, thus linking prognostic performance to health economic measures; by taking full account of the occurrence of events over time; and by considering estimation and cross-validation in a multiple-study setting. The method is illustrated in the context of cardiovascular disease risk prediction using an individual participant data meta-analysis. We estimate the number of cardiovascular-disease-free life years gained when statin treatment is allocated based on a risk prediction model with five established risk factors instead of a model with just age, gender and region. We explore methodological issues associated with the multistudy design and show that cost-effectiveness comparisons based on the proposed methodology are robust against a range of modelling assumptions, including adjusting for competing risks. Copyright © 2011 John Wiley & Sons, Ltd.
A framework for quantifying net benefits of alternative prognostic models‡
Rapsomaniki, Eleni; White, Ian R; Wood, Angela M; Thompson, Simon G
2012-01-01
New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measures and can be used in full health economic evaluations of prognostic models used for screening and allocating risk reduction interventions. We extend previous work in this area by quantifying net benefits in life years, thus linking prognostic performance to health economic measures; by taking full account of the occurrence of events over time; and by considering estimation and cross-validation in a multiple-study setting. The method is illustrated in the context of cardiovascular disease risk prediction using an individual participant data meta-analysis. We estimate the number of cardiovascular-disease-free life years gained when statin treatment is allocated based on a risk prediction model with five established risk factors instead of a model with just age, gender and region. We explore methodological issues associated with the multistudy design and show that cost-effectiveness comparisons based on the proposed methodology are robust against a range of modelling assumptions, including adjusting for competing risks. Copyright © 2011 John Wiley & Sons, Ltd. PMID:21905066
Fuzzy net present valuation based on risk assessment of Malaysian infrastructure
NASA Astrophysics Data System (ADS)
Shaffie, Siti Salihah; Jaaman, Saiful Hafizah; Mohamad, Daud
2017-04-01
In recent years, built-operate-transfer (BOT) projects have profoundly been accepted under project financing for infrastructure developments in many countries. It requires high financing and involves complex mutual risk. The assessment of the risks is vital to avert huge financial loss. Net present value is widely applied to BOT project where the uncertainties in cash flows are deemed to be deterministic values. This study proposed a fuzzy net present value model taking consideration the assessment of risks from the BOT project. The proposed model is adopted to provide more flexible net present valuation of the project. It is shown and proven that the improved fuzzy cash flow model will provide a valuation that is closed to the real value of the project.
Decentralised control of continuous Petri nets
NASA Astrophysics Data System (ADS)
Wang, Liewei; Wang, Xu
2017-05-01
This paper focuses on decentralised control of systems modelled by continuous Petri nets, in which a target marking control problem is discussed. In some previous works, an efficient ON/OFF strategy-based minimum-time controller was developed. Nevertheless, the convergence is only proved for subclasses like Choice-Free nets. For a general net, the pre-conditions of applying the ON/OFF strategy are not given; therefore, the application scope of the method is unclear. In this work, we provide two sufficient conditions of applying the ON/OFF strategy-based controller to general nets. Furthermore, an extended algorithm for general nets is proposed, in which control laws are computed based on some limited information, without knowing the detailed structure of subsystems.
Illinois WorkNet System, NOCTI Partner for Real-World Credentials
ERIC Educational Resources Information Center
Telger, Natasha; Foster, John C.
2011-01-01
This article describes one assessment that provides a college- and career-ready individual for employers. In Illinois, workNet is the state's primary online workforce development Web site and resource for Workforce Investment Act services. With help from NOCTI, workNet offers assessments that identify the skills and interests of participants,…
LabNet: Toward A Community of Practice. Technology in Education Series.
ERIC Educational Resources Information Center
Ruopp, Richard, Ed.; And Others
Many educators advocate the use of projects in the science classroom. This document describes an effort (LabNet) that has successfully implemented a program that allows students to learn science using projects. Chapter 1, "An Introduction to LabNet" (Richard Ruopp, Megham Pfister), provides an initial framework for understanding the…
Evaluation of the Texas Nutrition Education and Training Program for Federal Fiscal Year 1997.
ERIC Educational Resources Information Center
Ahmad, Mahassen
This report summarizes the results of the 1997 Texas Nutrition Education and Training (NET) program, one of the U.S. Department of Agriculture's Child Nutrition Programs. NET provides nutrition education and instructional resources for children and key individuals in their learning environment. NET's target population includes parents or…
ScaleNet: a literature-based model of scale insect biology and systematics
García Morales, Mayrolin; Denno, Barbara D.; Miller, Douglass R.; Miller, Gary L.; Ben-Dov, Yair; Hardy, Nate B.
2016-01-01
Scale insects (Hemiptera: Coccoidea) are small herbivorous insects found on all continents except Antarctica. They are extremely invasive, and many species are serious agricultural pests. They are also emerging models for studies of the evolution of genetic systems, endosymbiosis and plant-insect interactions. ScaleNet was launched in 1995 to provide insect identifiers, pest managers, insect systematists, evolutionary biologists and ecologists efficient access to information about scale insect biological diversity. It provides comprehensive information on scale insects taken directly from the primary literature. Currently, it draws from 23 477 articles and describes the systematics and biology of 8194 valid species. For 20 years, ScaleNet ran on the same software platform. That platform is no longer viable. Here, we present a new, open-source implementation of ScaleNet. We have normalized the data model, begun the process of correcting invalid data, upgraded the user interface, and added online administrative tools. These improvements make ScaleNet easier to use and maintain and make the ScaleNet data more accurate and extendable. Database URL: http://scalenet.info PMID:26861659
TokenPasser: A petri net specification tool. Thesis
NASA Technical Reports Server (NTRS)
Mittmann, Michael
1991-01-01
In computer program design it is essential to know the effectiveness of different design options in improving performance, and dependability. This paper provides a description of a CAD tool for distributed hierarchical Petri nets. After a brief review of Petri nets, Petri net languages, and Petri net transducers, and descriptions of several current Petri net tools, the specifications and design of the TokenPasser tool are presented. TokenPasser is a tool to allow design of distributed hierarchical systems based on Petri nets. A case study for an intelligent robotic system is conducted, a coordination structure with one dispatcher controlling three coordinators is built to model a proposed robotic assembly system. The system is implemented using TokenPasser, and the results are analyzed to allow judgment of the tool.
Tools and strategies for instrument monitoring, data mining and data access
NASA Astrophysics Data System (ADS)
van Hees, R. M., ,, Dr
2009-04-01
The ever growing size of data sets produced by various satellite instruments creates a challenge in data management. Three main tasks were identified: instrument performance monitoring, data mining by users and data deployment. In this presentation, I will discuss the three tasks and our solution. As a practical example to illustrate the problem and make the discussion less abstract, I will use Sciamachy on-board the ESA satellite Envisat. Since the launch of Envisat, in March 2002, Sciamachy has performed nearly a billion science measurements and performed daily calibrations measurements. The total size of the data set (not including reprocessed data) is over 30 TB, distributed over 150,000 files. [Instrument Monitoring] Most instruments produce house-keeping data, which may include time, geo-location, temperature of different parts of the instrument and instrument settings and configuration. In addition, many instruments perform calibration measurements. Instrument performance monitoring requires automated analyzes of critical parameters for events, and the option to off-line inspect the behavior of various parameters in time. We choose to extract the necessary information from the SCIAMACHY data products, and store everything in one file, where we separated house-keeping data from calibration measurements. Due to the large volume and the need to have quick random-access, the Hierarchical Data Format (HDF5) was our obvious choice. The HDF5 format is self describing and designed to organize different types of data in one file. For example, one data set may contain the meta data of the calibration measurements: time, geo-location, instrument settings, quality parameters (temperature of the instrument), while a second large data set contains the actual measurements. The HDF5 high-level packet table API is ideal for tables that only grow (by appending rows), while the HDF5 table API is better suited for tables where rows need to be updated, inserted or replaced. In particular, the packet table API allows very compact storage of compound data sets and very fast read/write access. Details about this implementation and pitfalls will be given in the presentation. [Data Mining] The ability to select relevant data is a requirement that all data centers have to offer. The NL-SCIA-DC allows the users to select data using several criteria including: time, geo-location, type of observation and data quality. The result of the query are [i] location and name of relevant data products (files), or [ii] listing of meta data of the relevant measurements, or [iii] listing of the measurements (level 2 or higher). For this application, we need the power of a relational database, the SQL language, and the availability of spatial functions. PostgreSQL, extended with postGIS support turned out to be a good choice. Common queries on tables with millions of rows can be executed within seconds. [Data Deployment] The dissemination of scientific data is often cumbersome by the usage of many different formats to store the products. Therefore, time-consuming and inefficient conversions are needed to use data products from different origin. Within the Atmospheric Data Access for the Geospatial User Community (ADAGUC) project we provide selected space borne atmospheric and land data sets in the same data format and consistent internal structure, so that users can easily use and combine data. The common format for storage is HDF5, but the netCDF-4 API is used to create the data sets. The standard for metadata and dataset attributes follow the netCDF Climate and Forecast conventions, in addition metadata complies to the ISO 19115:2003 INSPIRE profile are added. The advantage of netCDF-4 is that the API is essentially equal to netCDF-3 (with a few extensions), while the data format is HDF5 (recognized by many scientific tools). The added metadata ensures product traceability. Details will be given in the presentation and several posters.
2013-01-01
Background Mass distribution of long-lasting insecticide treated bed nets (LLINs) has led to large increases in LLIN coverage in many African countries. As LLIN ownership levels increase, planners of future mass distributions face the challenge of deciding whether to ignore the nets already owned by households or to take these into account and attempt to target individuals or households without nets. Taking existing nets into account would reduce commodity costs but require more sophisticated, and potentially more costly, distribution procedures. The decision may also have implications for the average age of nets in use and therefore on the maintenance of universal LLIN coverage over time. Methods A stochastic simulation model based on the NetCALC algorithm was used to determine the scenarios under which it would be cost saving to take existing nets into account, and the potential effects of doing so on the age profile of LLINs owned. The model accounted for variability in timing of distributions, concomitant use of continuous distribution systems, population growth, sampling error in pre-campaign coverage surveys, variable net ‘decay’ parameters and other factors including the feasibility and accuracy of identifying existing nets in the field. Results Results indicate that (i) where pre-campaign coverage is around 40% (of households owning at least 1 LLIN), accounting for existing nets in the campaign will have little effect on the mean age of the net population and (ii) even at pre-campaign coverage levels above 40%, an approach that reduces LLIN distribution requirements by taking existing nets into account may have only a small chance of being cost-saving overall, depending largely on the feasibility of identifying nets in the field. Based on existing literature the epidemiological implications of such a strategy is likely to vary by transmission setting, and the risks of leaving older nets in the field when accounting for existing nets must be considered. Conclusions Where pre-campaign coverage levels established by a household survey are below 40% we recommend that planners do not take such LLINs into account and instead plan a blanket mass distribution. At pre-campaign coverage levels above 40%, campaign planners should make explicit consideration of the cost and feasibility of accounting for existing LLINs before planning blanket mass distributions. Planners should also consider restricting the coverage estimates used for this decision to only include nets under two years of age in order to ensure that old and damaged nets do not compose too large a fraction of existing net coverage. PMID:23763773
Yukich, Joshua; Bennett, Adam; Keating, Joseph; Yukich, Rudy K; Lynch, Matt; Eisele, Thomas P; Kolaczinski, Kate
2013-06-14
Mass distribution of long-lasting insecticide treated bed nets (LLINs) has led to large increases in LLIN coverage in many African countries. As LLIN ownership levels increase, planners of future mass distributions face the challenge of deciding whether to ignore the nets already owned by households or to take these into account and attempt to target individuals or households without nets. Taking existing nets into account would reduce commodity costs but require more sophisticated, and potentially more costly, distribution procedures. The decision may also have implications for the average age of nets in use and therefore on the maintenance of universal LLIN coverage over time. A stochastic simulation model based on the NetCALC algorithm was used to determine the scenarios under which it would be cost saving to take existing nets into account, and the potential effects of doing so on the age profile of LLINs owned. The model accounted for variability in timing of distributions, concomitant use of continuous distribution systems, population growth, sampling error in pre-campaign coverage surveys, variable net 'decay' parameters and other factors including the feasibility and accuracy of identifying existing nets in the field. Results indicate that (i) where pre-campaign coverage is around 40% (of households owning at least 1 LLIN), accounting for existing nets in the campaign will have little effect on the mean age of the net population and (ii) even at pre-campaign coverage levels above 40%, an approach that reduces LLIN distribution requirements by taking existing nets into account may have only a small chance of being cost-saving overall, depending largely on the feasibility of identifying nets in the field. Based on existing literature the epidemiological implications of such a strategy is likely to vary by transmission setting, and the risks of leaving older nets in the field when accounting for existing nets must be considered. Where pre-campaign coverage levels established by a household survey are below 40% we recommend that planners do not take such LLINs into account and instead plan a blanket mass distribution. At pre-campaign coverage levels above 40%, campaign planners should make explicit consideration of the cost and feasibility of accounting for existing LLINs before planning blanket mass distributions. Planners should also consider restricting the coverage estimates used for this decision to only include nets under two years of age in order to ensure that old and damaged nets do not compose too large a fraction of existing net coverage.
Assessing NETS.T Performance in Teacher Candidates: Exploring the Wayfind Teacher Assessment
ERIC Educational Resources Information Center
Banister, Savilla; Vannatta Reinhart, Rachel
2013-01-01
To effectively integrate digital technologies in K-12 schools, teachers must be provided with undergraduate experiences that strongly support these integration resources and strategies. The National Educational Technology Standards for Teachers (NETS.T) provide a framework for teacher candidates and inservice teachers to identify their…
Cyber Defence in the Armed Forces of the Czech Republic
2010-11-01
undesirable action backward discovery. This solution is based on special tools using NetFlow protocol. Active network elements or specialized hardware...probes attached to the backbone network using a tap can be the sources of NetFlow data. The principal advantage of NetFlow protocol is the fact that it...provides primary data in the open form, which can be easily utilized in the subsequent operations. The FlowMon Probe 4000 is mostly used NetFlow
DisGeNET: a discovery platform for the dynamical exploration of human diseases and their genes.
Piñero, Janet; Queralt-Rosinach, Núria; Bravo, Àlex; Deu-Pons, Jordi; Bauer-Mehren, Anna; Baron, Martin; Sanz, Ferran; Furlong, Laura I
2015-01-01
DisGeNET is a comprehensive discovery platform designed to address a variety of questions concerning the genetic underpinning of human diseases. DisGeNET contains over 380,000 associations between >16,000 genes and 13,000 diseases, which makes it one of the largest repositories currently available of its kind. DisGeNET integrates expert-curated databases with text-mined data, covers information on Mendelian and complex diseases, and includes data from animal disease models. It features a score based on the supporting evidence to prioritize gene-disease associations. It is an open access resource available through a web interface, a Cytoscape plugin and as a Semantic Web resource. The web interface supports user-friendly data exploration and navigation. DisGeNET data can also be analysed via the DisGeNET Cytoscape plugin, and enriched with the annotations of other plugins of this popular network analysis software suite. Finally, the information contained in DisGeNET can be expanded and complemented using Semantic Web technologies and linked to a variety of resources already present in the Linked Data cloud. Hence, DisGeNET offers one of the most comprehensive collections of human gene-disease associations and a valuable set of tools for investigating the molecular mechanisms underlying diseases of genetic origin, designed to fulfill the needs of different user profiles, including bioinformaticians, biologists and health-care practitioners. Database URL: http://www.disgenet.org/ © The Author(s) 2015. Published by Oxford University Press.
Overview of Heat Addition and Efficiency Predictions for an Advanced Stirling Convertor
NASA Technical Reports Server (NTRS)
Wilson, Scott D.; Reid, Terry; Schifer, Nicholas; Briggs, Maxwell
2011-01-01
Past methods of predicting net heat input needed to be validated. Validation effort pursued with several paths including improving model inputs, using test hardware to provide validation data, and validating high fidelity models. Validation test hardware provided direct measurement of net heat input for comparison to predicted values. Predicted value of net heat input was 1.7 percent less than measured value and initial calculations of measurement uncertainty were 2.1 percent (under review). Lessons learned during validation effort were incorporated into convertor modeling approach which improved predictions of convertor efficiency.
BioNetFit: a fitting tool compatible with BioNetGen, NFsim and distributed computing environments.
Thomas, Brandon R; Chylek, Lily A; Colvin, Joshua; Sirimulla, Suman; Clayton, Andrew H A; Hlavacek, William S; Posner, Richard G
2016-03-01
Rule-based models are analyzed with specialized simulators, such as those provided by the BioNetGen and NFsim open-source software packages. Here, we present BioNetFit, a general-purpose fitting tool that is compatible with BioNetGen and NFsim. BioNetFit is designed to take advantage of distributed computing resources. This feature facilitates fitting (i.e. optimization of parameter values for consistency with data) when simulations are computationally expensive. BioNetFit can be used on stand-alone Mac, Windows/Cygwin, and Linux platforms and on Linux-based clusters running SLURM, Torque/PBS, or SGE. The BioNetFit source code (Perl) is freely available (http://bionetfit.nau.edu). Supplementary data are available at Bioinformatics online. bionetgen.help@gmail.com. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
NETopathies? Unraveling the Dark Side of Old Diseases through Neutrophils
Mitsios, Alexandros; Arampatzioglou, Athanasios; Arelaki, Stella; Mitroulis, Ioannis; Ritis, Konstantinos
2017-01-01
Neutrophil extracellular traps (NETs) were initially described as an antimicrobial mechanism of neutrophils. Over the last decade, several lines of evidence support the involvement of NETs in a plethora of pathological conditions. Clinical and experimental data indicate that NET release constitutes a shared mechanism, which is involved in a different degree in various manifestations of non-infectious diseases. Even though the backbone of NETs is similar, there are differences in their protein load in different diseases, which represent alterations in neutrophil protein expression in distinct disorder-specific microenvironments. The characterization of NET protein load in different NET-driven disorders could be of significant diagnostic and/or therapeutic value. Additionally, it will provide further evidence for the role of NETs in disease pathogenesis, and it will enable the characterization of disorders in which neutrophils and NET-dependent inflammation are of critical importance. PMID:28123386
Costs and effects of the Tanzanian national voucher scheme for insecticide-treated nets
Mulligan, Jo-Ann; Yukich, Joshua; Hanson, Kara
2008-01-01
Background The cost-effectiveness of insecticide-treated nets (ITNs) in reducing morbidity and mortality is well established. International focus has now moved on to how best to scale up coverage and what financing mechanisms might be used to achieve this. The approach in Tanzania has been to deliver a targeted subsidy for those most vulnerable to the effects of malaria while at the same time providing support to the development of the commercial ITN distribution system. In October 2004, with funds from the Global Fund to Fight AIDS Tuberculosis and Malaria, the government launched the Tanzania National Voucher Scheme (TNVS), a nationwide discounted voucher scheme for ITNs for pregnant women and their infants. This paper analyses the costs and effects of the scheme and compares it with other approaches to distribution. Methods Economic costs were estimated using the ingredients approach whereby all resources required in the delivery of the intervention (including the user contribution) are quantified and valued. Effects were measured in terms of number of vouchers used (and therefore nets delivered) and treated nets years. Estimates were also made for the cost per malaria case and death averted. Results and Conclusion The total financial cost of the programme represents around 5% of the Ministry of Health's total budget. The average economic cost of delivering an ITN using the voucher scheme, including the user contribution, was $7.57. The cost-effectiveness results are within the benchmarks set by other malaria prevention studies. The Government of Tanzania's approach to scaling up ITNs uses both the public and private sectors in order to achieve and sustain the level of coverage required to meet the Abuja targets. The results presented here suggest that the TNVS is a cost-effective strategy for delivering subsidized ITNs to targeted vulnerable groups. PMID:18279509
Ozone Production from the 2004 North American Boreal Fires
NASA Technical Reports Server (NTRS)
Pfister, G. G.; Emmons, L. K.; Hess, P. G.; Honrath, R.; Lamarque, J.-F.; Val Martin, M.; Owen, R. C.; Avery, M. A.; Browell, E. V.; Holloway, J. S.;
2006-01-01
We examine the ozone production from boreal forest fires based on a case study of wildfires in Alaska and Canada in summer 2004. The model simulations were performed with the chemistry transport model, MOZART-4, and were evaluated by comparison with a comprehensive set of aircraft measurements. In the analysis we use measurements and model simulations of carbon monoxide (CO) and ozone (O3) at the PICO-NARE station located in the Azores within the pathway of North American outflow. The modeled mixing ratios were used to test the robustness of the enhancement ratio deltaO3/deltaCO (defined as the excess O3 mixing ratio normalized by the increase in CO) and the feasibility for using this ratio in estimating the O3 production from the wildfires. Modeled and observed enhancement ratios are about 0.25 ppbv/ppbv which is in the range of values found in the literature, and results in a global net O3 production of 12.9 2 Tg O3 during summer 2004. This matches the net O3 production calculated in the model for a region extending from Alaska to the East Atlantic (9-11 Tg O3) indicating that observations at PICO-NARE representing photochemically well-aged plumes provide a good measure of the O3 production of North American boreal fires. However, net chemical loss of fire related O3 dominates in regions far downwind from the fires (e.g. Europe and Asia) resulting in a global net O3 production of 6 Tg O3 during the same time period. On average, the fires increased the O3 burden (surface-300 mbar) over Alaska and Canada during summer 2004 by about 7-9%, and over Europe by about 2-3%.
Representing Simple Geometry Types in NetCDF-CF
NASA Astrophysics Data System (ADS)
Blodgett, D. L.; Koziol, B. W.; Whiteaker, T. L.; Simons, R.
2016-12-01
The Climate and Forecast (CF) metadata convention is well-suited for representing gridded and point-based observational datasets. However, CF currently has no accepted mechanism for representing simple geometry types such as lines and polygons. Lack of support for simple geometries within CF has unintentionally excluded a broad set of geoscientific data types from NetCDF-CF data encodings. For example, hydrologic datasets often contain polygon watershed catchments and polyline stream reaches in addition to point sampling stations and water management infrastructure. The latter has an associated CF specification. In the interest of supporting all simple geometry types within CF, a working group was formed following an EarthCube workshop on Advancing NetCDF-CF [1] to draft a CF specification for simple geometries: points, lines, polygons, and their associated multi-geometry representations [2]. The draft also includes parametric geometry types such as circles and ellipses. This presentation will provide an overview of the scope and content of the proposed specification focusing on mechanisms for representing coordinate arrays using variable length or continuous ragged arrays, capturing multi-geometries, and accounting for type-specific geometry artifacts such as polygon holes/interiors, node ordering, etc. The concepts contained in the specification proposal will be described with a use case representing streamflow in rivers and evapotranspiration from HUC12 watersheds. We will also introduce Python and R reference implementations developed alongside the technical specification. These in-development, open source Python and R libraries convert between commonly used GIS software objects (i.e. GEOS-based primitives) and their associated simple geometry CF representation. [1] http://www.unidata.ucar.edu/events/2016CFWorkshop/[2] https://github.com/bekozi/netCDF-CF-simple-geometry
Dale, Michael; Benson, Sally M
2013-04-02
A combination of declining costs and policy measures motivated by greenhouse gas (GHG) emissions reduction and energy security have driven rapid growth in the global installed capacity of solar photovoltaics (PV). This paper develops a number of unique data sets, namely the following: calculation of distribution of global capacity factor for PV deployment; meta-analysis of energy consumption in PV system manufacture and deployment; and documentation of reduction in energetic costs of PV system production. These data are used as input into a new net energy analysis of the global PV industry, as opposed to device level analysis. In addition, the paper introduces a new concept: a model tracking energetic costs of manufacturing and installing PV systems, including balance of system (BOS) components. The model is used to forecast electrical energy requirements to scale up the PV industry and determine the electricity balance of the global PV industry to 2020. Results suggest that the industry was a net consumer of electricity as recently as 2010. However, there is a >50% that in 2012 the PV industry is a net electricity provider and will "pay back" the electrical energy required for its early growth before 2020. Further reducing energetic costs of PV deployment will enable more rapid growth of the PV industry. There is also great potential to increase the capacity factor of PV deployment. These conclusions have a number of implications for R&D and deployment, including the following: monitoring of the energy embodied within PV systems; designing more efficient and durable systems; and deploying PV systems in locations that will achieve high capacity factors.
SU-E-T-637: Age and Batch Dependence of Gafchromic EBT Films in Photon and Proton Beam Dosimetry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Das, I; Akino, Y
2014-06-15
Purpose: Gafchrmoic films have undergone significant changes in characteristic over time reflected by HS, EBT, EBT2, EBT3 name. Interand intra- EBT film variability have been studied and found to be significant. However, age and lot/batch type have not been studied in various radiation beams that are investigated in this study. Methods: Thirteen sets of films; 2 EBT, 6 EBT2 and 5 EBT3 films with different lot number and expiration date were acquired. Films were cut longitudinally in 3 cm width and sandwiched between two solid water slabs that were placed in a water phantom to eliminate air gap. Each setmore » of films were irradiated longitudinally at dmax with 6 and 15 MV photon beams as well as in reference condition (16 cm range, 10 cm SOBP) in our uniform scanning proton beam. Films were scanned using an Epson flatbed scanner (ES-10000G) after 48 hours to achieve full polymerization. The profiles were compared with the depth-dose measured with ionization chamber and net optical density (net OD) were calculated. Results: The net OD versus dose for EBT, EBT2 and EBT3 films of different age showed similar trend but with different slope. Even after calibration, differences are clearly visible in net OD in proton and photon beams. A net OD difference of nearly 0.5 is observed in photon but this was limited to 0.2–0.3 in proton beam. This relates to 20% and 15% dosimetric difference in photon and proton beam respectively over age and type of film. Conclusion: Net OD related to dose is dependent on the age and lot of the film in both photon and proton beams. It is concluded that before any set of film is used, a calibration film should be used for a meaningful dosimetry. The expired films showed larger OD variation compared to unexpired films.« less
Health safety nets can break cycles of poverty and disease: a stochastic ecological model.
Plucinski, Mateusz M; Ngonghala, Calistus N; Bonds, Matthew H
2011-12-07
The persistence of extreme poverty is increasingly attributed to dynamic interactions between biophysical processes and economics, though there remains a dearth of integrated theoretical frameworks that can inform policy. Here, we present a stochastic model of disease-driven poverty traps. Whereas deterministic models can result in poverty traps that can only be broken by substantial external changes to the initial conditions, in the stochastic model there is always some probability that a population will leave or enter a poverty trap. We show that a 'safety net', defined as an externally enforced minimum level of health or economic conditions, can guarantee ultimate escape from a poverty trap, even if the safety net is set within the basin of attraction of the poverty trap, and even if the safety net is only in the form of a public health measure. Whereas the deterministic model implies that small improvements in initial conditions near the poverty-trap equilibrium are futile, the stochastic model suggests that the impact of changes in the location of the safety net on the rate of development may be strongest near the poverty-trap equilibrium.
Electric nets and sticky materials for analysing oviposition behaviour of gravid malaria vectors
2012-01-01
Background Little is known about how malaria mosquitoes locate oviposition sites in nature. Such knowledge is important to help devise monitoring and control measures that could be used to target gravid females. This study set out to develop a suite of tools that can be used to study the attraction of gravid Anopheles gambiae s.s. towards visual or olfactory cues associated with aquatic habitats. Methods Firstly, the study developed and assessed methods for using electrocuting nets to analyse the orientation of gravid females towards an aquatic habitat. Electric nets (1m high × 0.5m wide) were powered by a 12V battery via a spark box. High and low energy settings were compared for mosquito electrocution and a collection device developed to retain electrocuted mosquitoes when falling to the ground. Secondly, a range of sticky materials and a detergent were tested to quantify if and where gravid females land to lay their eggs, by treating the edge of the ponds and the water surface. A randomized complete block design was used for all experiments with 200 mosquitoes released each day. Experiments were conducted in screened semi-field systems using insectary-reared An. gambiae s.s. Data were analysed by generalized estimating equations. Results An electric net operated at the highest spark box energy of a 400 volt direct current made the net spark, creating a crackling sound, a burst of light and a burning smell. This setting caught 64% less mosquitoes than a net powered by reduced voltage output that could neither be heard nor seen (odds ratio (OR) 0.46; 95% confidence interval (CI) 0.40-0.53, p < 0.001). Three sticky boards (transparent film, glue coated black fly-screen and yellow film) were evaluated as catching devices under electric nets and the transparent and shiny black surfaces were found highly attractive (OR 41.6, 95% CI 19.8 – 87.3, p < 0.001 and OR 28.8, 95% CI 14.5 – 56.8, p < 0.001, respectively) for gravid mosquitoes to land on compared to a yellow sticky film board and therefore unsuitable as collection device under the e-nets. With a square of four e-nets around a pond combined with yellow sticky boards on average 33% (95% CI 28-38%) of mosquitoes released were collected. Sticky materials and detergent in the water worked well in collecting mosquitoes when landing on the edge of the pond or on the water surface. Over 80% of collected females were found on the water surface (mean 103, 95% CI 93–115) as compared to the edge of the artificial pond (mean 24, 95% CI 20–28). Conclusion A square of four e-nets with yellow sticky boards as a collection device can be used for quantifying the numbers of mosquitoes approaching a small oviposition site. Shiny sticky surfaces attract gravid females possibly because they are visually mistaken as aquatic habitats. These materials might be developed further as gravid traps. Anopheles gambiae s.s. primarily land on the water surface for oviposition. This behaviour can be exploited for the development of new trapping and control strategies. PMID:23151023
Genetic deletion of the norepinephrine transporter decreases vulnerability to seizures
Kaminski, Rafal M.; Shippenberg, Toni S.; Witkin, Jeffrey M.; Rocha, Beatriz A.
2005-01-01
Norepinephrine (NE) has been reported to modulate neuronal excitability and act as endogenous anticonvulsant. In the present study we used NE transporter knock-out mice (NET-KO), which are characterized by high levels of extracellular NE, to investigate the role of endogenous NE in seizure susceptibility. Seizure thresholds for cocaine (i.p.), pentylenetetrazol (i.v.) and kainic acid (i.v.) were compared in NET-KO, heterozygous (NET-HT) and wild type (NET-WT) female mice. The dose-response curve for cocaine-induced convulsions was significantly shifted to the right in NET-KO mice, indicating higher seizure thresholds. The threshold doses of pentylenetetrazol that induced clonic and tonic seizures were also significantly higher in NET-KO when compared to NET-WT mice. Similarly, NET-KO mice displayed higher resistance to convulsions engendered by kainic acid. For all drugs tested, the response of NET-HT mice was always intermediate. These data provide further support for a role of endogenous NE in the control of seizure susceptibility. PMID:15911120
MPIGeneNet: Parallel Calculation of Gene Co-Expression Networks on Multicore Clusters.
Gonzalez-Dominguez, Jorge; Martin, Maria J
2017-10-10
In this work we present MPIGeneNet, a parallel tool that applies Pearson's correlation and Random Matrix Theory to construct gene co-expression networks. It is based on the state-of-the-art sequential tool RMTGeneNet, which provides networks with high robustness and sensitivity at the expenses of relatively long runtimes for large scale input datasets. MPIGeneNet returns the same results as RMTGeneNet but improves the memory management, reduces the I/O cost, and accelerates the two most computationally demanding steps of co-expression network construction by exploiting the compute capabilities of common multicore CPU clusters. Our performance evaluation on two different systems using three typical input datasets shows that MPIGeneNet is significantly faster than RMTGeneNet. As an example, our tool is up to 175.41 times faster on a cluster with eight nodes, each one containing two 12-core Intel Haswell processors. Source code of MPIGeneNet, as well as a reference manual, are available at https://sourceforge.net/projects/mpigenenet/.
Measures of net oxidant concentration in seawater
NASA Astrophysics Data System (ADS)
Jackson, George A.; Williams, Peter M.
1988-02-01
Dissolved oxygen deficits in the ocean have been used as a measure of the organic matter oxidized in a volume of water. Such organic matter is usually assumed to be predominantly settled particles. Using dissolved oxygen concentration in this way has two problems: first, it does not differentiate between oxidant consumed by the pool of dissolved organic matter present near the ocean surface and oxidant consumed by organic matter contained by falling particles; second, it does not account for other oxidant sources, such as nitrate, which can be as important to organic matter decay as oxygen in low-oxygen water, such as off Peru or in the Southern California submarine basins. New parameters provide better measures of the net oxidant concentration in a water parcel. One such, NetOx, is changed only by gaseous exchange with the atmosphere, exchange with the benthos, or the production or consumption of sinking particles. A simplified version of NetOx, NetOx = [O2] + 1.25[NO3-] - [TOC], where TOC (total organic carbon), the dissolved organic carbon (DOC) plus the suspended particulate organic carbon (POC), provides an index based on the usually dominant variables. Calculation of NetOx and a second property, NetOC ([O2] - [TOC]), for data from GEOSECS and ourselves in the Atlantic and Pacific oceans using property-property graphs show differences from those from oxygen deficits alone. Comparison of NetOx and NetOC concentrations at high and low latitudes of the Pacific Ocean shows the difference in surface water oxidant concentrations is even larger than the difference in oxygen concentration. Vertical particle fluxes off Peru calculated from NetOx gradients are much greater than those calculated from oxygen gradients. The potential value of NetOx and NetOC as parameters to understand particle fluxes implies that determination of TOC should be a routine part of hydrographic measurements.
Maira, S M; Wurtz, J M; Wasylyk, B
1996-01-01
The three ternary complex factors (TCFs), Net (ERP/ SAP-2), ELK-1 and SAP-1, are highly related ets oncogene family members that participate in the response of the cell to Ras and growth signals. Understanding the different roles of these factors will provide insights into how the signals result in coordinate regulation of the cell. We show that Net inhibits transcription under basal conditions, in which SAP-1a is inactive and ELK-1 stimulates. Repression is mediated by the NID, the Net Inhibitory Domain of about 50 amino acids, which autoregulates the Net protein and also inhibits when it is isolated in a heterologous fusion protein. Net is particularly sensitive to Ras activation. Ras activates Net through the C-domain, which is conserved between the three TCFs, and the NID is an efficient inhibitor of Ras activation. The NID, as well as more C-terminal sequences, inhibit DNA binding. Net is more refractory to DNA binding than the other TCFs, possibly due to the presence of multiple inhibitory elements. The NID may adopt a helix-loop-helix (HLH) structure, as evidenced by homology to other HLH motifs, structure predictions, model building and mutagenesis of critical residues. The sequence resemblance with myogenic factors suggested that Net may form complexes with the same partners. Indeed, we found that Net can interact in vivo with the basic HLH factor, E47. We propose that Net is regulated at the level of its latent DNA-binding activity by protein interactions and/or phosphorylation. Net may form complexes with HLH proteins as well as SRF on specific promotor sequences. The identification of the novel inhibitory domain provides a new inroad into exploring the different roles of the ternary complex factors in growth control and transformation. Images PMID:8918463
Maira, S M; Wurtz, J M; Wasylyk, B
1996-11-01
The three ternary complex factors (TCFs), Net (ERP/ SAP-2), ELK-1 and SAP-1, are highly related ets oncogene family members that participate in the response of the cell to Ras and growth signals. Understanding the different roles of these factors will provide insights into how the signals result in coordinate regulation of the cell. We show that Net inhibits transcription under basal conditions, in which SAP-1a is inactive and ELK-1 stimulates. Repression is mediated by the NID, the Net Inhibitory Domain of about 50 amino acids, which autoregulates the Net protein and also inhibits when it is isolated in a heterologous fusion protein. Net is particularly sensitive to Ras activation. Ras activates Net through the C-domain, which is conserved between the three TCFs, and the NID is an efficient inhibitor of Ras activation. The NID, as well as more C-terminal sequences, inhibit DNA binding. Net is more refractory to DNA binding than the other TCFs, possibly due to the presence of multiple inhibitory elements. The NID may adopt a helix-loop-helix (HLH) structure, as evidenced by homology to other HLH motifs, structure predictions, model building and mutagenesis of critical residues. The sequence resemblance with myogenic factors suggested that Net may form complexes with the same partners. Indeed, we found that Net can interact in vivo with the basic HLH factor, E47. We propose that Net is regulated at the level of its latent DNA-binding activity by protein interactions and/or phosphorylation. Net may form complexes with HLH proteins as well as SRF on specific promotor sequences. The identification of the novel inhibitory domain provides a new inroad into exploring the different roles of the ternary complex factors in growth control and transformation.
Rogue taxa phenomenon: a biological companion to simulation analysis
Westover, Kristi M.; Rusinko, Joseph P.; Hoin, Jon; Neal, Matthew
2013-01-01
To provide a baseline biological comparison to simulation study predictions about the frequency of rogue taxa effects, we evaluated the frequency of a rogue taxa effect using viral data sets which differed in diversity. Using a quartet-tree framework, we measured the frequency of a rogue taxa effect in three data sets of increasing genetic variability (within viral serotype, between viral serotype, and between viral family) to test whether the rogue taxa was correlated with the mean sequence diversity of the respective data sets. We found a slight increase in the percentage of rogues as nucleotide diversity increased. Even though the number of rogues increased with diversity, the distribution of the types of rogues (friendly, crazy, or evil) did not depend on the diversity and in the case of the order-level data set the net rogue effect was slightly positive. This study, assessing frequency of the rogue taxa effect using biological data, indicated that simulation studies may over-predict the prevalence of the rogue taxa effect. Further investigations are necessary to understand which types of data sets are susceptible to a negative rogue effect and thus merit the removal of taxa from large phylogenetic reconstructions. PMID:23707704
Rogue taxa phenomenon: a biological companion to simulation analysis.
Westover, Kristi M; Rusinko, Joseph P; Hoin, Jon; Neal, Matthew
2013-10-01
To provide a baseline biological comparison to simulation study predictions about the frequency of rogue taxa effects, we evaluated the frequency of a rogue taxa effect using viral data sets which differed in diversity. Using a quartet-tree framework, we measured the frequency of a rogue taxa effect in three data sets of increasing genetic variability (within viral serotype, between viral serotype, and between viral family) to test whether the rogue taxa was correlated with the mean sequence diversity of the respective data sets. We found a slight increase in the percentage of rogues as nucleotide diversity increased. Even though the number of rogues increased with diversity, the distribution of the types of rogues (friendly, crazy, or evil) did not depend on the diversity and in the case of the order-level data set the net rogue effect was slightly positive. This study, assessing frequency of the rogue taxa effect using biological data, indicated that simulation studies may over-predict the prevalence of the rogue taxa effect. Further investigations are necessary to understand which types of data sets are susceptible to a negative rogue effect and thus merit the removal of taxa from large phylogenetic reconstructions. Copyright © 2013 Elsevier Inc. All rights reserved.
PANDORA: keyword-based analysis of protein sets by integration of annotation sources.
Kaplan, Noam; Vaaknin, Avishay; Linial, Michal
2003-10-01
Recent advances in high-throughput methods and the application of computational tools for automatic classification of proteins have made it possible to carry out large-scale proteomic analyses. Biological analysis and interpretation of sets of proteins is a time-consuming undertaking carried out manually by experts. We have developed PANDORA (Protein ANnotation Diagram ORiented Analysis), a web-based tool that provides an automatic representation of the biological knowledge associated with any set of proteins. PANDORA uses a unique approach of keyword-based graphical analysis that focuses on detecting subsets of proteins that share unique biological properties and the intersections of such sets. PANDORA currently supports SwissProt keywords, NCBI Taxonomy, InterPro entries and the hierarchical classification terms from ENZYME, SCOP and GO databases. The integrated study of several annotation sources simultaneously allows a representation of biological relations of structure, function, cellular location, taxonomy, domains and motifs. PANDORA is also integrated into the ProtoNet system, thus allowing testing thousands of automatically generated clusters. We illustrate how PANDORA enhances the biological understanding of large, non-uniform sets of proteins originating from experimental and computational sources, without the need for prior biological knowledge on individual proteins.
Gitonga, Caroline W; Edwards, Tansy; Karanja, Peris N; Noor, Abdisalan M; Snow, Robert W; Brooker, Simon J
2012-07-01
To investigate risk factors, including reported net use, for Plasmodium infection and anaemia among school children and to explore variations in effects across different malaria ecologies occurring in Kenya. This study analysed data for 49 975 school children in 480 schools surveyed during a national school malaria survey, 2008-2010. Mixed effects logistic regression was used to investigate factors associated with Plasmodium infection and anaemia within different malaria transmission zones. Insecticide-treated net (ITN) use was associated with reduction in the odds of Plasmodium infection in coastal and western highlands epidemic zones and among boys in the lakeside high transmission zone. Other risk factors for Plasmodium infection and for anaemia also varied by zone. Plasmodium infection was negatively associated with increasing socio-economic status in all transmission settings, except in the semi-arid north-east zone. Plasmodium infection was a risk factor for anaemia in lakeside high transmission, western highlands epidemic and central low-risk zones, whereas ITN use was only associated with lower levels of anaemia in coastal and central zones and among boys in the lakeside high transmission zone. The risk factors for Plasmodium infection and anaemia, including the protective associations with ITN use, vary according to malaria transmission settings in Kenya, and future efforts to control malaria and anaemia should take into account such heterogeneities among school children. © 2012 Blackwell Publishing Ltd.
Keuffel, Eric; Jaskiewicz, Wanda; Paphassarang, Chanthakhath; Tulenko, Kate
2013-11-01
Many developing countries are examining whether to institute incentive packages that increase the share of health workers who opt to locate in rural settings; however, uncertainty exists with respect to the expected net cost (or benefit) from these packages. We utilize the findings from the discrete choice experiment surveys applied to students training to be health professionals and costing analyses in Lao People's Democratic Republic to model the anticipated effect of incentive packages on new worker location decisions and direct costs. Incorporating evidence on health worker density and health outcomes, we then estimate the expected 5-year net cost (or benefit) of each incentive packages for 3 health worker cadres--physicians, nurses/midwives, and medical assistants. Under base case assumptions, the optimal incentive package for each cadre produced a 5-year net benefit (maximum net benefit for physicians: US$ 44,000; nurses/midwives: US$ 5.6 million; medical assistants: US$ 485,000). After accounting for health effects, the expected net cost of select incentive packages would be substantially less than the original estimate of direct costs. In the case of Lao People's Democratic Republic, incentive packages that do not invest in capital-intensive components generally should produce larger net benefits. Combining discrete choice experiment surveys, costing surveys and cost-benefit analysis methods may be replicated by other developing countries to calculate whether health worker incentive packages are viable policy options.
Behavior-based aggregation of land categories for temporal change analysis
NASA Astrophysics Data System (ADS)
Aldwaik, Safaa Zakaria; Onsted, Jeffrey A.; Pontius, Robert Gilmore, Jr.
2015-03-01
Comparison between two time points of the same categorical variable for the same study extent can reveal changes among categories over time, such as transitions among land categories. If many categories exist, then analysis can be difficult to interpret. Category aggregation is the procedure that combines two or more categories to create a single broader category. Aggregation can simplify interpretation, and can also influence the sizes and types of changes. Some classifications have an a priori hierarchy to facilitate aggregation, but an a priori aggregation might make researchers blind to important category dynamics. We created an algorithm to aggregate categories in a sequence of steps based on the categories' behaviors in terms of gross losses and gross gains. The behavior-based algorithm aggregates net gaining categories with net gaining categories and aggregates net losing categories with net losing categories, but never aggregates a net gaining category with a net losing category. The behavior-based algorithm at each step in the sequence maintains net change and maximizes swap change. We present a case study where data from 2001 and 2006 for 64 land categories indicate change on 17% of the study extent. The behavior-based algorithm produces a set of 10 categories that maintains nearly the original amount of change. In contrast, an a priori aggregation produces 10 categories while reducing the change to 9%. We offer a free computer program to perform the behavior-based aggregation.
Modelling and analysis of workflow for lean supply chains
NASA Astrophysics Data System (ADS)
Ma, Jinping; Wang, Kanliang; Xu, Lida
2011-11-01
Cross-organisational workflow systems are a component of enterprise information systems which support collaborative business process among organisations in supply chain. Currently, the majority of workflow systems is developed in perspectives of information modelling without considering actual requirements of supply chain management. In this article, we focus on the modelling and analysis of the cross-organisational workflow systems in the context of lean supply chain (LSC) using Petri nets. First, the article describes the assumed conditions of cross-organisation workflow net according to the idea of LSC and then discusses the standardisation of collaborating business process between organisations in the context of LSC. Second, the concept of labelled time Petri nets (LTPNs) is defined through combining labelled Petri nets with time Petri nets, and the concept of labelled time workflow nets (LTWNs) is also defined based on LTPNs. Cross-organisational labelled time workflow nets (CLTWNs) is then defined based on LTWNs. Third, the article proposes the notion of OR-silent CLTWNS and a verifying approach to the soundness of LTWNs and CLTWNs. Finally, this article illustrates how to use the proposed method by a simple example. The purpose of this research is to establish a formal method of modelling and analysis of workflow systems for LSC. This study initiates a new perspective of research on cross-organisational workflow management and promotes operation management of LSC in real world settings.
Kornfeld, R; Rupp, K
2000-01-01
The Social Security Administration (SSA) initiated Project NetWork in 1991 to test case management as a means of promoting employment among persons with disabilities. The demonstration, which targeted Social Security Disability Insurance (DI) beneficiaries and Supplemental Security Income (SSI) applicants and recipients, offered intensive outreach, work-incentive waivers, and case management/referral services. Participation in Project NetWork was voluntary. Volunteers were randomly assigned to the "treatment" group or the "control" group. Those assigned to the treatment group met individually with a case or referral manager who arranged for rehabilitation and employment services, helped clients develop an individual employment plan, and provided direct employment counseling services. Volunteers assigned to the control group could not receive services from Project NetWork but remained eligible for any employment assistance already available in their communities. For both treatment and control groups, the demonstration waived specific DI and SSI program rules considered to be work disincentives. The experimental impact study thus measures the incremental effects of case and referral management services. The eight demonstration sites were successful in implementing the experimental design roughly as planned. Project NetWork staff were able to recruit large numbers of participants and to provide rehabilitation and employment services on a substantial scale. Most of the sites easily reached their enrollment targets and were able to attract volunteers with demographic characteristics similar to those of the entire SSI and DI caseload and a broad range of moderate and severe disabilities. However, by many measures, volunteers were generally more "work-ready" than project eligible in the demonstration areas who did not volunteer to receive NetWork services. Project NetWork case management increased average annual earnings by $220 per year over the first 2 years following random assignment. This statistically significant impact, an approximate 11-percent increase in earnings, is based on administrative data on earnings. For about 70 percent of sample members, a third year of followup data was available. For this limited sample, the estimated effect of Project NetWork on annual earnings declined to roughly zero in the third followup year. The findings suggest that the increase in earnings may have been short-lived and may have disappeared by the time Project NetWork services ended. Project NetWork did not reduce reliance on SSI or DI benefits by statistically significant amounts over the 30-42 month followup period. The services provided by Project NetWork thus did not reduce overall SSI and DI caseloads or benefits by substantial amounts, especially given that only about 5 percent of the eligible caseload volunteered to participate in Project NetWork. Project NetWork produced modest net benefits to persons with disabilities and net costs to taxpayers. Persons with disabilities gained mainly because the increases in their earnings easily outweighed the small (if any) reduction in average SSI and DI benefits. For SSA and the federal government as a whole, the costs of Project NetWork were not sufficiently offset by increases in tax receipts resulting from increased earnings or reductions in average SSI and DI benefits. The modest net benefits of Project NetWork to persons with disabilities are encouraging. How such benefits of an experimental intervention should be weighed against costs of taxpayers depends on value judgments of policymakers. Because different case management projects involve different kinds of services, these results cannot be directly generalized to other case management interventions. They are nevertheless instructive for planning new initiatives. Combining case and referral management services with various other interventions, such as longer term financial support for work or altered provider incentives, could produc
NASA Astrophysics Data System (ADS)
Mochizuki, M.; Uehira, K.; Kanazawa, T.; Shiomi, K.; Kunugi, T.; Aoi, S.; Matsumoto, T.; Sekiguchi, S.; Yamamoto, N.; Takahashi, N.; Nakamura, T.; Shinohara, M.; Yamada, T.
2017-12-01
NIED has launched the project of constructing a seafloor observatory network for tsunamis and earthquakes after the occurrence of the 2011 Tohoku Earthquake to enhance reliability of early warnings of tsunamis and earthquakes. The observatory network was named "S-net". The S-net project has been financially supported by MEXT.The S-net consists of 150 seafloor observatories which are connected in line with submarine optical cables. The total length of submarine optical cable is about 5,500 km. The S-net covers the focal region of the 2011 Tohoku Earthquake and its vicinity regions. Each observatory equips two units of a high sensitive pressure gauges as a tsunami meter and four sets of three-component seismometers. The S-net is composed of six segment networks. Five of six segment networks had been already installed. Installation of the last segment network covering the outer rise area have been finally finished by the end of FY2016. The outer rise segment has special features like no other five segments of the S-net. Those features are deep water and long distance. Most of 25 observatories on the outer rise segment are located at the depth of deeper than 6,000m WD. Especially, three observatories are set on the seafloor of deeper than about 7.000m WD, and then the pressure gauges capable of being used even at 8,000m WD are equipped on those three observatories. Total length of the submarine cables of the outer rise segment is about two times longer than those of the other segments. The longer the cable system is, the higher voltage supply is needed, and thus the observatories on the outer rise segment have high withstanding voltage characteristics. We employ a dispersion management line of a low loss formed by combining a plurality of optical fibers for the outer rise segment cable, in order to achieve long-distance, high-speed and large-capacity data transmission Installation of the outer rise segment was finished and then full-scale operation of S-net has started. All the data from 150 seafloor observatories are being transferred to and stored in the Tsukuba DC. Some data are being transmitted directly to JMA and have been used for monitoring of earthquakes and tsunamis. We will report construction and operation of the S-net system as well as the outline of the obtained data in this presentation.
Weiss, Scott T.
2014-01-01
Bayesian Networks (BN) have been a popular predictive modeling formalism in bioinformatics, but their application in modern genomics has been slowed by an inability to cleanly handle domains with mixed discrete and continuous variables. Existing free BN software packages either discretize continuous variables, which can lead to information loss, or do not include inference routines, which makes prediction with the BN impossible. We present CGBayesNets, a BN package focused around prediction of a clinical phenotype from mixed discrete and continuous variables, which fills these gaps. CGBayesNets implements Bayesian likelihood and inference algorithms for the conditional Gaussian Bayesian network (CGBNs) formalism, one appropriate for predicting an outcome of interest from, e.g., multimodal genomic data. We provide four different network learning algorithms, each making a different tradeoff between computational cost and network likelihood. CGBayesNets provides a full suite of functions for model exploration and verification, including cross validation, bootstrapping, and AUC manipulation. We highlight several results obtained previously with CGBayesNets, including predictive models of wood properties from tree genomics, leukemia subtype classification from mixed genomic data, and robust prediction of intensive care unit mortality outcomes from metabolomic profiles. We also provide detailed example analysis on public metabolomic and gene expression datasets. CGBayesNets is implemented in MATLAB and available as MATLAB source code, under an Open Source license and anonymous download at http://www.cgbayesnets.com. PMID:24922310
McGeachie, Michael J; Chang, Hsun-Hsien; Weiss, Scott T
2014-06-01
Bayesian Networks (BN) have been a popular predictive modeling formalism in bioinformatics, but their application in modern genomics has been slowed by an inability to cleanly handle domains with mixed discrete and continuous variables. Existing free BN software packages either discretize continuous variables, which can lead to information loss, or do not include inference routines, which makes prediction with the BN impossible. We present CGBayesNets, a BN package focused around prediction of a clinical phenotype from mixed discrete and continuous variables, which fills these gaps. CGBayesNets implements Bayesian likelihood and inference algorithms for the conditional Gaussian Bayesian network (CGBNs) formalism, one appropriate for predicting an outcome of interest from, e.g., multimodal genomic data. We provide four different network learning algorithms, each making a different tradeoff between computational cost and network likelihood. CGBayesNets provides a full suite of functions for model exploration and verification, including cross validation, bootstrapping, and AUC manipulation. We highlight several results obtained previously with CGBayesNets, including predictive models of wood properties from tree genomics, leukemia subtype classification from mixed genomic data, and robust prediction of intensive care unit mortality outcomes from metabolomic profiles. We also provide detailed example analysis on public metabolomic and gene expression datasets. CGBayesNets is implemented in MATLAB and available as MATLAB source code, under an Open Source license and anonymous download at http://www.cgbayesnets.com.
Colorectal Cancer Safety Net: Is It Catching Patients Appropriately?
Althans, Alison R; Brady, Justin T; Times, Melissa L; Keller, Deborah S; Harvey, Alexis R; Kelly, Molly E; Patel, Nilam D; Steele, Scott R
2018-01-01
Disparities in access to colorectal cancer care are multifactorial and are affected by socioeconomic elements. Uninsured and Medicaid patients present with advanced stage disease and have worse outcomes compared with similar privately insured patients. Safety net hospitals are a major care provider to this vulnerable population. Few studies have evaluated outcomes for safety net hospitals compared with private institutions in colorectal cancer. The purpose of this study was to compare demographics, screening rates, presentation stage, and survival rates between a safety net hospital and a tertiary care center. Comparative review of patients at 2 institutions in the same metropolitan area were conducted. The study included colorectal cancer care delivered either at 1 safety net hospital or 1 private tertiary care center in the same city from 2010 to 2016. A total of 350 patients with colorectal cancer from each hospital were evaluated. Overall survival across hospital systems was measured. The safety net hospital had significantly more uninsured and Medicaid patients (46% vs 13%; p < 0.001) and a significantly lower median household income than the tertiary care center ($39,299 vs $49,741; p < 0.0001). At initial presentation, a similar percentage of patients at each hospital presented with stage IV disease (26% vs 20%; p = 0.06). For those undergoing resection, final pathologic stage distribution was similar across groups (p = 0.10). After a comparable median follow-up period (26.6 mo for safety net hospital vs 29.2 mo for tertiary care center), log-rank test for overall survival favored the safety net hospital (p = 0.05); disease-free survival was similar between hospitals (p = 0.40). This was a retrospective review, reporting from medical charts. Our results support the value of safety net hospitals for providing quality colorectal cancer care, with survival and recurrence outcomes equivalent or improved compared with a local tertiary care center. Because safety net hospitals can provide equivalent outcomes despite socioeconomic inequalities and financial constraints, emphasis should be focused on ensuring that adequate funding for these institutions continues. See Video Abstract at http://links.lww.com/DCR/A454.